Hello, I am doing a Markov Chain Monte Carlo Simulation where I want to store many sampled states. I have the following data structure:
state(1) = struct('dim', 3 ,'coords',rand(3,1), 'vals', rand(3,1));
state(10000) = struct('dim', , 'coords', , 'vals', );
for i = 2:10000
state(i) = generateNewState(state(i-1));
How can I store my generated state-data and proceed with the next 10000 states? Then append them to the existing .mat file and go on until I generated say 1e10 states. And then use the data to do calculations? My problem is that the dimension (up to 10000) of the struct is not fixed. The other problem is that I dont want to load the whole mat file into my memory since it wouldn't fit. I would like to process the data in chunks. By processing I mean calculations of mean, variance, covariance, max, min , extraction of every 100th sample, creating histogram without knowing the domain etc...
I already tried the map-reduce formalism but there I had to limit myself to a maximum dimension and I had to fill up every struct of smaller dimensions with NaN's in order to be able to store the structs as a table in a csv file. But this can't be the right way to do it because maybe I will just need 10 dimensions but 10000 are theoretically possible. So I would have a really sparse table... It just depends on the data which I don't know in advance. So has anybody a good idea how to solve it?
Thanks in advance!