@teh_allchemist

Regarding this:

>For 50k systems of solutions, each with 20 variables, Matlab takes 7.63MB in memory to store this system in a Dataset (calculated using this code). Maple takes ~300MB to store the same data in list form (calculated using maple('evalf(kernelopts(bytesused)/1024/1024)')).

You're comparing apples to oranges. According to the docs, kernelopts(bytesused) tells you how much memory is being used by the full maple kernel (not a single object). The link you provided shows how to estimate the memory used by a single object.

Maple will use its internal memory as it computes stuff... this should not really be an issue, unless it starts going above your physical available memory, and your operating system starts to swap (i.e. use your hard drive as RAM)... but ~300MB doesn't sound very unreasonable for an involved calculation. Furthermore, if you're doing all the math in maple, just returning the results back to matlab, then one would expect the total memory usage by matlab's kernel to be "low" and by maple's kernel "high".

In any case... from your code I see that you're casting the results into int32 in matlab - these are 4 byte integers.

So 500k worth of 4 byte integers will take less than 2MB... you have multiple entries per "record" - say 10.. that should be less than 20MB of raw data... In your case the memory will probably be higher, because each of the 500k entries, is not just (say) 10 integers, but a maple "set" object. So the first thing I would do, is (as outlined in my other post) to first create an array of 500k x 10 integers (i.e. allocate it before you start the loop). And inside the loop, once you solved for the "Ans", simply fill in the given row of this large array... this would guarantee at the end you just have a "raw" data array for which you *know* the size.

The key question is what you want to do once you have your data... just export it out to a file?

If the data are mostly two digits integers (as i guess it is from your post), then exporting it in a text format to a file should in principle actually be even smaller than 20MB, because on average each entry will take 2bytes + 1byte for delimiter/newline.

So maybe I'm missing something... but I suppose I am not sure why you need to do the hybrid maple/matlab approach?

Exporting directly from maple should work fine. Have you tried that? If so, what was the result?

**EDIT:**

In the spirit of procrastination and given i have so much of my own work that i don't know where to start... here is a some dummy code that shows what I proposed to you in the text above.

**restart:**

**total_runs:=5;**

**elems_per_record:=3;**

**#this could be an Array, but ExportMatrix, likes matrices... **

**data:=Matrix(1..total_runs, 1..elems_per_record, datatype=integer[4]);**

**#run the loop **

**for i from 1 to total_runs do**

** #here calculate the solution... i'm using a dummy result**

** dummy_result:={ICE_T = 10, BAT_P = 30, EM2_T = 50}:**

** #now fill in the data for this "record". **

** data[i,1]:=eval(BAT_P, dummy_result):**

** data[i,2]:=eval(ICE_T, dummy_result):**

** data[i,3]:=eval(EM2_T, dummy_result):**

**end:**

**#now data has all the data filled in... **

**data;**

**#export the data (for example) as text.**

**#note, the data on disk actually smaller than in memory as explained above!**

**#(this command returns the bytecount it wrote to disk)**

**ExportMatrix("data.txt", data);**

... you can change to code to 500k runs with 20 elements in each case, and it will run and save the data to disk just fine.