out of memory - window machine

Vladimir Litvak litvak at TECHUNIX.TECHNION.AC.IL
Fri Jul 13 17:34:45 CEST 2007


Dear all,



I've been to a Mathworks workshop yesterday and they talked about their new
distributed computing toolbox (version 3.1). What they say is that if one
has a computer cluster it is possible to install a slave Matlab engine on
each machine and declare a variable that will be distributed among all the
machines and thereby utilize all their memory even without parallelizing any
of the computations. Of course things will run rather slowly that way.
Moreover, it should also be possible to run several instances of this engine
on the same machine and make use of multiple core processors (if the code is
parallelized) but also address more than 2Gb of memory on a 32-bit machine
given that every instance of the engine stores less than 2Gb.



I have not tried any of this, but if you try and it works for you I'd be
very interested to hear about it.



Best,



Vladimir



  _____

From: FieldTrip discussion list [mailto:FIELDTRIP at NIC.SURFNET.NL] On Behalf
Of Jared Van Snellenberg
Sent: Friday, June 29, 2007 3:58 AM
To: FIELDTRIP at NIC.SURFNET.NL
Subject: Re: [FIELDTRIP] out of memory - window machine



Hi Virginie,



There are a couple of other things you can do.



First, make certain that you are not growing any variables inside a loop.
For example, if you have any code like:



for i=1:size(data,1)

for j=1:size(data,2)

newvar(i,j)=data(i,j);

end

end



make sure that you precede this code with:



newvar=zeros(size(data));



This will not only speed up the execution of your code but can prevent
memory errors in matlab.



Second, use the clear function to remove any variables that are no longer
necessary.



Third, type 'pack' at the command line prior to executing the part of your
code that generates the memory error.



If none of this works, there are two more options that I'm aware of for
avoiding a memory error.



First, you can run your analysis on a computer with a 64-bit architecture
and a 64-bit operating system. The reason that you are still encountering
memory errors despite having virtual memory maximized is that 32-bit systems
are incapable of addressing more than approximately 4 GB of memory,
regardless of how much is available. This limitation is effectively removed
on 64 bit systems (or rather, the limitation is several orders of magnitude
higher).



Finally, you can run your analysis in steps, saving the results of each step
and clearing all the data. I've certainly had to do this before when working
in fieldtrip. For example, if you're attempting to timelock analyses, load
each subject individually and grandaverage their trials, saving the results
to a new variable. Then clear their data, load the next subject, and
continue. In addition, with the datasets I've used in FIELDTRIP I've noticed
that for most functions specifying cfg.keeptrials='yes' is likely to
generate memory errors, and for frequency analyses specifying
cfg.parameter='powandcsd' and using a large number of pairings in
cfg.channelcmb (or leaving it at default) is pretty well guarenteed to
generate a memory error.



Good luck!



-Jared



On 6/28/07, Virginie van Wassenhove <vvw at caltech.edu> wrote:

Hi fieldtrippers,

would anyone have a trick to share for optimizing memory in matlab...?

What I have tried so far (insufficient still):
- let windows manage memory
- boost virtual memory to maximum (on 2 drives!)
- clear all unnecessary variables in mat space
- disable java
- turned off graphic hardware acceleration
- shut off unused processes

I am running out of options...Would using a mac solve these issues?

Thanks in advance,
-vv


Virginie van Wassenhove, PhD

:::::::::::: contact info  :::::::::::::
Caltech - Division of Biology
1200 E. California Blvd M/C 139-74
Pasadena CA 91125 USA
:::::::::::::::::::::::::::::::::::::::::::::::::
vvw at caltech.edu
Virginie.van.Wassenhove at gmail.com
W: 626.395.8959
http://www.its.caltech.edu/~vvw

:::::::::::::::::: extras ::::::::::::::::::::
http://www.kiva.org
http://www.thehungersite.com/
http://www.agloco.com/r/BBBS1539
:::::::::::::::::::::::::::::::::::::::::::::::::

----------------------------------
The aim of this list is to facilitate the discussion between users of the
FieldTrip  toolbox, to share experiences and to discuss new ideas for MEG
and EEG analysis. See also
http://listserv.surfnet.nl/archives/fieldtrip.html and
http://www.ru.nl/fcdonders/fieldtrip.




--
Jared Van Snellenberg
Social Cognitive Affective Neuroscience Unit
http://scan.psych.columbia.edu
(212) 854-7858 p
(212) 854-3609 f
Department of Psychology, Columbia University
406 Schermerhorn Hall
1190 Amsterdam Avenue, Mail Code 5501
New York, NY 10027
_______________________________
"Luck is the residue of design"
-Attributed to Branch Rickey, former US Baseball Administrator, and also to
John Milton. Go figure.

----------------------------------

The aim of this list is to facilitate the discussion between users of the
FieldTrip toolbox, to share experiences and to discuss new ideas for MEG and
EEG analysis.

http://listserv.surfnet.nl/archives/fieldtrip.html

http://www.ru.nl/fcdonders/fieldtrip/


----------------------------------
The aim of this list is to facilitate the discussion between users of the FieldTrip  toolbox, to share experiences and to discuss new ideas for MEG and EEG analysis. See also http://listserv.surfnet.nl/archives/fieldtrip.html and http://www.ru.nl/fcdonders/fieldtrip.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.science.ru.nl/pipermail/fieldtrip/attachments/20070713/c8f4a1fc/attachment-0001.html>


More information about the fieldtrip mailing list