[FieldTrip] downsample while loading

Caspar, Emilie e.caspar at ucl.ac.uk
Fri Mar 20 13:14:07 CET 2015


Hi Omer,

Maybe this can help. It filters and epochs before downsampling. It should be faster. However, that highly depends on the computer you have. I tried this with two different computer, with exactly the same characteristics. One was almost full, the other not. It took 15 minutes for 64 electrodes on the new one, and 40 minutes on the full one.

        cfgp         = [];
        cfgp.dataset = [ file.name];
        cfgr            = [];
        cfgr.resamplefs = 256;
        cfgr.detrend = 'yes';

        %% epochs before filtering
        cfg1                     = [];
        cfg1.dataset             = [ file.name];
        cfg1.trialdef.eventtype  = 'STATUS';
        cfg1.trialdef.eventvalue = [65];
        cfg1.trialdef.prestim    = abs(windows(1));
        cfg1.trialdef.poststim   = windows(2);
        cfg1              = ft_definetrial(cfg1);


      %% filtres downsampling
        for i=1:nchans
            cfg1.channel = i;
            cfg1.bpfilter = 'yes'; cfg1.bpfreq=[0.9 30];%filtres
            datp         = ft_preprocessing(cfg1);
            singlechan{i} = ft_resampledata(cfgr, datp);
            clear datp
        end

        cfg = [];
        datall = ft_appenddata(cfg, singlechan{:});

---------------------------------------------
Emilie Caspar

Aspirante FNRS - Ph.D. Student

Consciousness, Cognition & Computation Group (CO3)
Centre de Recherche Cognition et Neurosciences (CRCN)
ULB Neurosciences Institute (UNI)

Université Libre de Bruxelles
Av. F.-D. Roosevelt, 50
1050 Bruxelles
BELGIUM

Voice : +32 2 650 32 95
mail : ecaspar at ulb.ac.be<mailto:ecaspar at ulb.ac.be>
office: DB10-138




On 20 mars 2015, at 12:49, Omer Sharon <omerxsharon at gmail.com<mailto:omerxsharon at gmail.com>> wrote:

Hi everybody

Thanks for answering, I'm reading through most of the answers here and it's very helpful.
I'm having trouble with loading of single (and several) channel sampled in 1000 Hz, but for about 8 hours - I get "out of memory" (from the ft_preprocessing)

Is there a way to downsample the data while loading? (before preprocessing)
 or alternatively to load it in several parts (in the time dimension), downsample and then concatenate.

Thanks a lot
Omer
_______________________________________________
fieldtrip mailing list
fieldtrip at donders.ru.nl<mailto:fieldtrip at donders.ru.nl>
http://mailman.science.ru.nl/mailman/listinfo/fieldtrip

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.science.ru.nl/pipermail/fieldtrip/attachments/20150320/a348c2c5/attachment-0001.html>


More information about the fieldtrip mailing list