Freqstatistics Yields Zero Significant Clusters?

Charles Cook charles.cook at ULETH.CA
Fri Jun 5 21:47:39 CEST 2009


Hi Michael,

Excellent points, which were well taken. I've made those changes and
modified my time and frequency windows a bit as well to better reflect our a
priori hypotheses. I first ran the analysis with cfg.latency  = [0 250], and
again came up with zero significant clusters. While it's entirely possible
there might not be anything significant, the topo plots I'd generated
certainly suggested some serious qualitative differences between male and
female images. But be that as it may, I then changed the cfg.latency  = [250
500] and received this as an error:
--------------------------------------------------------------------------
.
.
computing clusters in randomization 4999 from 5000
computing clusters in randomization 5000 from 5000
using a cluster-based method for multiple comparison correction
the returned probabilities and the thresholded mask are corrected for
multiple comparisons
the input is freq data with 81 channels, 79 frequencybins and 31 timebins

computing the leave-one-out averages [---|                                 ]
computing the leave-one-out averages [-------/                             ]
computing the leave-one-out averages [-----------                          ]
computing the leave-one-out averages [-------------\                       ]
computing the leave-one-out averages [----------------|                    ]
computing the leave-one-out averages [--------------------/                ]
computing the leave-one-out averages [------------------------             ]
computing the leave-one-out averages [--------------------------\          ]
computing the leave-one-out averages [-----------------------------|       ]
computing the leave-one-out averages [---------------------------------/   ]
computing the leave-one-out averages [-------------------------------------]
the input is freq data with 81 channels, 79 frequencybins and 31 timebins

computing the leave-one-out averages [---|                                 ]
computing the leave-one-out averages [-------/                             ]
computing the leave-one-out averages [-----------                          ]
computing the leave-one-out averages [-------------\                       ]
computing the leave-one-out averages [----------------|                    ]
computing the leave-one-out averages [--------------------/                ]
computing the leave-one-out averages [------------------------             ]
computing the leave-one-out averages [--------------------------\          ]
computing the leave-one-out averages [-----------------------------|       ]
computing the leave-one-out averages [---------------------------------/   ]
computing the leave-one-out averages [-------------------------------------]
??? Assignment has more non-singleton rhs dimensions than non-singleton
subscripts

Error in ==> clusterplot at 91
    sigposCLM(:,:,iPos) = (posCLM == sigpos(iPos));

Error in ==> CMCWM2_std81 at 160
clusterplot(cfg, stat);
------------------------------------------------------------
I'm assuming that this particular analysis does have significant clusters
since it passed where other simply reported 'no significant clusters'. Not
sure what's going on here with the clusterplot though. 

One further question is whether calculations are still being performed on
all 79 frequencybins and 31 timebins when I have specified otherwise.

And lastly I would also like to take this opportunity to thank Michael for
his much valued assistance into these many questions I've had, both on the
board and personal correspondence. 

Cheers,

Charles

On Fri, 5 Jun 2009 10:58:47 +0200, Michael Wibral
<wibral at BIC.UNI-FRANKFURT.DE> wrote:

>Hi Charles,
>
>from your output:
>...
>computing statistic 100 from 100
>performing Bonferoni correction for multiple comparisons
>...
>
>it seems that you're only computing 100 randomizations. It follows that the
best p-value you could EVER get is 0.01. You then do bonferroni correction
(not the cluster based correction you intended!). So if you set an alpha of
0.9 and divide this by - say - 2000 for your bonferoni corrcetion you alpha
is 0.9/2000=0.00045. You see that you'll never reach this limit given that
you do only 100 randomizations and by defibition cannot get below p=0.01. In
addition you won't get clusters when you use Bonferroni.
>
>I suggest using:
>
>cfg.numrandomization = 5000;
>cfg.corectm='cluster'; % or 'fdr'
>
>
>Good Luck!
>Michael

----------------------------------
The aim of this list is to facilitate the discussion between users of the FieldTrip  toolbox, to share experiences and to discuss new ideas for MEG and EEG analysis. See also http://listserv.surfnet.nl/archives/fieldtrip.html and http://www.ru.nl/neuroimaging/fieldtrip.



More information about the fieldtrip mailing list