ft_appenddata and denoise_pca causing problems

Luisa Frei l.frei at PSY.GLA.AC.UK
Wed Nov 10 13:12:13 CET 2010


Hi Jan-Mathijs,
thanks for the reply. I have now resorted to denoising per block,
which gets rid of most of the noise components (although there is
usually one left per session, which is nothing to worry about, I've
been told). I have also visually inspected the blocks for one session
to make sure there are no jumps left and reduced the threshold for
jump artifacts. However, this didn't make much of a difference at all.

We discussed this in our MEG group meeting and I found out that one
other person had had the same problem, whereas another colleague
didn't have this problem, but she had used the old version of
ft_append_data. We thought, that if this keeps happening, it might be
worth taking a look at ft_append_data, to make sure it handles 4D
reference channels correctly. For now however, I'm happy with the
solution I have.

Thanks,
Luisa

On 8 Nov 2010, at 08:07, jan-mathijs schoffelen wrote:

> Dear Luisa,
>
>>
>> Hi,
>> I'm sorry if this has been discussed before and if it has, could
>> you please direct me to the relevant emails? Thanks.
>>
> Don't worry. This has not yet been discussed here before, and it's
> an interesting issue. Yet, probably most people wouldn't know what
> you are talking about, because the function denoise_pca is not yet
> part of the FieldTrip release version ;o). At the moment, it's only
> the CCNi and the fcdc who have it available.
>
>> I am encountering a problem when using ft_appenddata and
>> denoise_pca. My MEG experiment  (4D) consists of 10 blocks per
>> session, for each of which I have a separate data set.
>>
>> During preprocessing (after artifact rejection), I concatenate
>> these data sets to find denoising weights per session using the
>> function denoise_pca for 4D (I do this on shorter 400 ms epochs to
>> save computational power). Afterwards, I apply these weights per
>> block, downsample the data and concatenate the resulting data
>> again in order to do an ICA per session to remove heart beat
>> artifacts. I downsample because I use longer epochs to do that
>> (1.5 sec).
>> However, when I do an ICA on the denoised and concatenated
>> session, I get lots of high variance noise components (first figure):
>> Trying to find the root of this, I went back and did this analysis
>> for one block only (denoising for one block, no concatenating) and
>> the result looks much better (second figure):
>> Then, as a test, I tried to concatenate two blocks, find the
>> weights for those two concatenated blocks, apply them to the
>> individual blocks, concatenate again and do the ICA on these two
>> blocks and I get this (last figure):
>> So, it seems like when denoising per session (the concatenated
>> blocks), I introduce noise when I apply the pca weights to the
>> individual blocks. Whereas the whole point of the exercise was to
>> reduce noise by denoising per session rather than block.
>
> Denoise_pca indeed tries to reduce the noise in the magnetometer
> data by computing a set of balancing coefficients, which are used
> to subtract a weighted combination of the signals measured at the
> reference sensors from the signals measured at the magnetometer
> coils. As such, it is assumed that the signals picked up by the
> references reflect purely environmental noise. If there is sensor
> specific noise, e.g. a jump in one of the references, the denoising
> algorithm will actually inject noise into the data. I suspect that
> in some of the blocks there is some unaccounted noise in your
> references, which both deteriorates the results in the concatenated
> block case, and also leads to suboptimal results when denoise_pca
> operates on data that contains the 'noisy' block.
>
>> Why is this? Am I doing something fundamentally wrong? I'm
>> wondering because a colleague of mine has done an almost identical
>> analysis step a while ago and she doesn't get problems with high
>> variance noise components. I'm not sure whether the problem lies
>> with append_data or denoise-pca, but I know that append_data has
>> been changed recently, so this might be one possible source of the
>> problem.
>
> I don't think ft_appenddata would be the cause of this, because
> this function only concatenates data structures.
>
> As a diagnostic I would check the quality of the reference channel
> data first, and see whether there are anomalies across blocks there.
>
> Good luck,
> Jan-Mathijs
>
>
> Dr. J.M. (Jan-Mathijs) Schoffelen
> Donders Institute for Brain, Cognition and Behaviour,
> Centre for Cognitive Neuroimaging,
> Radboud University Nijmegen, The Netherlands
> J.Schoffelen at donders.ru.nl
> Telephone: 0031-24-3614793
>
> ----------------------------------------------------------------------
> -----
> You are receiving this message because you are subscribed to
> the  FieldTrip list. The aim of this list is to facilitate the
> discussion
> between  users of the FieldTrip  toolbox, to share experiences
> and to discuss  new ideas for MEG and EEG analysis.
> See also http://listserv.surfnet.nl/archives/fieldtrip.html
> and http://www.ru.nl/neuroimaging/fieldtrip.
> ----------------------------------------------------------------------
> -----

---------------------------------------------------------------------------
You are receiving this message because you are subscribed to
the  FieldTrip list. The aim of this list is to facilitate the discussion
between  users of the FieldTrip  toolbox, to share experiences
and to discuss  new ideas for MEG and EEG analysis.
See also http://listserv.surfnet.nl/archives/fieldtrip.html
and http://www.ru.nl/neuroimaging/fieldtrip.
---------------------------------------------------------------------------



More information about the fieldtrip mailing list