[FieldTrip] artifact detection -- computing the accumulated z

Wouter Kruijne w.kruijne at gmail.com
Tue May 22 12:37:03 CEST 2018


Hi everyone,

When playing around with different ways of rejecting (muscle) artifacts, I
came across something in the tutorial + code that I do not fully
understand. The tutorial (
http://www.fieldtriptoolbox.org/tutorial/automatic_artifact_rejection#iii_z-transforming_the_filtered_data_and_averaging_it_over_channels
) states that:

> 4. Per timepoint these z-values are averaged. Since an artifact might
> occur on any and often on more than one electrode (think of eyeblinks and
> muscle artifacts), averaging z-values over channels/electrodes allows
> evidence for an artifact to accumulate.
>

‚ÄčThis line of reasoning makes perfect sense to me. However, in the
subsequent mathematical description, and in the implementation, averaging
isn't applied at all: instead, a scaled summation is computed.

zsum = SUM_{ch in C} = z(ch,t)/sqrt(C) ‚Äč

where C is the number of channels [forgive my math editing skills] .

I fail to understand why the accumulated z-score uses a division by the
_sqrt_ of the number of channels, rather than simple averaging. In fact,
the potential downsides I see  are:
 - The reported z-value, and the one used for thresholding does not
correspond to the traditional interpretation of z.
 - The value of a 'good' z-threshold value is dependent on the number of
channels, as this changes the scaling.
 - Should we wish to use the  channel-level z-scores for identifying
channels affected by the artefact, the accumulated threshold value now has
a different scale than the channel-level z-values.

Am I missing something? What are the upsides of the sqrt scaling?

Thank you!
Wouter
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.science.ru.nl/pipermail/fieldtrip/attachments/20180522/25b52611/attachment-0001.html>


More information about the fieldtrip mailing list