[FieldTrip] data-deleting bug in FieldTrip (present for 9 hours during yesterday, now fixed)
r.oostenveld at donders.ru.nl
Tue Aug 25 10:51:56 CEST 2015
Dear FieldTrip users
In SVN revision 10622 I changed this https://bitbucket.org/fieldtriptoolbox/fieldtrip/commits/0539a45d6bd609673a5c618c8ad6ad62de1c4013
to make the unzipping and subsequent cleaning up of *.gz, *.tgz and *.zip datasets more consistent (as I noticed that it was done in different manners), but did not realize that there would be an interaction between this and dataset2files, the helper function that deals with raw datasets that are split over multiple files.
So in short, in the 9 hours between
we had a FT version that might have caused problems by deleting the raw data files that you were processing.
The problem was due to the detection of the specified input file being different from the one that was actually read from, and if so, it would assume that it was unzipped and delete the file. But there are cases that were incorrectly detected (handled in private/dataset2files), e.g. in a CTF *.ds directory and a *.meg4 data file, or with brainvision vhdr/vmrk/eeg files.
Please do not use SVN version 10622, nor the ftp download version “20150824”. That ftp version has now been deleted from our ftp server, but has been there for an hour or so.
I am very sorry for the inconvenience that this may cause or may have caused.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the fieldtrip