<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">I am trying some pattern classification in fieldtrip. <div><br></div><div>Mostly using:</div><div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 10px/normal Courier; color: rgb(177, 40, 242); "><span style="color: #000000">cfg.method = </span>'crossvalidate'<span style="color: #000000">;</span></div></div><div><div style="color: rgb(0, 0, 0); margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 10px/normal Courier; ">cfg.mva = {dml.standardizer dml.glmnet(<span style="color: #b128f2">'family'</span>,<span style="color: #b128f2">'binomial'</span>)};</div><div style="color: rgb(0, 0, 0); "><br></div><div style="color: rgb(0, 0, 0); ">I am trying to predict task accuracy (0,1) in cases where accuracy is often around 80%, so the frequency of the correct class is much greater than the frequency of the incorrect class. </div><div style="color: rgb(0, 0, 0); "><br></div><div style="color: rgb(0, 0, 0); "><br></div><div style="color: rgb(0, 0, 0); ">dml.crossvalidate can handle this as follows:</div><div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 10px/normal Courier; color: rgb(51, 157, 47); ">%<span class="Apple-style-span" style="color: rgb(51, 157, 47); font-family: Courier; font-size: 10px; "> </span><span class="Apple-style-span" style="color: rgb(51, 157, 47); font-family: Courier; font-size: 10px; "> </span><span class="Apple-style-span" style="color: rgb(51, 157, 47); font-family: Courier; font-size: 10px; "> </span>In order to balance the occurrence of different classes one may set</div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 10px/normal Courier; color: rgb(51, 157, 47); ">% 'resample' equal to true (default: false). Resample will upsample less</div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 10px/normal Courier; color: rgb(51, 157, 47); ">% occurring classes during training and downsample often occurring</div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 10px/normal Courier; color: rgb(51, 157, 47); ">% classes during testing.</div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 10px/normal Courier; color: rgb(51, 157, 47); "><br></div></div><div style="color: rgb(0, 0, 0); ">… but this requires tossing a lot of data in the downsampling process. </div><div style="color: rgb(0, 0, 0); "><br></div><div style="color: rgb(0, 0, 0); "><br></div><div style="color: rgb(0, 0, 0); ">Has anybody tried other approaches for dealing with skewed classes that do not involve downsampling? Like this for example:</div><div style="color: rgb(0, 0, 0); "><br></div><div><b>Loss functions allowing for unbalanced classes<br></b>The classification performance is always evaluated by some loss<br>function, see the section Estimation of the generalization<br>error. Typical examples are the 0/1-loss (i.e., average number of<br>misclassified samples) and the area under the receiver operator<br>characteristic (ROC) curve (Fawcett, 2006). When using misclassification<br>rate, it must be assured that the classes have approximately<br>the same number of samples. Otherwise, the employed performance<br>measure has to consider the different class prior probabilities.<br>For instance, in oddball paradigms the task is to discriminate<br>brain responses to an attended rare stimulus from responses to a<br>frequent stimulus. A typical ratio of frequent-to-rare stimuli is<br>85:15. In such a setting, an uninformative classifier which<br>always predicts the majority class would obtain an accuracy of 85%.<br>Accordingly, a different loss function needs to be employed. Denoting<br>the number of samples in class i by ni, the normalized error can be<br>calculated as weighted average, where errors committed on samples<br>of class i are weighted by N/ni with N =Σk nk:</div></div><div><br></div><div>From </div><div><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; "><span class="title"><span class="pubauthor" style="color: black; ">S. Lemm, B. Blankertz, T. Dickhaus, K. R. Müller</span>, </span></span></div><div><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; "><span class="title"><span class="pubtitle" style="text-decoration: underline; ">Introduction to machine learning for brain imaging</span></span><br><span class="journal" style="color: green; ">NeuroImage, 56:387-399, 2011</span> </span></div><div><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; "><span class="publinks"><a href="http://doc.ml.tu-berlin.de/bbci/publications/LemBlaDicMue11.pdf">http://doc.ml.tu-berlin.de/bbci/publications/LemBlaDicMue11.pdf</a></span></span></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;"><br></span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;"><br></span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;">Not being a very good programmer, I got lost in the code before I could find the relevant cost function to apply normalization.</span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;"><br></span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;">Any advice on these issues would be much appreciated.</span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;"><br></span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;">thanks</span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;">Tim</span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;"><br></span></font></div><div><font class="Apple-style-span" face="sans-serif"><span class="Apple-style-span" style="font-size: 13px;"><br></span></font></div></body></html>