23/03/2013 — The conceptual and practical limitations of classical multiple linear regression models can be resolved naturally in a Bayesian framework. Unless based on an overly simplistic parameterization, however, exact inference in Bayesian regression models is analytically intractable. This problem can be overcome using methods for approximate inference. This MATLAB toolbox implements variational inference for a fully Bayesian multiple linear regression model, including Bayesian model selection and prediction of unseen data points on the basis of the posterior predictive density.Download code (358 KB)
03/01/2013 — Mixed-effects inference is critical whenever one wishes to evaluate the performance of a classification algorithm that has been trained and tested on a hierarchically structured dataset. This setting is very common in domains as varied as spam detection, brain-machine interfaces, and neuroimaging. This R package provides an efficient variational Bayesian implementation of the normal-binomial model for mixed-effects inference. The package permits inference on accuracies as well as balanced accuracies.This software is hosted on mloss.org.
01/06/2012 — Classification algorithms are often used in a hierarchical setting, where a classifier is trained and tested on individual datasets which are themselves sampled from a group. Examples of this sort of analysis are ubiquitous and are common in domains as varied as spam detection, brain-machine interfaces, and neuroimaging. This toolbox provides answers to the questions of statistical inference that arise in all of these settings. It implements models that account for both within-subjects (fixed-effects) and between-subjects (random-effects) variance components and thus provide mixed-effects inference. The toolbox provides (i) asymptotically exact MCMC implementations as well as (ii) computationally efficient variational Bayes approximations.This software is hosted on mloss.org.
16/11/2011 — The beta-binomial model enables the performance evaluation of a classification algorithm that is used in a hierarchical context. This archive contains a Java implementation of a Metropolis-Hastings/Gibbs sampling scheme. The algorithm is asymptotically exact and about 10 times faster than an equivalent MATLAB implementation.Download v1.0 (13 KB)
26/04/2011 — The latest release of the MVPA framework for MATLAB provides a generic environment for the application of statistical methods to high-dimensional datasets, such as those obtained by functional magnetic resonance imaging (fMRI). The framework is highly extensible and supports tight integration with LIBSVM, the Princeton MVPA toolbox, and Sun Grid Engine for use on high-performance compute clusters running Linux. It is published under the terms of the GNU General Public License.Download v3.2 (r10406) (703 KB)
22/10/2010 — The average accuracy obtained on individual cross-validation folds is a problematic measure of generalization performance. First, it makes statistical inference difficult. Second, it leads to an optimistic estimate when a biased classifier is tested on an imbalanced dataset. Both problems can be overcome by replacing the conventional point estimate of accuracy by an estimate of the posterior distribution of the balanced accuracy. The archive below contains a set of MATLAB functions to estimate the posterior distribution of the balanced accuracy and compute its associated statistics. For details, see Brodersen et al. (2010b) ICPR.This code is hosted on MATLAB Central.
20/06/2010 — The precision-recall curve (PRC) has become a widespread conceptual tool for assessing classification performance. The curve relates the positive predictive value of a classifier to its true positive rate and provides a useful alternative to the well-known receiver operating characteristic (ROC). A smooth estimate of the PRC can be computed on the basis of a simple distributional assumption about the underlying decision values. The archive below contains a MATLAB implementation of this approach. For details, see Brodersen et al. (2010a) ICPR.This code is hosted on MATLAB Central.
14/04/2010 — Model-based feature construction is a multivariate-decoding approach that addresses the twin challenges of feature selection and biological interpretability by (i) inverting a dynamic systems model of neurophysiological data in a trial-by-trial fashion; (ii) training and testing a discriminative classifier on a feature space derived from the trial-wise model parameter estimates; and (iii) reconstructing how informative each model parameter was in separating the cognitive states of interest. The archive below contains the somatosensory dataset used for illustrating this approach in Brodersen et al. (2011) NeuroImage.Download (78 KB)
19/02/2010 — Contemporary analysis pipelines for decoding brain states based on fMRI typically comprise thousands of lines of code. In contrast, this example illustrates the core principles behind fMRI classification in just about two screens full of MATLAB code. The example contains a fully preprocessed single-subject dataset. The first analysis performs a simple classification analysis, resulting in a classification accuracy. The second analysis creates a probabilistic map of feature importance. Updated on 27/02/2012.Download (33 MB)