Feature Relevance Analysis Tool

Feature selection is the task of finding relevant features used in a machine learning model. Often used for this task are models which produce a sparse subset of all input features by permitting the use of additional features (e.g. Lasso with L1 regularization). But these models are often tuned to filter out redundancies in the input set and produce only an unstable solution especially in the presence of higher dimensional data.

FRI calculates relevance bound values for all input features. These bounds give rise to intervals which we named ‘feature relevance intervals’ (FRI). A given interval symbolizes the allowed contribution each feature has, when it is allowed to be maximized and minimized independently from the others. This allows us to approximate the global solution instead of relying on the local solutions of the alternatives.

Avatar
Lukas Pfannschmidt
PhD candidate

My research interests include feature selection, relevance determination and high performance computing.

Posts

Quick start guide In this guide i am going describe how to use the FRI python library to analyse arbitrary datasets. (This guide is a …

Publications

Most existing feature selection methods are insufficient for analytic purposes as soon as high dimensional data or redundant sensor …

The increasing occurrence of ordinal data, mainly sociodemographic, led to a renewed research interest in ordinal regression, i.e. the …

Research on feature relevance and feature selection problems goes back several decades, but the importance of these areas continues to …

Biomedical applications often aim for an identification of relevant features for a given classification task, since these carry the …

Talks

Most existing feature selection methods are insufficient for analytic purposes as soon as high dimensional data or redundant sensor …