Feature Selection Algorithms

   Petr Somol    Pavel Pudil
Abstract: 
Demonstrations of Feature Selection Algorithms.

(the demonstrations require Flash plug-in (version 4+))

 

Floating Search principle explanation

This presentation explains the principle of the classical Floating subset search algorithm in its backward form. Suppose our task is to select a subset of items out of the total number of 6 items (features) so as to maximize an adopted criterion function. The Floating search algorithm finds solutions for all subset sizes in one run. For full explanation see [5,7].
Track the algorithm by clicking the green right-arrow:
 

(Fast) Branch & Bound basic principle explanation

This presentation explains the principle of the classical Branch & Bound subset search algorithm and the novel Fast Branch & Bound. Suppose our task is to select 2 items out of the total number of 6 items (features) so as to maximize an adopted criterion function. During demonstrations the red color denotes criterion value computation, yellow color denotes much faster criterion value prediction. For explanation see [1].
Track the algorithm by clicking the green right-arrow:
 

Classical (non-predictive) and Fast (predictive) Branch & Bound comparison

This presentation demonstrates differences between the classical and the Fast Branch & Bound (FBB) subset search algorithms by showing their parallel run. A sample problem of selecting 2 out of 8 features serves also for illustration of two possible types of FBB prediction mechanism errors. Red depicts true criterion evaluation, yellow depicts fast predictions. For more details see [1].
Start the demonstration by clicking the green right-arrow:
 

 

Floating vs. Oscillating Search

This animated picture illustrates the course of Floating Search [7] and Oscillating Search [4] algorithms. The SFFS starts with an empty set and successively increases the number of selected features. Any time it adds a feature it tries to backtrack so as to find better, previously unknown solution. The Oscillating Search algorithm modifies successively the current subset of d features as long as the subset gets improved. For details see [7,4].
 

 

Reference: 
Pudil, P., J. Novovičová, and J. Kittler, "Floating search methods in feature selection", Pattern Recognition Letters, vol. 15, pp. 1119-1125, 1994.
Novovičová, J., P. Pudil, and J. Kittler, "Divergence based feature selection for multimodal class densities", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 1, pp. 218-223, 1996.
Somol, P., P. Pudil, J. Novovičová, and P. Paclík, "Adaptive floating search methods in feature selection", Pattern Recognition Letters, vol. 20, no. 11/13, pp. 1157-1163, 1999.
Somol, P., and P. Pudil, "Oscillating search algorithms for feature selection", Proceedings of the 15th International Conference on Pattern Recognition, Los Alamitos, IEEE Computer Society, pp. 406-409, September, 2000.
Somol, P., P. Pudil, and J. Grim, "Branch & Bound algorithm with partial prediction for use with recursive and non-recursive criterion forms", Advances in Pattern Recognition - ICAPR 2001. Proceedings, Heidelberg, Springer, pp. 425-434, March, 2001.
Somol, P., and P. Pudil, "Feature selection toolbox", Pattern Recognition, vol. 35, no. 12, pp. 2749-2759, 2002.
Somol, P., P. Pudil, and J. Kittler, "Fast Branch & Bound algorithms for optimal feature selection", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 7, pp. 900-912, 2004.