• Stars
    star
    160
  • Rank 232,885 (Top 5 %)
  • Language
    MATLAB
  • License
    BSD 3-Clause "New...
  • Created over 3 years ago
  • Updated over 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

This toolbox offers more than 40 wrapper feature selection methods include PSO, GA, DE, ACO, GSA, and etc. They are simple and easy to implement.

Jx-WFST : A Wrapper Feature Selection Toolbox

View Wrapper Feature Selection Toolbox on File Exchange License GitHub release


"Toward Talent Scientist: Sharing and Learning Together" --- Jingwei Too


Wheel

Introduction

  • This toolbox offers more than 40 wrapper feature selection methods

  • The A_Main file provides the examples of how to apply these methods on benchmark dataset

  • Source code of these methods are written based on pseudocode & paper

  • Main goals of this toolbox are:

    • Knowledge sharing on wrapper feature selection
    • Assists others in data mining projects

Usage

The main function jfs is adopted to perform feature selection. You may switch the algorithm by changing the 'pso' to other abbreviations

  • If you wish to use particle swarm optimization ( see example 1 ) then you may write
FS = jfs('pso',feat,label,opts);
  • If you want to use slime mould algorithm ( see example 2 ) then you may write
FS = jfs('sma',feat,label,opts);

Input

  • feat : feature vector matrix ( Instance x Features )
  • label : label matrix ( Instance x 1 )
  • opts : parameter settings
    • N : number of solutions / population size ( for all methods )
    • T : maximum number of iterations ( for all methods )
    • k : k-value in k-nearest neighbor

Output

  • Acc : accuracy of validation model
  • FS : feature selection model ( It contains several results )
    • sf : index of selected features
    • ff : selected features
    • nf : number of selected features
    • c : convergence curve
    • t : computational time (s)

Notation

Some methods have their specific parameters ( example: PSO, GA, DE ), and if you do not set them then they will be defined as default settings

  • you may open the m.file to view or change the parameters
  • you may use opts to set the parameters of method ( see example 1 or refer here )
  • you may also change the fitness function in jFitnessFunction file

Example 1 : Particle Swarm Optimization ( PSO )

% Common parameter settings
opts.k  = 5;      % Number of k in K-nearest neighbor
opts.N  = 10;     % number of solutions
opts.T  = 100;    % maximum number of iterations
% Parameters of PSO
opts.c1 = 2;
opts.c2 = 2;
opts.w  = 0.9;

% Load dataset
load ionosphere.mat;

% Ratio of validation data
ho = 0.2;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho); 
opts.Model = HO; 

% Perform feature selection 
FS = jfs('pso',feat,label,opts);

% Define index of selected features
sf_idx = FS.sf;

% Accuracy  
Acc = jknn(feat(:,sf_idx),label,opts); 

% Plot convergence
plot(FS.c); grid on;
xlabel('Number of Iterations'); 
ylabel('Fitness Value');
title('PSO');

Example 2 : Slime Mould Algorithm ( SMA )

% Common parameter settings
opts.k  = 5;      % Number of k in K-nearest neighbor
opts.N  = 10;     % number of solutions
opts.T  = 100;    % maximum number of iterations

% Load dataset
load ionosphere.mat; 

% Ratio of validation data
ho = 0.2;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho); 
opts.Model = HO; 

% Perform feature selection 
FS = jfs('sma',feat,label,opts);

% Define index of selected features
sf_idx = FS.sf;

% Accuracy  
Acc = jknn(feat(:,sf_idx),label,opts); 

% Plot convergence
plot(FS.c); grid on; 
xlabel('Number of Iterations');
ylabel('Fitness Value'); 
title('SMA');

Example 3 : Whale Optimization Algorithm ( WOA )

% Common parameter settings
opts.k  = 5;      % Number of k in K-nearest neighbor
opts.N  = 10;     % number of solutions
opts.T  = 100;    % maximum number of iterations
% Parameter of WOA
opts.b = 1;

% Load dataset
load ionosphere.mat; 

% Ratio of validation data
ho = 0.2;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho); 
opts.Model = HO; 

% Perform feature selection 
FS = jfs('woa',feat,label,opts);

% Define index of selected features
sf_idx = FS.sf;

% Accuracy  
Acc = jknn(feat(:,sf_idx),label,opts); 

% Plot convergence
plot(FS.c); grid on; 
xlabel('Number of Iterations'); 
ylabel('Fitness Value'); 
title('WOA');

Requirement

  • MATLAB 2014 or above
  • Statistics and Machine Learning Toolbox

List of available wrapper feature selection methods

  • Note that the methods are altered so that they can be used in feature selection tasks
  • The extra parameters represent the parameter(s) other than population size and maximum number of iterations
  • Click on the name of method to view the extra parameter(s)
  • Use the opts to set the specific parameter(s)
No. Abbreviation Name Year Extra Parameters
43 'mpa' Marine Predators Algorithm 2020 Yes
42 'gndo' Generalized Normal Distribution Optimization 2020 No
41 'sma' Slime Mould Algorithm 2020 No
40 'mrfo' Manta Ray Foraging Optimization 2020 Yes
39 'eo' Equilibrium Optimizer 2020 Yes
38 'aso' Atom Search Optimization 2019 Yes
37 'hgso' Henry Gas Solubility Optimization 2019 Yes
36 'hho' Harris Hawks Optimization 2019 No
35 'pfa' Path Finder Algorithm 2019 No
34 'pro' Poor And Rich Optimization 2019 Yes
33 'boa' Butterfly Optimization Algorithm 2018 Yes
32 'epo' Emperor Penguin Optimizer 2018 Yes
31 'tga' Tree Growth Algorithm 2018 Yes
30 'abo' Artificial Butterfly Optimization 2017 Yes
29 'ssa' Salp Swarm Algorithm 2017 No
28 'wsa' Weighted Superposition Attraction 2017 Yes
27 'sbo' Satin Bower Bird Optimization 2017 Yes
26 'ja' Jaya Algorithm 2016 No
25 'csa' Crow Search Algorithm 2016 Yes
24 'sca' Sine Cosine Algorithm 2016 Yes
23 'woa' Whale Optimization Algorithm 2016 Yes
22 'alo' Ant Lion Optimizer 2015 No
21 'hlo' Human Learning Optimization 2015 Yes
20 'mbo' Monarch Butterfly Optimization 2015 Yes
19 'mfo' Moth Flame Optimization 2015 Yes
18 'mvo' Multiverse Optimizer 2015 Yes
17 'tsa' Tree Seed Algorithm 2015 Yes
16 'gwo' Grey Wolf Optimizer 2014 No
15 'sos' Symbiotic Organisms Search 2014 No
14 'fpa' Flower Pollination Algorithm 2012 Yes
13 'foa' Fruitfly Optimization Algorithm 2012 No
12 'ba' Bat Algorithm 2010 Yes
11 'fa' Firefly Algorithm 2010 Yes
10 'cs' Cuckoo Search Algorithm 2009 Yes
09 'gsa' Gravitational Search Algorithm 2009 Yes
08 'abc' Artificial Bee Colony 2007 Yes
07 'hs' Harmony Search - Yes
06 'de' Differential Evolution 1997 Yes
05 'aco' Ant Colony Optimization - Yes
04 'acs' Ant Colony System - Yes
03 'pso' Particle Swarm Optimization 1995 Yes
02 'ga'/'gat' Genetic Algorithm - Yes
01 'sa' Simulated Annealing - Yes

More Repositories

1

Wrapper-Feature-Selection-Toolbox-Python

This toolbox offers 13 wrapper feature selection methods (PSO, GA, GWO, HHO, BA, WOA, and etc.) with examples. It is simple and easy to implement.
Python
240
star
2

EMG-Feature-Extraction-Toolbox

This toolbox offers 40 feature extraction methods (EMAV, EWL, MAV, WL, SSC, ZC, and etc.) for Electromyography (EMG) signals applications.
MATLAB
78
star
3

EEG-Feature-Extraction-Toolbox

This toolbox offers 30 types of EEG feature extraction methods (HA, HM, HC, and etc.) for Electroencephalogram (EEG) applications.
MATLAB
69
star
4

Binary-Grey-Wolf-Optimization-for-Feature-Selection

Demonstration on how binary grey wolf optimization (BGWO) applied in the feature selection task.
MATLAB
31
star
5

Advanced-Feature-Selection-Toolbox

This toolbox offers advanced feature selection tools. Several modifications, variants, enhancements, or improvements of algorithms such as GWO, FPA, SCA, PSO and SSA are provided.
Python
29
star
6

Machine-Learning-Toolbox

This toolbox offers 8 machine learning methods including KNN, SVM, DA, DT, and etc., which are simpler and easy to implement.
MATLAB
22
star
7

Filter-Feature-Selection-Toolbox

Simple, fast and ease of implementation. The filter feature selection methods include Relief-F, PCC, TV, and NCA.
MATLAB
21
star
8

Whale-Optimization-Algorithm-for-Feature-Selection

Application of Whale Optimization Algorithm (WOA) in the feature selection tasks.
MATLAB
21
star
9

Binary-Harris-Hawk-Optimization-for-Feature-Selection

The binary version of Harris Hawk Optimization (HHO), called Binary Harris Hawk Optimization (BHHO) is applied for feature selection tasks.
MATLAB
11
star
10

Binary-Differential-Evolution-for-Feature-Selection

The binary version of Differential Evolution (DE), named as Binary Differential Evolution (BDE) is applied for feature selection tasks.
MATLAB
11
star
11

Ant-Colony-Optimization-for-Feature-Selection

Implantation of ant colony optimization (ACO) without predetermined number of selected features in feature selection tasks.
MATLAB
10
star
12

Neural-Network-Toolbox

This toolbox contains 6 types of neural networks, which is simple and easy to implement.
MATLAB
9
star
13

Sine-Cosine-Algorithm-for-Feature-Selection

Application of Sine Cosine Algorithm (SCA) in the feature selection tasks.
MATLAB
8
star
14

Binary-Particle-Swarm-Optimization-for-Feature-Selection

Simple algorithm shows how binary particle swarm optimization (BPSO) used in feature selection problem.
MATLAB
7
star
15

Salp-Swarm-Algorithm-for-Feature-Selection

Application of Salp Swarm Algorithm (SSA) in the feature selection tasks.
MATLAB
7
star
16

Equilibrium-Optimizer-for-Feature-Selection

Application of Equilibrium Optimizer (EO) in the feature selection tasks.
MATLAB
6
star
17

Particle-Swarm-Optimization-for-Feature-Selection

Application of Particle Swarm Optimization (PSO) in the feature selection tasks.
MATLAB
6
star
18

Henry-Gas-Solubility-Optimization-for-Feature-Selection

Application of Henry Gas Solubility Optimization (HGSO) in the feature selection tasks.
MATLAB
5
star
19

Binary-Dragonfly-Algorithm-for-Feature-Selection

Application of Binary Dragonfly Algorithm (BDA) in the feature selection tasks.
MATLAB
5
star
20

Ant-Colony-System-for-Feature-Selection

Application of ant colony optimization (ACO) for feature selection problems.
MATLAB
4
star
21

Deep-Learning-Toolbox-Python

This toolbox offers several deep learning methods, which are simple and easy to implement.
Python
4
star
22

Atom-Search-Optimization-for-Feature-Selection

Application of Atom Search Optimization (ASO) in the feature selection tasks.
MATLAB
4
star
23

Binary-Tree-Growth-Algorithm-for-Feature-Selection

A feature selection algorithm, named as Binary Tree Growth Algorithm (BTGA) is applied for feature selection tasks.
MATLAB
4
star
24

Genetic-Algorithm-for-Feature-Selection

Simple algorithm shows how the genetic algorithm (GA) used in the feature selection problem.
MATLAB
4
star
25

Deep-Learning-Toolbox

This toolbox offers convolution neural networks (CNN) using k-fold cross-validation, which are simple and easy to implement.
MATLAB
3
star
26

Binary-Atom-Search-Optimization-for-Feature-Selection

A new feature selection algorithm, named as Binary Atom Search Optimization (BASO) is applied for feature selection tasks.
MATLAB
3
star
27

Machine-Learning-Regression-Toolbox

This toolbox offers 7 machine learning methods for regression problems.
Python
2
star
28

Machine-Learning-Toolbox-Python

This toolbox offers 6 machine learning methods including KNN, SVM, LDA, DT, and etc., which are simpler and easy to implement.
Python
2
star
29

JingweiToo

1
star
30

Dimensionality-Reduction-Demonstration

Application of principal component analysis (PCA) for feature reduction.
MATLAB
1
star