1) git clone https://github.com/scikit-learn/scikit-learn

2) cd scikit-learn/

3) git fetch origin refs/pull/3204/head:mlp

4) git checkout mlp

Creating an MLP classifier is easy. First, import the scikit-learn library; then, initialize an MLP classifier by executing these statements,

from sklearn.neural_network import MultilayerPerceptronClassifier

clf = MultilayerPerceptronClassifier()

clf = MultilayerPerceptronClassifier()

If you'd like to have 3 hidden layers of sizes 150-100-50, create an MLP object using this statement,

clf = MultilayerPerceptronClassifier(n_hidden=[150, 100, 50])

Training and testing are done the same way as any learning algorithm in scikit-learn.

In addition, you can tune many parameters for your classifier. Some of the interesting ones are,

1) the

**algorithm**parameter, which allows

**users to select the type of algorithm for optimizing the neural network weights,**

**which is either stochastic gradient descent**

**SGD and l-bfgs; and**

2) the

**max_iter**

**parameter, which allows users to set the number of maximum iterations the network can run.**

After training

**mlp**, you can view the minimum cost achieved by printing

**mlp.**

**cost_**. This gives an idea of how well the algorithm has trained.

The implementation passed high standard tests and achieved expected performance for the MNIST dataset.

**MLP**with one hidden layer of 200 neurons, and 400 iterations achieved great results in the MNIST benchmark compared to other algorithms shown below,

Classification performance

=========================================================================

Classifier train-time test-time error-rate

-------------------------------------------------------------------------------------------------------------

Multi layer Perceptron 655.5s 0.30s 0.0169

nystroem_approx_svm 125.0s 0.91s 0.0239

ExtraTrees 79.9s 0.34s 0.0272

fourier_approx_svm 148.9s 0.60s 0.0488

LogisticRegression 68.9s 0.14s 0.0799

=========================================================================

Classifier train-time test-time error-rate

-------------------------------------------------------------------------------------------------------------

Multi layer Perceptron 655.5s 0.30s 0.0169

nystroem_approx_svm 125.0s 0.91s 0.0239

ExtraTrees 79.9s 0.34s 0.0272

fourier_approx_svm 148.9s 0.60s 0.0488

LogisticRegression 68.9s 0.14s 0.0799

Does it support auto-tuning network parameters with gridsearchcv?

ReplyDeleteOh yes it does. In fact, I have one example file that tests the results of choosing between different number of hidden neurons using gridsearchcv.

ReplyDeleteI love some help with the implementation of this in the Anaconda distribution and ipython notebook. I have added the four steps above in my terminal. All work fine.

ReplyDeleteI then go to a ipython notebook and add " from sklearn.neural_network import MultilayerPerceptronClassifier".

I get an error that "ImportError: cannot import name MultilayerPerceptronClassifier". I assume this is b/c is not configured properly with anaconda.

Help, please.

Hi Myles, I am facing the same problem. Have you found solution to it?

ReplyDeleteThank you!

Hi Myles, I am facing the same problem. Have you found solution to it?

ReplyDeleteThank you!

Hi all, sorry for replying this late! indeed this doesn't work all the time, however, it is always possible to copy the files to your work directory and import them directly. I will soon have these implementations uploaded in the pip repository so you could install them using "pip install"!

ReplyDeleteCheers.

Hi, I also tried the above commands and was not able to install it. I checked and the files seem to be in the correct sklearn folder. Do you know why this sometimes doesn't work? Is the pip install now available? Thanks!

ReplyDelete