Theano (python library)
If you want to experiment with neural networks, and you know python, then my best recommendation is: use theano.
Theano is not about neural networks, really - it is ... hm ... mathematical engine. Something between Matlab and Mathematica.
It is close to Matlab because uses vectorization (the final function will operate vectors).
It is close to Mathematica, because first you define some expression (function as analytical expression). You can compute (analytical) derivatives, and this is crucial for neural networks, because the main thing you need is derivatives, and nobody wants to spend his time on computing derivatives, specially when some library can do this for you.
After you defined needed function-expressions (activation function and gradient of loss function for neural network), you can compile it with theano (get really a function that can be evaluated for some arguments). The compiled function is vectorized and very fast (though compilation usually takes some time). Your functions can be evaluated on GPU as well.
This gives an ability to define your new neural networks in few lines of code (just by defining activation function). Impressive?
Documentation on theano
Github
Examples (mostly about neural networks, probably the easiest way to start and understand how good theano is)
Even more examples (with less explanation) can be found here.
Theano is not about neural networks, really - it is ... hm ... mathematical engine. Something between Matlab and Mathematica.
It is close to Matlab because uses vectorization (the final function will operate vectors).
It is close to Mathematica, because first you define some expression (function as analytical expression). You can compute (analytical) derivatives, and this is crucial for neural networks, because the main thing you need is derivatives, and nobody wants to spend his time on computing derivatives, specially when some library can do this for you.
After you defined needed function-expressions (activation function and gradient of loss function for neural network), you can compile it with theano (get really a function that can be evaluated for some arguments). The compiled function is vectorized and very fast (though compilation usually takes some time). Your functions can be evaluated on GPU as well.
This gives an ability to define your new neural networks in few lines of code (just by defining activation function). Impressive?
Documentation on theano
Github
Examples (mostly about neural networks, probably the easiest way to start and understand how good theano is)
Even more examples (with less explanation) can be found here.