Deep Learning is not so new area of Machine Learning research, but it has grabbed much of the attention of research community as well as the industry in last few years, due to its success in many of the complex learning tasks such as computer vision, NLP, speech processing etc. Before 2010, its was not considered very efficient method of learning the tasks, due to its inherited complexity and lack of high performances computing hardware. But it last decade (thanks to Moore’s law), hardware resources has expanded really fast and the introduction of GPUs, it has been possible to perform the tasks in hours, which used to take weeks earlier. The main objective of deep learning is to move Machine Learning closer to one of its original goals: Artificial Intelligence.
Most encouraging property of Deep Learning is its ability to learn multiple levels of representation and abstraction from raw data, that help to make sense of data such as images, sound, and text. So its no more required to manually design different feature extracting algorithms for different types of tasks. In recent years, there’s been a resurgence in the field of Artificial Intelligence. It’s spread beyond the academic world. All the major major players like Google, Microsoft, and Facebook, Baidu are creating their own research teams and making progress in deep learning even exponential. Some of this can be attributed to the abundance of raw data generated by social network users, much of which needs to be analyzed, as well as to the cheap computational power available via GPUs.
If you wish to start working in deep learning and explorer it for the better, then it is not necessary for you to start building of algorithm from ground. There are a lot of pre-build libraries, which has been developed by various research groups to boost the research potential. some of the most used libraries are below:
Theano
Tensor Flow
Caffe
Torch
Keras
Lasagne
Blocks
Pylearn2
Details of the Libraries:
The choice of the library depends on your preference of programming language. if you are a python expert then you can use Theano or Tensor Flow. Theano is a Python framework developed by the MILA run by Yoshua Bengio at the University of Montreal for research and development into state of the art deep learning algorithms.It is better described as a mathematical expression compiler where you symbolically define what you want and the framework complies your program to run efficiently on GPUs or CPUs.
Theano is really an low level platform. Practically we don’t use use Theano directly. There are alot of wrappers build on the to of theano to speed up the development process even further. Some of the popular wrapper are Keras, Lasagne, Blocks, Pylearn2.
Tensor Flow is an other python based library, which is developed by Google. Keras has a additional advantage, that it can be used on the top of Tensor Flow.
Caffe is developed by the Berkeley Vision and Learning Center, created by Yangqing Jia and led by Evan Shelhamer. It is implemented in C++. Development of Caffe in C++, makes it fastest available deep learning liberary. Caffe can process over 60M images per day with a single NVIDIA K40 GPU with AlexNet. Although resources available if Caffe are very limited. Caffe has also provided interfaces for python as well as matlab.
Torch is developed by NYU and written in Lua. It claims to provide a MATLAB-like environment for machine learning algorithms. Their claim for developing torch in Lua is that it makes Lua is easily to be integrated with C. So within a few hours’ work, any C or C++ library can become a Lua library. As Lua is purely written in ANSI C, it can be easily compiled for arbitrary targets.
Most encouraging property of Deep Learning is its ability to learn multiple levels of representation and abstraction from raw data, that help to make sense of data such as images, sound, and text. So its no more required to manually design different feature extracting algorithms for different types of tasks. In recent years, there’s been a resurgence in the field of Artificial Intelligence. It’s spread beyond the academic world. All the major major players like Google, Microsoft, and Facebook, Baidu are creating their own research teams and making progress in deep learning even exponential. Some of this can be attributed to the abundance of raw data generated by social network users, much of which needs to be analyzed, as well as to the cheap computational power available via GPUs.
If you wish to start working in deep learning and explorer it for the better, then it is not necessary for you to start building of algorithm from ground. There are a lot of pre-build libraries, which has been developed by various research groups to boost the research potential. some of the most used libraries are below:
Theano
Tensor Flow
Caffe
Torch
Keras
Lasagne
Blocks
Pylearn2
Details of the Libraries:
The choice of the library depends on your preference of programming language. if you are a python expert then you can use Theano or Tensor Flow. Theano is a Python framework developed by the MILA run by Yoshua Bengio at the University of Montreal for research and development into state of the art deep learning algorithms.It is better described as a mathematical expression compiler where you symbolically define what you want and the framework complies your program to run efficiently on GPUs or CPUs.
Theano is really an low level platform. Practically we don’t use use Theano directly. There are alot of wrappers build on the to of theano to speed up the development process even further. Some of the popular wrapper are Keras, Lasagne, Blocks, Pylearn2.
Tensor Flow is an other python based library, which is developed by Google. Keras has a additional advantage, that it can be used on the top of Tensor Flow.
Caffe is developed by the Berkeley Vision and Learning Center, created by Yangqing Jia and led by Evan Shelhamer. It is implemented in C++. Development of Caffe in C++, makes it fastest available deep learning liberary. Caffe can process over 60M images per day with a single NVIDIA K40 GPU with AlexNet. Although resources available if Caffe are very limited. Caffe has also provided interfaces for python as well as matlab.
Torch is developed by NYU and written in Lua. It claims to provide a MATLAB-like environment for machine learning algorithms. Their claim for developing torch in Lua is that it makes Lua is easily to be integrated with C. So within a few hours’ work, any C or C++ library can become a Lua library. As Lua is purely written in ANSI C, it can be easily compiled for arbitrary targets.