A Comparison of Deep Learning Frameworks
Whether you are aware of it or not, there is a whole relatively new AI technique in our lives “Deep Learning”. Deep learning enables us to find solutions easily to very complex problems. Today there are dozens of deep learning tools available and we will look into some of the most widely used frameworks.
The more you know the more you grow! Humankind wants to know more and thanks to super-fast moving technologies, we constantly have better approaches and tools than ever. Whether you are aware of it or not, there is a whole relatively new AI technique in our lives “Deep Learning”. This new method has been around just quite some time, evolving fast and now shaping our future. Deep learning is a branch of Machine Learning and seeks to imitate the neural activity of human brain on to artificial neural networks so that it can learn to identify characteristics of digital data such as image or voice. Deep learning enables us to find solutions easily to very complex problems. Today there are quite a few deep learning frameworks, libraries and tools to develop deep learning solutions.
There is so much to discover with deep learning frameworks and naturally all big players of tech industry want to take the lead in this “exciting” market. The tech giants like Google, Amazon, Facebook, and Microsoft are among the companies that invest heavily in the field of deep learning. They either have their own development of deep learning frameworks or they acquire/support some existing ones in digital the world. No surprise the competition is fierce. There are dozens of deep learning tools available and we will look into some of the most widely used frameworks.
Theano
Theano is where the whole story has begun. It was created in 2007 by Yoshua Bengio and the research team at the University of Montreal and was the first widely used DL (Deep Learning) framework. Theano is a Python library, extremely fast and powerful but criticised for being a low level deep learning framework. The team behind Theano announced in 2017 that after releasing the latest version there will be no further developments. Au revoir Theano!
Torch
Torch is an open source DL framework with support for algorithms that is primarily based in GPUs. It is written in scripting language Lua, very flexible and easy to build models with. It has been used by Google, Twitter and Facebook. Pytorch, a new Python implication of Torch, has recently brought a lot of popularity to this framework.
TensorFlow
TensorFlow is described on its homepage as an open source software library for numerical computation using data flow graphs. TensorFlow was developed by Google Brain Team to deploy machine learning and deep learning researches.
As of today it is the most commonly used deep learning framework. It is backed by a big community of tech firms, experts and technology enthusiasts worldwide even though it has been criticised to be too complex to use without any interfaces. Another common critic is that, according to many experts it runs much slower than other major frameworks. Nevertheless, Google released TensorFlow1.5 in January 2018 and it offers developers a faster programming style with the new features and enhancements. TensorFlow is adopted by many other big companies like eBay, Coca Cola, Twitter and Uber and there are dozens of applications of the framework. It was open sourced in 2015 to lure more talents in order to help improve the framework and increase the ready–made material for more efficient efforts. The framework is written in C++ and Python and has large amount of available ready-to-use documentation. It seems like TensorFlow will stay as the most widely used framework in the DL landscape for the next few years at least.
Keras
Keras was developed as an easily operated interface to simplify building neural networks with a speedy approach. It is written in Python and can be functioned on top of TenserFlow and Theano. We are likely to hear more about Keras in 2018 as Google will be including Keras in the coming TenserFlow releases.
Caffe and Caffe2
Caffe was based on giving priority to expression, speed, and modularity. It was developed by Berkeley Artificial Intelligence Research. If you are looking to model Convolutional Neural Network (CNN), Caffe is your go-to framework since its main application is in modelling CNNs. Following popularity of Caffe, Facebook introduced Caffe2 in 2017. Caffe2 framework offers users to use pre-trained models to build demo applications without extra hassle. Caffe2 is likely to be the successor of the Berkeley`s Caffe.
Pytorch
Pytorch is a python version of Torch framework which was released by Facebook in early 2017. It uses dynamic computational graphs which contributes significantly analyzing unstructured data. Pytorch has customised GPU allocator that makes DL models more memory efficient. Pytorch is a simple framework that offers high speed and flexibility.
Pytorch is used by many tech companies such as Twitter, Facebook and Nvidia to train DL models. It is developer-friendly and very efficient. Some of the main disadvantages are that it is still comparably new beta version and there is not enough community support. However, it has the potential to challenge TensorFlow with its growing momentum in the upcoming years.
MXNet
Apache Incubator`s MXNet is a very flexible and efficient deep learning library which is supported by Microsoft and Amazon. MXNet supports many languages like JavaScript, R and Go as well as Python and C++. It offers high performance and distinctive scalability and has very good image classifying skills. Despite these advantages, MXNet has a small community and very limited ready-to-use documentation.
The story of MXNet doesn’t end here though. In October 2017, Amazon and Microsoft launched a new deep learning interface called Gluon which has been implemented in MXNet. This move has made it clear that Google`s TensorFlow is not the only big player in this game. The duo claims that the Gluon interface simplifies development of DL models without giving up on training speed. It offers easier coding and more flexibility in addition to dynamic graphs and high performance.
Deeplearning4j
Deeplearning4j is the first commercial oriented, open source, distributed deep learning library written for Java and Scala. DL4J supports importing neural models from other major libraries and can be run on top of some very popular big data tools such as Apache spark. If you use Java as your programming language, DL4J is the framework to go for.
Microsoft Cognitive Toolkit
The Microsoft Cognitive Toolkit, also known as CNTK, is Microsoft`s open source and commercial grade DL framework. CNTK has sophisticated algorithms to handle big datasets. CNTK supports 64-bit Linux or Windows operating systems and offers high scalability and performance.
As we can see, every deep learning framework or library has different characteristic, offering a unique feature and they present different solutions.
There are hundreds of applications of artificial intelligence and DL today. For example; Amazon uses deep learning tools to recommend customers what to buy and these recommendations cover nearly a third of what they actually purchase. The company also manages its physical centres by employing nearly 80,000 robots. Microsoft uses voice recognition in Bing real time language translations and attempts to train more DL algorithms with the Microsoft Cognitive Toolkit. YouTube operates artificial video recognition on content classifications so that viewers can be protected from inappropriate surprises. Google Home Assistant knows who you are and what you like by recognising your voice and Gmail understands the text in your inbox and suggests you tailor made responses. Uber provides real time prediction systems for offering more reliable transportation by adapting DL methods.
From self driving cars to tailor made movie recommendations it appears that the fast moving landscape of deep learning has much more to offer. Big tech companies will continue their battles to take the lead in this race and no doubt we are set to see more mind blowing developments in our lives.
Original Article