Deep learning and its representation of the human brain based on applied math and statistics has drawn a new way to solve machine learning problems. When we talk about deep learning, we are implicitly referring to neural networks (NN) which are a representation of our neurons and how they work. Neural networks have been used to solve different problems ranging from natural language processing, computer vision, sentiment analysis, voice recognition, and autonomous vehicles.
There are some good reasons to implement deep learning:
However, using deep learning can be complicated. Common problems that we can face when using deep learning include:
There is a long list of possible platforms or libraries that you can use, but here I want to focus on some of the most common.
Machine learning platforms are useful because they help reduce the complexity needed to get started. The downside is the lack of flexibility because you need to adapt to their features and options, therefore making it difficult to fit with specific problems in your project.
Some examples of machine learning platforms include:
Just as developers use software libraries in their usual work, with deep learning it is no different. A library is simply a set of functions that are pre-made by a development team. The purpose is that they can be easily used, avoid complexity, and make it reusable and extensible. It is common that a library is maintained by the community, and that can help you learn and get involved. Some common libraries for deep learning include:
There is no single answer because every case is different. This graphic explains the most common neural nets for different cases.
Here we provide some real-life examples of where deep neural networks are already being used.
Financial companies are implement new technologies to improve revenues or security, are using neural networks and other machine learning techniques to achieve their goals.
Neural networks can be used to recognize objects and then process their output on a decision based on the environment. This is what is underlying the development of self-driving cars. Companies involved include:
Recurrent networks can’t retain information for a long time, which was the motivation to implement a separate memory module to ensure that information is accessible and work as a “soft” RAM circuit in our neural net.
Deep learning is closely related to classical problems found in the world of physics. Indeed tools used by scientists and physicists underpin much of machine learning. We will be able to create and visualize new scenarios, such as creating different images of falling objects.
Prediction under uncertainty is one of the biggest challenges that scientists have been working on. Learning predictive forward models of the world, where machines are asked to predict the future like our brain does, continues to be a significant challenge. However there have been some limited success using adversarial training. Some examples include: