Personal tools
You are here: Home Research Trends & Opportunities New Media and New Digital Economy AI, Machine Learning, Deep Learning, and Neural Networks

AI, Machine Learning, Deep Learning, and Neural Networks

MIT Stata Center_051118
(MIT Ray and Maria Stata Center, Jenny Fowter)

 

Artificial Intelligence: Fueling the Next Wave of the Digital Era

 
 

- The Internet of Sensing (IoS)

In the past few years, the field of artificial intelligence (AI) has made significant progress in almost all standard subfields, including vision, speech recognition and generation, natural language processing (understanding and generation), image and video generation, multi-agent systems, robotics Planning, decision making and integration of vision and motion control.  

In addition, breakthrough applications have emerged in many fields such as gaming, medical diagnosis, logistics systems, autonomous driving, language translation, interactive personal assistance, etc.

AI, together with Augmented Reality (AR), Virtual Reality (VR), 5G technology and hyperautomation, is considered as one of the main drivers of the Internet of Sensing (IoS), which is expected to grow significantly in the next decade.

The Internet of Sensing (IoS) is a technology that enhances our senses outside of our bodies. It allows us to experience the world around us using multiple senses, including: enhanced vision, hearing, touch, smell.

The Internet of Things (IoT) connects the digital and physical worlds. It uses sensors to monitor physical objects such as temperature, motion, or other environmental changes. The actuator then receives the signal from the sensor and responds to the change.

  

- AI: The Science of Making Inanimate Objects Smart

From smartphones to chatbots, AI is already ubiquitous in our digital lives. The momentum behind AI is building, in part because computers can collect vast amounts of data about our everyday preferences, purchases and activities. AI research experts use all this data to train machine learning (ML) and predict what we want or hate. 

AI is a general term referring to technology that enables computers to mimic human behavior. It is an umbrella term for a group of technologies. AI deals with computer models and systems that perform human-like cognitive functions, such as reasoning and learning. AI software is able to learn from experience, distinguishing it from more traditional pre-programmed and deterministic software. 

AI does not necessarily mean giving machines intelligence or consciousness in the same way that humans are intelligent and conscious. It simply means that the machine is able to solve a specific problem or class of problems. 

AI helps solve problems by performing tasks involving skills such as pattern recognition, prediction, optimization, and recommendation generation based on data such as video, images, audio, numbers, text, and more. 

Please refer to the following for more information:

 

The AI Resurgence

AI and ML principles have been around for decades. The recent popularity of AI is a direct result of two factors. First, AI/ML algorithms are computationally intensive. The availability of cloud computing makes it possible to actually run these algorithms. Second, training AI/ML models requires a lot of data. The availability of big data platforms and digital data increases the effectiveness of AI/ML, making it better than humans for many applications. 

The speed, availability and sheer scale of the infrastructure enable bolder algorithms to solve more ambitious problems. Not only is the hardware faster and sometimes enhanced with specialized processor arrays such as GPUs, it is also available as a cloud service. What used to run in specialized labs with access to supercomputers can now be deployed to the cloud at little cost and much easier. 

This has democratized access to the hardware platforms needed to run AI, allowing startups to proliferate. Additionally, emerging open source technologies, such as Hadoop, allow for faster development of scaled AI techniques applied to large and distributed datasets. 

Larger players are investing heavily in various AI technologies. These investments go beyond simple R&D expansion of existing products and are often strategic. For example, the size of IBM's investment in Watson, or Google's investment in driverless cars, deep learning (aka DeepMind), or even quantum computing, promises to significantly improve the efficiency of machine learning algorithms.

 

- The Future of AI

Technology is changing the way humans and machines work together. People rely on machines to help them make smarter decisions, expand range and access, and improve safety and productivity. This new era of human-machine collaboration relies on trust and understanding — allowing each component of the team to do what it does best. The autonomous future is not without people. It's more of human nature. 

AI has exploded over the past few years, especially since 2015. Much of this has to do with the widespread availability of GPUs that make parallel processing faster, cheaper, and more powerful. It also has to do with one or two hits of near-infinite storage going on at the same time and massive amounts of data per stripe (the whole big data movement) - images, text, transactions, mapping data, you name it. 

Artificial intelligence has various applications in today's society. It has become critical in this day and age as it can effectively solve complex problems in multiple industries such as healthcare, entertainment, finance, education, etc. AI is making our daily lives more comfortable and faster. 

AI technologies are already changing how we communicate, how we work and play, and how we shop and health. For businesses, AI has become an absolute necessity to create and maintain a competitive advantage. 

As AI permeates our daily lives and aims to make our lives easier, it will be interesting to see how quickly it develops and evolves, enabling different industries to evolve. Science fiction is slowly becoming a reality as new technological developments appear every day. Who knows what tomorrow will bring?

 

- AI Is Evolving to Process the World Like Humans

AI is developing on its own. The software the researchers created draws on concepts from Darwin's theory of evolution, including "survival of the fittest," to build AI programs that can be passed down from generation to generation without human input. AI offers a wide range of technological capabilities that can be applied across all industries, profoundly changing the world around us. 

As AI researchers work to develop and improve their machine learning and AI algorithms, the ultimate goal is to rebuild the human brain. The most perfect AI imaginable would be able to process the world around us through typical sensory input, while leveraging the storage and computing power of supercomputers. 

With this ultimate goal in mind, it's not hard to understand how artificial intelligence is evolving as it continues to evolve. 

Deep learning AI is able to interpret patterns and draw conclusions. Essentially, it's learning how to mimic the way humans process the world around us. That said, from the start, AI often requires typical computer input, such as encoded data. Developing AI that can process the world through audio and visual, sensory input is a daunting task.

 

- The Relationship Between AI, ML, DL, and Neural Networks

Both machine learning (ML) and deep learning (DL) are subsets of AI. But we often use these terms interchangeably. ML is the largest component of AI. All AI-based products or services on the market would not be possible without ML or DL. 

Perhaps both technologies were introduced decades ago. But now, over the past few years, people are using its apps a lot. AI may be the last invention humans need to make.  These three terms -- AI, ML, and DL -- are critical to understanding themselves and their relationships; from sales teams explaining the services they provide, to having to Data scientists who decide which model type to use. 

While each of AI, ML, and DL has its own definition, data requirements, level of sophistication, transparency, and limitations, what that definition is and how they relate to each other is entirely up to the context in which you view them. 

  • Artificial Intelligence (AI): imitating the intelligence or behavioral patterns of humans or any other biological entity. When machines are able to mimic human intelligence through prediction, classification, learning, planning, reasoning and/or perception.
  • Machine Learning (ML): A technique in which computers "learn" from data without using a common set of different rules. This approach is primarily based on training a model from a dataset. Machine learning is a subset of artificial intelligence that combines mathematics and statistics in order to learn from the data itself and improve with experience.
  • Deep Learning (DL): A technique for performing machine learning inspired by our "brain's own network of neurons" - networks that can adapt to new data. Deep learning is a subset of ML that uses neural networks to solve increasingly complex challenges such as image, audio, and video classification.
  • Neural Networks: A beautiful biology-inspired programming paradigm that enables computers to learn from observational data. Deep learning, a set of powerful neural network learning techniques. Neural networks and deep learning currently provide the best solutions for many problems in image recognition, speech recognition, and natural language processing.

 

Machine learning, while widely considered a form of AI, aims to let machines learn from data, not from programming. Its applicable use is to predict outcomes, like we recognize a red octagon sign with white letters and know to stop. 

AI, on the other hand, can determine the best course of action for how to stop, when to stop, etc. Simply put, the difference is: machine learning predicts, artificial intelligence acts. 

 

Artificial Intelligence System_112720A
[Artificial Intelligence System - Deloitte]

- The Rise of Machine Learning (ML)

Machine learning (ML) is an interdisciplinary field that uses statistics, probability, algorithms to learn from data and provide insights that can be used to build intelligent applications. 

ML is the current application of AI. The technology is based on the idea that we really should be able to give machines access to data and let them learn on their own. 

ML is a technique that uses data to train software models. The model learns from training cases, and we can then use the trained model to make predictions on new data cases. 

ML provides the foundation for AI. Two important breakthroughs have led to the emergence of machine learning, which is advancing AI at the current rate. 

One of them is the realization that instead of teaching a computer everything it needs to understand the world and how to perform tasks, it is better to teach it to teach itself. The second is the advent of the Internet and the enormous growth in the amount of digital information that is generated, stored, and available for analysis. 

Once these innovations were in place, engineers realized that instead of teaching computers and machines how to do everything, they could write code to make them think like humans, and then connect them to the internet, giving them access to all the information in the world. 

ML is concerned with the scientific research, exploration, design, analysis, and application of algorithms that learn concepts, predictive models, behaviors, strategies of action, etc. from observation, reasoning, and experimentation, and the characterization of which classes of precise conditions concepts and behaviors can be learned . 

Learning algorithms can also be used to model various aspects of human and animal learning. ML integrates and builds on advances in algorithms and data structures, statistical inference, information theory, signal processing, and insights gained from neural, behavioral, and cognitive sciences.

 

- Deep Learning (DL)

Deep learning (DL) uses artificial neural networks (ANNs) to perform complex computations on large amounts of data. It is a machine learning based on the structure and function of the human brain. DL algorithms train machines by learning from examples. Industries such as healthcare, e-commerce, entertainment, and advertising commonly use deep learning. 

While deep learning algorithms have self-learning representations, they rely on artificial neural networks that mirror the way the brain computes information. During training, the algorithm uses unknown elements in the input distribution to extract features, group objects, and discover useful data patterns. Like training a machine to learn on its own, this happens at multiple levels, using algorithms to build models. 

DL models use a variety of algorithms. While no network is considered perfect, certain algorithms are better suited to perform specific tasks. To choose the right algorithm, it is best to have a solid understanding of all major algorithms. 

DL is a hot topic these days because it aims to simulate the human mind. It's been getting a lot of attention lately, and for good reason. It is achieving results that were not possible before. In deep learning, computer models learn to perform classification tasks directly from images, text, or sound. 

DL models can achieve state-of-the-art accuracy and sometimes exceed human-level performance. The model is trained by using a large amount of labeled data and a neural network architecture with multiple layers. 

DL is basically ML on steroids that allows for more accurate processing of large amounts of data. Since it is more powerful, it also requires more computing power. Algorithms can determine on their own (without engineer intervention) whether predictions are accurate. 

For example, consider feeding an algorithm thousands of images and videos of cats and dogs. It can see if an animal has whiskers, claws or a furry tail, and uses learning to predict whether new data fed into the system is more likely to be a cat or a dog.

 

- Neural Networks

Neural networks are a family of algorithms that strive to identify potential relationships in a set of data by simulating the way the human brain works. In this sense, a neural network refers to a system of neurons, whether organic or artificial. Neural networks can adapt to changing inputs; thus the network can produce optimal results without redesigning the output criteria. 

Neural networks are a set of algorithms, loosely modeled on the human brain, designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling or clustering of raw input. The patterns they recognize are numerical and contained in vectors, and all real-world data, whether images, sounds, text, or time series, must be converted into vectors. 

Neural networks help us with clustering and classification. You can think of them as layers of clustering and classification on top of the data you store and manage. They help to group unlabeled data based on similarity between example inputs and to classify data when training on labeled datasets. 

Neural networks can also extract features that are provided to other algorithms for clustering and classification; therefore, you can think of deep neural networks as components of larger ML applications involving reinforcement learning, classification, and regression algorithms.

Neural networks and deep learning currently provide the best solutions for many problems in image recognition, speech recognition, and natural language processing.

 

[More to come ...] 

Document Actions