AI & IoT – The coming revolution

This is the wonderful coming world of the Artificial-Intelligence-of-Things (AIoT) …  with a slight glitch at the end! 

The Internet of Things has arrived! It is a revolution in technology that will impact us all – both in our professional as well as personal life. The Internet of Things, or IoT, is simply a phrase to describe any ‘thing’ that can be connected or controlled via the Internet. For example in the video clip the music system, the smoothie mixer, the fire hearth the calendar and the front door are all connected via the network and will soon be a part of our everyday lives. In the business world this translates to the hot topics of Smart Cities, Smart Homes, Industry 4.0 and Connected Vehicles. Companies of all kinds – not just technology firms – are now racing to find the next big ‘thing’, and to connect anything and everything to the Internet. Companies such as Ericsson and Cisco Systems have a vision predicting that 50 billion “things” could be connected to communications networks within six years. “The ‘Internet of things’ is a way of saying that more of the world will become part of the network,” says Gordon Bell, a Microsoft researcher and a pioneer of the original computer revolution,

Individually, the IoT and Artificial Intelligence (AI) are powerful technologies. When you combine AI and IoT, you get AIoT—the artificial intelligence of things. You can think of internet of things devices as the digital nervous system while artificial intelligence is the brain of a system.

Some examples where AIoT will have a transformative impact:

  • Autonomous Driving: The average new car has 60 microprocessors in it, according to the Center for Automotive Research. Electronics account for 40 percent of the cost of making a car. AIoT is used today with autonomous vehicles such as autopilot systems used by Zenuity, Tesla or many other companies that use radars, sonars, GPS, and cameras to gather data about driving conditions and then an AI system to make decisions about the data the internet of things devices are gathering.
  • Smart Homes and Offices: Network of smart sensors will pervade homes and office building. These sensors can detect what personnel are present and adjust temperatures and lighting accordingly to improve energy efficiency and comfort. In another use case, a smart building can control building access through facial recognition technology. The combination of connected cameras and artificial intelligence that can compare images taken in real-time against a database to determine who should be granted access to a building is AIoT at work. 
  • Smart Cities: Traffic can be monitored in real-time by road cameras or by drones, and adjustments to the traffic flow can be made, congestion can be reduced. When drones are deployed to monitor a large area, they can transmit traffic data, and then AI can analyze the data and make decisions about how to best alleviate traffic congestion with adjustments to speed limits and timing of traffic lights without human involvement.  
  • Healthcare: AIoT has the potential to revolutionize healthcare and medicine, for example by remote monitoring of patients, especially elderly patients, remote monitoring glucose levels of diabetic patients, cancer diagnosis and hospital resource management.

There is however, a big challenge if AI is going to be deployed ubiquitously in such IoT devices. Recall the film clip above – it obviously has voice recognition and natural language technology in the form of deep neural networks (DNN) installed inside the smoothie blender. But these DNNs are huge monsters with billions of connections, running on large expensive computing hardware consuming huge amounts of energy. There is no way you can deploy this inside the smoothie mixer – it’s too bulky, too expensive and needs too much energy! Thus, if deep neural networks are to be used successfully in IoT devices, they must run on small and cheap hardware, consuming very little energy, but at the same time retain the accuracy and performance as on the original hardware.

Moreover, it gets even more interesting with a tremendous explosion of innovation in the hardware space, much of it driven by AI. Nvidia has been the new kid on the block challenging traditional giants such as Intel by introducing a host of GPU architectures tailored for AI. However they are not alone, there’s XILINX which produces FPGAs (another type of hardware) and the European Commission has launched the European Processor Initiative (EPI) whose aim is to design and implement a roadmap for a new family of low-power European processors for high-performance Big-Data and a range of emerging applications. There is thus a veritable Zoo of new hardware for AI!

A company deploying deep learning in their product thus has very challenging problems designing systems that  meet their requirements. This is where EmbeDL comes in! With its award winning technology, monster networks will be compressed so they fit comfortably on low-end embedded devices, enabling real-time “intelligent” interactions with their environment. The technology is built on a unified approach to compress commonly used deep learning structures, including fully connected, convolutional, and recurrent neural networks, as well as their combinations. The compression is specially tailored to both meet the customer requirements as well as to run on a chosen hardware platform. Our technology works in an informed manner within these constraints to generate the optimal slim network structure that preserves the accuracy of sensing applications while maximally reducing their resource consumption. More on that in later posts, stay tuned!

Written by <a href="" target="_self">Devdatt Dubhashi</a>

Written by Devdatt Dubhashi


Devdatt is a Professor in the Data Science and AI Division of the Department of Computer Science and Engineering at Chalmers University of Technology and co-founder of EmbeDL. He received his Ph.D. in Computer Science from Cornell University USA and was a postdoctoral fellow at the Max Planck Institute for Computer Science in Saarbrueken Germany. He was with BRICS (Basic Research in Computer Science, a center of the Danish National Science Foundation) at the University of Aarhus and then on the faculty of the Indian Institute of Technology (IIT) Delhi before joining Chalmers in 2000. He has led several national projects in machine learning and has been associated with several EU projects. He has been an external expert for the OECD report on “Data Driven Innovation”. He has published regularly in the premier machine learning and AI venues such as NIPS, ICML and AAAI.

September 16, 2020