Article courtesy of Innodisk
We are moving into a new era of technological innovation. The concept of the Internet of Things (IoT) has already been around for a long time, especially so in the light of our rapid technological development. IoT incorporates the spirit of physical and digital convergence; data is gathered from an increasing number of devices for then to be aggregated into what is commonly known as Big Data. The number of these devices continues to grow and is estimated to reach a staggering 50 billion by 2020.
The data gathered by these devices encounter a problem when attempting to transmit to a centralized location such as the cloud, namely ‘latency’. Even though connection speeds are steadily increasing it fails to keep pace with the exponentially growing amount of data. Unless handled, this means that latency will increase and overall system performance will suffer.
This is one of the areas where AI can make significant contributions. Furthermore, it also opens up new technological innovations such as streamlining city traffic to public security and enhanced financial services.
More fundamentally, AIoT requires components that can handle the challenging and diverse conditions found at the edge. These locations can be anything from on board vehicles and airplanes to factories or oil installations in the desert. This requires a flexible and adaptive approach to component manufacturing. AI also promises to reduce the human factor when it comes to decision making. This puts greater pressure on system integrators to ensure quality control as an accident involving AI, where the human factor is removed, will not necessarily have a clear and obvious culprit.
Let us first define the concepts of IoT, AI, and edge computing.
The internet of things is a phrase that refers to the trend of “things” being interconnected through a network (usually the internet). The “things,” in this case, do not necessarily refer to separate electronic devices; they can also refer to things like wearable electronics or even people that have a medical device on or implanted in them. Basically, it is every individual device that can transfer data within a network in some fashion.
IoT in its pure form only gathers data with little or any computation. This means that the data is sent in bulk to cloud to be analyzed. However, all data is not equally valuable. Take for example security footage, the interesting parts have people or objects moving, while still shots of an unchanging background are less interesting. In this case, sending all the data to a cloud for analysis would waste large amounts of bandwidth that could have been used for other applications.
The AI we are referring to fits within the concept of “Narrow AI.” This is a program or system that is able to perform a set of specific tasks without any direct human input on how to do so. This differs significantly from “General AI,” which is the AI we are used to seeing in movies and series which has human-like autonomous capabilities. A current example of narrow AI is text, picture, and speech recognition that we can create through neural networks and machine learning. Such an AI has gone through thousands, if not millions, of different data iterations and taught itself how to correctly identify the image or object at hand.
But no matter how sophisticated its predictions become, it is still limited to this narrow function it has been trained for. If anything falls outside of this scope, the AI is rendered all but useless. An AI trained to identify written numbers can learn its task and will easily supersede human capabilities, but it will be completely useless when given a task such as identifying letters.
AI at the edge can potentially demand a lot of computational power to ensure performance is adequate. However, standard storage and memory components might deliver the needed performance but are ill-equipped to handle the rough conditions at their specific location. E.g., road-side traffic monitoring will experience temperature cycles from day to night and summer to winter, in-vehicle systems have to contend with shock and vibration, industrial settings have increased levels of pollution etc.
The original idea of IoT had data sent to a central location, or the cloud, to undergo processing and analysis. However, as the number of devices has increased exponentially, many applications have reached a roadblock where this large amount of data transmitted back and forth causes severe latency issues. Edge computing tackles this problem by handling more data at the edge. This way the device can determine by itself what needs to be sent to the cloud and what can be filtered out. The concept simply means moving computational power out to the “edge,” where the internet connects to various devices, i.e. the location data is actually gathered.
When talking about AIoT, we usually refer to an AI platform located at the edge. This normally takes the form of a small IPC with an inbuilt industrial-grade CPU. For real-time data analysis, this CPU needs adequate support in form of flash memory and DRAM.
Industrial-grade Memory and storage Industrial-grade storage and memory components are essential to solving the issues of implementing AI at the edge. The main issues to solve are exploring and identifying the risks present at each location of data gathering. The components can then be customized to fit the specific requirements of the application.
In the following article we look at some present day examples of AIoT applications.