It’s time for AI to get a turbo boost
The AI chipset market is expected to grow from US$4.2 billion in 2019 to US$10 billion in 2024
Many Artificial Intelligence (AI) researchers seem to think that the hardware problem is solved and that all their research should relate to algorithms and software. The reality is that Moore’s law (the doubling of computer performance every 18 months) has slowed and many problems can not be tackled by massive parallelism in the cloud. In particular, embedded intelligent, autonomous systems must be able to provide machine intelligence with high performance and a very low power budget.
Major players in the electronic world that one thinks of as software companies like Microsoft, Google, Facebook and others are forming design groups expressly to create customized processors to meet this challenge. The AI chipset market is expected to grow from US$4.2 billion in 2019 to US$10 billion in 2024 (Allied Market Research August 19, 2019).
The cloud offers large scale compute infrastructure
There are two distinct phases in machine learning: training and inference. Training is generally computing intensive, resource, and time consuming. The cloud is still the best place for this phase, as it offers large scale compute infrastructure mainly dominated by GPU-based accelerators. Inference is comparatively less demanding in terms of computing power and can move to the edge enabling a new class of applications such as autonomous driving and robotics.
Some AI will never move to the edge, such as chatbots and conversational AI, fraud monitoring and cybersecurity systems, but there is a strong trend towards embedding ultra-low power AI chipsets into sensors and other small nodes on wide area networks. This is how data-oriented systems work; a lot of data is usually generated at the very edge of the network. A simple example is to perform computer vision on a 30-fps high-resolution security camera, transferring the raw, unfiltered data to a cloud network for processing would be wasteful. Doing feature extraction, data compression or even taking decisions on the data locally and only transmitting metadata is far more efficient.
A new market is born, with rare opportunities
There is increasing demand for high performance data processing with ultra-low power consumption and thus a new market is born, with rare opportunities where start-ups have a chance to compete with the well-established giants. This space is dominated mainly by FPGA companies, ASIC vendors, and RISC-V designers.
CMC Microsystems is ready to facilitate research in this hot new area, bringing intelligence to the edge of the internet and reducing the load on data communications and cloud computing. As we trend toward one trillion sensors on the planet, it is important that we optimize the efficiency of data management through highly customized hardware. It is the best way to create sustainable products and competitive advantage.
CMC Microsystems (CMC) helps researchers and industry across Canada’s National Design Network develop innovations in microsystems and nanotechnologies.