AI chips

About

To support deep learning-based applications, AI chips are created with a specialised architecture and integrated AI acceleration.

Deep learning, also known as active neural network (ANN) or deep neural network (DNN), is a type of machine learning that falls under the umbrella of artificial intelligence. 

It is made up of a set of computer commands or algorithms that stimulate brain activity and structure.

AI chips, with their hardware architectures and complementary packaging, memory, storage, and connectivity technologies, enable AI to be infused into a wide range of applications to assist in the transformation of data into information and then knowledge.

The demand for chatbots and online channels like Messenger, Slack, and others that employ NLP to analyse user messages and conversational logic has expanded the use of AI chips for NLP applications.

Traditional chips with processor cores and memory constantly move commands and data between the two hardware components. These chips aren't ideal for AI applications because they can't handle AI workloads with large volumes of data. Some high-end traditional chips can process AI applications.

AI chips contain processor cores and AI-optimized cores (depending on chip size) that work together to perform computational tasks. Due to close integration with non-AI processor cores, the AI cores are optimised for heterogeneous enterprise-class AI workloads with low-latency inferencing.

Fields of application of AI Chips
  1. Natural language processing (NLP),
  2. Computer vision
  3. Robotics
  4. Network security across a wide variety of sectors, including automotive, IT, healthcare, and retail.
Leading Manufactures of AI Chips

Nvidia Corporation, Intel Corporation, IBM Corporation, Alphabet Inc., Samsung Electronics Co., Ltd, and Apple Inc. are some of the key players in the AI chip market.

 

 

Source: The Hindu





Posted by on 23rd May 2022