본문 바로가기
아이티 늬우스

Apple, Microsoft, and Google have declared their independence with their own AI semiconductors chips. What about Korea?

by rollirolli 2023. 5. 16.
반응형

 

 

Why do companies develop their own semiconductors in the US?

 

NVIDIA's stock price on the US New York Stock Exchange has surged by 98% since the beginning of this month, reaching a market capitalization of over $70 billion. Particularly impressive is the fact that its market cap has increased by a staggering $35 billion in less than six months, catapulting the company from 13th to 6th place in terms of market cap ranking within the US. While there have been speculations about a short-term price bubble, it seems that NVIDIA is able to mitigate such concerns with its technological prowess. The market expects NVIDIA to maintain its unrivaled dominance in the field of graphics processing units (GPUs) for the foreseeable future. According to market research firm 'Top500,' as of 2022, NVIDIA holds a staggering 92% market share in the global artificial intelligence (AI) semiconductor (accelerator) market, overshadowing competitors such as AMD (5%) and Intel (1%).

 

An anonymous industry insider knowledgeable about the domestic semiconductor industry stated, "In the case of NVIDIA, as the top customer of TSMC, the world's leading foundry, they receive preferential treatment, leaving domestic companies lagging behind in terms of fabrication." The insider further added, "In particular, AI semiconductor customers demand compatibility with NVIDIA's infrastructure, which has been crucial in the race for NVIDIA's dominance in the industry over the past few years."

 

It has become a trend for many tech companies such as Google and Amazon, which have not been involved in the semiconductor industry, to enter this field. These companies have started developing their own specialized AI chips, leveraging AI technologies in various applications. Last month, Google unveiled its fourth-generation AI chip called the 'TPU (Tensor Processing Unit) v4.' Tesla introduced its self-designed AI chip called D1 in 2021 and has been using it for fully autonomous driving assistance and other functions. Amazon Web Services (AWS) also revealed its inference AI chip called 'Inferentia II' in December last year, and it currently utilizes the chip for data centers (IDCs) and voice/image recognition services.

 

In Korea, not only semiconductor companies like Samsung Electronics and SK Hynix but also IT companies such as Naver and KT are joining forces to foster the domestic AI semiconductor ecosystem. Samsung Electronics and Naver have formed a collaborative relationship for AI chip development, while KT has partnered with AI chip design company Rebellion. Rebellion's data center AI chip 'Atom' is scheduled to be integrated into KT's IDC and its large-scale AI service 'Mideum.' Additionally, Purio Networks completed the development of its first-generation AI chip 'Warboy' last year and has entered full-scale chip production at Samsung Electronics' foundry line.

 

However, the reality is not smooth sailing. Analysis suggests that there is a significant gap compared to overseas big tech companies such as NVIDIA, which have been building up a substantial lead. In particular, in the AI market, there is intense behind-the-scenes competition among countries to lead the infrastructure industry that stems from AI. In the field, there is a growing concern that if we miss out on the market, we may become dependent on the overseas ecosystem. Following the large-scale language models that heavily influence AI performance, there is apprehension about the potential dependency of the domestic AI ecosystem on foreign companies if we were to yield the cutting-edge semiconductor technologies that will underpin various industries where AI will be implemented.

 

Amid the global chatbot development frenzy, there have been instances of shortages in NVIDIA's AI core GPU, the 'A100.' Chinese IT company Baidu, known for its AI chatbot 'EurkeaBot,' reportedly faced a scarcity of GPUs and requested A100 GPUs for all departments within the company. This serves as evidence that without independent AI semiconductor technology, there is a risk of becoming dependent on global big tech companies.

 

As a result, there is a growing voice emphasizing the urgent need for mergers and acquisitions between large companies and startups, as well as government support. In the AI market, not only is the platform itself important, but the potential of the derived infrastructure business is also limitless. Therefore, industry experts warn that if we miss out on the market, we may fall behind in the ecosystem competition.

 

Meanwhile, amid the ChatGPT craze, the significant operational costs have emerged as a critical issue for companies looking to utilize AI. It is known that training ChatGPT requires over 10,000 NVIDIA GPUs to handle large-scale data. According to Morgan Stanley, the current cost of powering Google search is around 0.28 cents (approximately 3.6 Korean won), but when using ChatGPT for search, the cost increases sevenfold to 2 cents (approximately 26 Korean won). Considering that there are over 100 million ChatGPT users worldwide, even if each user performs just one search with ChatGPT, the operational cost would amount to at least $2 million (approximately 2.6 billion Korean won). As AI-based advanced services such as autonomous driving, data centers, robotics, and smart factories, in addition to ChatGPT, continue to proliferate, there are observations that operational costs will skyrocket astronomically.

 

Big tech companies are also entering this field for the same reason. These companies are leveraging AI technologies in various applications and showcasing their efforts in developing proprietary AI chips that are tailored to their specific domains.

 

 

 

 

반응형

댓글