Opinion
Emerging Technologies

Why developing countries should look to China to leapfrog into LLM markets

A graphic of the letters AI illustrating the potential of large language models (LLM)

LLM has great commercial potential for emerging markets. Image: Unsplash/Steve Johnson

Winston Ma, CFA Esq.
Adjunct Professor, New York University
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
This article is part of: Annual Meeting of the New Champions
  • The performance gap between LLMs is quickly narrowing.
  • China‘s AI startups can't compete with tech giants on financial capital and computing power, so they must focus on AI applications in areas where they have unique data advantages.
  • China’s AI market trend is a useful reference for emerging markets, especially those with little financing and computing resources in the global AI race.

TikTok may be the social network where global users spend the most time, but few TikTok influencers realize that TikTok’s parent company, ByteDance, also holds another global top 10 mobile app, CapCut, a video editing app. By late 2024, total downloads of CapCut ranked fourth among all mobile applications, only behind Facebook, Instagram and TikTok.

Even fewer appreciate that CapCut is now powered by artificial intelligence (AI). The corresponding AI-generated content software, Dearmina, is comparable to Open AI’s Sora, which turns text input into video content.

In late May, ByteDance shocked the AI market by releasing its own large language models (LLMs), like Open AI. It priced its main model, 'Doubao,' significantly lower than similar offerings from its Chinese big tech rivals. The company claims Doubao LLM is capable of processing 2 million Chinese characters, equivalent to 1.25 million tokens, for RMB 1 ($0.14).

Cost of LLMs as of late May 2024. Image: SCMP

Other Chinese tech heavyweights, like Alibaba and Tencent, quickly followed with drastic price cuts. Some offered access to their less powerful LLMs for free.

OpenAI’s most advanced multimodal model, GPT-4o, also unveiled that week, comes in at $5 per million input tokens handled. In AI, a token is a fundamental unit of data processed by algorithms. A token typically equates to between 1 and 1.8 Chinese characters for Chinese LLMs.

This price war in China and similar trends in the US (where free, open-source models, like Meta’s Llama and xAI’s Grok, are challenging Open AI to drop model prices) show the performance gap between LLMs is quickly narrowing.

When LLM foundation models themselves are commoditized, various businesses can think of those as operating systems or platforms, and there is tremendous opportunity to develop AI applications that haven’t really emerged yet around those models.

In China’s AI market, therefore, most players focus on the application – the 'commercialization' of LLMs. Regarding AI applications for different industry sectors’ specific tasks, the size of the model may not be the sole determining factor in performance.

Other aspects, like architecture, training data and fine-tuning techniques, could play a significant role. The price war reduces LLM costs for the startups, boosting their profit margins.

Of course, such a shift of innovation direction is also out of necessity for many Chinese AI startups. Because of US sanctions, the most advanced chips, like Nvidia GPUS, are hard to access.

And, the economic slowdown in China has led to a 'venture capital winter' for startups. They can’t compete with tech giants on financial capital and computing power. Thus, they must focus on AI applications in areas where they have unique advantages, such as:

Existing data resources

Data differentiates algorithms and models, and proprietary data provides a sustainable AI moat. For AI, implementing high-quality search, classification, forecasting, personalization, anomaly detection, and so many more use cases all depend on the data. Thus, it’s not surprising that CapCut, affiliated with TikTok and ByteDance’s unrivaled short video platforms, is a superior video editing software.

CapCut's AI tool – Dreamina – recently rebranded itself as 'Jimeng' in Chinese, during which it completed a full launch of the AI drawing and AI video editing functions. Like Open AI’s Sora, it allows users to convert text descriptions into images.

Furthermore, Jimeng plans to introduce a 'story creation' feature, which will enhance its product line by allowing users to use AI technology to craft personalised stories (this upcoming feature is still being tested). This feature clearly has the DNA of TikTok short videos.

Have you read?

Investing in robust customer data foundations and data curation for business-specific models

Chinese startup FancyTech, which won LVMH’s 2024 Innovation Award, a platform that uses generative AI to produce videos from 3D product models, is a good example of this.

When the FancyTech founders were working at the e-commerce platform Tmall (part of the Alibaba group), they understood the pain points of global brands: They may need to produce 100 videos for the 100 products in their inventory, and it would be almost impossible to do so using traditional production methods.

Now, generative AI provides a powerful tool for this headache, brands – like Hublot’s Tmall flagship – now include multiple videos created using FancyTech.

Most importantly, the whole process of advertising video creation and distribution is fully automated, without the need for human quality control. The secret source? FancyTech invested heavily in its data-labelling team to curate high-quality datasets to train its AI application, which, in turn, generates high-quality video content for brands.

Enterprises can improve their AI applications in three ways – retrieval augmentation, fine-tuning with prompts and responses, and self-supervised pre-training. The ideal deployment would incorporate all three of these, but that represents a heavy data labelling load.

Discover

How is the World Economic Forum creating guardrails for Artificial Intelligence?

Understanding small is the new big

Whereas tech giants have been racing to build ever-larger language models, a surprising trend is emerging: smaller models – or small language models (SLMs). SLMs are more suitable for specific applications. LLM models boast hundreds of billions of parameters. SLMs are about tens of billions of parameters. If 2023 was the year of the LLM, then 2024 is likely to be the year of the SLM.

SLMs offer a degree of adaptability and responsiveness crucial for real-time applications. For specific tasks, like domain-specific question answering, these compact, efficient and purpose-built SLMs could deliver superior performance across three key metrics: speed, cost and accuracy. If the largest LLMs aim to be top university PhDs, the smaller SLMs are happy to be vocational school graduates.

Chinese AI players are focusing on SLMs because the era of 'the internet of connected things' (IoT) and edge computing has quietly arrived (more connected things than connected humans). Their smaller size allows for lower latency in processing requests, making them ideal for edge AI computing for IoT, where speed is of the essence. Various AI-companion robots are already available on Chinese e-commerce websites, most of which are cheaper than $100.

In summary, China’s AI market is developing a hybrid AI ecosystem that enables businesses and companies to exploit the opportunities generative AI presents, even with limited financing and computing resources in many cases.

They use the LLMs, taking full advantage of the open-source models, but they put more focus on specific applications and SLMs, which are often on-premises (for example, smart agriculture) or on virtual private clouds because they are smaller and may be trained with highly proprietary data.

Loading...

The result is a wave of AI applications emerging in China that claim PMF (product/market fit) for various industries. As China focuses on digital infrastructure development and emphasizes data as a fundamental element of the economy, its data resources are further standardized, organized and consolidated. Its industry policy has recently shifted from last decade 'Internet+' into 'data x.' The overall data foundation in China is poised to make 2024-2025 the year of 'deploy and scale.'

China’s AI market trend is a useful reference for emerging markets, especially those with little financing and computing resources in the global AI race.

We’re only now getting to the point where enterprise-ready architectures are available to really take advantage of AI, so now is the opportunity for emerging markets to leverage their existing data resources and invest in digital infrastructure to implement AI creatively.

By leveraging their proprietary data and subject matter expertise – valuable resources that LLM models do not have –emerging markets may take advantage of the current AI boom to leapfrog in their digital transformation.

Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Investing in African health tech can transform health systems. Here's how

Somto Chloe Keluo-Udeke

June 26, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum