Skip to content
  • KOSPI 2718.82 -26.23 -0.96%
  • KOSDAQ 871.27 -1.15 -0.13%
  • KOSPI200 369.92 -4.17 -1.11%
  • USD/KRW 1367.7 +1.7 +0.12%
  • JPY100/KRW 879.13 +1.38 +0.16%
  • EUR/KRW 1470 +2.23 +0.15%
  • CNH/KRW 189.25 +0.28 +0.15%
View Market Snapshot
Artificial intelligence

NCSOFT gears up to take on Naver, Kakao for AI language model

The gaming company says its AI platform outperforms other types of Korean language AI models of similar sizes

By Aug 17, 2023 (Gmt+09:00)

2 Min read

The Varco was trained primarily on text, images and videos for game development
The Varco was trained primarily on text, images and videos for game development

NCSOFT Corp. has unveiled a large-scale Korean language model as the first South Korean game developer to advance into the local-language generative artificial intelligence market, where online and mobile giants Naver Corp. and Kakao Corp. are making big strides.

Its self-developed large language model (LLM), Varco LLM, focuses on gaming content such as images, text and artificial humans and is optimized to develop immersive games, the company said on Wednesday.

The Varco LLM will be available at the machine learning hub Amazon Sagemaker JumpStart. It will be free of charge for one month from the release date.

It has a smaller number of parameters than those developed by Naver and Kakao. But NCSOFT said the Varco LLM will be suitable for certain types of applications in terms of both cost and performance.

Naver Cloud CEO Kim Yu-won said in February the upgraded HyperCLOVA X platform under development is more Korean-proficient than ChatGPT
Naver Cloud CEO Kim Yu-won said in February the upgraded HyperCLOVA X platform under development is more Korean-proficient than ChatGPT

An LLM is trained to learn vast amounts of data. A larger number of parameters increases the chances of the model memorizing the training data. On the other hand, it bumps up operational costs and slows down the response time.

NCSOFT’s LLM is divided into three types by parameter size: those with 1.3 billion, 6.4 billion and 13 billion parameters.

They compare with Kakao Corp.’s next-generation LLM scheduled for release in October. Its KoGPT-based LLM will have 6 billion, 13 billion, 25 billion and 65 billion parameters, respectively.

By comparison, Naver’s revamped AI platform HyperCLOVA, slated to be released later this year, will use 204 billion parameters.
 
NCSOFT'S Lineage II game poster
NCSOFT'S Lineage II game poster

NCSOFT’s Varco LLM outperforms other types of Korean language AI models of a similar size, its Chief Research Officer Lee Je-hee told reporters. 

Next month, the company will also introduce the Varco Studio platform for in-house use and unveil it to the public in the first half of next year.

Separately, it is developing a multimodal LLM with 100 billion parameters, which not only generates images, text and videos, but also analyzes them. It is scheduled for release next March.

Industry watchers say it remains to be seen whether the Varco LLM can turn around NCSOFT’s earnings and boost its share price, which has been on a losing streak since late May.

Write to Ju-Hyun Lee at deep@hankyung.com
 

Yeonhee Kim edited this article.
More to Read
Comment 0
0/300