SK Telecom announced on July 11 that it has released its proprietary large language model (LLM), “A.X 3.1 Lite,” as open source via the Hugging Face platform. The new model, developed entirely in-house from scratch, is a lightweight version built on 7 billion parameters. Building a model “from scratch” refers to developing the architecture and training process independently from the ground up.
A.X 3.1 Lite is an enhanced iteration of the previously deployed A.X 3.0 Lite, which was used for summarizing phone calls via SKT’s AI assistant, A.Dot. The updated version retains the earlier model’s lightweight structure and high efficiency. These qualities are especially advantageous for mobile and edge devices with limited power or memory, making the model suitable for commercial applications that demand both speed and accuracy.
Despite its compact size, A.X 3.1 Lite demonstrates exceptional Korean language processing capabilities. In KMMLU (Korean Massive Multitask Language Understanding), a prominent benchmark for Korean language comprehension, it achieved a score of 61.70—96% of the performance of its sister model A.X 4.0 Lite, which scored 64.15.
In CLIcK (Cultural and Linguistic Intelligence in Korea), a benchmark designed to evaluate a model’s understanding of Korean language and cultural context, A.X 3.1 Lite outperformed A.X 4.0 Lite, scoring 71.22 compared to 69.97. CLIcK is tailored to address the shortcomings of English-centric datasets by focusing on uniquely Korean cultural and linguistic elements.
SKT emphasized its commitment to continued open-source contributions, stating that it plans to release the full-scale version of A.X 3.1—built from scratch with 34 billion parameters—later this month. The company aims to further strengthen its LLM development capabilities, particularly in anticipation of participating in government-led initiatives for developing a sovereign AI foundation model.
“We aim to enhance the self-reliance of Korea’s AI ecosystem and contribute to national AI competitiveness by leveraging our expertise in developing Korean-specific LLMs,” said Kim Tae-yoon, head of foundation models at SK Telecom.
The original news is available on the SKT Newsroom website.