OpenAI has announced two sweeping partnerships in Asia, forging a deal with Korea's Kakao and signing a strategic joint venture with Japan's SoftBank. Both moves underscore its ambition to bolster AI services in regional markets while harnessing more localized language data to refine its large language models. They also come at a time of rising competition, especially from Chinese AI company DeepSeek, which recently made headlines in the U.S.

Sam Altman, CEO of OpenAI, traveled to Seoul for a press conference and unveiled the details alongside Kakao CEO Shina Chung. "Korea is a very impressive market," Altman said today at the press event. "The adoption of AI in Korea is remarkably advanced. Considering various industries, from energy to semiconductors and internet companies, there is a very strong environment conducive to applying AI. It is a market that is extremely important to us and is growing rapidly."

Kakao will integrate OpenAI's technology into KakaoTalk, one of South Korea's most popular messaging platforms. The two firms also plan to collaborate on a new Korean-language digital assistant called Kanana, and Kakao's corporate workforce will adopt ChatGPT Enterprise for internal use. Altman is further leveraging his visit to meet with other local tech giants such as Samsung Electronics and SK Hynix to explore potential partnerships around AI-enabled devices and custom chips.

In Japan, SoftBank unveiled a joint venture with OpenAI, dubbed SB OpenAI Japan, committing $3 billion annually to adopt and commercialize OpenAI technology. That sum covers multiple products, including ChatGPT Enterprise and newly introduced agents like Operator. The Japanese conglomerate's Chairman, Masayoshi Son, has signaled a willingness to invest even more heavily in OpenAI, with reports suggesting a possible funding round valued at up to $40 billion-though no official agreement has been announced.

Key Highlights:

  • SoftBank's Spending: $3 billion per year to integrate OpenAI tools.
  • Kakao Collaboration: Launching AI assistant Kanana, embedding OpenAI in KakaoTalk.
  • Operator & o3-mini: OpenAI's newly introduced agents and cost-effective reasoning model.
  • Broader Investments: Oracle, SoftBank, and OpenAI's Stargate Project could see up to $500 billion over four years.

Last week, OpenAI introduced Operator, a web agent for tasks like vacation planning and restaurant reservations, plus o3-mini, which it calls "the latest and most cost-effective reasoning model." The company also revealed a "Deep Research" feature capable of advanced internet searches for more complex queries. Under its new partnership, SoftBank will deploy these solutions across its sprawling corporate empire, including subsidiaries such as chip designer Arm.

OpenAI's flurry of deals in Japan and Korea comes amid the backdrop of growing competition from emerging AI powerhouses in China and beyond. "If the Chinese AI company proves to be more than just a viral flash in the pan," one industry observer wrote earlier, "it will have proven to be a very clear signal to OpenAI that a company building outside of the U.S. has stolen a march on capturing English-language generative AI momentum." By signing agreements that give it exposure to Korean- and Japanese-speaking consumers, OpenAI is well-positioned to diversify beyond its heavy dependence on the English-speaking market.

SoftBank's interest in AI once took another shape: in 2023, it formed SB Intuitions to build generative AI solutions specifically tailored to Japanese users. It remains unclear how that venture fits into the new joint operation with OpenAI. A SoftBank representative has not provided updated details, though some speculate that the specialized initiatives may be rolled into SB OpenAI Japan.

OpenAI has also aligned itself with Oracle to build large-scale data centers under the Stargate Project, beginning with an installation in Texas. SoftBank joins Oracle in co-funding the venture, which may eventually ramp up to $500 billion across multiple years. This initiative will expand OpenAI's compute capacity in the United States, supporting a technology stack that is expected to power Operator, o3-mini, and other products at enterprise scale.