Japanese tech conglomerate SoftBank is looking to develop a “world-class” Japanese-language-specific generative artificial intelligence model, and plans to invest $960 million in the next two years to bolster its computing facilities, according to a Nikkei report.
Training of large language models (LLM), such as OpenAI’s Chat GPT, requires advanced graphics processing units, which SoftBank plans to purchase from U.S. chip giant Nvidia, the Nikkei reported Monday, citing anonymous sources.
The investment of 150 billion yen ($960 million) will be spent in 2024 and 2025 and adds to 20 billion yen that SoftBank spent on computing infrastructure last year, the report said.
The latest investment is believed to be the largest of its kind by any Japanese company, and when completed, will likely give SoftBank the most powerful computing capabilities in the country, Nikkei added.
According to another report from Nikkei Asia, Japan lacks private companies with the high-performance supercomputers that are needed to build LLM, despite increased interest in the tech.
SoftBank’s reported investments could change this and give Japan a strong domestic player in its generative AI space at a time when international players are trying to enter the market.