Language models for Swedish in collaboration with AI Sweden and the Royal Library
This project will develop the first truly large-scale generative language model for the Swedish language. The model will be based on the GPT architecture, using the Nvidia Megatron framework, and will have up to 100 billion parameters. As such, this project is an extension of the preliminary work done by the AI Sweden team in collaboration with Nvidia on using the Megatron framework to build large-scale language models for the Swedish language. The current project will deliver the largest model built in Sweden to date, and will be unique even internationally due to its size and, in extension, extremely broad applicability for the entire Nordic region. The WASP WARA on media and language is the perfect development and application environment for such a model, and will rapidly accelerate the competence and capacity of Swedish AI research in general, and Swedish NLP in particular.