Skip to content
OnMSFT.com
  • Home
  • About
  • Contact
  • Windows
  • Surface
  • Xbox
  • How-To
  • OnPodcast
  • Edge
  • Teams
  • Gaming
Menu
  • Home
  • About
  • Contact
  • Windows
  • Surface
  • Xbox
  • How-To
  • OnPodcast
  • Edge
  • Teams
  • Gaming
  1. Home
  2. News
  3. Microsoft’s new AI model AdaptLLM can learn about ‘specific domains’ faster and cheaper

Microsoft’s new AI model AdaptLLM can learn about ‘specific domains’ faster and cheaper

Devesh Beri Devesh Beri
September 28, 2023
1 min read

As reported by Multiplatform.ai, Microsoft has developed a new way to train large language models (LLMs) to understand better and generate text in specific domains. This new method is more cost-effective than previous methods, producing LLMs that perform better on domain-specific tasks.

LLMs are good at understanding and generating text in a general sense. Still, they are not as good at understanding and generating text in specific domains, such as biology, finance, or law.

They explored three main approaches to creating these specialized programs. The first approach involves building a program from the ground up, but it’s complex and resource-intensive. The second approach refines existing programs with additional training, which may not work equally well for all tasks. The third approach, which Microsoft decided to focus on, leverages existing knowledge about a field to teach the program.

The third approach is a new method for customizing LLMs for specific domains called domain-adaptive pretraining. Domain-adaptive pretraining involves training an LLM on a large text dataset from a particular domain. This training helps the LLM learn the vocabulary and concepts important in the domain.

Microsoft researchers have found that domain-adaptive pretraining can be done more cost-effectively by transforming raw corpora into reading comprehension texts. Reading comprehension texts are questions about a piece of text, and the answers to the questions require the reader to understand the text.

Microsoft researchers have shown that AdaptLLM, a model trained using domain-adaptive pretraining on reading comprehension texts, is better.

Related

Share this article:
Previous Article Thanks to ‘Town Halls,’ Teams to support up to 10,000 attendees; via Microsoft 365 Roadmap Next Article Everything announced during the Netflix Drop 01 livestream

Related Articles

Firefox tab groups on Android in Nightly showing grouped tabs with names and color labels

Firefox on Android now supports tab groups with names and colors in Nightly

April 1, 2026
TSMC’s Key Production Region Hit by Taiwan’s Worst Rainfall Deficit

TSMC Can’t Supply Enough AI Chips, Samsung 2nm Gains Orders

March 31, 2026

Fujitsu and Rapidus plan 1.4nm AI chip to power next-gen supercomputing in Japan

March 31, 2026

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • Firefox on Android now supports tab groups with names and colors in Nightly
  • TSMC Can’t Supply Enough AI Chips, Samsung 2nm Gains Orders
  • Fujitsu and Rapidus plan 1.4nm AI chip to power next-gen supercomputing in Japan
  • Warhorse Studios Reportedly Replaces Translator With AI in Kingdom Come Deliverance 2
  • NVIDIA DLSS 4.5 Dynamic Multi-Frame Generation Already Works Through Hidden App Toggle

Recent Comments

  1. XxRIVTYxX on Intel Says It Tried to Help Before Crimson Desert Dropped Arc Support
  2. Gaurav Kumar on Chrome Prepares Nudge to ‘Move Tabs to the Side’ as Vertical Tabs Near Release
OnMSFT.com

The Tech News Site

Categories

  • Windows
  • Surface
  • Xbox
  • How-To
  • OnPodcast
  • Gaming
  • Edge
  • Teams

Recent Posts

  • Firefox on Android now supports tab groups with names and colors in Nightly
  • TSMC Can’t Supply Enough AI Chips, Samsung 2nm Gains Orders
  • Fujitsu and Rapidus plan 1.4nm AI chip to power next-gen supercomputing in Japan
  • Warhorse Studios Reportedly Replaces Translator With AI in Kingdom Come Deliverance 2
  • NVIDIA DLSS 4.5 Dynamic Multi-Frame Generation Already Works Through Hidden App Toggle

Quick Links

  • About OnMSFT.com
  • Contact OnMSFT
  • Join Our Team
  • Privacy Policy
© 2010–2026 OnMSFT.com LLC. All rights reserved.
About OnMSFT.comContact OnMSFTPrivacy Policy