Skip to content
OnMSFT.com
  • Home
  • About
  • Contact
  • News
  • How-to
  • Feature stories
  • Deals
  • Microsoft / office 365
  • Reviews
Menu
  • Home
  • About
  • Contact
  • News
  • How-to
  • Feature stories
  • Deals
  • Microsoft / office 365
  • Reviews
  1. Home
  2. News
  3. Microsoft’s new AI model AdaptLLM can learn about ‘specific domains’ faster and cheaper

Microsoft’s new AI model AdaptLLM can learn about ‘specific domains’ faster and cheaper

Devesh Beri Devesh Beri
September 28, 2023
1 min read

As reported by Multiplatform.ai, Microsoft has developed a new way to train large language models (LLMs) to understand better and generate text in specific domains. This new method is more cost-effective than previous methods, producing LLMs that perform better on domain-specific tasks.

LLMs are good at understanding and generating text in a general sense. Still, they are not as good at understanding and generating text in specific domains, such as biology, finance, or law.

They explored three main approaches to creating these specialized programs. The first approach involves building a program from the ground up, but it’s complex and resource-intensive. The second approach refines existing programs with additional training, which may not work equally well for all tasks. The third approach, which Microsoft decided to focus on, leverages existing knowledge about a field to teach the program.

The third approach is a new method for customizing LLMs for specific domains called domain-adaptive pretraining. Domain-adaptive pretraining involves training an LLM on a large text dataset from a particular domain. This training helps the LLM learn the vocabulary and concepts important in the domain.

Microsoft researchers have found that domain-adaptive pretraining can be done more cost-effectively by transforming raw corpora into reading comprehension texts. Reading comprehension texts are questions about a piece of text, and the answers to the questions require the reader to understand the text.

Microsoft researchers have shown that AdaptLLM, a model trained using domain-adaptive pretraining on reading comprehension texts, is better.

Related

Share this article:
Previous Article Thanks to ‘Town Halls,’ Teams to support up to 10,000 attendees; via Microsoft 365 Roadmap Next Article Everything announced during the Netflix Drop 01 livestream

Related Articles

Discord Nitro May Add Xbox Game Pass Starter Edition With 50+ Games and Cloud Gaming Access

April 24, 2026

Microsoft Drops ‘Microsoft Gaming’ Name, Brings Back Xbox Identity

April 24, 2026

Intel 14A Wins Tesla Deal, More Customers Show Interest

April 24, 2026

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • Discord Nitro May Add Xbox Game Pass Starter Edition With 50+ Games and Cloud Gaming Access
  • Microsoft Drops ‘Microsoft Gaming’ Name, Brings Back Xbox Identity
  • Intel 14A Wins Tesla Deal, More Customers Show Interest
  • Token-Based Pricing Disrupts AI Market as Groq Outpaces NVIDIA on Cost and Speed
  • Samsung and Kingston Raise SSD Prices Again as Costs Climb Over 10%

Recent Comments

  1. William on NZXT Responds to RTX 5090 Leak Claim, Disputes Redditor’s Version of Events
  2. Jenny Jones on Microsoft Publisher Will Shut Down in October 2026 and Users Are Not Happy
  3. XxRIVTYxX on Intel Says It Tried to Help Before Crimson Desert Dropped Arc Support
  4. Gaurav Kumar on Chrome Prepares Nudge to ‘Move Tabs to the Side’ as Vertical Tabs Near Release
OnMSFT.com

The Tech News Site

Categories

  • Windows
  • Surface
  • Xbox
  • How-To
  • OnPodcast
  • Gaming
  • Edge
  • Teams

Recent Posts

  • Discord Nitro May Add Xbox Game Pass Starter Edition With 50+ Games and Cloud Gaming Access
  • Microsoft Drops ‘Microsoft Gaming’ Name, Brings Back Xbox Identity
  • Intel 14A Wins Tesla Deal, More Customers Show Interest
  • Token-Based Pricing Disrupts AI Market as Groq Outpaces NVIDIA on Cost and Speed
  • Samsung and Kingston Raise SSD Prices Again as Costs Climb Over 10%

Quick Links

  • About OnMSFT.com
  • Contact OnMSFT
  • Join Our Team
  • Privacy Policy
© 2010–2026 OnMSFT.com LLC. All rights reserved.
About OnMSFT.comContact OnMSFTPrivacy Policy