Skip to content
OnMSFT.com
  • Home
  • About
  • Contact
  • Windows
  • Surface
  • Xbox
  • How-To
  • OnPodcast
  • Edge
  • Teams
  • Gaming
Menu
  • Home
  • About
  • Contact
  • Windows
  • Surface
  • Xbox
  • How-To
  • OnPodcast
  • Edge
  • Teams
  • Gaming
  1. Home
  2. News
  3. Recent report reveals why AI progress shows no signs of decelerating

Recent report reveals why AI progress shows no signs of decelerating

Priya Walia Priya Walia
August 4, 2023
2 min read

Time, the global news platform, elucidates how advancements in AI have been significantly dictated by three key determinants: Computation, Data, and Algorithms.

In a recent report, the publications explained how the exponential strides in AI over the last seven decades have emanated from the relentless pursuit to harness increased computational power, feed AI systems with more data, and undertake algorithmic innovations that streamline the data and computational needs while achieving equivalent results.

Compute: Bringing Greater Intelligence to AI

Time cites the example of the world’s first artificial neural network, Perceptron Mark I, developed in 1957. It was a modest structure with 1,000 artificial neurons that performed around 700,000 operations. Fast forward to today, OpenAI’s GPT-4 language model relies on a staggering 21 septillion operations for training, underlining the incredible progress made in computational capabilities.

Aligned with Moore’s law, the falling cost of computational power has made it increasingly affordable to train larger AI models. AI developers now spend millions, like OpenAI’s CEO Sam Altman, who claimed he spent over $100 million on training GPT-4. Such investment fetches massive returns, leading to AI models that undergo training in computing. AI giants like OpenAI and Anthropic, backed by significant investments from tech giants Microsoft and Google, respectively, spearhead this movement.

Data: The Lifeblood of AI

The report further explains how AI systems are essentially learning mechanisms that model relationships within their training data. The greater the data points, the better the AI system performs. Take the instance of LlaMa, a large language model developed by Meta. Llama was trained on approximately a billion data points, drastically improving the quality of its output as compared to its predecessor, Perceptron Mark I, which was trained only on six data points.

The limitations on how much data an AI system can efficiently process are moderated by the availability of computing. Interestingly, recent trends suggest that the amount of data used to train AI systems is outpacing the creation of new data on the internet. However, AI developers are optimistic about overcoming this challenge by innovating and finding ways to use lower-quality language data.

Algorithms: Getting More Out of Less

Algorithms dictate how AI systems utilize the available compute to model data relationships. Much of algorithmic progress has been derived from compute usage, which has augmented the overall performance of AI systems. Future algorithmic advancements may likewise need to focus on compensating for limitations in data availability.

Experts widely concur that the rate of AI progress, driven by the interconnected play of computing, data, and algorithms, will likely continue unhindered for the next few years. However, this rapid growth has elicited concerns about the possibility of misuse.

Related

Share this article:
Previous Article GitHub Copilot now alerts developers of matching codes in public repositories Next Article Microsoft lays out new secure ways for Graph connector content discovery across Microsoft 365

Related Articles

Chrome and Gemini icons representing Gemini Live voice assistant integration in Chrome

Chrome tests Gemini Live voice assistant in a floating overlay panel

March 14, 2026

Chrome’s Organizer feature may sync Gemini and AI conversations across devices

March 14, 2026

After Chrome, Edge tests launching the browser automatically when you sign into Windows

March 13, 2026

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • Chrome tests Gemini Live voice assistant in a floating overlay panel
  • Chrome’s Organizer feature may sync Gemini and AI conversations across devices
  • After Chrome, Edge tests launching the browser automatically when you sign into Windows
  • iPhone Fold Latest Rumors: Display, Cameras, RAM and Price Details Revealed
  • Samsung fears first mobile operating loss due to memory price surge

Recent Comments

No comments to show.
OnMSFT.com

OnMSFT.com covers Microsoft news, reviews, and how-to guides. Formerly known as WinBeta, we have been your source for Microsoft news since 1998.

Categories

  • Windows
  • Surface
  • Xbox
  • How-To
  • OnPodcast
  • Gaming
  • Edge
  • Teams

Recent Posts

  • Chrome tests Gemini Live voice assistant in a floating overlay panel
  • Chrome’s Organizer feature may sync Gemini and AI conversations across devices
  • After Chrome, Edge tests launching the browser automatically when you sign into Windows
  • iPhone Fold Latest Rumors: Display, Cameras, RAM and Price Details Revealed
  • Samsung fears first mobile operating loss due to memory price surge

Quick Links

  • About OnMSFT.com
  • Contact OnMSFT
  • Join Our Team
  • Privacy Policy
© 2010–2026 OnMSFT.com LLC. All rights reserved.
About OnMSFT.comContact OnMSFTPrivacy Policy