• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Theoretical particle physicist tackles machine learning’s black box

Simon Osuji by Simon Osuji
August 14, 2025
in Artificial Intelligence
0
Theoretical particle physicist tackles machine learning’s black box
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


Theoretical particle physicist tackles machine learning's black box
Graphical representation of the model. The matrix variables can be viewed as linear maps between vector spaces, which are represented by arrows in the figure. Credit: Machine Learning: Science and Technology (2025). DOI: 10.1088/2632-2153/adc872

From self-driving cars to facial recognition, modern life is growing more dependent on machine learning, a type of artificial intelligence (AI) that learns from datasets without explicit programming.

Related posts

Viome Full Body Intelligence Test Review: Little Clarity, Pricey Supplements

Viome Full Body Intelligence Test Review: Little Clarity, Pricey Supplements

January 31, 2026
3 Great Heated Blanket Deals on My Favorite Affordable Models

3 Great Heated Blanket Deals on My Favorite Affordable Models

January 31, 2026

Despite its omnipresence in society, we’re just beginning to understand the mechanisms driving the technology. In a recent study, Zhengkang (Kevin) Zhang, assistant professor in the University of Utah’s Department of Physics & Astronomy, demonstrated how physicists can play an important role in unraveling its mysteries.

“People used to say machine learning is a black box—you input a lot of data and at some point, it reasons and speaks and makes decisions like humans do. It feels like magic because we don’t really know how it works,” said Zhang. “Now that we’re using AI across many critical sectors of society, we have to understand what our machine learning models are really doing—why something works or why something doesn’t work.”

As a theoretical particle physicist, Zhang explains the world around him by understanding how the smallest, most fundamental components of matter behave in an infinitesimal world. Over the past few years, he’s applied the tools of his field to better understand machine learning’s massively complex models.

Scaling up while scaling down costs

The traditional way to program a computer is with detailed instructions for completing a task. Say you wanted software that can spot irregularities on a CT scan. A programmer would have to write step-by-step protocols for countless potential scenarios.

Instead, a machine learning model trains itself. A human programmer supplies relevant data—text, numbers, photos, transactions, medical images—and lets the model find patterns or make predictions on its own.

Throughout the process, a human can tweak the parameters to get more accurate results without knowing how the model uses the data input to deliver the output.

Machine learning is energy intensive and wildly expensive. To maximize profits, industry trains models on smaller datasets before scaling them up to real-world scenarios with much larger volumes of data.

“We want to be able to predict how much better the model will do at scale. If you double the size of the model or double the size of the dataset, does the model become two times better? Four times better?” said Zhang.

A physicist tackles machine learning black box
A portion of the Feynman diagrams used to solve a machine learning model. Credit: University of Utah

A physicist’s toolbox

A machine learning model looks simple: Input data—> black box of computing—> output that’s a function of the input.

The black box contains a neural network, which is a suite of simple operations connected in a web to approximate complicated functions. To optimize the network’s performance, programmers have conventionally relied on trial and error, fine-tuning and re-training the network and racking up costs.

“Being trained as a physicist, I would like to understand better what is really going on to avoid relying on trial and error,” Zhang said. “What are the properties of a machine learning model that give it the capability to learn to do things we wanted it to do?”

In a new paper published in the journal Machine Learning: Science and Technology, Zhang solved a proposed model’s scaling laws, which describe how the system will perform at larger and larger scales. It’s not easy—the calculations require adding up to an infinite number of terms.

Zhang applied a method that physicists use to track hundreds of thousands of terms, called Feynman diagrams. Richard Feynman invented the technique in the 1940s to deal with hopelessly complicated calculations of elementary particles in the quantum realm. Instead of writing down algebraic equations, Feynman drew simple diagrams—every line and vertex in the diagram represents a value.

“It’s so much easier for our brains to grasp, and also easier to keep track of what kind of terms enter your calculation,” Zhang said.

Zhang used Feynman diagrams to solve a model posed in published research from 2022. In that paper, the physicists studied their model in a particular limit. Zhang was able to solve the model beyond that limit, obtaining new and more precise scaling laws that govern its behavior.

As society runs headfirst into AI, many researchers are working to ensure the tools are being used safely. Zhang believes that physicists can join the engineers, computer scientists and others working to use AI responsibly.

“We humans are building machines that are already controlling us—YouTube algorithms that recommend videos that suck each person into their own little corners and influence our behavior,” Zhang said. “That’s the danger of how AI is going to change humanity—it’s not about robots colonizing and enslaving humans. It’s that we humans build machines that we are struggling to understand, and our lives are already deeply influenced by these machines.”

More information:
Zhengkang Zhang, Neural scaling laws from large-N field theory: solvable model beyond the ridgeless limit, Machine Learning: Science and Technology (2025). DOI: 10.1088/2632-2153/adc872

Provided by
University of Utah

Citation:
Theoretical particle physicist tackles machine learning’s black box (2025, August 13)
retrieved 13 August 2025
from https://techxplore.com/news/2025-08-theoretical-particle-physicist-tackles-machine.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Deputy Finance Minister Inaugurates New Ghana Incentive-Based Risk-Sharing System for Agricultural Lending Ltd. (GIRSAL) Board

Next Post

Pebble’s smartwatch is back: Pebble Time 2 specs revealed

Next Post
Pebble’s smartwatch is back: Pebble Time 2 specs revealed

Pebble's smartwatch is back: Pebble Time 2 specs revealed

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Museum of the Home’s displays will change to reflect changing times

Museum of the Home’s displays will change to reflect changing times

2 years ago
XRP hits new ATH of $3.55 after 7 years amid altcoin surge

Polymarket’s US expansion and SEC filing fuel token launch rumors

5 months ago
EU, French Navy support Maritime Security in the Gulf of Guinea

EU, French Navy support Maritime Security in the Gulf of Guinea

1 year ago
Tools like Apple’s photo Clean Up are yet another nail in the coffin for being able to trust our eyes

Tools like Apple’s photo Clean Up are yet another nail in the coffin for being able to trust our eyes

10 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.