• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

How states are placing guardrails around AI in the absence of strong federal regulation

Simon Osuji by Simon Osuji
August 7, 2025
in Artificial Intelligence
0
How states are placing guardrails around AI in the absence of strong federal regulation
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


AI guardrails
Credit: AI-generated image

U.S. state legislatures are where the action is for placing guardrails around artificial intelligence technologies, given the lack of meaningful federal regulation. The resounding defeat in Congress of a proposed moratorium on state-level AI regulation means states are free to continue filling the gap.

Related posts

Crypto-Funded Human Trafficking Is Exploding

Crypto-Funded Human Trafficking Is Exploding

February 13, 2026
Our Favorite TV Is Still Almost Half Off

Our Favorite TV Is Still Almost Half Off

February 13, 2026

Several states have already enacted legislation around the use of AI. All 50 states have introduced various AI-related legislation in 2025.

Four aspects of AI in particular stand out from a regulatory perspective: government use of AI, AI in health care, facial recognition and generative AI.

Government use of AI

The oversight and responsible use of AI are especially critical in the public sector. Predictive AI—AI that performs statistical analysis to make forecasts—has transformed many governmental functions, from determining social services eligibility to making recommendations on criminal justice sentencing and parole.

But the widespread use of algorithmic decision-making could have major hidden costs. Potential algorithmic harms posed by AI systems used for government services include racial and gender biases.

Recognizing the potential for algorithmic harms, state legislatures have introduced bills focused on public sector use of AI, with emphasis on transparency, consumer protections and recognizing risks of AI deployment.

Several states have required AI developers to disclose risks posed by their systems. The Colorado Artificial Intelligence Act includes transparency and disclosure requirements for developers of AI systems involved in making consequential decisions, as well as for those who deploy them.

Montana’s new “Right to Compute” law sets requirements that AI developers adopt risk management frameworks—methods for addressing security and privacy in the development process—for AI systems involved in critical infrastructure. Some states have established bodies that provide oversight and regulatory authority, such as those specified in New York’s SB 8755 bill.

AI in health care

In the first half of 2025, 34 states introduced over 250 AI-related health bills. The bills generally fall into four categories: disclosure requirements, consumer protection, insurers’ use of AI and clinicians’ use of AI.

Bills about transparency define requirements for information that AI system developers and organizations that deploy the systems disclose.

Consumer protection bills aim to keep AI systems from unfairly discriminating against some people, and ensure that users of the systems have a way to contest decisions made using the technology.

Bills covering insurers provide oversight of the payers’ use of AI to make decisions about health care approvals and payments. And bills about clinical uses of AI regulate use of the technology in diagnosing and treating patients.

Facial recognition and surveillance

In the U.S., a long-standing legal doctrine that applies to privacy protection issues, including facial surveillance, is to protect individual autonomy against interference from the government. In this context, facial recognition technologies pose significant privacy challenges as well as risks from potential biases.

Facial recognition software, commonly used in predictive policing and national security, has exhibited biases against people of color and consequently is often considered a threat to civil liberties. A pathbreaking study by computer scientists Joy Buolamwini and Timnit Gebru found that facial recognition software poses significant challenges for Black people and other historically disadvantaged minorities. Facial recognition software was less likely to correctly identify darker faces.

Bias also creeps into the data used to train these algorithms, for example when the composition of teams that guide the development of such facial recognition software lack diversity.

By the end of 2024, 15 states in the U.S. had enacted laws to limit the potential harms from facial recognition. Some elements of state-level regulations are requirements on vendors to publish bias test reports and data management practices, as well as the need for human review in the use of these technologies.

Generative AI and foundation models

The widespread use of generative AI has also prompted concerns from lawmakers in many states. Utah’s Artificial Intelligence Policy Act requires individuals and organizations to clearly disclose when they’re using generative AI systems to interact with someone when that person asks if AI is being used, though the legislature subsequently narrowed the scope to interactions that could involve dispensing advice or collecting sensitive information.

Last year, California passed AB 2013, a generative AI law that requires developers to post information on their websites about the data used to train their AI systems, including foundation models. Foundation models are any AI model that is trained on extremely large datasets and that can be adapted to a wide range of tasks without additional training.

AI developers have typically not been forthcoming about the training data they use. Such legislation could help copyright owners of content used in training AI overcome the lack of transparency.

Trying to fill the gap

In the absence of a comprehensive federal legislative framework, states have tried to address the gap by moving forward with their own legislative efforts. While such a patchwork of laws may complicate AI developers’ compliance efforts, I believe that states can provide important and needed oversight on privacy, civil rights and consumer protections.

Meanwhile, the Trump administration announced its AI Action Plan on July 23, 2025. The plan says “The Federal government should not allow AI-related Federal funding to be directed toward states with burdensome AI regulations … “

The move could hinder state efforts to regulate AI if states have to weigh regulations that might run afoul of the administration’s definition of burdensome against needed federal funding for AI.

Provided by
The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation:
How states are placing guardrails around AI in the absence of strong federal regulation (2025, August 6)
retrieved 6 August 2025
from https://techxplore.com/news/2025-08-states-guardrails-ai-absence-strong.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

foodpanda honours delivery partners this National Day with S$16,000 in bursaries, double-tipping and rider safety courses

Next Post

Trumpworld Knows Epstein Is a Problem. But They Can’t Solve It

Next Post
Trumpworld Knows Epstein Is a Problem. But They Can’t Solve It

Trumpworld Knows Epstein Is a Problem. But They Can’t Solve It

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

The iPhone Air is so light, I forgot it was in my pocket

The iPhone Air is so light, I forgot it was in my pocket

4 months ago
Fils Partners with Madkhol to introduce sustainable investment solutions in the Saudi market

Fils Partners with Madkhol to introduce sustainable investment solutions in the Saudi market

1 year ago
South Sudan grounds 4 UN-linked aircraft over security, smuggling concerns

South Sudan grounds 4 UN-linked aircraft over security, smuggling concerns

2 months ago
The D Brief: From cars to missiles?; SecArmy, confirmed; Zelenskyy coming to DC; UK’s defense plans; And a bit more.

The D Brief: From cars to missiles?; SecArmy, confirmed; Zelenskyy coming to DC; UK’s defense plans; And a bit more.

12 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.