• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

For AI, secrecy often does not improve security

Simon Osuji by Simon Osuji
October 14, 2024
in Artificial Intelligence
0
For AI, secrecy often does not improve security
0
SHARES
4
VIEWS
Share on FacebookShare on Twitter


computer science
Credit: Pixabay/CC0 Public Domain

Concern about misuse of artificial intelligence has led political leaders to consider regulating the emerging technology in ways that could limit access to AI models’ inner workings. But researchers at a group of leading universities, including Princeton, caution that such restrictions are likely to do more harm than good.

Related posts

It’s Peak Season for Bird-Watchers to Spot Migratory Hummingbirds, and Our Favorite Feeder Is on Sale

It’s Peak Season for Bird-Watchers to Spot Migratory Hummingbirds, and Our Favorite Feeder Is on Sale

March 6, 2026
ByteDance’s AI Ambitions Are Being Hampered by Compute Restraints and Copyright Concerns

ByteDance’s AI Ambitions Are Being Hampered by Compute Restraints and Copyright Concerns

March 6, 2026

In an article published online on Oct. 10 in the journal Science, a research team including Princeton computer science professor Arvind Narayanan and graduate student Sayash Kapoor conclude that limiting public access to the underlying structures of AI systems could have several negative results.

These include stifling innovation by restricting engineers’ ability to improve and adapt the models; increasing secrecy of the models’ operation; and concentrating power in the hands of a few individuals and corporations who control access to the technology.

The article discusses in detail the threats posed by misuse of AI systems in areas including disinformation, hacking, bioterrorism and the creation of false images. The researchers assess each risk and discuss whether there are more effective ways to combat them instead of restricting access to AI models.

For example, discussing how AI could be misused to generate text for email scams called spear-phishing, the researchers note that it is more effective to bolster defenses than restrict AI.

“[T]he key bottleneck for spear phishing is not generally the text of emails but downstream safeguards: modern operating systems, browsers, and email services implement several layers of protection against such malware,” they write.

The emergence of artificial intelligence in the past few years has led to calls for regulating the technology, including steps by the White House and the European Union. At issue are the construction of computer code and data that make up today’s primary AI systems like GPT-4 and Llama 2. Known as foundation models, these are the systems that can be harnessed to write reports, create graphics and perform other tasks.

A major distinction among the models is how they are released. Some models, called open models, are fully available for public inspection. Others, called closed models, are available only to their designers. A third type are hybrids, which keep parts of the models secret and allow public access to other parts.

Although the distinction seems technical, it can be critical to regulation. The researchers said most of the concern about the AI models relate to ways the models could be subverted for malicious purposes. One option to combat misuse is to make adaptions harder by restricting access to the AI models.

Regulators could do this by requiring developers to block outside access. They could also make developers legally responsible for misuse of the models by others, which likely would have the same result.

The researchers found that available evidence does not show that open models are riskier than closed models or information already available through standard research techniques such as online searching.

In an article presented earlier this year at the International Conference on Machine Learning, the researchers concluded that evidence shows that restricting access to models does not necessarily limit the risk of misuse. This is partly because even closed models can be subverted and partly because information for malicious actors might already be available on the internet through search engines.

“Correctly characterizing the distinct risk of open foundation models requires centering marginal risk: To what extent do open foundation models increase risk relative to closed foundation models, or to preexisting technologies such as search engines,” the researchers write.

The researchers said that this does not mean that access to models shouldn’t be limited. In some areas, closed models provide the best solution. But, they argue, regulators need to carefully consider whether limiting access is the best way to prevent harm.

“For many threat vectors, existing evidence of marginal risk is limited,” they write. “This does not mean that open foundation models pose no risk along these vectors but, instead, that more rigorous analysis is required to substantiate policy interventions.”

More information:
Rishi Bommasani et al, Considerations for governing open foundation models, Science (2024). DOI: 10.1126/science.adp1848

Provided by
Princeton University

Citation:
For AI, secrecy often does not improve security (2024, October 14)
retrieved 14 October 2024
from https://techxplore.com/news/2024-10-ai-secrecy.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Banque du Caire Funds Kredit with 100mln EGP to support small and medium enterprises

Next Post

UBA strengthens Lagos trade fair partnership with $6-billion SME loan facility

Next Post
UBA strengthens Lagos trade fair partnership with $6-billion SME loan facility

UBA strengthens Lagos trade fair partnership with $6-billion SME loan facility

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Who Is Steve Ballmer, the 5th Richest Person in the World?

Who Is Steve Ballmer, the 5th Richest Person in the World?

2 years ago
BRICS Nations Stockpiling Gold to Support New Currency

BRICS Is Now Richer Than G7 Countries

2 years ago
What Each Country is Likely to Do at the 2023 Summit

What Each Country is Likely to Do at the 2023 Summit

3 years ago
Israel-Tied Predatory Sparrow Hackers Are Waging Cyberwar on Iran’s Financial System

Israel-Tied Predatory Sparrow Hackers Are Waging Cyberwar on Iran’s Financial System

9 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • Mahama attends Liberia’s 178th independence anniversary

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.