• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Meta’s Yann LeCun joins 70 others in calling for more openness in AI development

Simon Osuji by Simon Osuji
November 2, 2023
in Creator Economy
0
Meta’s Yann LeCun joins 70 others in calling for more openness in AI development
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter

On the same day the U.K. gathered some of the world’s corporate and political leaders into the same room at Bletchley Park for the AI Safety Summit, more than 70 signatories put their name to a letter calling for a more open approach to AI development.

“We are at a critical juncture in AI governance,” the letter, published by Mozilla, notes. “To mitigate current and future harms from AI systems, we need to embrace openness, transparency and broad access. This needs to be a global priority.”

Much like what has gone on in the broader software sphere for the past few decades, a major backdrop to the burgeoning AI revolution has been open versus proprietary — and the pros and cons of each. Over the weekend, Facebook parent Meta’s chief AI scientist Yann LeCun took to X to decry efforts from some companies, including OpenAI and Google’s DeepMind, to secure “regulatory capture of the AI industry” by lobbying against open AI R&D.

“If your fear-mongering campaigns succeed, they will *inevitably* result in what you and I would identify as a catastrophe: a small number of companies will control AI,” LeCun wrote.

And this is a theme that continues to permeate the growing governance efforts emerging from the likes of President Biden’s executive order and the AI Safety Summit hosted by the U.K. this week. On the one hand, heads of large AI companies are warning about the existential threats that AI poses, arguing that open source AI can be manipulated by bad actors to more easily create chemical weapons (for example), while on the other hand counter arguments posit that such scaremongering is merely to help concentrate control in the hands of a few protectionist companies.

Proprietary control

The truth is probably somewhat more nuanced than that, but it’s against that backdrop that dozens of people put their name to an open letter today, calling for more openness.

“Yes, openly available models come with risks and vulnerabilities — AI models can be abused by malicious actors or deployed by ill-equipped developers,” the letter says. “However, we have seen time and time again that the same holds true for proprietary technologies — and that increasing public access and scrutiny makes technology safer, not more dangerous. The idea that tight and proprietary control of foundational AI models is the only path to protecting us from society-scale harm is naive at best, dangerous at worst.

Esteemed AI researcher LeCun — who joined Meta 10 years ago — attached his name to the letter, alongside numerous other notable names including Google Brain and Coursera co-founder Andrew Ng, Hugging Face co-founder and CTO Julien Chaumond and renowned technologist Brian Behlendorf from the Linux Foundation.

Specifically, the letter identifies three main areas where openness can help safe AI development, including through enabling greater independent research and collaboration, increasing public scrutiny and accountability, and lowering the barriers to entry for new entrants to the AI space.

“History shows us that quickly rushing towards the wrong kind of regulation can lead to concentrations of power in ways that hurt competition and innovation,” the letter notes. “Open models can inform an open debate and improve policy making. If our objectives are safety, security and accountability, then openness and transparency are essential ingredients to get us there.”

Related posts

OpenAI’s Sora app is struggling after its stellar launch

OpenAI’s Sora app is struggling after its stellar launch

January 29, 2026
I built marshmallow castles in Google’s new AI world generator

I built marshmallow castles in Google’s new AI world generator

January 29, 2026

Source link

Previous Post

Hensoldt demos UAV, sensor solutions at local event

Next Post

UN welcomes first medical evacuations from Gaza

Next Post
UN welcomes first medical evacuations from Gaza

UN welcomes first medical evacuations from Gaza

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Meta Tests Custom AI Images on Facebook, Instagram Feeds

Meta Tests Custom AI Images on Facebook, Instagram Feeds

1 year ago
JNIM Targets Malian Government, Russian Mercenaries But Civilians Pay the Price

JNIM Targets Malian Government, Russian Mercenaries But Civilians Pay the Price

1 year ago
Deep learning model algorithm for sentiment analysis

Deep learning model algorithm for sentiment analysis

1 year ago
Google’s environmental report pointedly avoids AI’s actual energy cost

Google’s environmental report pointedly avoids AI’s actual energy cost

2 years ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.