Saturday, May 17, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

Workers suffer when AI gets it wrong, argues professor

Simon Osuji by Simon Osuji
November 29, 2023
in Artificial Intelligence
0
Workers suffer when AI gets it wrong, argues professor
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter


Amazon
Credit: Unsplash/CC0 Public Domain

Amazon thought it had found an efficient way to find the best workers. Recruitment is time consuming and expensive, so why not outsource it to artificial intelligence (AI)?

Related posts

Coinbase Will Reimburse Customers Up to $400 Million After Data Breach

Coinbase Will Reimburse Customers Up to $400 Million After Data Breach

May 17, 2025
Is Elon Musk Really Stepping Back from DOGE?

Is Elon Musk Really Stepping Back from DOGE?

May 17, 2025

Their team built an AI-based algorithm—a series of instructions telling a computer how to analyze data—that would give each candidate a score from one to five stars. They could then simply choose the candidates with five stars.

But there was a problem. It turned out that women didn’t score well for software and tech jobs. What was going on?

Well, the algorithm was trained on CVs submitted to Amazon over the previous 10 years, and most came from men. The algorithm had “learned” that men were to be preferred. It awarded more stars for masculine language in a CV and took off stars for anyone who went to a women’s college.

The algorithm had been taught to discriminate, copying human bias.

Other studies have found that AI can pick up gender signals in a CV, even when a name and pronouns are removed. And, even if AI is trained to be gender-neutral, it might still discriminate against parents or other vulnerable employee groups, like those who are racially or culturally diverse or LGBTQI+.

But most cases of AI-based discrimination won’t be reported. Or maybe even noticed. And that is a big problem.

In a detailed analysis of Australian workplace laws, published in the Melbourne University Law Review, I found there is little known about how Australian employers are using AI.

There are many software tools that use AI to streamline human resource functions—from recruitment to performance management and even to dismissal. But how these are being used is often only revealed when things go really wrong.

For example, the Australian Public Service tried using AI-assisted technology to manage promotions. Many of these promotions were later overturned for not being based on merit, but this was only revealed because the Public Service has a dedicated Merit Protection Commissioner.

What happens in the private sector, where most people work?

Europe has strong privacy and data protection laws—the General Data Protection Regulation (GDPR)—that demand a human decision-maker have the final say in any automated process that significantly affects people’s lives. In the EU, gig workers have used this to challenge Uber and Ola when they were automatically terminated as drivers.

But Australia has no equivalent.

Australian privacy law significantly lags behind countries like the UK and the European Union. Incredibly, it contains a blanket exception for “employee records”—while your employer needs your consent to initially gather new data, there are no limits placed on that data once it is held.

And the federal Privacy Act 1988 (Cth) does not apply to small businesses, which employ most Australian workers.

Discrimination law might fill this gap if it can be shown that an AI algorithm discriminates against certain people or groups. But if we don’t know that an algorithm is being used, how do we challenge it?

Discrimination law mostly relies on individuals making a complaint—and few people do, even when they know they have been discriminated against. With automated decisions, we may not even know what algorithm has been used, let alone if it is discriminating against us.

We need reform—and soon. As we watch the management of OpenAI (makers of ChatGPT) implode over fears of where the technology is going, it is clear we need strong regulation of new AI technologies. We cannot rely on the companies themselves to have the answers or call for help when needed, much less to publicly report any serious problems.

The Federal government, in its response to the review of the Privacy Act, has agreed in principle to consult on the employee records exception. And it has agreed in principle to (eventually) remove the small business exception.

But we need more. Adopting rigorous privacy law—like the GDPR—is a first step. But the EU has recognized the need to go further, as it attempts to pass the new EU AI Act. The Act aims to be the world’s first comprehensive AI law and would impose more regulation for riskier technologies. Employment systems using AI would be classed as high-risk.

I have argued, though, that discrimination law also needs an overhaul. Rather than relying on individuals to make a complaint, we need positive, proactive obligations on employers, so it is clear what they are doing, and clear that they must engage with workers before they adopt these new technologies.

We must demand open reporting on AI development and practices, or we might not find out what we need to know until it is too late.

Provided by
University of Melbourne

Citation:
Workers suffer when AI gets it wrong, argues professor (2023, November 28)
retrieved 28 November 2023
from https://techxplore.com/news/2023-11-workers-ai-wrong-professor.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Polestar’s climate-tweeting bot isn’t actually a bot, for good reason

Next Post

ChatGPT Predicts 400% Rally to $3, Here’s When

Next Post
ChatGPT Predicts 400% Rally to $3, Here’s When

ChatGPT Predicts 400% Rally to $3, Here's When

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

How to Score a Hole in One With Your Retirement Planning

How to Score a Hole in One With Your Retirement Planning

11 months ago
Lesego Chombo, 27, Appointed Botswana’s Youngest Minister Ever!

Lesego Chombo, 27, Appointed Botswana’s Youngest Minister Ever!

4 days ago
3 Questions: Noah Nathan and Ariel White on Global Diversity Lab summer internships | MIT News

3 Questions: Noah Nathan and Ariel White on Global Diversity Lab summer internships | MIT News

2 years ago
Empowering Young Women And Fostering Sisterhood For A Brighter Future

Empowering Young Women And Fostering Sisterhood For A Brighter Future

2 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • Matthew Slater, son of Jackson State great, happy to see HBCUs back at the forefront

    0 shares
    Share 0 Tweet 0
  • Dolly Varden Focuses on Adding Ounces the Remainder of 2023

    0 shares
    Share 0 Tweet 0
  • US Dollar Might Fall To 96-97 Range in March 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.