• About Us
    • About us
    • Careers
    • People
  • Services
    • Inclusive Cultures
    • Inclusive Recruitment
    • Inclusive Development
  • Resources
    • Insights
    • Research
    • Press
  • Case Studies
  • Contact Us
  • About Us
    • About us
    • Careers
    • People
  • Services
    • Inclusive Cultures
    • Inclusive Recruitment
    • Inclusive Development
  • Resources
    • Insights
    • Research
    • Press
  • Case Studies
  • Contact Us
Search
Contact us
Menu

Home › Insights › ▷ Racism at Work Podcast Episode 5: AI – is it bias, and what does that mean for the workplace?

▷ Racism at Work Podcast Episode 5: AI – is it bias, and what does that mean for the workplace?

Pearn Kandola
  • June 21, 2019

 

Harry and I discussed the role AI has to play in the future of our workplaces and how our bias (conscious or unconscious) can be directly transferred into its algorithms. We also asked how, in order to combat this growing threat, we can all improve our knowledge of the issue and reduce the likelihood of these biases taking hold.

“We’ve got to ensure that when we’re training algorithms to make decisions on our behalf that we’re not giving it biased training that we can’t undo. ” – Harry Gaskell

Harry Gaskell is the Chief Innovation Officer and a Partner at EY UK&I. He is also a Chair at the Employers Network for Equality & Inclusion (enei). You can follow Harry on LinkedIn.

We cover a wide range of topics, including:

  • How AI can be used to empower our workplace.
  • Could be AI be fairer than humans?
  • Sources of bias for AI we need to combat
  • What is required to properly train unbiased AI
  • Challenges of undoing bias in AI
  • Can AI facilitate a fair selection of senior leaders?
  • Examples of how AI can fail to tell differences
  • Who should be held responsible for AI from a legal perspective
  • Potential actions we can implement to make AI systems more accurate
  • Is an international standard required to regulate AI?
  • Social responsibility and AI

“If we’re not careful we might start building AI which is around for decades that was never built around principles such as ethics, social responsibility, accountability, explainability, reliability, lineage.” – Harry Gaskell

And much more. Please enjoy, and be sure to grab a copy of Racism at Work: The Danger of Indifference and connect with Professor Binna Kandola OBE on LinkedIn to join the conversation or share your thoughts.

You can also listen on Apple Podcasts, Spotify, Stitcher, TuneIn or in any other podcasting app by searching for “Racism at Work Podcast“, or simply by asking Siri and Alexa to “play the Racism at Work podcast“.

Mentioned in the show

  • History of AI in healthcare [9:35]
  • Amazon scraps secret AI recruiting tool that showed bias against women [11:50]
  • Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech [13:32]
  • Is this a wolf? Understanding bias in machine learning [25:03]
  • AI is the future – but where are the women? [31:45]
  • Trusted AI – IBM Research [34:15]
  • Putting artificial intelligence (AI) to work by EY [34:20]

Stay ahead of the game.

Get the latest on DEI, effective recruitment, and leadership development direct to your inbox.
This field is for validation purposes and should be left unchanged.
Privacy policy(Required)

Related Posts

  • Inclusive Cultures

The Cost of Silence: A case study for why employees stop healthy dissent  

Dissent. It’s a word that often makes leaders uneasy. According to the Oxford Learning Dictionary, it means “the expression of...
Announcing psychological safety week sponsored by the Financial Times presented by Pearn Kandola
  • Inclusive Cultures

BP, Cabot Credit Management, and Puma Capital Group Join as Sponsors for Psychological Safety Week

Global leaders support free international initiative to boost workplace inclusion, performance and trust BP, Cabot Credit Management, and Puma Capital...
Announcing psychological safety week sponsored by the Financial Times presented by Pearn Kandola
  • Inclusive Cultures

Charity for Civil Servants joins as sponsor for Pearn Kandola’s Psychological Safety Week

Psychological Safety Week, the first global initiative dedicated to empowering workplaces to be safe, inclusive and honest, is taking place...
  • Inclusive Cultures

Senovis joins Psychological Safety Week and Brings Renown AI Expert Ethan Mollick to Senior HR Leaders

As AI transforms work at speed, HR is becoming one of the most structurally important departments in the modern organisation....
Great Place To Work image featuring a stock image of a working professional and the title 'We're a UK's Best Workplace in Consulting and Professional Services 2025'
  • Inclusive Cultures

Pearn Kandola Makes The UK’s Best Workplaces in Consulting & Professional Services™ List

The 2025 UK’s Best Workplaces in Consulting & Professional Services ™ list was revealed this morning by Great Place To...
an image of various small plastic models to display a wide ranging team on a map, indicating inclusion
  • Inclusive Cultures

Building Trust with Inclusive Behaviours 

In today’s diverse and dynamic workplace, fostering an environment of trust is paramount for organisational success. Inclusive behaviours: actions that...

Free Consultation with a Business Psychologist

Talk to us to discuss your needs.
Book an Appointment
14 years’ experience
Jonathan Taylor, Managing Psychologist

Book Your Appointment Today

This field is for validation purposes and should be left unchanged.
(Required)
Terms & Conditions

Privacy Policy

Cookie Policy

Site map

Making The World Fairer

© 2025 Pearn Kandola LLP

© 2015-2024 Pearn Kandola LLP. All Rights Reserved.
Pearn Kandola and the Pearn Kandola Logo are registered trademarks of Pearn Kandola LLP.

  • About Us
    • About us
    • Careers
    • People
  • Services
    • Inclusive Cultures
    • Inclusive Recruitment
    • Inclusive Development
  • Resources
    • Insights
    • Research
    • Press
  • Case Studies
  • Contact Us
Search