Be careful of bias

Do you have a home voice assistant? Alexa, Siri or Google Assistant? I heard the argument raging in the kitchen from my home office. Herself wanted a light turning on, Siri was quite insistent in asking how long the timer should be. Neither was budging and tempers (on one side) were getting frayed. I used the app to quietly turn the light on.

A few weeks later Herself showed me a TikTok saying how these voice assistants react better to a male voice than a female voice. Because of how they’re trained and bias. I can’t find a paper on it as search engines are abysmal at the moment. But it’s food for thought - bias.

(And for the public record this does indeed confirm that Siri is stupid and Herself was right)

Bias exists though. Charity Digital did a good summary here which covers gender, language and racial bias.

Is your voice assistant biased?
We explore the social implications of this artificial intelligence technology, and what it says about how our tech can do better

Apparently it’s really bad for the Welsh, but this comedy classic still gives me sympathy for the Scottish

So it’s good to see the UK Government publishing details of what algorithms they are going to use. Transparency can only help, and as I’m a man I’d probably rarely come across any issues.

Warnings AI tools used by government on UK public are ‘racist and biased’
Transparency campaigners welcome government move to publish details of system algorithms

Until the next heated ‘debate’ at home of course.

Subscribe to Gary P Shewan

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe