It’s part of a bigger conversation about the amount of personal data that we give away to companies and advertisers, which allows them to do things like advertise products to you that you’ve been talking about, but not looking for online.
The hosts explore the different data combinations that allow this to happen – loyalty cards, device proximities, IP addresses and more – and Megan talks about why she chooses not own voice activated devices and how that conflicts with her work in conversational AI. Andy asks the important question: does your Alexa or other voice assistant give you more value than your data gives companies and advertisers?
Confirmation bias is also on the cards, as they look at an Airbnb pricing algorithm that offered Airbnb hosts the opportunity to generate more revenue through automated variable pricing. It sounds like a fantastic idea, but it had unintended consequences when the majority of people that signed up where white and it inadvertently created greater opportunity for white Airbnb hosts than those of other ethnicities.
So, why did this happen?Megan discusses the cultural and social contexts behind confirmation bias, how unintended consequences can occur and the best ways to mitigate them.
AI, however, isn’t only used for the dark arts of advertising and revenue generation. In the month’s final news piece, the trio dives into how AI is being used to promote better mental health outcomes and early tests where it’s predicting and discovering cancer and Alzheimer’s disease.
AI, machine learning and data science are becoming increasingly involved with everyday life. Keep up to date with AI, industry trends and innovations by subscribing to AI Right? on all major podcast platforms.