Reading Time: < 1 minutes

Today, AI and ML are present in almost all walks of our life. They help to make the consumer experience more user friendly, inclusive, and personalized. Google products offer well maintained and monitored uniformity and inclusiveness, and we have Tulsee Doshi to thank for this.

Tulsee Doshi, a Google product lead in ML Fairness and Responsible AI, is the brains behind unbiased product experience for customers regardless of their identities or origin.

AI and ML perform based on the data used to train them. If the data is limited, that will create bias, and not all consumers could connect with the product. Doshi wanted to apply a people-centered approach to give users an inclusive experience.

But creating all-inclusive products has its own challenges. One of them is understanding the different approaches for measuring and tackling fairness concerns. It is hard to determine the true meaning of fairness, as it may differ from one perspective to another. The product teams are investing in tooling and resources for both internal and external purposes.

More organizations and companies should employ a diverse workforce, consult external experts, and take opinions of the target audience to be able to develop a product while keeping in mind a broader set of people.

ML fairness is a critical concern in ML development; Tulsee Doshi is the silent hero behind Google’s improved fairness concerns.

Source

#AIMonks #AI #ArtificialIntelligence #ML #MachineLearning #Fairness #Unbiased #Google #TulseeDoshi #ConsumerExperience #AllInclusive

Subscribe to AI Bytes

Join thousands of other data scientists and artificial intelligence enthusiasts

I will never give away, trade or sell your email address. You can unsubscribe at any time.


0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *