As AI settles in, it is becoming a major factor influencing how we see the world and establish relationships.
Think about recruitment which is being supported with AI. Or think about the way we are invited to explore the field of interest we displayed with a list of “you might also like” links.
Recruitment processes are there to assist HR to find the right candidate for a job. AI algorithms used in the process rely on an existing dataset from which it learns who might be the best candidate. We know that algorithms can help to reduce human decision bias and thus help to provide greater fairness. Where it might lead to bias itself is, when the original dataset introduces a first filter due to its own specificities. Take for example a dataset consisting of students, it can help address questions related to that group, but will not serve well when used in a general demographics context.
The AI algorithms used to keep us as engaged as much as possible in a given platform like YouTube let us explore subjects which are of interest to us. As these algorithms become more and more focused based on our existing activity, they start to reduce what is visible to us. A consequence is how it will influence our ability to develop a point of view. As we become stuck in an echo chamber with singular interests people might forget that other views might exist. The reverse is also true, as being directed towards content we have been interested in also creates the impression of growing demand in that field.
While some of the consequences are frightening, the evolution of AI helps us see some of the changes culture is subject to. It creates an incentive to reevaluate our behaviors who lead to this change. Discovering flaws in the AI algorithms has also lead to much more research on the impact of AI. Research that now leads to investigate how AI can become more ethical but also what ethical may mean.
It’s an interesting consequence from seeking to find the right answer. As it can’t be found, diversity leads to seeking new ways to find it.