Newsroom

How AI Affects Product Recommendation Bias

October 30, 2024 • By UC Irvine Paul Merage School of Business

We’ve all been there. You search for something at your favorite online store, and suddenly a product recommendation catches your eye. The price is right. The customer ratings are solid. The next thing you know you’re clicking “add to cart” and completing your purchase. Do you know the reason behind why the store suggested that specific product? Was it because it was the best available option at the best price, or was it because the profit margin for that product was higher compared to another possibly better product that they never showed you.

That’s the type of scenario Professor Vidyanand Choudhary at the UCI Paul‬ Merage School of Business and Associate Professor Zhe Zhang at the UT Dallas Naveen Jindal School of Management explored in “Product Recommendation and Consumer Search,” published in the Journal of Management Information Systems.

Their paper examines the effects of AI/machine learning-based recommendation systems on how consumers locate products and services online. “Previously, the focus has been on how consumers search for products, but with the advent of machine learning, platforms like YouTube, Netflix, TikTok, Amazon and others can now recommend products based on their knowledge about consumer preferences,” says Choudhary.

When consumers shop online for one item on Amazon, Amazon recommends a slew of other products, he says. Similarly, Netflix recommends certain shows based on a person’s viewing history. The researchers wondered, “What incentive does the platform have to actually recommend what they think you’d really want? And how does that change if the profit margin on certain products is different?”

 

Information Accessibility Affects Search Recommendations

According to their study, the amount of consumer information available to the platform relates to how well the platform can estimate an individual’s preferences. The more consumer information they have, the greater the company’s ability to predict what the customer really wants. Similarly, consumers can determine the product best suited for their needs by searching for and analyzing product information.

“There is a lot of information available about cars, so consumers tend to do their own searches,” says Choudhary. “However, the amount of information available can also mean it may take significant time and effort for a consumer to figure out which car would be best for them.” However, “if a consumer is looking for a niche product such as parts for home repairs or cabinet parts it can be hard to find the right products online.” In these cases, consumers are more apt to “seek recommendations at their local Home Depot,” Choudary says.

Searching for the most suitable product takes time and effort, and online stores know this. In their article, Choudhary and Zhang explain how these “search costs” factor into the equation. “That search cost, or how difficult it is for consumers to search for and analyze this information, affects the platform,” says Choudhary. “If it’s easy for customers to find it on their own, then the platform is more disciplined in making better recommendations.” If the customer is unable to search for a product, or if the information isn’t available, the site may take advantage of this. “In other words, you have to trust their recommendation, and they factor that into their decision making.”

 

Profitability, Retaining Business, and Bias Levels Incentivize Platform Decisions

The platform must also take into account how often they make poor suggestions for consumers. They know if they make subpar recommendations, people will “leave their platform altogether,” Choudhary says. “They are afraid of losing your business, and that is another disciplining force.”

Another key factor in their study relates to the profitability of the recommended products. If a particular product is much more profitable to the platform, that likely skews their recommendation, he says.

To find answers, Choudhary and Zhang considered several outcome variables. First, the bias level in the platform’s recommendation strategy; then the profits the firm earned; the consumer surplus; and finally, social welfare. “The greater the bias in the recommendation, the greater the gap between consumers’ true preferences and the product recommended to them,” says Choudhary. As consumers, we prefer “low bias.”

 

‘Game Theory’ Predicts Algorithmic Behaviors

The researchers’ paper employs the mathematical framework of game theory. As Choudhary explains, “It’s where you can mathematically represent what different people are trying to do and predict how things will play out based on the optimal strategy for each player.”

They used Amazon and Netflix as examples in their study, but the methodology applies to any platform that makes recommendations, Choudhary says. “Right now it’s hard to think of a website that doesn’t recommend things,” he says. He cites YouTube and TikTok, but nearly “everyone” uses algorithms to suggest content. “Game theory applies not just to economics,” he says, but to almost “anything in real life.”

 

Complexity of ‘Search Costs’: Effects of Consumer Information on Bias

Of course the platform benefits from having more consumer information. The more information the platform has, the greater its profit. “In a ‘big picture’ sense, we show that recommendation systems change what we know about the impact of search costs on consumers,” says Choudhary. “Whereas previously it was believed that lower search costs are always good for consumers, we find that search costs interact with the firm’s recommendation strategy in complex ways. Reducing search costs can adversely impact consumer surplus in some cases.”

They also determined that, in some cases, sharing more personal information on a platform may enhance the consumer experience. “Consumers may worry the platforms that collect their information will use it to their disadvantage,” says Choudhary. “However, we find this is not necessarily true. While there are cases where more consumer information increases bias, there are also cases where more consumer information actually reduces the level of bias. This part is surprising.”

 

Confirmation Bias and Consumers’ Online ‘Filter Bubble’

The researchers’ findings may have societal implications beyond mere policy around platforms not overcharging customers and making fair recommendations. “A lot of the stuff we see around politics and polarization is partly because of confirmation bias,” Choudhary says. “Let’s say you love cats, and the site figures out you love cats, so it only shows you posts about cats. Over time you start to think the whole world loves cats. Nobody cares about dogs because you haven’t seen anyone post anything about their dog. This causes a distorted way of thinking; everyone’s in their own bubble.” A person’s “sense of reality” is “distorted,” and “they think this is what the world looks like, but it’s not true.”

 

Game Theory Results Also Apply to Dating Apps

Choudhary is applying what he learned in this study to future research. “I’ve been working on the incentives for technology adoption in healthcare settings and another project on the incentives of the makers of dating apps, and it’s the same thing,” he says. “Maybe it’s better for the dating app to string you along so you keep paying for their service. You keep dating people, but never actually find the one.” That strategy may work for the app, but not for users and “that’s what game theory is all about.” It’s about thinking through all the different misaligned incentives and comprehending “how that creates friction.”