Do retailers have a problem with recommendation bias? – RetailWire

Dec 17 2021
Cleber Ikeda is director of investigative analysis and intelligence at Walmart. All views and opinions expressed herein are personal and may not represent those of Walmart. The content of this article does not contain or refer to any confidential or proprietary information of Walmart.
Recommendation algorithms help connect customers to the products they need or want to buy. They also increase the visibility of promotions and in some cases make shopping more fun. They also, in some cases, crossed serious ethical boundaries that undermined the trust consumers placed in retailers.
There have been unfortunate cases in which recommendation algorithms have misled profiling results and generated discrimination (for example, Jobs). The root cause is usually the input data, tainted with bias, extremism, harassment or discrimination. Combined with a negligent approach to privacy and aggressive advertising practices, data can become the raw material for a terrible customer experience. Irresponsible use of data could even generate serious and unwanted results, such as threats to human rights.
To address the risks of discrimination and injustice, retailers need to assess whether their algorithms discriminate against specific groups, subgroups or minorities. They need to know whether profiling techniques prevent customer segments from having full visibility into comparable products, and whether poorly designed algorithmic designs prevent less affluent customers from accessing good deals.
Training machine learning developers on the impact of bias, discrimination, and prejudices on algorithmic design is a powerful tool. Without proper training and clear communication from leaders, developers could design algorithms while consciously or unconsciously promoting values ââthat are not aligned with their company’s ethical standards.
Another concern is confidentiality. Many data points used to profile customers and predict purchasing decisions are personally identifiable information or protected attributes. Marketers should adhere to national and international privacy regulations, but it is also good for business to understand customer expectations for privacy. Breaches of trust are business killers.
Retailers should also exercise caution when it comes to retargeting online advertisements. There is a line to be drawn between useful product recalls and what seems intrusive.
State-of-the-art artificial intelligence is not yet capable of “fixing” real world data. Still, there is no excuse for the owners of recommendation algorithms to be careless about this.
Greater diversity in data science teams would be helpful, as the marginalized and vulnerable groups who suffer most from inequalities in the digital world are not well represented. Businesses can also get away with bias premiums where hackers compete with each other to identify inherent code biases.
DISCUSSION QUESTIONS: What is the role of senior managers in the development of ethical recommendation algorithms in retail? How should retailers deal with the potential for discrimination, invasion of privacy, and even human rights threats in AI-powered interactions with shoppers?
“It may be necessary to have an outside team or expert to think about, test and understand the consequences of the algorithms used and to keep members of the C-suite informed.”