By Joe Arney

Digital recommender systems have long been a part of our lives. But those systems might be serving up inequality along with new music, viral videos and hot products.

Now, a leading expert on the technology powering these systems is turning his attention to the way news is recommended and shared. 

“If a system only shows us the news stories of one group of people, we begin to think that is the whole universe of news we need to pay attention to,” said Robin Burke, professor and chair of the information science department. 

Burke’s research studies bias in recommender systems, which tend to favor the most popular creators and products—usually at the expense of newcomers, underrepresented groups and, ultimately, consumers who have fewer choices. That’s problematic because these systems are proprietary, so researchers aren’t able to examine how they work. 

“The people who do this kind of research in industry don’t publish very much about it, so we don’t know exactly what’s going on in terms of how their systems work, or how well they work,” he said.

A quick primer for the uninitiated: Recommender systems use data from individual subscribers to serve personalized content—art, news, commerce, politics—which may limit exposure to new ideas and influences.

It’s why the National Science Foundation awarded Burke and others, including associate professor Amy Voida, a nearly $1 million grant in 2021 to develop “fairness-aware” algorithms that blunt biases baked into recommender systems. And the NSF saw the potential to do something similar in news, leading to a $2 million grant earlier this year to build a platform for researchers eager to experiment with the artificial intelligence that powers news recommender systems.

A platform like this could be game-changing for academic researchers, who are locked out of the proprietary systems built and studied by tech and social media companies. And as more nontraditional providers become sources of news, understanding how these algorithms work is essential: You may think of TikTok as a place for music videos, but a Pew Research Center survey found one in four American adults under 30 get their news from the platform.

“We have put all this control over the public square of journalistic discourse into the hands of companies that don’t have any transparency or accountability relative to what they’re doing,” Burke said. “I think that’s dangerous. And so, it’s important to think about what the alternatives might look like.” That includes the business model itself, which is predicated on selling ads while keeping users on a platform.

If successful, this latest grant will build a robust system for live experiments on recommender systems that will eventually become self-funded through contributions from other researchers. He compared it to the way space telescopes and supercolliders have created a platform where experts can better understand the world around them. 

“Unless you work at one of these companies, you don’t have any insight into how these systems work, or control over them,” Burke said. “I hope that, through this infrastructure, we’re able to understand how these things are governed, and for what objectives—and who gets to decide what those objectives are. That’s something I’m very interested in.”

Lisa Marshall (Jour, PolSci’94; MJour’22) contributed reporting.