Skip to content Skip to footer

The Risk of Personalization: do people want and trust it?

My name is Lars K Jensen, and I am a former journalist who has been working with digital development, analyses and journalism in the media industry for several years. I work with , audience development and digital journalism in Berlingske Media. Sign up to my , Products in publishing, here.        

Personalization is user-centric technology. But it also raises concerns. How will you identify and address those concerns?

Personalized experiences and recommendations are everywhere and logically are looking it, with some being farther ahead than others.

There are, however, a number of pitfalls to avoid. Besides thinking hard about how to measure the success of personalized content recommendations and find the right level of algorithm transparency, we also need to take into account whether our users actually want a personalized experience.

Because, even if we can easily assume that users want recommendations that provide them with stories and content more relevant to them, a study indicates that that might not always be the case.

Personalization concerns

This year's edition of Roskilde University's report on how Danes access and use news publishers/media (“Danskernes brug af nyhedsmedier 2023” which is a part of the yearly Digital News Report) has this interesting chart on page 47:

The Risk of Personalization

For all of you that don't understand Danish, here's a little explanation 😉

The chart shows how much certain age groups (the columns – the one on the left is all age groups) agree with these two statements:

  • Blue: “I'm worried that more personalized news can mean I miss out on important information.”
  • Red: “I'm worried that more personalized news can mean that I miss out on challenging opinions.” (Meaning opinions you don't necessarily agree with.)

As you can see there is a general concern regarding personalization in news publishing.

Even though Danes aged between 18 and 24 are the least concerned regarding missing out on important information because of personalization, 44 percent is still quite a lot, and they are the age group most concerned with missing out on challenging opinions – perhaps because they are still forming their views.

Now, I am well aware that there is always a discrepancy between what people say when asked and how they actually behave. But this indicates a degree of concern among our audiences we must not ignore.

As one of the authors, Mads Kæmsgaard Eberholst, wrote in a LinkedIn post:

Tackling this concern along all stakeholders is, for me, THE big challenge facing the Danish media. Users, politicians, advertisers and media companies must all come to terms with the problems and opportunities that arise with increasing personalization of the news media.”

Important information” can be many things, when it comes to journalism, news, media and the people consuming it.

It might be important to people in order to be updated on current events, understand the society they are a part of – or to know what is going on so they can partake in conversations with friends, family and coworkers (sometimes called the “social aspect” or “social component”, if you work with the Jobs To Be Done framework).

Editors or algorithms?

There is also another chart regarding personalization in the report – on page 46:

The Risk of Personalization

This is about people's attitudes towards how stories are selected for presentation – indicated by how much they agree with these three statements:

  • Blue: “To have stories chosen for me by editors and journalists is a good way to get news.”
  • Red: “To have stories automatically chosen for me based on what I have consumed in the past is a good way to get news.”
  • Green: “To have stories automatically chosen for me based on what my friends have consumed in the past is a good way to get news.”

Here we see (although we should take into account how the questions are framed) that more of the respondents generally agree that stories chosen by editors and journalists are a good way than personalization based on previous consumption is better than – even though it's a tight race.

Except for personalization based on previous consumption by friends – indications are that you should not implement that kind of algorithm 😉.

> Also by Lars: Paywalled content: Rethink Your Premium Icons

Okay, let's recap the two charts regarding personalization.

In general:

  • 21% of respondents say that personalization is a good way to get news.
  • 57% are worried that personalization can mean they miss out on important information.
  • 52% are worried that personalization can mean they miss out on challenging opinions.

This is really interesting – because it shines a light on a part of personalization that isn't always the most talked about in the media and publishing industry:

  • Do people actually want it?
  • How will people react when we begin personalizing the stories presented to them?
  • And will it affect the trust we have built up?

Differences across borders?

I know that these are based on Danish respondents, and based on the composition of the subscribers for this newsletter, there's a good chance that you are from another country.

I have previously written about how different media markets in different countries follow different rules and have different patterns for media consumption etc. – so there is definitely something there to be aware of.

So, does this apply to your market and users as well? That's a great question, and one you should seek out the answer to 😉

I have just seen a recent survey of 1000 US respondents done by Readly, which showed that “Most U.S. Readers Reject Its Use In Journalism”.

Now, is a broad term, but these point is interesting to keep in mind:

Americans are happy to embrace technology in the home, but are wary of overreliance on AI in areas where human judgment plays a crucial role,' concludes Chris Couchman, head of content at Readly.”

I wouldn't necessarily call this a backlack against Artificial Intelligence (or technology in general). I see it as more of a constant reminder, that we need to continually invest in research and knowledge about our users and audiences, especially when implementing novel technology.

As Esther Kezia Thorpe writes in a Media Voices article:

What I want to emphasize is that real audiences want real news from real people. That's the principle we need to keep at the heart of our businesses.”

(Here we are obviously venturing into the realm of generative AI as well.)

Being transparent

This is also why it's very important to be transparent about the use of personalization and algorithms in the selection of which stories to present to the users.

In an article from MediaWatch about the findings in the Danish study, another one of the authors – Mark Ørsten, professor at Roskilde University – mentions that the concern can stem from people's perception of things like echo chambers and filter bubbles (and then we can discuss whether they actually exist).

He suggests that publishers implement personalization as a choice:

If news publishers work with algorithms they need to make sure to tell their users what they have done and what it means for the users. I believe that the users want to option to opt in or out and have an opinion on whether they want this.”

Again: Transparency.

But as I've written about before in this newsletter, you can have too little transparency – or even too much:

So, what can that story teach us about algorithms and news media and publishers? First, it teaches us that some kind of transparency is necessary to establish trust between the algorithm and the user. And second, it tells us that at some point we add too much information (obviously for all the right transparency reasons) and the band of trust snaps.”

Brief recap:

So, in your next meeting or project on personalization:

  • Remember to not only talk about how a particular recommender system will help you generate more pageviews, get traffic to your long tail of content etc. – and how to validate the AI or algorithm's performance and what data is it trained on.
  • Make sure to also discuss how it will provide real value to real users and how you will measure that value creation.
  • And how you will find out if people want the personalization you are about to implement or they have certain concerns.

And, obviously: What can you do to mitigate those concerns?

People expressing concern is normal when new technology enters our life and consumption patterns – but it's not the same as saying that we shouldn't do anything.

Remember the three types of resistance:

  1. I don't understand this.
  2. I don't like this.
  3. I don't like you.

It's best if you can address people and alleviate their concerns when they are at level 1 or alternatively level 2 😉

.