I spend a couple of hours on a deep update on a Google ranking change for my members newsletter yesterday - and what happens? Google announces another big change.
So, I'd better do another update, hadn't I?
Those of you who don't give a hoot about SEO can leap on to the links.
Google News News
It feels like a long time since we've had a significant update to Google News, other than things like the removal of meta news keywords. But yesterday, we did get a big update:
It's good that this applies to both the main search and Google News - as we all know that Google News is not a huge destination for most.
However, the messaging around this has been interesting, nd occasionally just plain wrong. A lot of people have focused on a secondary part of the news:
“Many other kinds of websites have reputations as well. For example, you might find that a newspaper (with an associated website) has won journalistic awards. Prestigious awards, such as the Pulitzer Prize award, or a history of high quality original reporting are strong evidence of positive reputation.”
And that seems to have been translated as "humans are choosing which outlets are promoted". For example:
That's not what's actually happening, though. As Google's Danny Sullivan put it:
So, let's break this down, for clarity's sake:
- Google are changing the news and general search algorithms to prioritise high quality original reporting — and display it for longer
- They have also updated their human reviewer guidelines to suggest using awards are one metric of "high quality"
Humans are not deciding which "high quality" journalism sources are being prioritised - the algorithm is still doing that. Human reviewers are just looking for evidence that the algorithm isn't doing its job, and they're using some new standards and benchmarks to do that. They don't manually intervene is search results - they notify the algorithm team, which uses their data to tweak the algorithm, again.
Nor, indeed, is "award-winning" being used as the major metric. It's just one signal reviewers can use to judge how well the algorithm is doing.
So, this isn't another example of a platform bringing humans into the equation, as with Facebook recently.
The really clear reason for this change is the prevalence of large outlets doing quick SEO-focused write-throughs of other people's stories, and (essentially) getting as much traffic as they can off other people's stories. We can debate the ethics of that all night, but the reality is that it has been a successful strategy for a while.
That is under threat.
We'll see how it works out in practice, but if you are working on a publication that relies on that sort of traffic:
- Start actively monitoring search traffic to that category of reporting
- Start putting in place contingencies to replace that traffic if it does prove to be threatened
Conversely, hopefully this will incentivise people to do more original reporting, as Google is aiming to display the original stories for longer. Intentionally or not, historically Google has had a habit of displaying the newest stories, not the original story, giving further succour to the write through operations.
Another potential downside is that the change might start helping the big outlets at the expense of the small ones. For good reasons, the "award-winning" outlets messaging has people worried:
Google went some way to try and assuage that doubt in the original announcement:
There is no absolute definition of original reporting, nor is there an absolute standard for establishing how original a given article is. It can mean different things to different newsrooms and publishers at different times, so our efforts will constantly evolve as we work to understand the life cycle of a story.
That gives us at least a little hope that the benefits of this won't only accrue to the big publishers, but also to smaller organisations doing great reporting in niches.
But as always with Google algorithm changes we'll probably see some oddities in the results before it all settles down.
And it's worth remembering that, yet again, this is another example of just how much power two platforms have over our reader acquisition:
Today was just going to be a links email, as as I'd done the work anyway, here they are:
The Sun has been building out a very interesting engagement strategy — several former students are (or have) worked on it. This is a good chance to do some high-profile work with a good team.
Soothing your subscribers
The Guardian's membership editor functions as the "connective tissue" between supporters and the newsroom
"Members" and "Subscribers" are different words. And for good reason. If you want members, you need to make them feel like members.
“At 7,000 members our lives are already changed for the better”: How the Daily Maverick developed its membership program
"Members" and "Subscribers" are…
Oh, I already said that.
Interesting challenge to traditional thinking. The tl;dr of the argument is that inactive subscribers are dragging down your open rate (so what?) but also that they might be helping your newsletter build up a false positive rate for spam. I'd worry more about the latter.
Your dose of analytics medicine
Deep.BI’s newest platform functionality called “Deep Content Attribution Score” (DCAS) is designed to give editorial and data teams better insight into what drives user conversion.
Go on, take your medicine.
Lots of interesting stuff happening around LinkedIn at the moment.
The Next Web are using a bot to write BitCoin stories. And it's doing great on traffic.
I, for one, welcome our new AI overlords.
See you over the weekend for the general reading edition.
Hope you've had a fabulously engaging week…