Google Freshness Update

google freshness project depicted as fresh orange juiceGoogle not only have project names for products, they also give their search engine algorithm changes names, and I don’t just mean all those cuddly zoo animals that begin with the letter P.

Their InsideSearch blog, which is updated by Google engineers (including Pandu Nayak whose name was the catalyst for the Penguin update) only goes back to May 2011 but if you look at the task numbers, you get an idea for the sheer scope of changes (the highest number on the August/September update page is #937372 for example).

One big issue has been the ‘Freshness’ project. It goes back quite a way, although as far as I can see it has only been named Freshness since 2012 (if I have that wrong, please do correct me).

For August and September there have been 13 tweaks affecting the way fresh content is analysed.

The freshness concept as far as I can read is there to achieve these things:

 

  • Figure out if the searcher is looking for fresher content (and turning off that area of the algorithm when it is obvious they are not)
  • Try to ensure newer articles are placed higher up the results (if the above is true)
  • Reduce the occurrence of showing multiple pages for the same site showing under the same search
  • Better image search where freshness is concerned (eg. finding relevant news articles related to topical images)
  • Return fresh results in real time (even possibly seconds after an event occurs)
  • Reducing the so called ‘freshness boost’ to zero if a page is deemed to be of low quality
  • Promoting fresher content if it is breaking news or highly topical
  • Updated handling where stale content is found (based on document age)

All the above really pushes the concept of original, fresh and high quality content being the order of the day.

I also suspect it means that in future culling of curated content will be applied with greater vigour.

In internet marketing circles, pushing the concept of Huffington Post lookalike sites has been big business for a number of years, and a whole host of tools has been marketed off the back of that.

Ideas like adding original content as an intro and outro and then stuffing the middle with curated content has been rife for ages (and who can blame them if it works!).

But Google is not stupid, and with all that human brain power, there is no way they won’t have figured out everything in advance anyway, so algorithm changes that reduce spammy stuff is absolutely a good thing.

By publishing these updates it must help to level the playing field between those who genuinely want to add value to the web and those out for a quick buck. I applaud it.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.