Thursday, December 26, 2024
Thursday, December 26, 2024
Home Industry Update A guide to algorithms used by Google and an overview of SMITH

A guide to algorithms used by Google and an overview of SMITH

by admin

Google claims to update its search algorithm several thousand times per year. In the absolute majority of cases, Google algorithm updates are too small to notice. But, every once in a while, Google introduces a change so fundamental, that it disrupts the way we do SEO forever.

To understand these rather complicated algorithms, here’s how we can break them down in simple language for you:

Early in 2011, Google launched Panda update, a search results algorithm which filtered out websites with thin, low quality content. This was the start of a series of major quality control checks. Google Panda stripped search results pages (SERPs) of poorly constructed, spam content, enabling higher quality websites to rise to the top.

Google Penguin update was first announced on April 24, 2012. The primary aim was to decrease the ranking of websites that violates Google webmaster guidelines. Penguin update was needed to protect Google SERP from low-quality content sites. Penguin was Google’s response to the increasing practice of manipulating search results (and rankings) through black hat link-building techniques. The algorithm’s objective was to gain greater control over and reduce the effectiveness of, a number of black hat spamming techniques. Penguin only deals with a site’s incoming links.

Hummingbird Algorithm was announced on 22nd August 2013. The aim to launch the Hummingbird update was to detect the practice of unnecessary keyword stuffing and low-quality content published by some websites. The Hummingbird algorithm helps Google better interpret search queries and provide results that match searcher intent.

Mobile Algorithm was announced on 21st April, 2015. This update has shifted the focus from a desktop to a mobile version of the website. Today, Google ranks all websites based on how fast and user-friendly their mobile versions are.

Rank Brain Algorithm was announced on 26th October, 2015 in order to get rid of shallow content and poor website design websites which are ranking in top Google SERP. Rank Brain is a part of Google’s Hummingbird algorithm.

Medic Algorithm was announced on 4th May, 2018. Google representatives have hinted that the update implemented some of the E-A-T (expertise, authority, trust) signals from the Quality Rater Guidelines document. The Google Medic update seemed to disproportionately affect medical websites as well as other websites that have to do with potentially life-altering decisions (finance, law, education).

Bert Algorithm was announced on 22nd Oct, 2019 to devalue the ranking of the websites which has poor content on the website. BERT uses natural language processing technology to better understand search queries, interpret text, identify entities and relationships between entities.

Basically, a broad core algorithm in simple words means a revamp of the main algorithm, which may include changing the way in which certain existing ranking factors were weighted against one another, or how they interact with one another. Broad core is the third core update after the January and May update of 2020, Google has named it as December 2020 core update which focuses on degrading the ranking of the website which has poor quality content and poor backlinks. Websites suffer fluctuation in search rankings, dip in traffic, loss of snippets. The aim is to give rich contextual results for visitor queries.

 Google recently published a research paper on a new algorithm called SMITH that it claims outperforms BERT for understanding long queries and long documents. SMITH stands for Siamese Multi-depth Transformer-based Hierarchical (SMITH) Encoder. In simple words, the SMITH algorithm is trained to understand the entire passage.

Google never directly reveals what type of algorithm they are using, though the researchers say that the SMITH algorithm outperforms the BERT algorithm. It is very difficult to say whether Google is using a new Algorithm or not. Sources convey that they are using pre-training models that are similar to BERT and many other algorithms. Algorithm pre-training is where an algorithm is trained on a data set.

The piece has been authored by Mohit Bohra, SEO Executive, BC Web Wise.

Comments

comments

You may also like

EDITOR PICKS

POPULAR POSTS

© 2024 Media Samosa – All Right Reserved.