Google Algorithm That Affects Your Blog Optimization

Google Algorithm That Affects Your Blog Optimization

In this article, we will discuss the types of Google algorithms that affect the optimization of your blog, as well as the latest Google Algorithm.

Google Algorithm Affecting Web Optimization

For webmasters, the google algorithm is one of the materials that need to be known. Once known, webmasters also need to analyze how to conquer the Google algorithm. The goal is of course to ensure the position of the web is in the best ranking.

As is known, web optimization steps are closely related to on page and off page. On page optimization and off page optimization are carried out so that search engines, including Google, judge our website to be worthy of being on the first page of the SERP (Searching Engine Result Page).

MR JIM : Google's algorithm more or less gives birth to what is called this on page and off page factor. All factors that need to be optimized on page must meet the criteria that are usually required to be met. Likewise all off page optimization factors.

Apparently, on page optimization and off page optimization do not include all of the criteria required to be well optimized by search engines like Google. There are other factors that all these factors become algorithms for Google.

What are the Google algorithms that webmasters need to know? How does the algorithm work in assessing a page's worthiness to rank at the top? Check out the following information.

Google always updates every year, especially related to the algorithm for ranking website pages. When updated, websites that do not meet the criteria or are considered unfavorable based on this algorithm will receive the consequences.

The negative consequence is reduced web traffic which also has an impact on reduced income. This can overwrite any web including the big web as long as this big web doesn't meet the criteria that the google algorithm is based on.

Several large websites and even well-known newspapers such as dailymail.co.uk have decreased traffic by up to 50% due to the latest Google algorithm update. Including CCN.com which lost 71% of traffic and resulted in a decrease in revenue of up to 90%.

So is this something negative? Or is there a positive side? It all depends on our perspective on the Google developers. The extent to which Google is neutral in providing an assessment in terms of ranking this website.

Google's neutral attitude, of course, must be based on consumer input and criticism and is intended for consumer convenience. If so, then Google is still right when updating the algorithm.

So far, what are the Google algorithms from year to year? Including how the latest updates in 2019? Here is some information that can be summarized as much as possible to make it more to the point and easy to understand.

1. Panda

Panda is an algorithm that was released on February 24, 2011. Initially Panda was a filter that issued a score on the quality of a website page. This quality rating is based on plagiarism, content size, spam and keyword stuffing.

In 2016, the results of Panda's assessment were used as the official algorithm used by Google to rank pages based on the scores they issued. So, since 2016, pages that are detected as plagiarism, have thin content, have user generated spam and keyword stuffing (too much) will get a low score. So the ranking will go down.

So avoid duplication of content, keep the capacity of the content normal and set it so that the use of keywords can be as effective as possible. Make sure that it fits the criteria as much as possible. After doing so make sure your page rank again.

2. Penguin

On April 24, 2014 an algorithm was released that will issue an assessment of links and anchor text. In 2016 this algorithm also became the core used by Google in determining page ranking.

So if a page contains links or links that are not appropriate and the number is not appropriate, it will be judged as spam. Likewise with optimized anchor text. Anchor text that is too dominant will be suspected so that it is considered negative.

Check that the placement of links on your page is optimal. The number is not less, but not much so that it is considered a page that contains spam. Also check that the link that is linked is good enough, does not have an error or does not come from a bad source.

Make sure when providing links using anchors that are not always exactly the target keyword. The trick is to make it longer but still related or by using synonyms.

3. Hummingird

Hummingbird was released on August 22, 2013 which became an algorithm for assessing keyword stuffing and web page content. Hummingbird is a keyword detection tool which is a synonym so that it can be judged to be the same as the original keyword.

In addition, is the content of the website pages. It turns out that the algorithm can recognize how well the content on a website page is. Not just detecting words without understanding their meaning.

Mobile Friendly Illustration, Source: journal.id The badness of a content is judged by its suitability with keywords. That is, if the content does not contain things that are appropriate or related to keywords, it means that it is inappropriate content.

Improve the placement of keywords on your website and do research so that every content that is loaded is always feasible because it is related to the targeted keywords. Don't forget that hummingbird can detect synonyms or similarities, so even if they don't contain keywords, they can still be considered appropriate as long as they are similar and related.

4. Pigeon

Pigeon is an algorithm that can assess the on-page and off-page quality of web pages. So all on page and off page factors can be assessed based on the criteria whether they have been met or not. So make sure that your web page is optimized on page and off page.

Pigeon also indexes the ranking based on the user's presence. So that a page that has a good ranking in a location will be different if it is monitored by users who are in other different locations. This algorithm was released July 24, 2014.

5. Mobile

Mobile was released on April 21, 2015 which becomes an algorithm for assessing page integration when used on a mobile device. So now, all web pages need to meet the criteria to be accessed using a cellphone.

Illustration of Link Building, source ig @madcashcentral So that webmasters are not only limited to setting targets as usual, but also need to target mobile device users who continue to increase from year to year. Research all criteria that fall into the mobile friendly category.

6. Rank Brain

On October 26, 2015 was released an algorithm that focuses on specific queries and user experience. So now webmasters are required to focus more on each page's content and prioritize comfort for users.

7. Possum

Possum was released on September 1, 2016 which serves to place the pages according to their respective locations to make the competition more specific based on their respective presence. So the webmasters must ensure that the key competition is in accordance with their respective target locations.

8. Fred

Fred is an algorithm released on March 8, 2017 that provides ratings around content and affiliation. These affiliates are usually about advertising content. So that good content but too many affiliates will be rated negatively by search engines according to Fred's algorithm.

That's the various google algorithms sorted by update. Hopefully with this information can provide assistance to webmasters to anticipate the impact of updating the Google algorithm. Look forward to more reviews on MR JIM EU ORG.

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel