Google’s algorithms are all the same and one of the most interesting. I myself am very happy with the perspective of a typical user, who is doing a lot of searches with Google every day, and I see that this search engine is so much about users and is constantly trying to make it easier for them to search and bring them good results. .
Well, if you remember, we had already talked about panda, penguin, humingbird and pigeon algorithms. In this article, we also want to add another algorithm, or better, to review the updates. A fairly recent update to
Where did the story of Google’s algorithm start?
The story was that on March 17, 2017, there was a whole crowd of webmasters and some terrified to the results of the search page. At that time, there were scattered accounts of the changes made to Google’s search engine results, but Google kept silent as usual, and did not provide specific information to Webmasters.
Webmasters saw these changes as the result of the new update of the algorithm, and decoded it and provided information to others. In this article, we will examine this update, but let’s first look at the not-so-interesting story of Fred.
Just two days after the start of the Google search engine, the Fred name was chosen for this update in an insane joke . The story began with Ryan Jones (director of Sapiano Razorfish) in a tweet saying: “Those who are not friends with Garry Ellis (Google Search Engine Development Specialist), they certainly do not know that he does not have all the things that do not have names in fred sound Fish, people or whatever you think. “
A few hours after this tweet, Barry Schwartz (a search engine specialist) asked Garry Ellis if he wanted to nominate a new update?
“And from now on, we’ll know every unofficial update,” Fred said.
And … that ‘s how the person was born.
What is the person really
Be careful that the person is not an algorithm. Fred is an update added to the core of the search engine and caused changes to the ranking of sites.
What was Google’s goal for a person’s update?
Google has not even published information about it even after the formal confirmation of this algorithm. In Obanda, it seemed that sites with finer content and fines were penalized, but that was not the case; SEO experts got some other information with the grave of the penalized sites.
They found that Fred has targeted sites that use black hat SEO techniques that are making money with high-quality content and a large amount of spam advertising. On these sites, the user experience (UX) is virtually no matter what, and what is not important is gaining more profit.
Many of these sites sacrificed the user experience to make money and imposed a lot of advertising on it. Of course, some of these sites took notice of a relatively better position on Google after they became aware of this issue. Like the following site, which has improved its previous position by improving its user experience, publishing quality content and technical SEOs:
What sites hit the hardest hit by the algorithm?
During a useful move on the site of seroundtable.com, a list of 100 different sites that had lost their place in search results after the update, then Bryce Schwartz reviewed 100 sites and gained some interesting results. He found that most of these 100 sites featured similar characteristics, including the fact that the content contained in them was extremely weak and ineffective, and advertisements appeared or appear in a totally annoying manner, the user had to read content that was worth a lot It does not always click on different links, postpay pages will pop up, and the overall percentage of ads on the site is more than its content.
Updating the person is also another one. It also fined sites that had been blocked from bumpy and full of ads by Boklinka. Do you know why
To answer this question, let’s put ourselves behind Google. Do you trust sites that do not have useful content and do not add good user experience to you? Obviously your answer is negative. Well, Google is like you. Google also does not trust such sites, and knows that they are not linked to the quality of the content of a site, but they are merely important for them, not the user who is going to visit the site.
Look at the two screenshots taken from the Analytics page of one of the users:
Unlike sites that featured these features, this site has improved overnight, and since March 7th, traffic has increased 125%.
If you would like a sample of the sites that have been fined by the pictures here below. As you can see, this site does not have an interesting experience, and it enters into every piece of content that is inadequate and inferior to content.
What should we do to protect Fred Fred?
See, as I said, an update for the core of the search engine and improved it, not a separate part (like the Panda algorithm), which was added later; so it can not be said that someone finishes websites. This is the entire Google search engine that monitors your site, checks it, and if you have violated the rules, it will penalize you; so you need to follow a series of general instructions to avoid catching Google and other robots. These instructions can be found in the SEO article that contains the internal and external vault checklist of the site. But let’s review some of the updates that are partly about the individual.
First point: content, content and content again
Filling content with keywords or even meeting density in other keywords does not answer. When it comes to quality content in the middle, you should consider users first and generate content for them. If you want to create valuable content, be sure to:
- Try to keep your articles as similar as possible to the same topics as possible.
- Write articles that are attractive to the audience and the first category and answer their questions.
- Get help from the site analytics tools to see which posts bring the most traffic to your site.
- Check out the Bans Rit site analysis tools and stay on the page.
- Regular posts, infographics, and evergreen content (content that is worth constant and does not go away over time) are issues that increase your followers and cause users to revisit your site. Rising returns to the site will increase your credit rating.
- Use video content.
- Work on social networks. The activity on these networks and getting the likes and falouurs is the fastest way to get the credit lost.
Tip: Good design experience
Use of your website should be easy for your target audience; so if you want users to have a good experience working with your website, get help with the following guidelines:
- Define Persson for your target audience and design user experience tailored to them.
- Use the A / B test to figure out what your audience would prefer.
- Use the Heat Map to find out which sites users are more focused on and which sectors they have not interacted with.
Third: Observe the relevance of advertising on your site
Advertising is one of the facts of the Internet world and no one can deny it. You have probably already seen yourself with pages that are full of ads and their content is not well seen. For example, we sometimes search the Internet for “chess training,” and when we open a page in the search engine, we start with a large number of different banner ads, scrolling the page down, but it’s still not content and links to pages or There are other sites. Yes, this site is full of links, banners and pop-ups.
Users hate these sites and immediately snap it. This means that an UX user experience (UX) is awful and Google does not go through such an error. So if you want your site to stay away from Google’s fine, keep track of the relevancy of ads on your site and use more than Nofollow links instead of Dofollow.
Notice that the user should not have to click on and go to consecutive pages to read the content at any time. Your goal should be to allow the user to easily access what he wants to.
Point 4: Review the links
Oldies One of the factors that Google ranked was the number of blocks for each site. In the same way that each site had more links, it got even better. Of course, Google soon realized that this ranking model has caused a number of site owners to get rich link sales (without having to read the site or even have a similarity with the topic), as well as top-notch sites. . But, well, luckily Google did not take that factor out and replaced it with better factors.
Of course, the blinkers still have a tremendous impact on SEO, however, provided they are taken from a credible and credible place. Valid sites are just like people who are trusted to be a neighborhood or family, and they speak for all the documents. If they approve you, be sure your work is right!
If you have ever used a buyer’s link and do not have a place in Google’s results, then it’s time to clean up. Repent firstly to erase spectan bcclinks, then use Google Search Console to declare to Google which links to ignore and analyze (they must disavow the links). Of course, Google suggests that if it is possible to talk to site administrators to clear your link, but if it does not, it’s okay to disassociate them.
Tip: Sometimes you do not use spam links, but competitors use this method to land your site. In such cases, you should take care of the clutches, observe them with tools such as Semrush Backlinks, then try to disavow them through the Google Consoles.
The final instructions
If you want me to say I do not really want to see what Google updates and what algorithms it adds; you should always try to consider your users in content creation and even the design of the site and see what they like more. . Then you can think of how we can understand the site for Google and show it to Google Hassan Nithman. Google updates its behavior to real users on an ongoing basis, so be sure to prioritize users, whether now or for years, there will be no harm to you, but to yours!