By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy. We’ll occasionally send you promo and account related email
No need to pay just yet!
About this sample
About this sample
Words: 895 |
Pages: 2|
5 min read
Published: Mar 14, 2019
Words: 895|Pages: 2|5 min read
Published: Mar 14, 2019
The insights on all things SEO are invaluable, especially when it comes from John Mueller. Recently, the Senior Webmaster Trends Analyst at Google answered some pressing questions on the use of nofollow links, duplicate content, and more. Most of his answers are featured in "Ask Me Anything" session on Reddit. Here are 10 things an SEO can learn from his comments, including excerpts of the answers he gave to Reddit users.
The essence of the question is simply this: would hiding some design features for enhancing user experience (UX) have a negative impact on rankings, now that websites are gravitating towards mobile-first indexing.
From Mueller's answer, it can be inferred that although some design features have to be compromised on mobile-optimized version of web pages, important UX elements have to be fully loaded on the page just as users open it. He substantiated that Google understands why certain layout features would have to be hidden, but still it is important to promote user interaction for better conversion rates.
Nofollow directive is a value added in meta tags for a webpage's HTML source code, to suggest Google not to follow external links within the site. This has been used by several websites.
Mueller said that Google have no plans to ignore nofollow all of a sudden, since it does not cause any issues but find it as useful to the search engine. Although, he did point out that there is a tendency for things to evolve in due course. So, one cannot rule out an algorithm update that may invite changes in future.
This refers to the number of clicks, which a site appearing on SERP receives divided by impressions. Those in the SEO profession seems to think that organic CTR contributes to search rankings, but the Google executive says that it perhaps is not as influential as most people feel it is. His answer suggest people do "weird things" after a search, like stepping back, and that indicates that clicks on a ranked site is not an accurate measure of its quality. Besides, there are ways to encourage higher rankings and that is to optimize long-tail keywords, meta descriptions, etc.
A Reddit user doubted if there are inconsistencies for rankings for different zones, as denoted by the country code on Google websites. This raised the question whether the search engine uses different algorithms or ranking factors based on nations. Mueller says Google try not to adopt distinct algorithms per language or nation, since it is more scalable to create something that is uniform for the web. He puts the inconsistencies, such as rich snippets launched in select markets, down to local content.
The essence of the question is whether bots and spammers can be held accountable for non-organic trends, which illicit marketers can take up. In response, Mueller says that Google are doing their best to block and filter those, although captcha pops up to even the most reasonable search queries. There has to be a way to help people get what they want without having to prove they are no robots, and Mueller rightly feels there should be an easier way to block and filter bots and spammers without having to rely too much on captcha.
A Reddit user asked since his website have articles on a range of topics, whether it would be possible to create articles for each question and answer without violating Google's duplicate content guidelines. Mueller suggests a different idea: work on quality, not quantity.
He says a wrong SEO advice is one that has no effect on rankings in the long run, especially. He equates it to advising a business to change invoice printing paper - something that involves time and effort for no real beneficial outcome.
It has been a misconception among site owners that publishing content frequently can attain higher search rankings. He says that Google algorithms does not give preferential treatment to websites that publish content regularly than other websites.
One method SEOs use to avoid duplicate content, which may impact rankings, is to rewrite the source content with subtle differences. That is by using synonyms and associated words to make the content unique. According to Mueller, this can be counterproductive to a website. It is always best to make content as unique as possible.
The mobile version of a site is a priority when it comes to what Google includes in the index, making it the main version over the desktop version. This change is rolling out to only those sites, which the search engine considers ready for the move. Which is why Mueller tells webmasters not to panic. In other words, until the site is ready to be launched as a mobile-friendly version, he advices to stick to quality content on desktop version to get indexed and then crawled. The logic is to take time to come with the best mobile-friendly site that ranks higher.
Browse our vast selection of original essay samples, each expertly formatted and styled