Links are the lifeline of your website’s organic SEO. However, this lifeline can also poison your site’s ranking if link building is done wearing a black hat. This article is written to provide some details about low-quality links and how you can identify them on yours as well as other sites before getting rid of them.
Links- Why they are so important?
Search engines like Google rely on links as one of the crucial signals to rank the websites and pages. It can be understood with the Google’s statement that “links are special endorsements by other websites who see a value in your content for their readers”. Simply, you get links from other websites when you have a unique, and valuable content that these websites think would be perfect to show to their readers.
Google utilizes links to determine the value of a content in its page ranking algorithm in the same way scholars use citation to solidify the value and authority of information in their piece of the research paper.
It’s not that simple as it seems!
With black-hat SEO practices, many websites started using the artificial methods to increase the number of links to their sites with a hope to improve the website ranking. This initially improved their rankings, but consequently degraded the Google’s SERPs in terms of relevance and value proposition, as these websites just stuffed the pages with as many links.
This is where Google retaliated with its link checking algorithms to measure the link quality and penalize the website’s which are using black hat practices. Google penguin is one such link checking algorithm which was a solid strike to the black hat link-building techniques. It was deployed with an aim to ensure that only the sites with high-quality links get the ranking benefits and those with low-quality links are penalized for not adhering to the webmaster’s guidelines.
Also, it was made clear that ‘quality’ would remain above ‘quantity’. So, the number of links does not matter until you use few but high-quality links on your pages.
Learn: 5 SEO blunders to watch out for a Website
What determines a link is “Bad” or of “High” Quality?
Sadly, there is no official definition of what you call a “Bad link” or a “Low-quality link”. It’s confusing. Why? Just read this paragraph from Google’s Quality Guidelines about the “link schemes”:
“Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.”
It sounds like anything you would do for link building would just violate the quality guidelines.
Sounds scary? I was scared too at first until I read the last paragraph of the same guideline:
“The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.”
So, this brings us to an assertion that high-quality links are:
- the ones that are pointed towards high quality, relevant, and valuable contents.
- the ones that are editorially given; not automatically generated without an editorial control.
This also helped me to derive a temporary definition of low-quality links”
- Links which are automatically built without editorial control are the bad or low-quality links.
- Links to irrelevant content with no value for the users are bad or low-quality links.
Well, this definition definitely does not contain the whole idea about the bad links but would be enough for a start before we go to more complex and real context aspects of the low-quality links.
The real context of Low-quality links
We implicated some definitions using the Google’s guidelines. Now, let’s move a little deeper and see how can you identify the low-quality links in the real context on your site.
- Low-quality, poor quality, or the bad links are simply the links that come from low-quality sites. These low-quality sites can be identified with some automatic and manual inspection. They usually do not adhere to the search engine guidelines and do not use recommend standards for the quality contents.
- Quality contents can be identified by inspecting the uniqueness that would also sprinkle some value to the readers in the real context. The value can be effectively measured by checking what this content provides that other similar contents on the web do not.
- While you inspect the uniqueness, you can easily check if the content is scrapped, plagiarized, or poorly repurposed. Such contents are not unique and are of low quality. Thus, the links coming from such sites which do not have additional value to the readers are the low-quality links.
- As another indicator of the low-quality sites, you can check the abundance of the Ads, interlinking, keyword stuffing and other spam indicators. Sites which have a lot of ads overshadowing the main content, poor interlinking, and stuffed keywords are always identified by Google as low-quality sites. And the links coming from them are low-quality links.
Not all that look poor is of low-quality
SEO professionals use a variety of data sources to collect a hard data to measure the link and site quality. However, the tools that they adhere may necessarily not be the reliable ones in determining the quality.
Professionals use the tools like Google toolbar PageRank (Google took it down in 2016), Google indexation, links count tools, Alexa rankings, and other automated data collection tools. But I find these tools hardly reliable, as they are not designed to check the link quality. They give the data based on different assertions and these data are manually interpreted by the SEOs to correlate them with the quality.
I can use these data to identify the potential culprits but not to immediately drive a conclusion that “this is a low-quality site”.
Link building is tough and a time-consuming process. Especially for the new sites that may look like poor based on the data signals collected by the automated tools, but they are just the new sites that have not become that popular yet.
Learn: How to get traffic on your newly built website?
So a manual review is always recommended even if you rely on the data collected by the automated tools to determine the quality. The whole task can be simplified by using these data tools to flag the possible bad links and then manually inspecting them to confirm the same.
How to perform a manual Review?
A manual review should be performed if the data collection tools indicate a good score but SEO tools do not give a positive review. You can use these signals to manually review the link quality:
Trustworthiness – By looking at the site for the first time if you do not feel like visiting it again or you do not trust the information shared there, you are probably looking at a low-quality site.
Scrap- If you are looking at a site that does not have unique information or products, pulls content from other websites without a link back to the original site, and feels like you can get better information on the same context on other sites, then it’s probably a low-quality site.
Ads- There are sites that show too many paid ads that overshadow the main website content. Also, some Ads melt-in so confusingly with the website content that you wouldn’t be able to differentiate the website content from the paid Ads. Furthermore, they go beyond the Ad banners and use automatic redirections to the Ad pages triggered on clicks or scroll event. These sites are definitely the poor ones and they can kill your SEO if linked back.
To conclude
Finally, there is no need to review the whole site manually if you find a bad page anytime during the review. If there is one poor quality page on the site it’s highly possible that most of the pages would be poor.
Probably it’s a time to ask your SEO expert to do a manual review of your own site too and check all the parameters discussed in this post. Hopefully, you have now enough information about the low-quality links and you can prepare a checklist to track the work of your SEO provider on low-quality links.