SEO

A Specialist Remedy to Hide Backlinks from Competitors

How to Hide My Backlinks from Competitors

 

Backlinks are an essential aspect for ranking after the quality content. Once a marketer has a decent post, he or she tries to capture the competitor’s backlinks.

It’s very popular Backlinking strategy to spy on competitors backlinks who is already ranking for the same keyword. This way, we can increase our chances for ranking on the same keyword.

There are several backlink checker tools available on the market. However, a few only gives better results, like Majestic and Ahrefs. They two are highly used to know about the backlinks of any site. Rests of the tools have small index database than these tools or show the backlinks by relying on their data. While you can’t get access to the competitor’s Webmaster Tools, these tools come into the scene.

These tools help in revealing the competitor’s site data. Most of the marketers use these tools to competitor analysis.

At the same time, you can also keep the backlink profile safe from such bots. It’s the right approach to staying ahead from your competition. By stopping the bad bots from crawling your site, you can prevent the backlinks stealing by your competitor.

It’s the complete guide to hide your backlinks. You will learn each aspect of hiding backlink profile from crawlers.

The highlights what I am covering in this post are as follows:

  1. What are the crawlers?
  2. What do we need for blocking crawlers?
  3. Identify the Bots you want to Block
  4. Download the .htaccess file
  5. Code for .htacess for blocking the bots
  6. Code for Robots.Txt for blocking the Bots
  7. Plugins to hide backlinks from Competitors
  8. Some Important Points

1. What are the crawlers?

The crawlers/bots are the computer programs designed to crawl the web and collecting data by performing some automated tasks. To make it simple, we can take the example of bots from search engines, like GoogleBot. It goes to every web page on the internet and capture a copy for referencing the data like keyword, backlinks, etc.

There are also some bots exist which do some unethical work like extracting the email from a website or finding security vulnerabilities.

2. What do we need for blocking crawlers?

There are some prerequisites before you can block the bots from access backlinks profile.

  • You blog or website must be running on the Apache Server. Mostly Commercial web hosts use the Apache. So, there should not be any issues in modifying or creating .htaccess. In the case of free web host, you won’t be able to do the same.
  • You must have the access to the web logs of your site. If you are using the commercial web host, you will get to see all the weblogs into the Cpanel. Free web host doesn’t allow this.

When your web host owns above two features, you can go further in this process.

3. Identify the Bots you want to Block

Before you get able to block the bots, you would have to identify them. Two things can recognize a blog:

  1. The IP address the bot is using or
  2. The name of the Bot, which we call – User Agent String.

The best way to find them is to look into the Weblogs. The location of the weblogs varies according to the hosts. I am using Bluehost hosting. If you are on the same server, you can see weblogs quickly in the Cpanel section. Here search for the Access Logs. By getting into the Access Logs, the weblogs can be easily downloaded to your local drive.

Web logs contain the all the data from visitors. If you got lots of data on it, there might be difficult to identify the bot. If you already know the name of bots, it saves your time.

Now unzip the log file using Archive soft, and open it in an ASCII text editor like Notepad in Windows.

Once you get the list of IP addresses and User Agent Sting related to bots, save them to a file for further proceedings.

The IP address is a four digit number separated by the dots, Like 192.168.1.1. Bots use the IP address to get access to your site. Another thing is “User Agent String”. It’s jus a short name which is used to identify a bot. You aren’t required using the complete name of a bot, just get a small chunk of string that makes it different from other bots. That’s it.

4. Download the .htaccess file

Once you identified the IP address or user agent string of the bot, login to your web hosting and look for the File Manager. In many cases, the .htacess file remains hidden for the accessing rights issues. Just enable “show hidden files” and then you will be able to locate the .htaccess file.

After making all the possible efforts, if you aren’t able to find it on your web hosting server, and then you would need to create it from scratch. It’s a usual scenario when most of the web hosts don’t offer a default .htaccess file to the users. In this case, open a blank notepad file and write code in it.

Note: Always keep a copy of an original version of the .htaccess file. Sometimes, you may need to reinstate the original file.

5. Block Bots using .htaccess

If you have managed to download the .htaccess file from the web host, and open it for adding the code. You should go at the end of the file and add the code.

If you have created a brand new file, then just place the code from the beginning.

This blocking method works in two ways:

  • Blocking by IP address
  • Blocking by User Agent String

Blocking by Ip Address:

Simple code can block the Ip address a bot is using.

Order Deny,Allow
Deny from 127.0.0.1

You would have to replace the IP address (127.0.0.1) to the Ip you want to block. Order Deny, Allows means if the web host receives a request that matches to the deny rule then, it will deny the request and if it doesn’t match to any deny rule, then it will be allowed.

The second line of code is describing that the requests coming from 127.0.0.1 are set to issue a “Forbidden” error, instead of producing the actual page.

If you got more IPs to block, then just add another “deny from” with the target IP address.

For example:

Order Deny,Allow
Deny from 127.0.0.1
Deny from 72.56.457.1
Deny from 191.168.1.78

By User Agent String:

With the User agent string, you can easily block the web crawlers. Here we need to use a function built into Apache, which we call RewriteEngine. By using this function with the user agent string, you can easily block any bot and issues a 4.3 forbidden error.

Let’s see some examples:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} AhrefsBot [OR]
RewriteCond %{HTTP_USER_AGENT} msnbot [OR]
RewriteCond %{HTTP_USER_AGENT} AltaVista [OR]
RewriteCond %{HTTP_USER_AGENT} MJ12
RewriteRule . – [F,L]

The F denotes the Forbidden and L shows the last rule for the bot.

Once you implemented all the things and blocked the bots in the .htaccess file, then save the file. You should save a brand new .htaccess file with “.htaccess” name, including inverted commas.Now upload the file on your web hosting by overwriting the previous one.

6. Blocking bots using Robots.txt

The web crawlers can be kept aside by using Robots.Txt file on your server. A simple disallow rule can do this task very easily. Let’s learn this by the following code:

User-agent: RogerbotUser-agent: Exabot

User-agent: Exabot

User-agent: MJ12bot

User-agent: Dotbot

User-agent: Gigabot

User-agent: AhrefsBot

User-agent: BlackWidow

User-agent: ChinaClaw

User-agent: Custo

User-agent: DISCo

User-agent: Download\ Demon

User-agent: eCatch

User-agent: EirGrabber

User-agent: EmailSiphon

User-agent: EmailWolf

User-agent: Express\ WebPictures

User-agent: ExtractorPro

User-agent: EyeNetIE

User-agent: FlashGet

User-agent: GetRight

User-agent: GetWeb!

User-agent: Go!Zilla

User-agent: Go-Ahead-Got-It

User-agent: GrabNet

User-agent: Grafula

User-agent: HMView

User-agent: HTTrack

User-agent: Image\ Stripper

User-agent: Image\ Sucker

User-agent: Indy\ Library

User-agent: InterGET

User-agent: Internet\ Ninja

User-agent: JetCar

User-agent: JOC\ Web\ Spider

User-agent: larbin

User-agent: LeechFTP

User-agent: Mass\ Downloader

User-agent: MIDown\ tool

User-agent: Mister\ PiX

User-agent: MJ12Bot

User-agent: Navroad

User-agent: NearSite

User-agent: NetAnts

User-agent: NetSpider

User-agent: Net\ Vampire

User-agent: NetZIP

User-agent: Octopus

User-agent: Offline\ Explorer

User-agent: Offline\ Navigator

User-agent: PageGrabber

User-agent: Papa\ Foto

User-agent: pavuk

User-agent: pcBrowser

User-agent: RealDownload

User-agent: ReGet

User-agent: SiteSnagger

User-agent: SmartDownload

User-agent: SuperBot

User-agent: SuperHTTP

User-agent: Surfbot

User-agent: tAkeOut

User-agent: Teleport\ Pro

User-agent: VoidEYE

User-agent: Web\ Image\ Collector

User-agent: Web\ Sucker

User-agent: WebAuto

User-agent: WebCopier

User-agent: WebFetch

User-agent: WebGo\ IS

User-agent: WebLeacher

User-agent: WebReaper

User-agent: WebSauger

User-agent: Website\ eXtractor

User-agent: Website\ Quester

User-agent: WebStripper

User-agent: WebWhacker

User-agent: WebZIP

User-agent: Wget

User-agent: Widow

User-agent: WWWOFFLE

User-agent: Xaldon\ WebSpider

User-agent: Zeus

Disallow: /

The above example includes all the popular web crawlers to be added into the Robots.txt for blocking purpose.

Note: You shouldn’t rely only on the Robots.txt file because some bots don’t obey the rules defined in this file, so the blocking rules should also be set in the .htaccess.

7. Plugins to hide backlinks from Competitors

The plugins assist in integrating some extra features in the WordPress. So, there doesn’t need of doing anything manually. Just by adding a relevant plugin, you can get the required function. If you want that competitor won’t get able to see your site’s backlinks, and then plugins can be used for doing so.

There are two plugins available for the WordPress. The first plugin is Spyder Spanker, which is a premium plugin and another plugin is Link Privacy, as a freemium solution to hide backlinks from competitors. I have not used even one of these plugins because I didn’t feel any requirement to cover up the backlinks from competitors.

However, both of these plugins are helpful for those who don’t want to get their hands dirty in editing .htaccess file.

These plugins stop the indexing of your backlinks into the database of web crawlers.

If you have used it earlier or currently using it on your blog/website, share your experience.

Recommended: 8 Best Security Plugins for WordPress to Protect Your Blog from Hackers

8. Some Important Points  

  • You should be careful while blocking an Ip address because it may belong to any particular person or any other thing who doesn’t care about your backlinks. So, if you are confident enough, then only after go for it.
  • If you blocked any particular IP that bot is using, it doesn’t mean that you imposed a permanent ban to the bot. The bot may come from different IPs and try to explore backlink profile.
  • You should do all these working at the time of development phase of your website. Once the bots have crawled your backlinks, it takes much time in cleaning your backlinks data from their database.
  • The above methods are not the permanent solution to prevent your website from web bots. Every day different tools are coming into the marker, so it’s almost impossible to identify and block them all.
  • You should try to get links from authority sites, so even after knowing about your site backlinks, it will be difficult enough to earn those links.
  • People use 301 redirect hiding method to cover the backlinks from crawlers, but it’s not effective as the above solution is. They purchase one more domain and create backlinks to this domain and redirect it to their primary domain. This way, they try to hide their backlinks, but in most of the cases, bots can sense the presence of a 301 redirect.

Conclusion:

 The internet marketing is becoming competitive day by day. People are more eager now to steal your hard work. They want to rank their websites in any way. Stealing the backlinks is one of the strategies that your competitor can use to outrank your site in the search results. In this situation, hiding backlinks can be advantageous to maintain the ranking for the longer time.

Code Reference:

http://goo.gl/NRwTIm

http://goo.gl/NCRnYn

 

About the author

Nikhil

Hey Reader,

Welcome to MyQuickIdea. I am Nikhil Saini, author of this blog from Jaipur, India.

I started MyquickIdea as a passion and now it's empowering newbies and helping them to learn basics of blogging with their blogs.

Here at MyQuickIdea I write about WordPress, Social-media, SEO and link-building.

You can read more about me at About page.

22 Comments

Leave a Comment