Search engine optimization, in its the majority of basic sense, relies upon one thing above all others: Search engine spiders crawling and indexing your site.
However almost every website is going to have pages that you don’t want to include in this expedition.
In a best-case situation, these are not doing anything to drive traffic to your site actively, and in a worst-case, they might be diverting traffic from more vital pages.
Thankfully, Google permits webmasters to inform search engine bots what pages and material to crawl and what to disregard. There are several ways to do this, the most common being using a robots.txt file or the meta robotics tag.
We have an exceptional and comprehensive explanation of the ins and outs of robots.txt, which you need to absolutely read.
However in top-level terms, it’s a plain text file that lives in your site’s root and follows the Robots Exemption Procedure (REP).
Robots.txt supplies crawlers with guidelines about the website as an entire, while meta robotics tags consist of directions for particular pages.
Some meta robots tags you might use include index, which tells online search engine to include the page to their index; noindex, which tells it not to add a page to the index or include it in search results page; follow, which instructs a search engine to follow the links on a page; nofollow, which tells it not to follow links, and an entire host of others.
Both robots.txt and meta robotics tags are useful tools to keep in your toolbox, however there’s likewise another method to advise online search engine bots to noindex or nofollow: the X-Robots-Tag.
What Is The X-Robots-Tag?
The X-Robots-Tag is another way for you to manage how your websites are crawled and indexed by spiders. As part of the HTTP header reaction to a URL, it manages indexing for an entire page, along with the particular elements on that page.
And whereas utilizing meta robots tags is fairly uncomplicated, the X-Robots-Tag is a bit more complex.
However this, naturally, raises the concern:
When Should You Utilize The X-Robots-Tag?
According to Google, “Any instruction that can be used in a robotics meta tag can also be specified as an X-Robots-Tag.”
While you can set robots.txt-related regulations in the headers of an HTTP action with both the meta robotics tag and X-Robots Tag, there are particular scenarios where you would want to utilize the X-Robots-Tag– the two most common being when:
- You wish to control how your non-HTML files are being crawled and indexed.
- You want to serve instructions site-wide instead of on a page level.
For example, if you want to obstruct a particular image or video from being crawled– the HTTP reaction approach makes this easy.
The X-Robots-Tag header is likewise useful since it permits you to combine numerous tags within an HTTP reaction or utilize a comma-separated list of instructions to define regulations.
Possibly you do not want a certain page to be cached and desire it to be not available after a certain date. You can use a combination of “noarchive” and “unavailable_after” tags to instruct online search engine bots to follow these directions.
Basically, the power of the X-Robots-Tag is that it is much more flexible than the meta robots tag.
The advantage of using an X-Robots-Tag with HTTP reactions is that it permits you to use routine expressions to carry out crawl directives on non-HTML, along with apply specifications on a larger, international level.
To assist you understand the distinction between these directives, it’s valuable to categorize them by type. That is, are they crawler directives or indexer regulations?
Here’s an useful cheat sheet to discuss:
|Crawler Directives||Indexer Directives|
|Robots.txt– uses the user agent, enable, prohibit, and sitemap instructions to specify where on-site online search engine bots are allowed to crawl and not allowed to crawl.||Meta Robots tag– allows you to define and prevent online search engine from revealing specific pages on a site in search engine result.
Nofollow– permits you to define links that need to not hand down authority or PageRank.
X-Robots-tag– permits you to manage how specified file types are indexed.
Where Do You Put The X-Robots-Tag?
Let’s state you wish to block particular file types. An ideal technique would be to include the X-Robots-Tag to an Apache configuration or a.htaccess file.
The X-Robots-Tag can be added to a website’s HTTP reactions in an Apache server setup via.htaccess file.
Real-World Examples And Uses Of The X-Robots-Tag
So that sounds fantastic in theory, but what does it appear like in the real life? Let’s take a look.
Let’s state we wanted search engines not to index.pdf file types. This setup on Apache servers would look something like the below:
In Nginx, it would appear like the below:
area ~ * . pdf$
Now, let’s look at a various scenario. Let’s state we want to utilize the X-Robots-Tag to block image files, such as.jpg,. gif,. png, and so on, from being indexed. You could do this with an X-Robots-Tag that would appear like the below:
Please note that understanding how these instructions work and the effect they have on one another is vital.
For instance, what takes place if both the X-Robots-Tag and a meta robotics tag lie when crawler bots find a URL?
If that URL is obstructed from robots.txt, then certain indexing and serving instructions can not be discovered and will not be followed.
If instructions are to be followed, then the URLs containing those can not be prohibited from crawling.
Look for An X-Robots-Tag
There are a few various methods that can be utilized to check for an X-Robots-Tag on the site.
The most convenient way to check is to install a browser extension that will tell you X-Robots-Tag info about the URL.
Screenshot of Robots Exclusion Checker, December 2022
Another plugin you can utilize to figure out whether an X-Robots-Tag is being utilized, for example, is the Web Developer plugin.
By clicking the plugin in your internet browser and navigating to “View Reaction Headers,” you can see the numerous HTTP headers being used.
Another technique that can be used for scaling in order to pinpoint problems on websites with a million pages is Shrieking Frog
. After running a website through Yelling Frog, you can browse to the “X-Robots-Tag” column.
This will show you which areas of the website are using the tag, together with which specific directives.
Screenshot of Yelling Frog Report. X-Robot-Tag, December 2022 Utilizing X-Robots-Tags On Your Website Understanding and controlling how online search engine engage with your site is
the foundation of seo. And the X-Robots-Tag is a powerful tool you can use to do simply that. Just know: It’s not without its dangers. It is really easy to make a mistake
and deindex your entire site. That said, if you read this piece, you’re probably not an SEO newbie.
So long as you utilize it wisely, take your time and inspect your work, you’ll discover the X-Robots-Tag to be an useful addition to your arsenal. More Resources: Included Image: Song_about_summer/ Best SMM Panel