Everybody is worried about their social security on electronic media. Its possibly time to make some information on the secure environment. Mostly people make their website or blogs live over the internet to make some announcement about particular niche. They want specific groups of people to visit their blog but they don’t want their data to be stored on Google index for any reasons.
So, in this blog we will be discussing how to create or optimize your robots.txt for SEO. Actually the defined file is responsible for everything Google indexes.
It won’t be wrong to say that robots.txt is file is all about making your domain of data encrypted.
Where is the Robots.txt file? How to Create a Robots.txt file?
First of all lets talk about where it is placed in an individual’s website over the internet?
Following is the command that is used to find out your robots.txt:
Since the format for robots.txt file is quite simple to understand. The first line usually names a user agent. The user agent is actually the name of the search bot you are trying to communicate with. For example, Googlebot or Bingbot. You can use asterisk * to instruct all bots.
The next line follows with Allow or Disallow instructions for search engines, so they know which parts you want them to index, and which ones you don’t want indexed.
In this sample example of www.kamranmohsin.com, we have disallowed them to index our WordPress wp-admin directory. In the next two lines we have instructed all bots to index only admin-ajax.php file in admin directory.
Its better to disallow plugins WordPress plugin directory. This will strengthen your site’s security if someone is looking for a specific vulnerable plugin to exploit for a mass attack.
Do I Really Need a Robots.txt File?
Its highly recommended to create robots.txt for your site. If you want to submit your site’s XML sitemap to search engines, then this is where search engines will look for your XML sitemap unless you have specified it in Google Webmaster Tools. Well Absence of a robots.txt file will not stop search engines from crawling and indexing your website, but its better to have one created.
What is XML Sitemap?
Sitemap is a list of pages on a website that is accessible to all users. An XML sitemap is a way for website owners to tell search engines about all the pages that exist on their website. WordPress XML sitemap also tells search engines which links on your website are more important than others and how frequently you update your website. While sitemaps do not boost your search rankings, it allows search engines to better crawl your website.
To check xml sitemap on your site, following is an example:
Adding Your XML Sitemap to Robots.txt File
If you are using Yoast’s WordPress SEO plugin or some other plugin to generate your XML sitemap, then your plugin will try to automatically add your sitemap related lines into robots.txt file.
However if it fails, then your plugin will show you the link to your XML Sitemaps which you can add to your robots.txt file manually like this:
We are using the following sitemap on our site.