Optimizing Robots.txt For A WordPress Blog

Optimizing Robots.txt For A WordPress Blog
Optimizing Robots.txt For A WordPress Blog

Video: Optimizing Robots.txt For A WordPress Blog

Video: Optimizing Robots.txt For A WordPress Blog
Video: How to Optimize WordPress Robots.txt In 2020 - Less Is More - Robots.txt Tutorial For WordPress 2024, December
Anonim

Most seasoned bloggers certainly know what robots.txt is and why you need this file. But few authors immediately rush to create a robots.txt file after installing a blog on WordPress.

Optimizing robots.txt for a WordPress blog
Optimizing robots.txt for a WordPress blog

Robots.txt is a text file that is uploaded to the root directory of your site and contains instructions for crawlers. The main purpose of its use is to prohibit indexing of individual pages and sections on the site. However, using robots.txt, you can also specify the correct domain mirror, prescribe the path to the sitemap, and the like.

Most modern search engines have learned to navigate the popular CMS well and usually do not try to index content that is not intended for this. For example, Google will not index your WordPress blog admin area even if you don't specify it directly in robots.txt. However, in some cases, the use of direct bans can still be useful. And we are talking primarily about the prohibition of duplicate content.

Some webmasters go so far as to prohibit indexing of category and tag pages, since their content partially duplicates the content of the main page. But most are limited to banning trackback and feed pages, which completely duplicate article content and are not intended for search engines at all. Such a precaution will not only make the results of the site "cleaner", but also save you from possible search filters, especially after the introduction of the new Google Panda algorithm.

Here are the recommended directives for a robots.txt file (it will work for almost any WordPress blog):

User-Agent: * Disallow: /wp-login.php Disallow: /wp-register.php Disallow: /xmlrpc.php Disallow: / wp-admin Disallow: / wp-includes Disallow: / wp-content / plugins Disallow: / wp-content / cache Disallow: / wp-content / themes Disallow: / trackback / Disallow: / feed / Disallow: * / trackback / Disallow: * / feed /

Please note that in robots.txt the administrative folders wp-admin and wp-includes are completely closed for indexing. The wp-content folder is only partially closed, since it contains the uploads directory, which contains all the images from your blog that should be indexed.

All you need to do is copy the directives from the above code (note that each directive must be written on a new line), save them to a text file called robots.txt, and upload them to the root directory of your site.

You can always check whether robots.txt is working correctly through the Google Webmaster Tools and Yandex Webmaster interfaces.

Recommended: