Recently one of our readers asked us how to optimize the robots.txt file to improve SEO.
The robots.txt file tells search engines how to crawl your website, which makes it an incredibly powerful SEO tool.
In this article, we will show you some tips on how to create a perfect robots.txt file for SEO.

What Is a Robots.txt file?
Robots.txt is a text file that website owners can create to tell search engine bots how to crawl and index pages on their sites.
It is typically stored in the root directory (also known as the main folder) of your website. The basic format for a robots.txt file looks like this:
User-agent: [user-agent name]
Disallow: [URL string not to be crawled]
User-agent: [user-agent name]
Allow: [URL string to be crawled]
Sitemap: [URL of your XML Sitemap]
You can have multiple lines of instructions to allow or disallow specific URLs and add multiple sitemaps. If you do not disallow a URL, then search engine bots assume that they are allowed to crawl it.
Here is what a robots.txt example file can look like:
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Sitemap: https://example.com/sitemap_index.xml
In the above robots.txt example, we have allowed search engines to crawl and index files in our WordPress uploads folder.
After that, we have disallowed search bots from crawling and indexing plugins and WordPress admin folders.
Lastly, we have provided the URL of our XML sitemap.
Do You Need a Robots.txt File for Your WordPress Site?
If you don’t have a robots.txt file, then search engines will still crawl and index your website. However, you will not be able to tell them which pages or folders they should not crawl.
This won’t have much impact when you first start a blog and don’t have a lot of content.
However, as your website grows and you add more content, then you will likely want better control over how your website is crawled and indexed.
Here is why.
Search bots have a crawl quota for each website.
This means that they crawl a certain number of pages during a crawl session. If they don’t finish crawling all pages on your site, then they will come back and resume crawling in the next session.
This can slow down your website indexing rate.
You can fix this by disallowing search bots from attempting to crawl unnecessary pages like your WordPress admin pages, plugin files, and themes folder.
By disallowing unnecessary pages, you save your crawl quota. This helps search engines crawl even more pages on your site and index them as quickly as possible.
Another good reason to use a robots.txt file is when you want to stop search engines from indexing a post or page on your website.
This is not the safest way to hide content from the general public, but it will help you prevent content from appearing in search results.
What Does an Ideal Robots.txt File Look Like?
Many popular blogs use a very simple robots.txt file. Their content may vary depending on the needs of the specific site:
User-agent: *
Disallow:
Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml
This robots.txt file allows all bots to index all content and provides them with a link to the website’s XML sitemaps.
For WordPress sites, we recommend the following rules in the robots.txt file:
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/
Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml
This tells search bots to index all WordPress images and files. It disallows search bots from indexing the WordPress admin area, readme file, and cloaked affiliate links.
By adding sitemaps to the robots.txt file, you make it easy for Google bots to find all the pages on your site.
Now that you know what an ideal robots.txt file looks like, let’s take a look at how you can create a robots.txt file in WordPress.
How to Create a Robots.txt File in WordPress
There are two ways to create a robots.txt file in WordPress. You can choose the method that works best for you.
Method 1: Editing Robots.txt File Using All in One SEO
All in One SEO, also known as AIOSEO, is the best WordPress SEO plugin on the market, used by over 3 million websites.
It’s easy to use and comes with a robots.txt file generator.
If you don’t already have the AIOSEO plugin installed, you can see our step-by-step guide on how to install a WordPress plugin.
Note: A free version of AIOSEO is also available and has this feature.
Once the plugin is installed and activated, you can use it to create and edit your robots.txt file directly from your WordPress admin area.
Simply go to All in One SEO » Tools to edit your robots.txt file.

First, you’ll need to turn on the editing option by clicking the ‘Enable Custom Robots.txt’ toggle to blue.
With this toggle on, you can create a custom robots.txt file in WordPress.

All in One SEO will show your existing robots.txt file in the ‘Robots.txt Preview’ section at the bottom of your screen.
This version will show the default rules that were added by WordPress.

These default rules tell the search engines not to crawl your core WordPress files, allow the bots to index all content, and provide them a link to your site’s XML sitemaps.
Now, you can add your own custom rules to improve your robots.txt for SEO.
To add a rule, enter a user agent in the ‘User Agent’ field. Using a * will apply the rule to all user agents.
Then, select whether you want to ‘Allow’ or ‘Disallow’ the search engines to crawl.
Next, enter the filename or directory path in the ‘Directory Path’ field.

The rule will automatically be applied to your robots.txt. To add another rule, just click the ‘Add Rule’ button.
We recommend adding rules until you create the ideal robots.txt format we shared above.
Your custom rules will look like this.

Once you are done, don’t forget to click on the ‘Save Changes’ button to store your changes.
Method 2. Editing Robots.txt file Manually Using FTP
For this method, you will need to use an FTP client to edit the robots.txt file.
Simply connect to your WordPress website files using an FTP client.
Once inside, you will be able to see the robots.txt file in your website’s root folder.

If you don’t see one, then you likely don’t have a robots.txt file.
In that case, you can just go ahead and create one.

Robots.txt is a plain text file, which means you can download it to your computer and edit it using any plain text editor like Notepad or TextEdit.
After saving your changes, you can upload the robots.txt file back to your website’s root folder.
How to Test Your Robots.txt File
Once you have created your robots.txt file, it’s always a good idea to test it using a robots.txt tester tool.
There are many robots.txt tester tools out there, but we recommend using the one inside Google Search Console.
First, you will need to have your website linked with Google Search Console. If you haven’t done this yet, see our guide on how to add your WordPress site to Google Search Console.
Then, you can use the Google Search Console Robots Testing Tool.

Simply select your property from the dropdown list.
The tool will automatically fetch your website’s robots.txt file and highlight the errors and warnings if it found any.

Final Thoughts
The goal of optimizing your robots.txt file is to prevent search engines from crawling pages that are not publicly available. For example, pages in your wp-plugins folder or pages in your WordPress admin folder.
A common myth among SEO experts is that blocking WordPress categories, tags, and archive pages will improve the crawl rate and result in faster indexing and higher rankings.
This is not true. It’s also against Google’s webmaster guidelines.
We recommend that you follow the above robots.txt format to create a robots.txt file for your website.
We hope this article helped you learn how to optimize your WordPress robots.txt file for SEO. You may also want to see our ultimate WordPress SEO guide and our expert picks for the best WordPress SEO tools to grow your website.
If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.
Stéphane says
Hi,
Thanks for that post, it becomes clearer how to use the robots.txt file. On most websites that you find while looking for some advice regarding the robots.txt file, you can see that the following folders are explicitly excluded from crawling (for WordPress):
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
I don’t really understand the reasons to exclude those folders (is there one actually?). What would be your take regarding that matter?
WPBeginner Support says
It is mainly to prevent anything in those folders from showing as a result when a user searches for your site. As that is not your content it is not something most people would want to appear for the site’s results.
Admin
zaid haris says
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
GSC show the coverage error for “Disallow: /wp-admin/” Is this wrong?
WPBeginner Support says
For most sites, you do not want anything from your wp-admin to appear as a search result so it is fine and expected to receive the coverage area when you deny Google the ability to scan your wp-admin.
Admin
Hansini says
I am creating my robots.txt manually as you instructed for my WordPress site.
I have one doubt. when I write User-Agent: * won’t it allow another spamming robot to access my site?
Should I write User-Agent: * or User-Agent: Googlebot.?
WPBeginner Support says
The User-Agent line is setting the rules that all robots should follow on your site, if you specify a specific bot on that line it would be setting rules for that specific bot and none of the others.
Admin
Nishant says
What should we write to make google index my post?
WPBeginner Support says
For having your site listed, you would want to take a look at our article below:
https://www.wpbeginner.com/beginners-guide/how-do-i-get-my-wordpress-site-listed-on-google-beginners-guide/
Admin
Sanjeev Pandey says
should we also disallow /wp-content/themes/ ?
It is appearing in the search result when I run the command site:abcdef.com in google search
WPBeginner Support says
You would not want to worry about blocking your themes folder and as you write SEO-friendly content you should no longer see the themes as a search result.
Admin
Salem says
HI, What’s means ” Disallow: /readme.html & Disallow: /refer/ ” ?
WPBeginner Support says
That means you’re telling search engines to not look at any referral links or the readme.html file.
Admin
sean says
Hi, what are the pros and cons of blocking wp-content/uploads
Thank you
WPBeginner Support says
If you block your uploads folder then search engines would not normally crawl your uploaded content like images.
Admin
Piyush says
thanks for solve my problem
WPBeginner Support says
You’re welcome
Admin
Ravi kumar says
Sir i m very confused about robot.txt many time i submitted site map in blogger but the after 3,4 days coming the same issue what is the exactly robot.txt.. & how submit that please guide me
WPBeginner Support says
It would depend on your specific issue, you may want to take a look at our page below:
https://www.wpbeginner.com/glossary/robots-txt/
Admin
Prem says
If I no index a url or page using robots.txt file, does google shows any error in search console?
WPBeginner Support says
No, Google will not list the page but if the page is listed it will not show an error.
Admin
Bharat says
Hi
I have a question
i receive google search console coverage issue warning for blocked by robots.txt
/wp-admin/widgets.php
My question is, can i allow for wp-admin/widgets.php to robots.txt and this is safe?
WPBeginner Support says
IF you wanted to you can but that is not a file that Google needs to crawl.
Admin
Anthony says
Hi there, I’m wondering if you should allow: /wp-admin/admin-ajax.php?
WPBeginner Support says
Normally, yes you should.
Admin
Jaira says
May I know why you should allow /wp-admin/admin-ajax.php?
WPBeginner Support says
It is used by different themes and plugins to appear correctly for search engines.
Amila says
Hello! I really like this article and as I’m a beginner with all this crawling stuff I would like to ask something in this regard. Recently, Google has crawled and indexed one of my websites on a really terrible way, showing the pages in search results which are deleted from the website. The website didn’t have discouraged search engine from indexing in the settings of WordPress at the beginning, but it did later after Google showed even 3 more pages in the search results (those pages also doesn’t exist) and I really don’t understand how it could happen with “discourage search engine from indexing” option on. So, can the Yoast method be helpful and make a solution for my website to Google index my website on the appropriate way this time? Thanks in advance!
WPBeginner Support says
The Yoast plugin should be able to assist in ensuring the pages you have are indexed properly, there is a chance before you discouraged search engines from crawling your site your page was cached.
Admin
Amila says
Well yes and from all pages, it cached the once who doesn’t exist anymore. Anyway, as the current page is on “discourage” setting on, is it better to keep it like that for now or to uncheck the box and leave the Google to crawl and index it again with Yoast help? Thanks! With your articles, everything became easier!
WPBeginner Support says
You would want to have Google recrawl your site once it is set up how you want.
Pradhuman Kumar says
Hi I loved the article, very precise and perfect.
Just a small suggestion kindly update the image ROBOTS.txt tester, as Google Console is changed and it would be awesome if you add the link to check the robots.txt from Google.
WPBeginner Support says
Thank you for the feedback, we’ll be sure to look into updating the article as soon as we are able.
Admin
Kamaljeet Singh says
My blog’s robots.txt file was:
User-Agent: *
crawl-delay: 10
After reading this post, I have changed it into your recommended robots.txt file. Is that okay that I removed crawl-delay
WPBeginner Support says
It should be fine, crawl-delay tells search engines to slow down how quickly to crawl your site.
Admin
reena says
Very nicely described about robot.text, i am very happy
u r very good writer
WPBeginner Support says
Thank you, glad you liked our article
Admin
JJ says
What is Disallow: /refer/ page ? I get a 404, is this a hidden wp file?
Editorial Staff says
We use /refer/ to redirect to various affiliate links on our website. We don’t want those to be indexed since they’re just redirects and not actual content.
Admin
Sagar Arakh says
Thank you for sharing. This was really helpful for me to understand robots.txt
I have updated my robots.txt to the ideal one you suggested. i will wait for the results now
WPBeginner Support says
You’re welcome, glad you’re willing to use our recommendations
Admin
Akash Gogoi says
Very helpful article. Thank you very much.
WPBeginner Support says
Glad our article was helpful
Admin
Zingylancer says
Thanks for share this useful information about us.
WPBeginner Support says
Glad we could share this information about the robots.txt file
Admin
Jasper says
thanks for update information for me. Your article was good for Robot txt. file. It gave me a piece of new information. thanks and keep me updating with new ideas.
WPBeginner Support says
Glad our guide was helpful
Admin
Imran says
Thanks , I added robots.txt in WordPress .Very good article
WPBeginner Support says
Thank you, glad our article was helpful
Admin