If you’re managing a WordPress website, chances are you’ve heard about the robots.txt file. It may seem like a small technical detail, but it can have a huge impact on how search engines interact with your site. Done right, it helps search engines focus on your best content. Done wrong, it can block your entire site from being indexed.
In this guide, you’ll learn everything you need to know about WordPress robots.txt: what it is, where to find it, how to edit it with or without a plugin, how to delete it, and—most importantly—what to actually include in it.
What is robots.txt?
The robots.txt file is a plain-text file located at the root of your website (yourdomain.com/robots.txt). Its job is to tell search engine bots which parts of your site they’re allowed to crawl and index, and which ones they should avoid.
For example:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
This is a simple, common setup that tells all bots to stay out of your WordPress admin area, but still allows access to a file needed by plugins and themes.
Why is the robots.txt File Important?
If you care about SEO, user privacy, or server performance, robots.txt matters. Here’s why:
- Crawl Budget Control: Search engines only crawl a limited number of pages per site. If bots waste time on low-value pages, important content might be ignored.
- Privacy: Some pages aren’t meant for the public (like login or admin areas).
- Duplicate Content: You might want to prevent bots from crawling search results pages or filtered product listings that create duplicate content.
- Site Performance: Reducing unnecessary crawling can lessen server load, especially on high-traffic sites.
Where is the robots.txt File in WordPress?
By default, WordPress creates a virtual robots.txt file. It’s generated dynamically and isn’t a physical file you can edit in your site folder unless you manually create one.
To check if your site has a virtual robots.txt file, go to:
https://yourdomain.com/robots.txt
If you see something like this:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
…then you’re seeing WordPress’s default robots.txt content.
How to Access robots.txt in WordPress
There are multiple ways to access and edit your robots.txt, depending on whether or not you’re using an SEO plugin.
1. Through Your Web Browser
Just go to yourdomain.com/robots.txt. This shows you what search engines are currently seeing.
2. Through an SEO Plugin (like Yoast or All in One SEO)
If you’re using Yoast SEO:
- Go to your WordPress dashboard.
- Navigate to SEO > Tools > File Editor.
- You’ll see the robots.txt file there, if you’ve created one.
If you’re using All in One SEO:
- Go to All in One SEO > Tools > Robots.txt Editor.
- You can enable and edit a custom robots.txt file directly in your dashboard.
3. Via FTP or File Manager
If you prefer to handle files manually:
- Connect to your site using FTP or your hosting control panel’s File Manager.
- Navigate to your root directory (usually called
public_html
). - If there’s already a
robots.txt
file, download and edit it using a text editor. - If not, create a new text file named
robots.txt
, and upload it.
How to Create and Edit robots.txt in WordPress (Without a Plugin)
You can manage your robots.txt file even if you don’t use a plugin. Here’s how:
- Open a simple text editor (Notepad, TextEdit, VS Code, etc.).
- Add your rules. For example:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml
- Save the file as
robots.txt
. - Use FTP or File Manager to upload it to your site’s root directory.
Once uploaded, visit yourdomain.com/robots.txt to verify it’s live.
How to Update or Change robots.txt in WordPress
Once the file is in place, updating it is easy. Either:
- Use your SEO plugin to change it through the WordPress dashboard.
- Or, manually edit the file on your server using FTP or your hosting panel.
Make sure to save your changes and test the file with Google’s robots.txt Tester.
How to Delete robots.txt in WordPress
If you want to remove the file entirely:
- Access your root directory via FTP or File Manager.
- Locate the
robots.txt
file. - Delete it.
WordPress will then revert to using the default virtual file. Just note that any custom rules will be lost.
How to Unblock Content in robots.txt
Sometimes, people accidentally block entire sections of their site. For example, this line would block everything:
Disallow: /
To unblock content:
- Open the file.
- Remove or adjust restrictive rules.
- Use
Disallow:
with nothing after it to allow all content.
User-agent: *
Disallow:
Also check that your WordPress Settings > Reading doesn’t have “Discourage search engines from indexing this site” checked.
What to Include in Your WordPress robots.txt File
Here’s a well-rounded, beginner-friendly robots.txt
configuration for a typical WordPress site:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /cgi-bin/
Disallow: /wp-login.php
Sitemap: https://yourdomain.com/sitemap.xml
For Blogs
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-login.php
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap_index.xml
For WooCommerce Stores
User-agent: *
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml
For Membership Sites or Courses
User-agent: *
Disallow: /login/
Disallow: /register/
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml
Advanced Tips
If your site has many filters, search results pages, or query strings, use Disallow
to prevent those from being crawled.
Example:
Disallow: /?s=
Disallow: /search/
Disallow: *?filter=
If you want to target a specific bot, like Bingbot or Googlebot, you can write rules just for them:
User-agent: Googlebot
Disallow: /private-area/
Common Mistakes to Avoid
- Blocking CSS and JavaScript – Modern search engines need to render pages. Don’t block
/wp-content/
unless you have a good reason. - Blocking the Entire Site – A single line like
Disallow: /
can remove your entire site from search results. - Ignoring Sitemap – Always add your sitemap URL to help bots find your pages.
- Overcomplicating – Keep rules simple unless you have a large, complex site.
- Forgetting to Test – Always use a tool like Google’s robots.txt Tester to make sure your rules are working as expected.
Wrapping It Up
The robots.txt
file might be small, but it plays a big role in how search engines treat your WordPress website. Whether you’re trying to improve SEO, block unwanted crawlers, or simply guide search engines through your site, knowing how to access, edit, and optimize this file is essential.
To recap, you now know:
- Where the
robots.txt
file is in WordPress - How to access and edit it (with or without plugins)
- How to update, remove, or unblock it
- What to include for blogs, eCommerce sites, and more
Take a few minutes today to review your site’s robots.txt
. A few tweaks might just make a noticeable difference in your crawl rate and visibility.