When someone launches a website, one of the first goals is to make sure search engines like Google can find and index the pages correctly. However, not every page on a website needs to appear in search results. This is where the Robots Txt File in WordPress becomes important. It helps website owners guide search engine bots on which pages they should crawl and which pages they should ignore.
Think of it as a simple instruction file that communicates with search engine crawlers. Without proper instructions, bots may crawl unnecessary pages such as admin sections, duplicate pages, or plugin directories. This can waste the website’s crawl budget and affect SEO performance.
For WordPress users, understanding the Robots Txt File in WordPress is essential for better search engine visibility, faster indexing, and improved website structure. Whether you run a blog, business site, or ecommerce store, managing this file correctly can help improve your SEO strategy.
In this guide, we will explain what a robots.txt file is, why it matters for SEO, and how you can easily add or edit it in WordPress.
What is Robots.txt File?

A robots.txt file is a small text file that tells search engine bots how they should crawl a website. When search engines like Google visit a website, their bots first check the robots.txt file to see which pages they are allowed to access and which pages they should ignore.
This file helps website owners control how search engines interact with their site. For example, it can block bots from crawling admin pages, login pages, or other sections that are not useful for search results.
The robots.txt file is usually placed in the main directory of a website. It uses simple rules, such as Allow and Disallow to guide search engine crawlers. Using this file properly helps improve website SEO and ensures search engines focus on the most important pages.
How to Add Robots Txt File in WordPress?
Adding a Robots Txt File in WordPress is an important step for managing how search engines crawl your website. It helps you control which pages search engine bots can access and which sections they should ignore. By setting up the Robots Txt File in WordPress, you can improve crawl efficiency, protect private pages, and support better SEO performance. You can also use SEO auditing tools to analyze how search engines interact with your site and ensure your robots.txt configuration is working effectively.
The good news is that WordPress allows you to create or edit this file using different methods such as plugins, FTP access, or hosting tools. Below are the main steps and methods you can follow to add and manage the robots.txt file in your WordPress website.
Method 1: Editing Robots.txt File Using a Plugin
One of the easiest ways to manage the robots.txt file is by using a WordPress plugin. This method is especially helpful for beginners because it allows you to edit the file directly from the WordPress dashboard without accessing server files. Plugins such as WPCode or Virtual Robots.txt provide a simple interface where you can add, edit, or remove rules for search engine crawlers.

To start, I installed the WPCode plugin from the WordPress Add Plugins section and activated it. After activation, I open the plugin settings from the dashboard. I am using WPCode. I navigate to the Code Snippets section and open the File Editor option.
Here, I can locate the robots.txt editor where I can modify the file and add new crawling rules. Once the changes are added, I save them and test the file to ensure everything works correctly.
If I use the Virtual Robots.txt plugin, I install and activate it in the same way. Then I go to the plugin settings, where I can view the default robots.txt rules. From there, I can modify the instructions or add new directives depending on how I want search engines to crawl the website. This approach is simple, quick, and ideal for users who prefer not to work with server files or technical tools.
Method 2: Creating and Uploading Robots.txt File Using FTP

Another effective method is creating the robots.txt file manually and uploading it to the website using FTP access. This approach gives more control over the file because I can create and edit it directly using a text editor before uploading it to the server.
To begin, I open a basic text editor such as Notepad or any similar editor. In this file, I write the rules that I want search engine bots to follow. These rules may include allowing search engines to crawl important pages while blocking directories like the admin area or login pages.
After writing the instructions, I saved the file with the name robots.txt. It is important to ensure the file name is correct because search engines specifically look for this exact file in the root directory of the website. This step is also important during WordPress website content migration, as maintaining proper crawl instructions helps search engines index your new site structure correctly.
Next, I connect to my website using an FTP client such as FileZilla. After logging in with my hosting credentials, I navigate to the root folder of the website, which is usually called public_html.
Inside this folder, I uploaded the robots.txt file that I created earlier. Once the file is uploaded, it becomes active, and search engine bots will start following the rules defined in it. This method is useful when I want full control over the file structure and prefer managing website files manually.
Method 3: Create Robots Txt File in WordPress Using Cloudways

If your website is hosted on Cloudways, you can easily create and manage the Robots Txt File in WordPress through server access. This method allows you to directly add the file inside your website directory so search engine bots can follow the crawling instructions properly.
First, I log in to my Cloudways hosting dashboard. From the top navigation bar, I open the Servers section and select the server where my WordPress website is hosted. Inside the Server Management area, I open Master Credentials to get the SSH and SFTP login details. These credentials allow me to access the website files securely.
Next, I open an FTP client such as FileZilla and connect to the server using the SFTP credentials provided in Cloudways. After the connection is established, I navigate to the applications folder where all the hosted applications are stored.
To locate the correct application folder, I go back to the Cloudways platform and open the Applications section. Then I select my WordPress website and check the Application Settings under the General tab to find the folder name assigned to that application.
Once I know the folder name, I return to FileZilla and open the path /applications/application-folder-name/public_html. This is the root directory of the WordPress site. Inside this folder, I create a new text file and name it robots.txt.
After creating the file, I open it with a text editor and add the rules that guide search engine crawlers. These rules help control how bots interact with the website. Once the changes are saved, the Robots Txt File in WordPress becomes active, and search engines will start following the instructions defined in the file.
Pages You Should Block in Robots.txt and Why You Should
Not every page on a website is meant to appear in search engine results. Using robots.txt, you can block certain pages so search engines focus only on your important content.
Admin Pages: Protect Website Management Areas
Admin pages like /wp-admin/ are used for managing the website and should not be accessed by search engine bots. These pages contain backend controls and do not provide useful information for users searching on Google. Blocking them helps maintain security and prevents unnecessary crawling.
Login Pages: Avoid Indexing Private Access Points
Login pages such as /wp-login.php are meant only for website administrators or registered users. Search engines do not need to crawl or index these pages because they do not provide valuable content for search results. Blocking them keeps the website structure organized.
Internal Search Pages: Prevent Duplicate Content
Internal search result pages often generate multiple URLs with similar content. If search engines crawl these pages, it may create duplicate content issues. Blocking them helps maintain better SEO and ensures that search engines focus on original content.
Plugin and System Files: Reduce Unnecessary Crawling
WordPress plugins and system files contain technical scripts that are required for website functionality but not for search engine results. Blocking these directories prevents search engine bots from wasting time crawling files that do not add value to users.
Thank You or Checkout Pages: Keep Transaction Pages Private
Pages like checkout confirmation or thank-you pages are part of the purchasing process. These pages should not appear in search results because they are meant only for users who complete a transaction. Blocking them helps maintain a better user experience.
Blocking unnecessary pages through robots.txt is a smart step in maintaining a well-structured website. It helps search engines crawl important pages more efficiently while keeping private or irrelevant sections hidden from search results.
Conclusion
Understanding and managing the File in WordPress is an important part of technical SEO and website optimization. This small text file helps search engines understand which pages of your website should be crawled and which sections should be ignored. By properly configuring it, you can guide search engine bots to focus on important pages such as blog posts, product pages, and landing pages.
For WordPress websites, adding or editing the Robots Txt File can be done easily using plugins, FTP access, or hosting platforms like Cloudways. Each method gives website owners control over how search engines interact with their site. When used correctly, this file helps improve website indexing, supports better search engine visibility, and strengthens your overall SEO strategy, website ranking, and search engine performance.