Virtual Robots.txt

Description

Virtual Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. Instead of mucking about with FTP, files, permissions ..etc, just upload and activate the plugin and you’re done.

By default, the Virtual Robots.txt plugin allows access to the parts of WordPress that good bots like Google need to access. Other parts are blocked.

If the plugin detects an existing XML sitemap file, a reference to it will be automatically added to your robots.txt file.

Installation

  1. Upload pc-robotstxt folder to the /wp-content/plugins/ directory
  2. Activez Buddy Member Stats via le menu Extensions de l’admin WordPress
  3. Once you have the plugin installed and activated, you’ll see a new Robots.txt menu link under the Settings menu. Click that menu link to see the plugin settings page. From there you can edit the contents of your robots.txt file.

FAQ

Will it conflict with an existing robots.txt file?

If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict.

Will this work for sub-folder installations of WordPress?

Out of the box, no. Because WordPress is in a sub-folder, it won’t « know » when someone is requesting the robots.txt file which must be at the root of the site.

Does this plugin modify individual posts, pages, or categories?

No it doesn’t.

Why does the default plugin block certain files and folders?

By default, the virtual robots.txt is set to block WordPress files and folders that don’t need to be accessed by search engines. Of course, if you disagree with the defaults, you can easily change them.

Avis

19 septembre 2023 4 réponses
I had an issue that I believed was being caused by the plugin but the plugin author was very quick to help and clear up the issue. Great and helpful customer service – thanks!
23 novembre 2018
Works great and easy to use and customise. It already set by default the directories that need to be left out of Search Engines scanning/indexing… Very happy with it!
18 mars 2018 1 réponse
What I saw wasn’t what I got. The XML sitemap wasn’t included in the robots.txt file, even thought this was described as a feature that should work out of the box. In addition to that, upon installing this plugin, it blocked certain directories without asking. Lastly, it inserts a line at the top of the file, promoting the plugin. That should be an optional feature that users are empowered to turn off. All in all, it offers the functionality, but falls short and disappoints in other areas.
29 décembre 2016
I thought this would be simple. Sure sounds simple. But after I saved your suggested text to my brand new « virtual robots.txt », I clicked the link where it says « You can preview your robots.txt file here (opens a new window). If your robots.txt file doesn’t match what is shown below, you may have a physical file that is being displayed instead. » That new window shows text that is indeed different from the plugin’s. So I understand that to mean there’s a physical robots.txt file on my server. So which one is actually going to be used? Your FAQ offers this: Q: Will it conflict with any existing robots.txt file? A: If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict. If a physical file exists, WP won’t process ANY request for one? This SOUNDS like WP will ignore BOTH the physical file AND your virtual one. In which case, what’s the point? Might as well not have one, it seems to me. When I manually go to mydomain.com/robots.txt, I see what’s in the physical file, not what the plugin saved. So… is it working? I don’t know! Should I delete the physical file and assume the virtual one will work? I don’t know! Should I delete this plugin and edit the physical file manually? Most likely. 2 stars instead of 1 because I appreciate getting the suggested lines to include in my file.
Lire les 9 avis

Contributeurs/contributrices & développeurs/développeuses

« Virtual Robots.txt » est un logiciel libre. Les personnes suivantes ont contribué à cette extension.

Contributeurs

“Virtual Robots.txt” a été traduit dans 1 locale. Remerciez l’équipe de traduction pour ses contributions.

Traduisez « Virtual Robots.txt » dans votre langue.

Le développement vous intéresse ?

Parcourir le code, consulter le SVN dépôt, ou s’inscrire au journal de développement par RSS.

Journal

1.10

  • Fix to prevent the saving of HTML tags within the robots.txt form field. Thanks to TrustWave for identifying this issue.

1.9

  • Fix for PHP 7. Thanks to SharmPRO.

1.8

  • Undoing last fixes as they had unintended side-effects.

1.7

  • Further fixes to issue with newlines being removed. Thanks to FAMC for reporting and for providing the code fix.
  • After upgrading, visit and re-save your settings and confirm they look correct.

1.6

  • Fixed bug where newlines were being removed. Thanks to FAMC for reporting.

1.5

  • Fixed bug where plugin assumed robots.txt would be at http when it may reside at https. Thanks to jeffmcneill for reporting.

1.4

  • Fixed bug for link to robots.txt that didn’t adjust for sub-folder installations of WordPress.
  • Updated default robots.txt directives to match latest practices for WordPress.
  • Plugin development and support transferred to Marios Alexandrou.

1.3

  • Now uses do_robots hook and checks for is_robots() in plugin action.

1.2

  • Added support for existing sitemap.xml.gz file.

1.1

  • Added link to settings page, option to delete settings.

1.0

  • Version initiale.