it says that You should not use robots.txt as a means to hide your web pages from Google Search results.Robots.txt Introduction and Guide | Google Search Central | Documentation | Google for Developers
Robots.txt is used to manage crawler traffic. Explore this robots.txt introduction guide to learn what robot.txt files are and how to use them.support.google.com
Maybe Google's advice is right?it says that You should not use robots.txt as a means to hide your web pages from Google Search results.
So if you know another way to prevent the page from showing in google search I will be gratefulMaybe Google's advice is right?
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ([Gg]ooglebot) [NC]
RewriteRule .* - [F]
Are you aware of the consequences? It means that your Google ranking can drop sharply and to the point where your website is kicked out of the index for manipulation tactics. Generally speaking: When you hide something from Google that normal users can see, your website will be punished by Google.
If you want to do that, you could block Google from reading parts of your site by adding some .htaccess rules that treat Google like it was a bad bot. For example create an .htaccess file in the directory that Google shall not be able to read and put this into it:
Code:RewriteEngine On RewriteBase / RewriteCond %{HTTP_USER_AGENT} ([Gg]ooglebot) [NC] RewriteRule .* - [F]
It will keep all requests out that use "googlebot" as the bot name in the current directory (where the .htaccess file is located) and all descending directories. However, when Google shows up with an anonymous ip, it will still be able to read the directory content, and you will not be able to keep them from checking with IPs that you don't know. The better solution is definitely to let them spider the site and tell them through a robots.txt what pages and directories you don't want them to include in their index.
like so ?Now that is a totally different request.
If your work computers have a static ip address (e.g. their internet router has a static ip address to the outside world), you could instead simply use the ip address limiter on the Apache & Nginx settings page to block access to your website.
location ^~ /exampledirectory {
allow 203.0.113.2;
deny all;
}
Thank you for all of your help I just need to deny access to a part of the site and the rest of the site should be visible to allFor subdirectories, yes, maybe. For the whole site, it is much easier to use the "Apache & nginx settings" and fill the "Deny access to the site" fields.
I tried this in the Additional nginx directives section with my IP address but it did not work do you know why ?like so ?
Code:location ^~ /exampledirectory { allow 203.0.113.2; deny all; }