• We value your experience with Plesk during 2024
    Plesk strives to perform even better in 2025. To help us improve further, please answer a few questions about your experience with Plesk Obsidian 2024.
    Please take this short survey:

    https://pt-research.typeform.com/to/AmZvSXkx

  • We are developing a new feature in Plesk that will help you promote your websites or business on social media. We want to conduct a one-hour online UX test to present the prototype and collect feedback. If you are interested in the feature, please book a meeting via this link.
    Thank you in advance!
  • The Horde webmail has been deprecated. Its complete removal is scheduled for April 2025. For details and recommended actions, see the Feature and Deprecation Plan.

Question how can I prevent access to specific part of a website and deny it from showing in google search

elaa

Basic Pleskian
Hi, I have a website that is already published and I want to add a new website application to this site but it needs to be hidden from everyone only with specific IP address and It should not be published in google searches
how can I do this in plesk
 
Are you aware of the consequences? It means that your Google ranking can drop sharply and to the point where your website is kicked out of the index for manipulation tactics. Generally speaking: When you hide something from Google that normal users can see, your website will be punished by Google.

If you want to do that, you could block Google from reading parts of your site by adding some .htaccess rules that treat Google like it was a bad bot. For example create an .htaccess file in the directory that Google shall not be able to read and put this into it:

Code:
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ([Gg]ooglebot) [NC] 
RewriteRule .* - [F]

It will keep all requests out that use "googlebot" as the bot name in the current directory (where the .htaccess file is located) and all descending directories. However, when Google shows up with an anonymous ip, it will still be able to read the directory content, and you will not be able to keep them from checking with IPs that you don't know. The better solution is definitely to let them spider the site and tell them through a robots.txt what pages and directories you don't want them to include in their index.
 
Are you aware of the consequences? It means that your Google ranking can drop sharply and to the point where your website is kicked out of the index for manipulation tactics. Generally speaking: When you hide something from Google that normal users can see, your website will be punished by Google.

If you want to do that, you could block Google from reading parts of your site by adding some .htaccess rules that treat Google like it was a bad bot. For example create an .htaccess file in the directory that Google shall not be able to read and put this into it:

Code:
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ([Gg]ooglebot) [NC]
RewriteRule .* - [F]

It will keep all requests out that use "googlebot" as the bot name in the current directory (where the .htaccess file is located) and all descending directories. However, when Google shows up with an anonymous ip, it will still be able to read the directory content, and you will not be able to keep them from checking with IPs that you don't know. The better solution is definitely to let them spider the site and tell them through a robots.txt what pages and directories you don't want them to include in their index.


Thank you for everything but this is really complicated for me, I just wanted my web app to be accessed only when using work computers
 
Now that is a totally different request.

If your work computers have a static ip address (e.g. their internet router has a static ip address to the outside world), you could instead simply use the ip address limiter on the Apache & Nginx settings page to block access to your website.
 
Now that is a totally different request.

If your work computers have a static ip address (e.g. their internet router has a static ip address to the outside world), you could instead simply use the ip address limiter on the Apache & Nginx settings page to block access to your website.
like so ?
Code:
location ^~ /exampledirectory {
allow 203.0.113.2;
deny all;
}
 
For subdirectories, yes, maybe. For the whole site, it is much easier to use the "Apache & nginx settings" and fill the "Deny access to the site" fields.
 
For subdirectories, yes, maybe. For the whole site, it is much easier to use the "Apache & nginx settings" and fill the "Deny access to the site" fields.
Thank you for all of your help I just need to deny access to a part of the site and the rest of the site should be visible to all
 
Back
Top