• The APS Catalog has been deprecated and removed from all Plesk Obsidian versions.
    Applications already installed from the APS Catalog will continue working. However, Plesk will no longer provide support for APS applications.
  • Please be aware: with the Plesk Obsidian 18.0.78 release, the support for the ngx_pagespeed.so module will be deprecated and removed from the sw-nginx package.

Bot crawling not disallowed by default on Plesk 8443

Chris1

Regular Pleskian
A client of ours has recently experienced a problem with their Google account with a warning stating that there was an issue with their SSL configuration. We verified that their SSL setup was fine, they are using a valid SSL with SNI support.

The problem ended up being that Google were picking up their domain on port 8443 which was raising a warning as the shared certificate on the server does not match their domain name.

I have since put a robots.txt file containing the following into /usr/local/psa/admin/htdocs/:

Code:
User-agent: *
Disallow: /

I have also advised that the client should contact Google to have the domain removed with the port 8443.

Is there a reason why the Plesk interface is set to allow bot crawling by default?
 
Back
Top