Daniel Richards
New Pleskian
I'm having a strange issue where google can see robot.txt exists but cannot always read it.
I've tried all of the obvious like encoding, permissions etc etc,
but no joy.
I look in /var/log/nginx/access.log and I find
66.249.69.26 - - [23/Jul/2018:05:36:36 +0000] "GET /robots.txt HTTP/1.1" 301 178 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +Googlebot - Search Console Help)"
I can manually ask google to fetch robots.txt and a similar entry will appear with 301 and google report fetch not possible. (it indexes the rest of my site just fine)
It doesn't appear in the domain logs - just the system one above.
How can I enable the logging so that I can see the exact query googlebot is making? Can I determine if googlebot is requesting by IP, or domain name? - if so is it using http or https, or www or no www?
Thanks in advance....
(any ideas on recitfying this also welcome)
I've tried all of the obvious like encoding, permissions etc etc,
but no joy.
I look in /var/log/nginx/access.log and I find
66.249.69.26 - - [23/Jul/2018:05:36:36 +0000] "GET /robots.txt HTTP/1.1" 301 178 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +Googlebot - Search Console Help)"
I can manually ask google to fetch robots.txt and a similar entry will appear with 301 and google report fetch not possible. (it indexes the rest of my site just fine)
It doesn't appear in the domain logs - just the system one above.
How can I enable the logging so that I can see the exact query googlebot is making? Can I determine if googlebot is requesting by IP, or domain name? - if so is it using http or https, or www or no www?
Thanks in advance....
(any ideas on recitfying this also welcome)