Hi,
I have a site that has been previously hosted via apache only and we use a script to generate dynamic pages. A request hits a subfolder with a script that looks for 404 errors and if the request URI is in a list then a dynamic page is served. This is a massively cut-down version of the script but it goes something like this:
Under Apache, it catches the 404, does the checks and then renders the page with a 200 message, the browser is not served a 404. With the nginx reverse proxy the page still renders but a 404 is passed to the browser so this means the pages are erroring for Googlebot and the pages have dropped out of the index. Is there any way to stop nginx processing that folder at all or any other way to stop the 404 being pushed back to nginx before the 200 code is served?
TIA
chubba
I have a site that has been previously hosted via apache only and we use a script to generate dynamic pages. A request hits a subfolder with a script that looks for 404 errors and if the request URI is in a list then a dynamic page is served. This is a massively cut-down version of the script but it goes something like this:
PHP:
if (preg_match("/\/shop-search\/(.*)-shoes$/", $_SERVER['REQUEST_URI'], $matches)) {
header("HTTP/1.1 200 OK");
$colorName = trim(strip_tags($matches[1]));
$pagenum = 1;
$mainURL = "/shop-search/" . $colourName . "-shoes/";
$color = array(
"Blue" => "Blue",
"Red" => "Red",
);
if (isset($color[$colorName])) {
$keywords = $color[$colorName];
$pageType = "colour";
require_once "by-color.php";
exit;
}
}
Under Apache, it catches the 404, does the checks and then renders the page with a 200 message, the browser is not served a 404. With the nginx reverse proxy the page still renders but a 404 is passed to the browser so this means the pages are erroring for Googlebot and the pages have dropped out of the index. Is there any way to stop nginx processing that folder at all or any other way to stop the 404 being pushed back to nginx before the 200 code is served?
TIA
chubba