Strange Redirect Problem

I am getting redirect errors on the Google Search Console for my site.

Specifically it is complaining about the url:

I can't see any .php or html file in this directory that might contain redirect code. This is certainly not a directory that I have touched or added files to.

In the search console I can fetch the page as google. When I do this I see the following:

Fetch as Google
Googlebot type: Desktop
Redirected on Tuesday, March 20, 2018 at 5:42:19 PM PDT
This URL redirected to:
Downloaded HTTP response:

1 HTTP/1.1 301 Moved Permanently
2 Date: Wed, 21 Mar 2018 00:42:20 GMT
3 Server: Apache
4 Location:
5 Cache-Control: max-age=1
6 Expires: Wed, 21 Mar 2018 00:42:21 GMT
7 Content-Length: 253
8 Keep-Alive: timeout=3, max=100
9 Connection: Keep-Alive
10 Content-Type: text/html; charset=iso-8859-1
13 <html><head>
14 <title>301 Moved Permanently</title>
15 </head><body>
16 <h1>Moved Permanently</h1>
17 <p>The document has moved <a href="">here</a>.</p>
18 </body></html>

I have searched through the site trying to figure out how this is happening and can't find anything. There is code in the \concrete\symfony\http-foundation\RedirectResponse.php files -- specifically lines 98 and 101 (concrete 8.3) that seem to create this code, but I can't figure out how it is being trigger. Any ideas??

Google is penalizing my site because of this and a few similar redirects that I have not added to the site. Very strange.

Any help or suggests are appreciated.

View Replies:
edsaxmoore replied on at Permalink Reply
Did you find an answer to this problem? I am experiencing the same issue.

dgreer replied on at Permalink Reply
No - no help on this one. I finally eliminated the pages from the google crawl so that i was not getting errors.

Let me know if you figure out what is going on.

Did you by chance convert your site via the migration programs? I am wondering if this is some migration related problem....
edsaxmoore replied on at Permalink Reply
It was a fresh install of 8.1, then upgrade to 8.2. I'm having IT check the server level to see if there is something on the server, which I doubt. It is happening on many of my sites. I will let you know my findings.
edsaxmoore replied on at Permalink Reply
I think I've figured it out. One of the guys on slack told me it my be my cache. I cleared the cache and now I am getting a 404 error, which is correct. I guess I'm going to have to set some cron jobs to clear the cache once a month or so.
byvictoria replied on at Permalink Reply
Hi everyone,
I am having the same problem. I have cleared the cache, but that didn't solve my issue. The problem seems related to the pretty url .htaccess code, because when I take it off, I don't get that redirection notice...

Any help?


edsaxmoore replied on at Permalink Reply
Make sure your robotx.txt file is correct. I've had to manually add some disallows. For example, the culprit in my case was as follows:

Disallow: /application/views

Try that and see if it works
byvictoria replied on at Permalink Reply
Thank you @edsaxmoore...

The thing is that my robots already had that line. I added Disallow: /concrete/images and I am waiting to see what happens. This is my robots now:

User-agent: *
Disallow: /application/attributes
Disallow: /application/authentication
Disallow: /application/bootstrap
Disallow: /application/config
Disallow: /application/controllers
Disallow: /application/elements
Disallow: /application/helpers
Disallow: /application/jobs
Disallow: /application/languages
Disallow: /application/mail
Disallow: /application/models
Disallow: /application/page_types
Disallow: /application/single_pages
edsaxmoore replied on at Permalink Reply
What is the exact Google error/warning you are getting? Also, have you deleted the cache at the server level? Deleting in in concrete5 is one thing, but if you've got Varnish running or another caching program, it may be holding a deleted graphic in cache at the server level.
byvictoria replied on at Permalink Reply

I just get: Search Console has identified that your site is affected by 1 new Index coverage related issue. This means that Index coverage may be negatively affected in Google Search results. We encourage you to fix this issue.

New issue found: Redirect error

And it points out the page where the error is:

And when I go to that page, I get ERR_TOO_MANY_REDIRECTS
edsaxmoore replied on at Permalink Reply
The example I gave above with the robot.txt file was only an example. If it happens again, be sure and find the exact path, which in this instance is /concrete/images and disallow IF it shouldn't be indexed. In this case, it should be disallowed.

Disallow: /concrete/images

Then you will have to run the 'fix' with google and have them review it. They will let you know if it is fixed or not. But, this should take care of it.
darrellgw replied on at Permalink Reply
I too had this exact same problem with concrete/images directory. I disallowed it in the robots.txt file and the problem went away for a couple of months. Now I am getting a new issue from Google search console, although only a warning:

Indexed, though blocked by robots.txt
First detected: 8/7/18
Status: Warning

Really wish the community could come up with something where an exact fix for this issue comes out. I might try clearing system cache and deleting cache files off the server and removing the disallow in robots.txt to see what happens. Will post results if it works.