Search Engines Indexing Core Folders

Permalink 1 user found helpful
We have a site being index by Google and we noticed it's indexing core folders like:

/concrete/libraries
/files/onstates
/concrete/jobs
/concrete/config

See for yourself:

http://www.google.com/search?hl=en&q=site:gigstadpainting.com...

This is no good. Any ideas on how Concrete could block that indexing rather than me setting up a custom robots file?

LucasAnderson
 
andrew replied on at Permalink Reply
andrew
It looks like your server is setup to allow directory browsing. So if I go to server.com/concrete/ I see all teh files there.

Now, there's nothing an end user should be able to do with that knowledge...but it's still probably not the best server setup (and plus, I think this is why you're being indexed.)

I'd disable directory browsing through apache yourself or have an administrator do it... that way going to a directory that doesn't have a valid index.php file in it is going to get permissions denied...and I think google will stop spidering those directories then:

http://felipecruz.com/blog_disable-directory-listing-browsing-apach...
LucasAnderson replied on at Permalink Reply
LucasAnderson
That seemed to work, but then I checked the Pretty URL functionality and that was spitting out 404's.

I added this to httpd.conf

<Directory "/home/*">
Options -Includes
AllowOverride None
</Directory>

So server.com/concrete was blocked, but so was server.com/services (an actual page, after prettified by pretty URL's)

Thoughts?
LucasAnderson replied on at Permalink Reply
LucasAnderson
Thanks Andrew, I resolved it by learning more about editing httpd.conf

Go figure.
renic replied on at Permalink Reply
renic
A robot.txt file wouldn't hurt either.