URL errors1 user found helpful
Any advice would be greatly appreciated thanks!
User-agent: * Disallow: /blocks Disallow: /concrete Disallow: /config Disallow: /controllers Disallow: /css Disallow: /elements Disallow: /helpers Disallow: /jobs Disallow: /js Disallow: /languages Disallow: /libraries Disallow: /mail Disallow: /models Disallow: /packages
Basically that file will tell search engine crawlers (well at least the major ones) to not crawl and index the disallowed directories.
Should clear up. Those sitelinks have been a pain since google launched that.
Thanks for your advice too:-)
Is this a set up error?
Looks a bit odd to me!
The 'robot.txt' should be set up automatically with C5, you do not need to create your own.
If you need to add to it, you can.
Crawl errors and updates can take a while to rectify with all search engines.
Remember you can exclude pages from the sitemap.xml with 'Page Properties"
Hope that helps