URL errors

Permalink 1 user found helpful
Do you have any suggestions for removing the URL errors as seen in the highlighted screen shot?

Any advice would be greatly appreciated thanks!

1 Attachment

 
adajad replied on at Permalink Reply
adajad
Do you have a robots.txt in your webroot? If not - create one. If you do, make sure it has the following (click 'View entire code block' to see all lines):

User-agent: *
Disallow: /blocks 
Disallow: /concrete 
Disallow: /config 
Disallow: /controllers 
Disallow: /css 
Disallow: /elements 
Disallow: /helpers 
Disallow: /jobs 
Disallow: /js 
Disallow: /languages 
Disallow: /libraries 
Disallow: /mail 
Disallow: /models 
Disallow: /packages


Basically that file will tell search engine crawlers (well at least the major ones) to not crawl and index the disallowed directories.
jessicadunbar replied on at Permalink Reply
jessicadunbar
Also, you can demote those links in webmaster tools. To help get those removed faster. Create and submit a sitemap.

Should clear up. Those sitelinks have been a pain since google launched that.
irussell replied on at Permalink Reply
Ok I updated my robot.txt file and resubmitted my sitemap. But I still see the errors when I do a search on my URL in google. Does this just take some time to go away?

Thanks for your advice too:-)
Steevb replied on at Permalink Reply
Steevb
Never seen that issue before, but I suppose there's always a first time?

Is this a set up error?

Looks a bit odd to me!

The 'robot.txt' should be set up automatically with C5, you do not need to create your own.

If you need to add to it, you can.

Crawl errors and updates can take a while to rectify with all search engines.

Remember you can exclude pages from the sitemap.xml with 'Page Properties"

Hope that helps