Sitemap.xml throws errors in Webmaster Tools
PermalinkAlso running automated jobs goes well.
But when adding it for crawl in Google webmaster tools, and also in Bing webmaster tools, both of them throw errors and claim they cannot access it ("The request to fetch the page was timed out." and "Sitemap could not be read").
Why would this be?
The sitemap file itself also runs smoothly when I run it through XML validators and it can be viewed/crawled by bots as well. On the server everything looks fine and Unix permissions for the sitemap are 766. The site can be viewed on both http/https and with or without www before the domain. And htaccess doesn't block anything more but the default folders.
Now, 24 hours later, Google Search console has stopped rendering errors and suddenly says the sitemap has been crawled, but Bing still threw errors.
I decided to submit the non-SSL URL to Bing, and now it has processed correctly.
I'm wondering if other Concrete5 users have experienced the same.
URLs that time out may indicate an issue with the server responding to the request, or there may be an issue with the URL itself which means it takes too long to respond - such as overly large or complex database query. It is worth checking Google Search Console 'crawl errors' to see if they also see issues with URLs timing out. If so you may need to deal with a developer or server admin to understand what is causing the timeouts.