Sitemap.xml throws errors in Webmaster Tools

Permalink
One of my sites has, what seems to be a perfectly correct sitemap.xml file. The structure looks good and I can be viewed in a browser at any time without problems.
Also running automated jobs goes well.

But when adding it for crawl in Google webmaster tools, and also in Bing webmaster tools, both of them throw errors and claim they cannot access it ("The request to fetch the page was timed out." and "Sitemap could not be read").
Why would this be?

jirosworld
 
Steevb replied on at Permalink Reply
Steevb
Do you have a large site?
URLs that time out may indicate an issue with the server responding to the request, or there may be an issue with the URL itself which means it takes too long to respond - such as overly large or complex database query. It is worth checking Google Search Console 'crawl errors' to see if they also see issues with URLs timing out. If so you may need to deal with a developer or server admin to understand what is causing the timeouts.
jirosworld replied on at Permalink Best Answer Reply
jirosworld
No, this is not a large site at all. It's an illustrator portfolio site with a maximum of 40 individual content pages at the most, all images are compressed, so no large files or crazy amounts of internal links.
The sitemap file itself also runs smoothly when I run it through XML validators and it can be viewed/crawled by bots as well. On the server everything looks fine and Unix permissions for the sitemap are 766. The site can be viewed on both http/https and with or without www before the domain. And htaccess doesn't block anything more but the default folders.

Now, 24 hours later, Google Search console has stopped rendering errors and suddenly says the sitemap has been crawled, but Bing still threw errors.
I decided to submit the non-SSL URL to Bing, and now it has processed correctly.
I'm wondering if other Concrete5 users have experienced the same.