Hello,
Google can't for some reason fetch generated robots.txt, although it should be able to do so. I can access robots.txt using curl with following output:
curl http://www.mydomain.com/robots.txt
User-agent: *
Disallow: /web/login
Allow: *
User-Agent: Googlebot
Disallow: /web/login
Google is complaining:
Failed: Robots.txt unreachable
Any idea what is wrong?
Also, because of that Google can't access sitemap.xml.
Another problem is about sitemap.xml. I contains URL's with http, not https prefix. They are valid, as we have http->https redirection rule, but I would prefer to have it correctly in sitemap in the first place. Any help with that?
Many thanks in advance.
Lumir