Hello everyone, I have a problem with the indexing of my website on google.
Here is my robot:
User-agent: Googlebot
Disallow:
User-agent: googlebot-image
Disallow:
User-agent: googlebot-mobile
Disallow:
User-agent: MSNBot
Disallow:
User-agent: Slurp
Disallow:
User-agent: Teoma
Disallow:
User-agent: Gigabot
Disallow:
User-agent: Robozilla
Disallow:
User-agent: Nutch
Disallow:
User-agent: ia_archiver
Disallow:
User-agent: baiduspider
Disallow:
User-agent: naverbot
Disallow:
User-agent: yeti
Disallow:
User-agent: yahoo-mmcrawler
Disallow:
User-agent: psbot
Disallow:
User-agent: yahoo-blogs/v3.9
Disallow:
User-agent: *
Disallow:
Disallow: /cgi-bin/
Sitemap: exemple
Normally everything is fine, only google is showing me this error:
Exploring allowed?
error
No: blocked by robots.txt file
Page retrieval
error
Failed: Blocked by robots.txt" error
I don't understand what's going on, I've used this type of robot several times on other websites and it doesn't work on this website I'm trying to index.
thank you in advance for your help