Skip to Content
Menu
This question has been flagged
1 Reply
1913 Views

Hello everyone, I have a problem with the indexing of my website on google.


Here is my robot:


User-agent: Googlebot

Disallow: 

User-agent: googlebot-image

Disallow: 

User-agent: googlebot-mobile

Disallow: 

User-agent: MSNBot

Disallow: 

User-agent: Slurp

Disallow: 

User-agent: Teoma

Disallow: 

User-agent: Gigabot

Disallow: 

User-agent: Robozilla

Disallow: 

User-agent: Nutch

Disallow: 

User-agent: ia_archiver

Disallow: 

User-agent: baiduspider

Disallow: 

User-agent: naverbot

Disallow: 

User-agent: yeti

Disallow: 

User-agent: yahoo-mmcrawler

Disallow: 

User-agent: psbot

Disallow: 

User-agent: yahoo-blogs/v3.9

Disallow: 

User-agent: *

Disallow: 

Disallow: /cgi-bin/

Sitemap: exemple


Normally everything is fine, only google is showing me this error:


Exploring allowed?
error
No: blocked by robots.txt file
Page retrieval
error
Failed: Blocked by robots.txt" error


I don't understand what's going on, I've used this type of robot several times on other websites and it doesn't work on this website I'm trying to index.


thank you in advance for your help

Avatar
Discard
Best Answer
Hello I have the same problem. Did you find a solution ?


Avatar
Discard
Related Posts Replies Views Activity
1
Feb 25
481
2
Jun 23
4639
0
Oct 24
586
1
Apr 23
3085
1
Apr 23
3499