Skip to Content
Menu
Dette spørgsmål er blevet anmeldt
1 Svar
3197 Visninger

Hello everyone, I have a problem with the indexing of my website on google.


Here is my robot:


User-agent: Googlebot

Disallow: 

User-agent: googlebot-image

Disallow: 

User-agent: googlebot-mobile

Disallow: 

User-agent: MSNBot

Disallow: 

User-agent: Slurp

Disallow: 

User-agent: Teoma

Disallow: 

User-agent: Gigabot

Disallow: 

User-agent: Robozilla

Disallow: 

User-agent: Nutch

Disallow: 

User-agent: ia_archiver

Disallow: 

User-agent: baiduspider

Disallow: 

User-agent: naverbot

Disallow: 

User-agent: yeti

Disallow: 

User-agent: yahoo-mmcrawler

Disallow: 

User-agent: psbot

Disallow: 

User-agent: yahoo-blogs/v3.9

Disallow: 

User-agent: *

Disallow: 

Disallow: /cgi-bin/

Sitemap: exemple


Normally everything is fine, only google is showing me this error:


Exploring allowed?
error
No: blocked by robots.txt file
Page retrieval
error
Failed: Blocked by robots.txt" error


I don't understand what's going on, I've used this type of robot several times on other websites and it doesn't work on this website I'm trying to index.


thank you in advance for your help

Avatar
Kassér
Bedste svar
Hello I have the same problem. Did you find a solution ?


Avatar
Kassér
Related Posts Besvarelser Visninger Aktivitet
1
feb. 25
1347
2
jun. 23
5562
0
aug. 25
1686
1
apr. 23
4195
1
apr. 23
4620