# Taken from Source: http://en.wikipedia.org/robots.txt # A capture bot, downloads gazillions of pages with no public benefit # http://www.webreaper.net/ User-agent: WebReaper Disallow: / # Don't allow the wayback-maschine to index user-pages #User-agent: ia_archiver #Disallow: /wiki/User #Disallow: /wiki/Benutzer # # Friendly, low-speed bots are welcome viewing article pages, but not # dynamically-generated pages please. # # Inktomi's "Slurp" can read a minimum delay between hits; if your # bot supports such a thing using the 'Crawl-delay' or another # instruction, please let us know. # User-agent: * # Disallow: /w/ Disallow: /wiki/User:* Disallow: /wiki/Pengguna:* Disallow: /wiki/Special:* Disallow: /wiki/Istimewa* Disallow: /wiki/Templat:* Disallow: /wiki/Internal:* Disallow: /o/index.php? Disallow: /org/ Disallow: /wt/index.php? Disallow: /wtest/ Disallow: /wiki/GBI_Rayon_7/Cabang Disallow: /index.php?title=GBI_Rayon_7/Cabang