standard for robot exclusion robot exclusion standard robots.txt <World-Wide Web> A proposal to try to prevent the havoc wreaked by many of the early {World-Wide Web} {robots} when they retrieved documents too rapidly or retrieved documents that had side effects (such as voting). The proposed standard for robot exclusion offers a solution to these problems in the form of a file called "robots.txt" placed in the {document root} of the {web site}. W3C standard (http://w3.org/TR/html4/appendix/notes.html#h-B.4.1.1). (2006-10-17)