robotparser
— Parser for robots.txt¶
Note
The robotparser
module has been renamed urllib.robotparser
in
Python 3.
The 2to3 tool will automatically adapt imports when converting
your sources to Python 3.
This module provides a single class, RobotFileParser
, which answers
questions about whether or not a particular user agent can fetch a URL on the
Web site that published the robots.txt
file. For more details on the
structure of robots.txt
files, see http://www.robotstxt.org/orig.html.
-
class
robotparser.
RobotFileParser
(url='')[source]¶ This class provides methods to read, parse and answer questions about the
robots.txt
file at url.-
can_fetch
(useragent, url)[source]¶ Returns
True
if the useragent is allowed to fetch the url according to the rules contained in the parsedrobots.txt
file.
-
The following example demonstrates basic use of the RobotFileParser class.
>>> import robotparser
>>> rp = robotparser.RobotFileParser()
>>> rp.set_url("http://www.musi-cal.com/robots.txt")
>>> rp.read()
>>> rp.can_fetch("*", "http://www.musi-cal.com/cgi-bin/search?city=San+Francisco")
False
>>> rp.can_fetch("*", "http://www.musi-cal.com/")
True