Package: libwww-robotrules-perl Version: 6.02-1 Architecture: all Maintainer: Debian Perl Group Installed-Size: 36 Depends: perl, liburi-perl Breaks: libwww-perl (<< 6.00) Replaces: libwww-perl (<< 6.00) Section: perl Priority: optional Homepage: https://metacpan.org/release/WWW-RobotRules Description: database of robots.txt-derived permissions WWW::RobotRules parses /robots.txt files as specified in "A Standard for Robot Exclusion", at . Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. . The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.