Package: libwww-robotrules-perl Status: install reinstreq unpacked Priority: optional Section: perl Installed-Size: 36 Maintainer: Debian Perl Group Architecture: all Version: 6.02-1 Replaces: libwww-perl (<< 6.00) Depends: perl, liburi-perl Breaks: libwww-perl (<< 6.00) Description: database of robots.txt-derived permissions WWW::RobotRules parses /robots.txt files as specified in "A Standard for Robot Exclusion", at . Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. . The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. Homepage: https://metacpan.org/release/WWW-RobotRules