org.apache.nutch.protocol.http.api
Class HttpRobotRulesParser
java.lang.Object
org.apache.nutch.protocol.RobotRulesParser
org.apache.nutch.protocol.http.api.HttpRobotRulesParser
- All Implemented Interfaces:
- org.apache.hadoop.conf.Configurable
public class HttpRobotRulesParser
- extends RobotRulesParser
This class is used for parsing robots for urls belonging to HTTP protocol.
It extends the generic RobotRulesParser
class and contains
Http protocol specific implementation for obtaining the robots file.
Method Summary |
crawlercommons.robots.BaseRobotRules |
getRobotRulesSet(Protocol http,
URL url)
The hosts for which the caching of robots rules is yet to be done,
it sends a Http request to the host corresponding to the URL
passed, gets robots file, parses the rules and caches the rules object
to avoid re-work in future. |
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
LOG
public static final org.slf4j.Logger LOG
allowForbidden
protected boolean allowForbidden
HttpRobotRulesParser
public HttpRobotRulesParser(org.apache.hadoop.conf.Configuration conf)
getRobotRulesSet
public crawlercommons.robots.BaseRobotRules getRobotRulesSet(Protocol http,
URL url)
- The hosts for which the caching of robots rules is yet to be done,
it sends a Http request to the host corresponding to the
URL
passed, gets robots file, parses the rules and caches the rules object
to avoid re-work in future.
- Specified by:
getRobotRulesSet
in class RobotRulesParser
- Parameters:
http
- The Protocol
objecturl
- URL
- Returns:
- robotRules A
BaseRobotRules
object for the rules
Copyright © 2013 The Apache Software Foundation