File "robotparser.cpython-38.pyc"
Full Path: /home/attunedd/public_html/byp/izo/con7ext_sym404/rintoar.txt/lib64/python3.8/urllib/__pycache__/robotparser.cpython-38.pyc
File size: 7.16 KB
MIME-type: text/x-bytecode.python
Charset: 8 bit
U
e5d$ @ s\ d Z ddlZddlZddlZdgZeddZG dd dZG dd dZ G d d
d
Z
dS )a% robotparser.py
Copyright (C) 2000 Bastian Kleineidam
You can choose between two licenses when using this package:
1) GNU GPLv2
2) PSF license for Python 2.2
The robots.txt Exclusion Protocol is implemented as specified in
http://www.robotstxt.org/norobots-rfc.txt
NRobotFileParserRequestRatezrequests secondsc @ sr e Zd ZdZdddZdd Zdd Zd d
Zdd Zd
d Z dd Z
dd Zdd Zdd Z
dd Zdd ZdS )r zs This class provides a set of methods to read, parse and answer
questions about a single robots.txt file.
c C s2 g | _ g | _d | _d| _d| _| | d| _d S )NFr )entriessitemaps
default_entrydisallow_all allow_allset_urllast_checkedselfurl r */usr/lib64/python3.8/urllib/robotparser.py__init__ s
zRobotFileParser.__init__c C s | j S )zReturns the time the robots.txt file was last fetched.
This is useful for long-running web spiders that need to
check for new robots.txt files periodically.
)r r
r r r mtime% s zRobotFileParser.mtimec C s ddl }| | _dS )zYSets the time the robots.txt file was last fetched to the
current time.
r N)timer )r
r r r r modified. s zRobotFileParser.modifiedc C s&