Awesome
Parse robots.txt
, robots
meta and headers
Determine if a page may be crawled from robots.txt, robots meta tags and robot headers.
Support us
<img src="https://github-ads.s3.eu-central-1.amazonaws.com/robots-txt.jpg?t=1" width="419px" />
We invest a lot of resources into creating best in class open source packages. You can support us by buying one of our paid products.
We highly appreciate you sending us a postcard from your hometown, mentioning which of our package(s) you are using. You'll find our address on our contact page. We publish all received postcards on our virtual postcard wall.
Installation
You can install the package via composer:
composer require spatie/robots-txt
Usage
$robots = Spatie\Robots\Robots::create();
$robots->mayIndex('https://www.spatie.be/nl/admin');
$robots->mayFollowOn('https://www.spatie.be/nl/admin');
You can also specify a user agent:
$robots = Spatie\Robots\Robots::create('UserAgent007');
By default, Robots
will look for a robots.txt
file on https://host.com/robots.txt
.
Another location can be specified like so:
$robots = Spatie\Robots\Robots::create()
->withTxt('https://www.spatie.be/robots-custom.txt');
$robots = Spatie\Robots\Robots::create()
->withTxt(__DIR__ . '/public/robots.txt');
Testing
composer test
Changelog
Please see CHANGELOG for more information what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
Please review our security policy on how to report security vulnerabilities.
Postcardware
You're free to use this package, but if it makes it to your production environment we highly appreciate you sending us a postcard from your hometown, mentioning which of our package(s) you are using.
Our address is: Spatie, Kruikstraat 22, 2018 Antwerp, Belgium.
We publish all received postcards on our company website.
Credits
License
The MIT License (MIT). Please see License File for more information.