Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
https://anaconda.org/r/r-robotstxt/badges/version.svg
https://anaconda.org/r/r-robotstxt/badges/latest_release_date.svg
https://anaconda.org/r/r-robotstxt/badges/latest_release_relative_date.svg
https://anaconda.org/r/r-robotstxt/badges/platforms.svg
https://anaconda.org/r/r-robotstxt/badges/license.svg
https://anaconda.org/r/r-robotstxt/badges/downloads.svg