Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
copied from cf-staging / r-robotstxtLabel | Latest Version |
---|---|
main | 0.7.15 |