robots_txt 2.1.0
robots_txt: ^2.1.0 copied to clipboard
A complete, dependency-less and fully documented `robots.txt` ruleset parser.
2.1.0 #
- Added a method
.validate()for validating files. - Renamed
parser.darttorobots.dart.
2.0.1 #
- Converted the
onlyApplicableToparameter inRobots.parse()from aStringinto aSetto allow multiple user-agents to be specified at once. - Fixed the
onlyApplicableToparameter inRobots.parse()not being taken into account.
2.0.0 #
- Additions:
- Added dependencies:
metafor static analysis.
- Added developer dependencies:
testfor testing.
- Added support for the 'Sitemap' field.
- Added support for specifying:
- The precedent rule type for determining whether a certain user-agent can
or cannot access a certain path. (
PrecedentRuleType) - The comparison strategy to use for comparing rule precedence.
(
PrecedenceStrategy)
- The precedent rule type for determining whether a certain user-agent can
or cannot access a certain path. (
- Added tests.
- Added dependencies:
- Changes:
- Bumped the minimum SDK version to
2.17.0for enhanced enum support.
- Bumped the minimum SDK version to
- Improvements:
- Made all structs
constand marked them as@sealedand@immutable.
- Made all structs
- Deletions:
- Removed dependencies:
sprintweb_scraper
- Removed dependencies:
1.1.1 #
- Updated project description.
- Adapted code to lint rules.
1.1.0+3 #
- Improved documentation.
- Bumped year in the license.
1.1.0+2 #
- Updated package description.
- Updated dependency versions.
1.1.0+1 #
- Formatted files in accordance with
dartfmt.
1.1.0 #
- Fixed the reading of the contents of
robots.txt. - Fixed the parsing of rule fields to
Rules. - Added
example.dart.
1.0.0 #
- Initial release.