Coder Social home page Coder Social logo

folyd / robotstxt Goto Github PK

View Code? Open in Web Editor NEW
85.0 6.0 12.0 212 KB

A native Rust port of Google's robots.txt parser and matcher C++ library.

Home Page: https://crates.io/crates/robotstxt

License: Apache License 2.0

Rust 60.05% C++ 35.39% Makefile 0.42% C 1.10% CMake 3.04%
robotstxt google-robots-parser rust

robotstxt's Introduction

robotstxt

Crates.io Docs.rs Apache 2.0

A native Rust port of Google's robots.txt parser and matcher C++ library.

  • Native Rust port, no third-part crate dependency
  • Zero unsafe code
  • Preserves all behavior of original library
  • Consistent API with the original library
  • 100% google original test passed

Installation

[dependencies]
robotstxt = "0.3.0"

Quick start

use robotstxt::DefaultMatcher;

let mut matcher = DefaultMatcher::default();
let robots_body = "user-agent: FooBot\n\
                   disallow: /\n";
assert_eq!(false, matcher.one_agent_allowed_by_robots(robots_body, "FooBot", "https://foo.com/"));

About

Quoting the README from Google's robots.txt parser and matcher repo:

The Robots Exclusion Protocol (REP) is a standard that enables website owners to control which URLs may be accessed by automated clients (i.e. crawlers) through a simple text file with a specific syntax. It's one of the basic building blocks of the internet as we know it and what allows search engines to operate.

Because the REP was only a de-facto standard for the past 25 years, different implementers implement parsing of robots.txt slightly differently, leading to confusion. This project aims to fix that by releasing the parser that Google uses.

The library is slightly modified (i.e. some internal headers and equivalent symbols) production code used by Googlebot, Google's crawler, to determine which URLs it may access based on rules provided by webmasters in robots.txt files. The library is released open-source to help developers build tools that better reflect Google's robots.txt parsing and matching.

Crate robotstxt aims to be a faithful conversion, from C++ to Rust, of Google's robots.txt parser and matcher.

Testing

$ git clone https://github.com/Folyd/robotstxt
Cloning into 'robotstxt'...
$ cd robotstxt/tests 
...
$ mkdir c-build && cd c-build
...
$ cmake ..
...
$ make
...
$ make test
Running tests...
Test project ~/robotstxt/tests/c-build
    Start 1: robots-test
1/1 Test #1: robots-test ......................   Passed    0.33 sec

License

The robotstxt parser and matcher Rust library is licensed under the terms of the Apache license. See LICENSE for more information.

robotstxt's People

Contributors

folyd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

robotstxt's Issues

Panics when slicing within char boundaries

The parser can panic if it tries to slice a string within char boundaries.

thread '<unnamed>' panicked at 'byte index 58198 is not a char boundary; it is inside '\u{a0}'
...
/root/.cargo/registry/src/github.com-1ecc6299db9ec823/robotstxt-0.3.0/src/parser.rs:169:57

crashes for deviantart robots.txt

When parsing https://www.deviantart.com/robots.txt

User-agent: *
Disallow: /*q=
Disallow: /users/*?
Disallow: /join/*?
Disallow: /morelikethis/
Disallow: /download/
Disallow: /checkout/
Disallow: /global/
Disallow: /api/
Disallow: /critiques/
 
Sitemap: http://sitemaps.deviantart.net/sitemap-index.xml.gz

the parser fails with

thread 'main' panicked at 'assertion failed: !val.is_empty()', /home/me/.local/share/cargo/registry/src/github.com-1ecc6299db9ec823/robotstxt-0.2.0/src/parser.rs:207:17

Reproduction:

use robotstxt::DefaultMatcher;

fn main() {
    let robots_content = r#"User-agent: *
Disallow: /*q=
Disallow: /users/*?
Disallow: /join/*?
Disallow: /morelikethis/
Disallow: /download/
Disallow: /checkout/
Disallow: /global/
Disallow: /api/
Disallow: /critiques/
 
Sitemap: http://sitemaps.deviantart.net/sitemap-index.xml.gz"#;
    let mut matcher = DefaultMatcher::default();
    matcher.one_agent_allowed_by_robots(&robots_content, "oldnews", "https://www.deviantart.com/");
}

I'm assuming it is because of the line between the Disallows and the Sitemap, which only contains a single space.

Document test dependencies

While following README.md and trying to build && run official google tests
I get this error
/usr/bin/ld: cannot find -labsl::container collect2: error: ld returned 1 exit status

on make stage

What should I install in order to fix this and run tests?

uname -a
5.8.0-53-generic #60-Ubuntu SMP Thu May 6 07:46:32 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux

panics when robots_txt is not valid

Fetching https://install.pivpn.io/robots.txt redirects to https://raw.githubusercontent.com/pivpn/pivpn/master/auto_install/install.sh which is a text/plain shell script.

The parser panics when trying to parse that file.

thread 'main' panicked at 'assertion failed: self.type_ == ParseKeyType::Unknown && !self.key_text.is_empty()', /home/me/.local/share/cargo/git/checkouts/robotstxt-269483cb38f6894f/ffe972d/src/parser.rs:88:9

I'm not entirely sure what the correct behavior should be, but simply ignoring unparsable lines seems like a good option.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.