Coder Social home page Coder Social logo

shnewto / ashpaper Goto Github PK

View Code? Open in Web Editor NEW
12.0 2.0 3.0 148 KB

Rust Inpterpreter for Esopo language AshPaper conceived by William Hicks

License: MIT License

Rust 100.00%
esopo esolang esoteric-language esoteric-programming-language interpreter esoteric-interpreter poetry rust rust-lang rust-library

ashpaper's Introduction

ashpaper's People

Contributors

2313499 avatar atul9 avatar crockagile avatar dependabot[bot] avatar shnewto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

ashpaper's Issues

poems, poems, poems

Would love some poems that do interesting things added to the ashpaper-bin/poems directory as .eso files.

unicode support

Need to at least verify that non-ascii characters don't break anything. For now, it's expected that no unicode characters will correspond to rules / instructions but they shouldn't stop us from interpreting some poetry.

Phase out instances of `unwrap`

I'd like to start the work of phasing out all the unwraps going on! But the battle has been stalled by what's turning out to be a thinker... (for me)

For the purpose of illustrating what I think might be interesting to more people than just me, and at the risk of it being embarrassing for me, I'm going to go into a little extra detail about what I've been doing to get rid of one instance of unwrap in particular. (Forgive me if I'm missing something extremely obvious, I've been fighting against letting APIs make me think it's my fault / I'm using it wrong. I was inspired by Brian Hicks addressing UX gaslighting users in a talk he gave at Elm in the Spring this year. But all that doesn't mean I'm not using it wrong!)

The offending code is this:

fn parse(line: &str) -> Instuctions {
AshPaper::parse(Rule::program, line)
.unwrap_or_else(|e| panic!("{}", e))
.next()
.unwrap()
}

I'd ultimately like to manually handle all the unwrapping in that function and return an instance of Instructions but for illustration's sake / brevity of the examples, lets assume we were going to return AshPaper::parse(Rule::program, line). But what is that?

The AshPaper struct is the result of some pest magic.

#[derive(Parser)]
#[grammar = "grammar.pest"]
pub struct AshPaper;

Instead of digging into the docs, I just put in something that it doesn't return, a String. The compiler knows exactly what it needs and it will tell me! And it did...

    = note: expected type `std::string::String`
               found type `std::result::Result<pest::iterators::Pairs<'_, program::Rule>, pest::error::Error<program::Rule>>`

Ah okay. So there's some more pest magic happening here with my program's version of that R. It corresponds to an enum called Rule that's defined and generated by the rules in my grammar.pest

So, I gave that a go. I dropped in

use pest::error::Error;
use pest::iterators::Pairs;

And changed my parse function's signature.

fn parse(line: &str) -> Result<Pairs<'_, Rule>, Error<Rule>> 

And...this:

error[E0283]: type annotations required: cannot resolve `_: pest::Parser<program::Rule>`
   --> ashpaper/src/program.rs:141:5
    |
141 |     Parser::parse(Rule::program, line)
    |     ^^^^^^^^^^^^^
    |
    = note: required by `pest::Parser::parse`

๐Ÿค” I'm not sure what to do about that one... So, it's time to go to the docs, there's got to be an example of how to do this.

The pest book's section on its parse function says it's type is Result< Pairs, Error >... which I have? And the example code uses unwrap ๐Ÿ˜“ Which I get! Examples are usually bogged down by that stuff, and the actual signature is available in the crate's docs. Which, after perusing, led me to give a slightly terser signature a try. When you're feeling desperate, what isn't on the table ya know?

The crate's docs say the function signature is this:

fn parse(rule: R, input: &str) -> Result<Pairs<R>, Error<R>>

So, my parse function becomes:

fn parse(line: &str) -> Result<Pairs<Rule>, Error<Rule>>

And...

error[E0283]: type annotations required: cannot resolve `_: pest::Parser<program::Rule>`
   --> ashpaper/src/program.rs:141:5
    |
141 |     Parser::parse(Rule::program, line)
    |     ^^^^^^^^^^^^^
    |
    = note: required by `pest::Parser::parse`

Donk. Same error. Please excuse this next bit if you're troubled by people doing nonsensical things.

I revised the function body, from this:

Parser::parse(Rule::program, line)

to this:

let res: Result<Pairs<Rule>, Error<Rule>> = Parser::parse(Rule::program, line);
res

Again, donk.

error[E0283]: type annotations required: cannot resolve `_: pest::Parser<program::Rule>`
   --> ashpaper/src/program.rs:141:49
    |
141 |     let res: Result<Pairs<Rule>, Error<Rule>> = Parser::parse(Rule::program, line);
    |                                                 ^^^^^^^^^^^^^
    |
    = note: required by `pest::Parser::parse`

error: aborting due to previous error

๐Ÿ˜…

Maybe this?

    let res: Result<Pairs<Rule>, Error<Rule>> =
        Parser::parse(Rule::program, line) as Result<Pairs<Rule>, Error<Rule>>;
    res

And:

error[E0283]: type annotations required: cannot resolve `_: pest::Parser<program::Rule>`
   --> ashpaper/src/program.rs:142:9
    |
142 |         Parser::parse(Rule::program, line) as Result<Pairs<Rule>, Error<Rule>>;
    |         ^^^^^^^^^^^^^
    |
    = note: required by `pest::Parser::parse`

error: aborting due to previous error

A third donk. And I've tapped out to take a rest. If anyone reading this has a thought to share, please do!

My Spec Complaint Fork

I thought that this was a really cool project, but was sad to see that it still didn't implement rhyming or alliteration support. It looks like something was in the works, but it seems to be from 2019, so I thought I'd try my hand at it. I would appreciate your thoughts on my attempt and would like to know if you'd consider a pull request.

The reason I'm raising an issue rather than a pull request is that my fork is a substantial deviation from the initial project - rather than using a Pest parser, it replaces that with a more flexible, custom solution. However, I don't know if that's what you want in your project, so it's really up to you.

The parser itself is mainly modeled after the parsing logic in the original Python interpreter, with a few enhancements:

  • all syllable counting is handled at parse time and passed to the executor
  • each instruction passes the register to which it's applicable (eliminating internal state beyond the stack and registers)
  • rhyming and alliteration are both determined at parse time rather than during execution

The syllable counting is done with a custom version of the cmudict crate, called cmudict-fast that I put together. It optimizes lookup time by loading the entire dictionary into a hashmap in volitile memory, obviously at the cost of startup time. As a backup, it uses the exact same method as the original crate & Python interpreter of counting vowel clusters. Since the original Ashpaper interpreter also used the cmu dictionary, programs written for it should also work with this one. It doesn't, however, run identically to your original crate in all cases (for example, you'll notice lovely-poem.eso is missing an a on line 17).

To make sure the interpreter can be distributed as a single executable, the dictionary is embedded into it, which increases executable size by about 3.5 MB

The executor is mostly the same, just modified to use the enumerations from the new parser and with added instruction handling for the new features.

I additionally added an argument to the CLI to tell a user how many syllables a string contains to make determining the actual syllable count easier for the user.

I'm still in the process of writing comprehensive tests and documenting the code, but the library itself should all be working.

If I remember any other major changes I'll update this appropriately.

In terms of far-off plans, I plan to make an LLVM backend at some point.

Also, thank you - your interpreter was how I originally found the AshPaper language, and it's been a lot of fun to get this working.

Please let me know your thoughts, and if you have any questions, comments, and concerns.

refactor lib with custom errors

Right now, we're just using Result<(),()> to track success or failure. It'd be great to be able to report helpful feedback on failure though.

Property tests!

I think the ashpaper lib is a good candidate for some property tests!

Info/debug logs

It's really helpful to see what rules are being evaluated when writing a poegram for this library. The AshPaper spec has great examples of what helpful information looks like.

In an ideal world, info logs would output something like the EXECUTION section of the spec and Debug logs would include the information of the spec's ANNOTATION section.

Not set on info/debug being the actual mechanism though.

Don't ignore code in doc tests

Right now, the code in the docs is ignored by the doc tests, would be great if we could figure out how to get them running.

Implement rhyming instruction

End rhyme with previous line: If register 0 < register 1, push the number of
syllables present in the previous line to the stack. Otherwise, push the number of
syllables in the current line to the stack.

One direction this could take is using the rhyme crate. But it's currently not building and it may be a ways off. It's issue requires an update in the cmudict project. There is a PR up for that though, so we'll see.

https://gitlab.com/pwoolcoc/cmudict/merge_requests/2

Resolve `cargo doc` warning

โœ˜ cargo doc --verbose --all
warning: output filename collision.
The lib target `ashpaper` in package `ashpaper v0.1.3` has the same output filename as the lib target `ashpaper` in package `ashpaper v0.1.3 (/home/shea/src/ashpaper/ashpaper)`.
Colliding filename is: /home/shea/src/ashpaper/target/doc/ashpaper/index.html
The targets should have unique names.
Consider changing their names to be unique or compiling them separately.
This may become a hard error in the future; see <https://github.com/rust-lang/cargo/issues/6313>.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.