Coder Social home page Coder Social logo

genstark's People

Contributors

bobbinth avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

genstark's Issues

Reduce proof size

  1. Add proof-of-work to Merkle trees
  2. Potentially refactor low-degree proof structure to avoid redundant sampling

Potentially simplify linear combination for low degree proof

To reduce the size of FRI proofs, polynomials P(x), B(x) and D(x) are combined into a single polynomial using random linear combination. In Vitalik Buterin's STARKs, Part 3: Into the Weeds this done by combining P, Psteps, B, Bsteps, and D as follows:

E = k1 * P + k2 * P * xsteps+ k3 * B + k4 * B * xsteps + D

This library implements a generalized version of this approach, but it is not clear to me why the linear combination can't be done with just Psteps, Bsteps, and D as:

E = k1 * P * xsteps+ k2 * B * xsteps + D

If the above does not sacrifice security, it would simplify the code a little and also make #5 straightforward to implement.

Proof size should not depend on the number of boundary and transition constraints

Right now, STARK proofs include values for all P(x), B(x), and D(x) for all spot checks. Specifically, a leaf of the evaluation Merkle tree has the following form:

╒══════ P(x) ═══════╕╒══════ B(x) ═══════╕╒═══════ D(x) ═══════╕

 P0 P1 P2 P3 ..... Pw B0 B1 B2 B3 ..... Bk D0 D1 D2 D3 ..... Dn 
├──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┴──┤

This means that the proof size grows with the number of registers w, number of registers mentioned in boundary constraints k, and number of transition constraints n.

While we can't do anything about w (we have to include all evaluations of P(x) into the proof), it seems that we should be able to get away with including only linear combinations of B(x) and D(x).

However, to compute a linear combination, we need a source of randomness, and at the time we build a Merkle tree of evaluations, we don't have anything "random" (once it is built, we use the root of the evaluation Merkle tree as the source of randomness).

One approach to address this could be:

  1. Include only P(x) evaluations in the evaluation Merkle tree.
  2. Use the root of the tree to compute random linear combinations of B(x) and D(x).
  3. Put the random linear combinations into another Merkle tree.

If we can re-purpose the tree we use to prove correctness of random linear combination for low degree proof, than this will come at almost no extra cost.

Determine appropriate defaults

Currently, there isn't really any good motivation behind choosing various default parameters. Some analysis needs to go into choosing defaults for:

  • extensionFactor
  • exeSpotCheckCount
  • friSpotCheckCount

Probably #3 should be done first.

Conflicting node versions

yarn add v1.13.0
info No lockfile found.
[1/4] 🔍  Resolving packages...
[2/4] 🚚  Fetching packages...
error @guildofweavers/[email protected]: The engine "node" is incompatible with this module. Expected version "12.3.x". Got "12.6.0"
error Found incompatible module
info Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.
λ  stark-test nvm install 12.3
Downloading and installing node v12.3.1...
Downloading https://nodejs.org/dist/v12.3.1/node-v12.3.1-darwin-x64.tar.xz...
######################################################################## 100.0%
Computing checksum with shasum -a 256
Checksums matched!
Now using node v12.3.1 (npm v6.9.0)
λ  stark-test yarn add @guildofweavers/genstark
yarn add v1.13.0
info No lockfile found.
[1/4] 🔍  Resolving packages...
[2/4] 🚚  Fetching packages...
error @guildofweavers/[email protected]: The engine "node" is incompatible with this module. Expected version "12.6.x". Got "12.3.1"
error Found incompatible module
info Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.

Typescript compilation error: cannot find namespace 'WebAssembly'

Thanks for the library; looks very interesting.

From a fresh Git clone, after npm install, tsc -p . fails with:

lib/Stark.ts:364:26 - error TS2304: Cannot find name 'WebAssembly'.

364             memory : new WebAssembly.Memory({
                             ~~~~~~~~~~~

lib/Stark.ts:373:28 - error TS2304: Cannot find name 'WebAssembly'.

373         const memory = new WebAssembly.Memory({ initial: initialMemory, maximum: maximumMemory });
                               ~~~~~~~~~~~

node_modules/@guildofweavers/galois/galois.d.ts:7:26 - error TS2503: Cannot find namespace 'WebAssembly'.

7         readonly memory: WebAssembly.Memory;
                           ~~~~~~~~~~~

node_modules/@guildofweavers/merkle/merkle.d.ts:22:26 - error TS2503: Cannot find namespace 'WebAssembly'.

22         readonly memory: WebAssembly.Memory;
                            ~~~~~~~~~~~


Found 8 errors.

I am running node version v12.10.0.

Any ideas? I could try a global declaration in global.d.ts or the like:

declare const WebAssembly: any

but I would have to modify the dependencies as well.

Presumably I have simply misconfigured something locally.

Refactor arithmetic script parsing

The current approach uses self-implemented parser. This works for simple script semantics, but adding functionality to the script becomes more difficult. A better way would be to use some off-the-shelf lexer/parser to do the work.

Potential options:

Low degree proof fails for small number of steps

Hello,

I am playing with airscript and genstark to see how it works and I got the following problem:

I use this dummy airscript program :

define Foo over prime field (21888242871839275222246405745257275088548364400416034343698204186575808495617) {

        public input public_inputs: element[${num_public_inputs}];
        secret input secret_inputs: element[${num_secret_inputs}];

        // define transition function
        transition ${num_public_inputs+num_secret_inputs} registers {
            for each (public_inputs, secret_inputs) {
                init { yield [public_inputs, secret_inputs]; }
                for steps [1..${depth-1}] { yield $r; }
            }
        }

        // define transition constraints
        enforce ${num_public_inputs+num_secret_inputs} constraints {
            for all steps { enforce transition($r) = $n; }
        }
    }

Basically it takes a certain number of public and secret inputs, put them in the registers, and does nothing for a certain number of steps (called depth).

I want to prove the following assertions:
At step 0: the registers initiated with public values have the correct value
At step depth-1: all the registers still have their initial values.

When I use genstark to generate a proof with depth = 8, I get the following error :

StarkError: Low degree proof failed: Invalid array length

When I tried to debug I found the source of the problem:

In genstark/lib/components/LowDegreeProver.js:

class LowDegreeProver {

    ...

    prove(cEvaluations, domain, maxDegreePlus1) {
        
       ...

        const componentCount = getComponentCount(cEvaluations.length);
        const proof = {
            lcRoot: pTree.root,
            lcProof: lcProof,
            components: new Array(componentCount),
            remainder: []
        };

        ...

    }

With getComponentCount defined as:

function getComponentCount(valueCount) {
    let result = Math.ceil(Math.log2(valueCount) / 2); // round up log(valueCount, 4);
    result -= REMAINDER_SLOTS;
    return Math.min(result, 0);
}

Here in case of depth = 8, cEvaluations.length is 32
which means getComponentCount(cEvaluations.length) returns -1 !

Then, we try to initialize components: new Array(componentCount) with a size of -1, and get the error.

This problem disapears when I set depth >= 32.

Is that the expected behavior ?

Shouldn't it be return Math.max(result, 0); in getComponentCount ?

error when running examples

HI, I have ts-node v10.7.0 and tried to run
ts-node fibonacci.ts from examples/demo as well as other examples
I get the same error below.
Could you advise on how to work around this. Maybe a different version of ts-node is needed?
Thanks.

The error log:
`
/home/default2/.nvm/versions/node/v15.14.0/lib/node_modules/ts-node/src/index.ts:820
return new TSError(diagnosticText, diagnosticCodes);
^
TSError: ⨯ Unable to compile TypeScript:
../../lib/Stark.ts:101:76 - error TS2345: Argument of type 'unknown' is not assignable to parameter of type 'Error | undefined'.

101 throw new StarkError(Failed to generate the execution trace, error);
~~~~~
../../lib/Stark.ts:143:61 - error TS2345: Argument of type 'unknown' is not assignable to parameter of type 'Error | undefined'.

143 throw new StarkError('Low degree proof failed', error);
~~~~~
../../lib/Stark.ts:212:90 - error TS2345: Argument of type 'unknown' is not assignable to parameter of type 'Error | undefined'.

212 error = new StarkError(Verification of evaluation Merkle proof failed, error);
~~~~~
../../lib/Stark.ts:242:71 - error TS2345: Argument of type 'unknown' is not assignable to parameter of type 'Error | undefined'.

242 throw new StarkError('Verification of low degree failed', error);
~~~~~

at createTSError (/home/default2/.nvm/versions/node/v15.14.0/lib/node_modules/ts-node/src/index.ts:820:12)
at reportTSError (/home/default2/.nvm/versions/node/v15.14.0/lib/node_modules/ts-node/src/index.ts:824:19)
at getOutput (/home/default2/.nvm/versions/node/v15.14.0/lib/node_modules/ts-node/src/index.ts:1014:36)
at Object.compile (/home/default2/.nvm/versions/node/v15.14.0/lib/node_modules/ts-node/src/index.ts:1322:43)
at Module.m._compile (/home/default2/.nvm/versions/node/v15.14.0/lib/node_modules/ts-node/src/index.ts:1454:30)
at Module._extensions..js (node:internal/modules/cjs/loader:1121:10)
at Object.require.extensions.<computed> [as .ts] (/home/default2/.nvm/versions/node/v15.14.0/lib/node_modules/ts-node/src/index.ts:1458:12)
at Module.load (node:internal/modules/cjs/loader:972:32)
at Function.Module._load (node:internal/modules/cjs/loader:813:14)
at Module.require (node:internal/modules/cjs/loader:996:19) {

diagnosticCodes: [ 2345, 2345, 2345, 2345 ]
}
`

Security level of a STARK

There should be a getter on the Stark object to calculate security level of a specific STARK instance. Something as simple as Stark.SecurityLevel will do. The security level probably depends on:

  • Constraint degree
  • Extension factor
  • Execution trace query count
  • FRI proof query count
  • Security of the underlying hash function
  • Size of the underlying field
  • Anything else?

Need to come up with the exact formula for how all these come together.

Use expressions to define transition function and constraints

Currently, to create a STARK transition functions and constraints we need to use JavaScript functions. A better way would be to parse math expressions into these functions automatically.

For example, a MiMC constraint could be defined as:

N0 - (R0^3 + K0)

Instead of current:

function mimcConstraint(this: EvaluationFrame): bigint {
    const r0 = this.getValue(0);
    const k0 = this.getConst(0);
    const n0 = this.getNextValue(0);
    return this.sub(n0, this.add(this.exp(r0, 3n), k0));
}

This would also allow expressing STARK config as a plain JSON file.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.