Coder Social home page Coder Social logo

poly-commit's Introduction

Polynomial Commitments

poly-commit is a Rust library that implements polynomial commitment schemes. This library was initially developed as part of the Marlin paper, and is released under the MIT License and the Apache v2 License (see License).

WARNING: This is an academic prototype, and in particular has not received careful code review. This implementation is NOT ready for production use.

Overview

A polynomial commitment scheme is a cryptographic primitive that enables a party to commit to a polynomial over a given finite field, and then, later on, to reveal desired evaluations of the polynomial along with cryptographic proofs attesting to their correctness.

This library provides various constructions of polynomial commitment schemes. These constructions support committing to multiple polynomials at a time with differing degree bounds, batching multiple evaluation proofs for the same evaluation point into a single one, and batch verification of proofs.

The key properties satisfied by the polynomial commitment schemes are succinctness, extractability, and hiding. See the Marlin paper for definitions of these properties.

Build guide

The library compiles on the stable toolchain of the Rust compiler. To install the latest version of Rust, first install rustup by following the instructions here, or via your platform's package manager. Once rustup is installed, install the Rust toolchain by invoking:

rustup install stable

After that, use cargo (the standard Rust build tool) to build the library:

git clone https://github.com/scipr-lab/poly-commit.git
cd poly-commit
cargo build --release

This library comes with some unit and integration tests. Run these tests with:

cargo test

A benchmarking module is also provided for the commit, open and verify methods, as well as for computing the commitment and proof size. You can add a new benchmark for your scheme following the examples in the pcs/benches directory, or run the existing benchmarks with:

cargo bench

Lastly, this library is instrumented with profiling infrastructure that prints detailed traces of execution time. To enable this, compile with cargo build --features print-trace.

Usage

This trait defines the interface for a polynomial commitment scheme. It is recommended to use the schemes from this crate that implement the PolynomialCommitment trait (e.g. the vanilla KZG scheme does not implement this trait, but the Marlin scheme which uses it under the hood, does).

// In this example, we will commit to a single polynomial, open it first at one point, and then batched at two points, and finally verify the proofs.
// We will use the KZG10 polynomial commitment scheme, following the approach from Marlin.

use ark_poly_commit::{Polynomial, marlin_pc::MarlinKZG10, LabeledPolynomial, PolynomialCommitment, QuerySet, Evaluations};
use ark_bls12_377::Bls12_377;
use ark_crypto_primitives::sponge::poseidon::{PoseidonSponge, PoseidonConfig};
use ark_crypto_primitives::sponge::CryptographicSponge;
use ark_ec::pairing::Pairing;
use ark_ff::UniformRand;
use ark_std::test_rng;
use ark_poly::{univariate::DensePolynomial, DenseUVPolynomial};
use rand_chacha::ChaCha20Rng;
use ark_ff::PrimeField;

type UniPoly_377 = DensePolynomial<<Bls12_377 as Pairing>::ScalarField>;
type Sponge_Bls12_377 = PoseidonSponge<<Bls12_377 as Pairing>::ScalarField>;
type PCS = MarlinKZG10<Bls12_377, UniPoly_377, Sponge_Bls12_377>;

let rng = &mut test_rng();

let max_degree = 16; // max degree supported by the scheme with the given public parameters generated by the setup here.

// 1. PolynomialCommitment::setup
// The setup procedure in this example is for demonstration purposes only - typically a setup ceremony would be run to generate the public parameters.
let pp = PCS::setup(max_degree, None, rng).unwrap();

let degree = 10; //degree of our polynomial
let secret_poly = UniPoly_377::rand(degree, rng);

let point_1 = <Bls12_377 as Pairing>::ScalarField::rand(rng);
let point_2 = <Bls12_377 as Pairing>::ScalarField::rand(rng);

let label = String::from("secret_poly");
let labeled_poly = LabeledPolynomial::new(
   label.clone(),
   secret_poly.clone(),
   Some(degree),
   Some(2), // we will open a univariate poly at two points
);

// TODO: replace by https://github.com/arkworks-rs/crypto-primitives/issues/112.
fn test_sponge<F: PrimeField>() -> PoseidonSponge<F> {
   let full_rounds = 8;
   let partial_rounds = 31;
   let alpha = 17;

   let mds = vec![
      vec![F::one(), F::zero(), F::one()],
      vec![F::one(), F::one(), F::zero()],
      vec![F::zero(), F::one(), F::one()],
   ];

   let mut v = Vec::new();
   let mut ark_rng = test_rng();

   for _ in 0..(full_rounds + partial_rounds) {
      let mut res = Vec::new();

      for _ in 0..3 {
         res.push(F::rand(&mut ark_rng));
      }
      v.push(res);
   }
   let config = PoseidonConfig::new(full_rounds, partial_rounds, alpha, mds, v, 2, 1);
   PoseidonSponge::new(&config)
}
let mut test_sponge = test_sponge::<<Bls12_377 as Pairing>::ScalarField>();

// 2. PolynomialCommitment::trim
// Since the setup produced pp with a max degree of 16, and our poly is of degree 10, we can trim the SRS to tailor it to this example.
let (ck, vk) = PCS::trim(&pp, degree, 2, Some(&[degree])).unwrap(); 

// 3. PolynomialCommitment::commit
// The prover commits to the polynomial using their committer key `ck`.
let (comms, states) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap(); 

// 4a. PolynomialCommitment::open
// Opening proof at a single point.
let proof_single = PCS::open(&ck, [&labeled_poly], &comms, &point_1, &mut (test_sponge.clone()), &states, None).unwrap(); 

// 5a. PolynomialCommitment::check
// Verifying the proof at a single point, given the commitment, the point, the claimed evaluation, and the proof.
assert!(PCS::check(&vk, &comms, &point_1, [secret_poly.evaluate(&point_1)], &proof_single, &mut (test_sponge.clone()), Some(rng)).unwrap()); 

let mut query_set = QuerySet::new();
let mut values = Evaluations::new();
for (i, point) in [point_1, point_2].iter().enumerate() {
   query_set.insert((label.clone(), (format!("{}", i), point.clone())));
   let value = secret_poly.evaluate(&point);
   values.insert((label.clone(), point.clone()), value);
}

// 4b. PolynomialCommitment::batch_open
// Some schemes support batch opening proofs. Generate a single proof for opening the polynomial at multiple points.
let proof_batched = PCS::batch_open(
   &ck,
   [&labeled_poly],
   &comms,
   &query_set,
   &mut (test_sponge.clone()),
   &states,
   Some(rng),
).unwrap();

// 5b. PolynomialCommitment::batch_check
assert!(PCS::batch_check(
   &vk,
   &comms,
   &query_set,
   &values,
   &proof_batched,
   &mut (test_sponge.clone()),
   rng,
).unwrap());

License

This library is licensed under either of the following licenses, at your discretion.

Unless you explicitly state otherwise, any contribution that you submit to this library shall be dual licensed as above (as defined in the Apache v2 License), without any additional terms or conditions.

Reference papers

Polynomial Commitments
Aniket Kate, Gregory M. Zaverucha, Ian Goldberg
ASIACRYPT 2010

Sonic: Zero-Knowledge SNARKs from Linear-Size Universal and Updateable Structured Reference Strings
Mary Maller, Sean Bowe, Markulf Kohlweiss, Sarah Meiklejohn
CCS 2019

AuroraLight: Improved Prover Efficiency and SRS Size in a Sonic-Like System
Ariel Gabizon
ePrint, 2019

Marlin: Preprocessing zkSNARKs with Universal and Updatable SRS
Alessandro Chiesa, Yuncong Hu, Mary Maller, Pratyush Mishra, Noah Vesely, Nicholas Ward
EUROCRYPT 2020

Proof-Carrying Data from Accumulation Schemes
Benedikt Bünz, Alessandro Chiesa, Pratyush Mishra, Nicholas Spooner
TCC 2020

Signatures of Correct Computation
Charalampos Papamanthou, Elaine Shi, Roberto Tamassia
TCC 2013

Acknowledgements

This work was supported by: an Engineering and Physical Sciences Research Council grant; a Google Faculty Award; the RISELab at UC Berkeley; and donations from the Ethereum Foundation and the Interchain Foundation.

poly-commit's People

Contributors

3for avatar antonio95 avatar autquis avatar burdges avatar confusesun avatar dependabot-preview[bot] avatar hrodr avatar huyuncong avatar kevaundray avatar kobigurk avatar mmagician avatar mmaker avatar npwardberkeley avatar pratyush avatar rex4539 avatar rozbb avatar ryanleh avatar swasilyev avatar tsunrise avatar valardragon avatar weikengchen avatar will-lin4 avatar yuwen01 avatar zhenfeizhang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

poly-commit's Issues

Constructor for generic struct kzg10::Commitment

Summary

Generic struct, such as kzg10::Commitment<E: PairingEngine>(pub E::G1Affine) without proper constructor can be hard to instantiate, as the compiler will complain about "expected type, but found struct" problem.

Problem Definition

Assume you got a GroupAffine point (SW or TE), which is supposed to be a kzg10::Commitment, and you try to reconstruct the commitment out of the underlying point value, except you can't with current API.

|                 .map(|[x, y]| Commitment(GroupAffine::new(x, y, false)))
|                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ expected associated type, found struct `ark_ec::short_weierstrass_jacobian::GroupAffine`
|
   = note: expected associated type `<E as PairingEngine>::G1Affine`
                       found struct `ark_ec::short_weierstrass_jacobian::GroupAffine<_>`

(side note: I concede that technically we can use CanonicalDeserialize into a Commitment, but that's a bit troublesome.)

Proposal

TBD


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Support for evaluation-hiding PCSs

Problem Definition

In many PCSs, a prover commits to a polynomial and is subsequently queried about the value of that committed polynomial at a point chosen by the verifier - letting the verifier learn that actual value during the process.

However, this is not the case for all PCSs: sometimes, the verifier just wants to be convinced that a committed polynomial, at a point of their choice, takes a value the prover only commits to without revealing it in plain. This is, for instance, the case of Hyrax (cf. section 6 here), which is currently PRed (see #130). It is not unthinkable that more such schemes will be added as the module grows.

The current PolynomialCommitment trait does not handle that situation very elegantly - in particular, the method check receives a list of expected evaluations. How Hyrax was patched to fit this paradigm is outlined in the linked PR.

Proposal

Three possible ways forward come to mind:

  • Leave the trait untouched and have the check method simply disregard the received evaluations in evaluation-hiding PCSs. This seems inelegant and might lead to misuse where the programmer passes a certain evaluation to that function, receives positive confirmation, and thinks those evaluations were matched against the proof and correspond to evaluations of the committed polynomial. This issue could be mitigated with good documentation.
  • Small-impact breaking change: replace the evaluations (which are elements of a finite field) passed to check by an Option wrapper around that. Each implementor, depending of which of the two paradigms it fits, would panic if it receives the opposite of what it expects and otherwise run business as usual. By what it expects I mean Some for PCSs where evaluations are meant to be learnt by the verifier and None for evaluation-hiding ones. This change would indeed be breaking but the changes would be minimal.
  • Larger refactor: a separate trait could be added for those schemes, or perhaps an evaluation_hiding flag inside the current trait. Here I don't have many specific proposals, but I am up for discussion. A point in favour of this is that evaluation-hiding PCSs pay a performance cost due to their nature and the plug-out-plug-in design aim of the PolynomialCommitment trait in its current form would put them in the same bag as non-hiding PCSs.

At this moment, the second option seems the most appealing. I am happy to explore other possibilities, though.


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Add `&self` parameter to `PolynomialCommitment` trait methods

Summary

Problem Definition

LigeroPCS requires the prover & the verifier to do some Merkle tree operations in the non-interactive setting.
Specifically, the prover's commitment is a Merkle root, and the verifier should check well-formedness of such a commitment against the provider proofs (MT paths) for a couple of leaf indices.

We are trying to use a Merkle Tree from crypto-primitives inside the commit method (and equivalently, Path in the check method) - which requires the parties to provide a concrete instantiation of a CRHScheme and TwoToOneCRHScheme.
Currently we can do it like:

impl<..., C> PolynomialCommitment<F, P, S>
    for Ligero<F, C, ...>
where
    ...,
    C: merkle_tree::Config + 'static,
{
...
    fn check<'a>(
        vk: &Self::VerifierKey,
        ...,
        rng: Option<&mut dyn RngCore>,
    ) -> Result<bool, Self::Error>
    where
        Self::Commitment: 'a,
    {
        let mut rng = rng.unwrap();
        // take the first committment
        let labeled_commitment = commitments.into_iter().next().unwrap();
        // IS THERE A BETTER WAY?
        let leaf_hash_params = C::LeafHash::setup(&mut rng).unwrap();
        let two_to_one_params = C::TwoToOneHash::setup(&mut rng).unwrap();
        // well-formedness check internally verifies some merkle paths, which requires passing the params
        Self::well_formedness_check(
            labeled_commitment.commitment(),
            &leaf_hash_params,
            &two_to_one_params,
        );

This looks a bit hacky, because probably the rng parameter wasn't meant to be used for constructing structs that are part of the setup.

A more natural way to do this would be to have

pub struct LigeroPCS<
    F: PrimeField,
    C: Config,
    ...,
> {
    leaf_hash_params: C::LeafHash,
    two_to_one_params: C::TwoToOneHash,
}

and then when we need either of the params, we'd call self.leaf_hash_params, for example. The problem is of course that neither the interface for commit nor check exposes a &self parameter, so there's no way to access parameters from the struct implementing PolynomialCommitment.

Proposal

I propose changing the method signature of the associated PolynomialCommitment methods to include a (non-mutable) &self, e.g.

    /// check but with individual challenges
    fn check<'a>(
        &self,
        vk: &Self::VerifierKey,
        commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
        point: &'a P::Point,
        values: impl IntoIterator<Item = F>,
        proof: &Self::Proof,
        challenge_generator: &mut ChallengeGenerator<F, S>,
        rng: Option<&mut dyn RngCore>,
    ) -> Result<bool, Self::Error>
    where
        Self::Commitment: 'a;

The alternative for this Ligero PC is to have the merkle tree parameters be part of Self::{Verifier,.Commiter}Key, but that introduces a great deal of additional constraints (serialization, Sync etc.) which are cumbersome to enforce on C::LeafHash etc.

@Antonio95


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

kzg10::PreparedVerifierKey is never used

Neither in kzg10 nor in marlin_pc.

If you still plan to employ it, please notice that it contains precomputed powers of the generator as g1 prepared. It may worth considering wNAF instead, or at least use E::G1Projective::batch_normalization_into_affine.

Sonic KZG10 can merge some pairing operations before pairing

Summary

In Aleo, we notice that Sonic/AuroraLight KZG10 has space for optimization in the pairing equation check.
https://github.com/arkworks-rs/poly-commit/blob/master/src/sonic_pc/mod.rs#L106

Problem Definition

Currently, when Sonic handle k combined comms, it provides k+2 entries to the Millier loop.

         for (degree_bound, comm) in combined_comms.into_iter() {
            let shift_power = if let Some(degree_bound) = degree_bound {
                vk.get_shift_power(degree_bound)
                    .ok_or(Error::UnsupportedDegreeBound(degree_bound))?
            } else {
                vk.prepared_h.clone()
            };

            g1_projective_elems.push(comm);
            g2_prepared_elems.push(shift_power);
        }

        g1_projective_elems.push(-combined_adjusted_witness);
        g2_prepared_elems.push(vk.prepared_h.clone());

        g1_projective_elems.push(-combined_witness);
        g2_prepared_elems.push(vk.prepared_beta_h.clone());

However, indeed you can reduce it to l+2 where l is the number of combined comms that require a degree bound. Basically, all the entries where the second term is prepared_h can be put together.

Proposal

Sum the combined comms that do not require a degree bound first and then combine it with the existing entry about witness, on prepared_h.


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Integrate with ark-sponge

Summary

Currently, opening challenges are univariate, such that challenges={x, x^2, x^3, ...} where x is random. Now ark-sponge is stable, so we want to use the sponge API instead. Specifically, opening challenges will have two strategies:

  • Univariate: Squeeze a field element x from sponge, and challenges[i] = x^{i+1}
  • Multivariate: For each opening challenge, we squeeze a field element

Challenge Generator and Challenge Strategy

We define the strategy as an enum, and define a challenge generator to replace opening_challenges: dyn Fn(u64)-> F.

#[derive(Copy, Clone)]
pub struct ChallengeGenerator<'a, S: 'a + CryptographicSponge> {
    sponge_state: &'a mut S,
    strategy: ChallengeStrategy
}

pub enum ChallengeStrategy {
    Multivariate, 
    Univariate
}

ChallengeGenerator will have method next() to get the next random challenge.

New Interface for PolyCommit

pub trait PolynomialCommitment<F: Field, P: Polynomial<F>: Sized, S: CryptographicSponge> {
    fn setup<R: RngCore>(
        max_degree: usize,
        num_vars: Option<usize>,
        rng: &mut R,
    ) -> Result<Self::UniversalParams, Self::Error>;
    
    fn trim(
        pp: &Self::UniversalParams,
        supported_degree: usize,
        supported_hiding_bound: usize,
        enforced_degree_bounds: Option<&[usize]>,
    ) -> Result<(Self::CommitterKey, Self::VerifierKey), Self::Error>;

    fn commit<'a>(
        ck: &Self::CommitterKey,
        polynomials: impl IntoIterator<Item = &'a LabeledPolynomial<F, P>>,
        rng: Option<&mut dyn RngCore>,
    ) -> Result<
        (
            Vec<LabeledCommitment<Self::Commitment>>,
            Vec<Self::Randomness>,
        ),
        Self::Error,
    >
    where
        P: 'a;
    
    fn open<'a>(
        ck: &Self::CommitterKey,
        labeled_polynomials: impl IntoIterator<Item = &'a LabeledPolynomial<F, P>>,
        commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
        point: &'a P::Point,
        opening_challenges: ChallengeGenerator<_, S>,
        rands: impl IntoIterator<Item = &'a Self::Randomness>,
        rng: Option<&mut dyn RngCore>,
    ) -> Result<Self::Proof, Self::Error>;
    
    fn batch_open<'a>(
        ck: &Self::CommitterKey,
        labeled_polynomials: impl IntoIterator<Item = &'a LabeledPolynomial<F, P>>,
        commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
        query_set: &QuerySet<P::Point>,
        opening_challenges: ChallengeGenerator<_, S>,
        rands: impl IntoIterator<Item = &'a Self::Randomness>,
        rng: Option<&mut dyn RngCore>,
    ) -> Result<Self::BatchProof, Self::Error>;
    
    fn check<'a>(
        vk: &Self::VerifierKey,
        commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
        point: &'a P::Point,
        values: impl IntoIterator<Item = F>,
        proof: &Self::Proof,
        opening_challenge: ChallengeGenerator<_, S>,
    ) -> Result<bool, Self::Error>
    where
        Self::Commitment: 'a;
    
    fn batch_check<'a, R: RngCore>(
        vk: &Self::VerifierKey,
        commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
        query_set: &QuerySet<P::Point>,
        evaluations: &Evaluations<P::Point, F>,
        proof: &Self::BatchProof,
        opening_challenge: ChallengeGenerator<_, S>,
    ) -> Result<bool, Self::Error>
    where
        Self::Commitment: 'a;
    
     fn open_combinations<'a>(
        ck: &Self::CommitterKey,
        linear_combinations: impl IntoIterator<Item = &'a LinearCombination<F>>,
        polynomials: impl IntoIterator<Item = &'a LabeledPolynomial<F, P>>,
        commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
        query_set: &QuerySet<P::Point>,
        opening_challenge: ChallengeGenerator<_, S>,
        rands: impl IntoIterator<Item = &'a Self::Randomness>,
        rng: Option<&mut dyn RngCore>,
    ) -> Result<BatchLCProof<F, P, Self>, Self::Error>
    where
        P: 'a,
        Self::Randomness: 'a,
        Self::Commitment: 'a;
    
     fn check_combinations<'a, R: RngCore>(
        vk: &Self::VerifierKey,
        linear_combinations: impl IntoIterator<Item = &'a LinearCombination<F>>,
        commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
        eqn_query_set: &QuerySet<P::Point>,
        eqn_evaluations: &Evaluations<P::Point, F>,
        proof: &BatchLCProof<F, P, Self>,
        opening_challenge: ChallengeGenerator<_, S>,
    ) -> Result<bool, Self::Error>
    where
        Self::Commitment: 'a;
}

For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Design and use random oracle interface

Currently the API for PC schemes follows the definition in the Marlin paper. This means that instead of generating an opening challenge for batch verification via a random oracle, methods on PolynomialCommitment explicitly take as input an opening challenge.

This reduces flexibility in challenge generation, and also increases the chance of mistakes in challenge generation due to incorrect domain separation or incorrect state chaining

Hence, we should define a common random oracle interface, and should use this interface consistently in PolynomialCommitment, as well as downstream in marlin. Goals include making the construction modular with respect to choice of concrete random oracle (eg: blake2s, Poseidon, etc), and enabling optimizations like sampling short linear combination coefficients, instead of sampling them as powers of opening_challenge.

Prior art include libraries like merlin, and the ad-hoc impl in marlin

Optimization for `open` for linear-code based PCS

Summary

The current interface doesn't allow room for certain optimizations to open.

Problem Definition

Specifically, for linear code schemes such as Ligero, during commit we compute the merkle root, and to create the proof in open we again recompute the merkle tree to then send merkle proofs for a bunch of leaf indices. Ideally the open stage shouldn't need to recompute the tree.

Proposal

We introduce a gap between the output of commit and the input to check.

Namely, commit would return an ExtendedCommitment (aka PrivateCommitment?) , and verify would only receive the Committment. We also introduce a trivial into() method that's a no-op in all current schemes, but for linear codes it sheds a few fields, such as the coefficients matrix, extended coefficients matrix and the merkle tree, saving the need to recompute them.

We can then leverage this new interface to further simplify the commit and open trait methods: instead of having a separate argument rands, we place the rands on the ExtendedComittment. Then the into() method would shed the rands field when converting it to be used in check.

@Antonio95 @autquis @Pratyush


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Update `PC::check_combinations` to optional rng

Summary

PC::check is implemented with optional rng:

poly-commit/src/lib.rs

Lines 225 to 233 in 9da67e2

fn check<'a>(
vk: &Self::VerifierKey,
commitments: impl IntoIterator<Item = &'a LabeledCommitment<Self::Commitment>>,
point: &'a P::Point,
values: impl IntoIterator<Item = F>,
proof: &Self::Proof,
challenge_generator: &mut ChallengeGenerator<F, S>,
rng: Option<&mut dyn RngCore>,
) -> Result<bool, Self::Error>

However, rng is mandatory for check_combinations

rng: &mut R,

Even tho PC::batch_check forwards that as optional

Some(rng),

Problem Definition

This creates downstream conflicts for types that comply with PC::check and will have the rng as optional. In case the user will not provide the rng, we can defer to

impl<R: RngCore> RngCore for OptionalRng<R> {

Proposal

Update PC::check_combinations and PC::batch_check to take the rng as optional


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

`PolynomialCommitment` trait simplification

Summary

The PolynomialCommitment trait is a little cumbersome with all the wrapper types.
See if we can simplify the trait and then implement it for all schemes. Some ideas:

  • rename check -> verify- there are a lot of name variations in the literature for the same algorithm, though check seems to be used less frequently. verify is more intuitive IMO.
  • LabeledPolynomial construction is cumbersome: introduce a from conversion on Polynomial<F> with some reasonable defaults, or:
  • combine it with the trim function, which would take the parameters from the max degree of all labeled polys we are working with.
  • infer QuerySet from a vector of (vectors of?) points. Construct Evaluations from QuerySet and a Vec<Polynomial>.
  • can we construct the ChallengeGenerator automatically? We know the sponge, and we know whether the poly is uni- or multi-variate, so this can be part of e.g. setup.
  • split the trait into a "base" trait and "batched" trait (?)
  • document all the methods

See also jellyfish PCS trait, although the Espresso traits aren't as generic as they are mostly tailored for KZG.
@alxiong


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Curves ranamed in Zexe

Hi,

I assume the recent renaming of some curves in Zexe:

This PR renames some modules relating to the concrete curves that are implemented in zexe.

edwards_bls12 -> edwards_on_bls12_377
jubjub -> edwards_on_bls12_381
edwards_sw6_ -> edwards_on_cp6_782
sw6 -> cp6_782

upsets Cargo:

cargo build 
error: failed to select a version for `algebra`.
versions that meet the requirements `*` are: 0.1.0

the package `poly-commit` depends on `algebra`, with features: `jubjub` but `algebra` does not have these features.

Okay to commit a degree-0 polynomials?

When we are writing tests, we want to forge a trivial circuit with a specific number of constraints and nonzero entries, though the circuit itself is trivial (we may just write the same LC a million times).

This, however, would fail. A trivial constraint system may cause A, B, C matrices in Marlin to be too simple, such as after "row", "col" computation, the "val" polynomial could become a degree-zero polynomial.

Poly-commit, specifically KZG10 implementation, for some reasons, refuses to commit a degree-zero polynomial (during commit and open).

Error::check_degree_is_within_bounds(polynomial.degree(), powers.size())?;

which,

    pub(crate) fn check_degree_is_within_bounds(
        num_coefficients: usize,
        num_powers: usize,
    ) -> Result<(), Self> {
        if num_coefficients < 1 {
            Err(Error::DegreeIsZero)
        } else {
            Self::check_degree_is_too_large(num_coefficients, num_powers)
        }
    }

Reading KZG10, it seems not obvious that degree-zero polynomial cannot be committed? I guess you can commit a degree-zero polynomial (which is a constant)?

Should we remove this restriction?

`KZG10` does not implement `PolynomialCommitment`

Summary of Bug

The KZG10 struct which implements a polynomial commitment scheme on its own does not implement the PolynomialCommitment trait. Either it should be implemented and/or the trait itself should be simplified to make this implementation easier.

Version

latest commit: cafc05e

Commitment to a certain degree polynomial

Hello,

Does this scheme allow the commitment to a certain degree d polynomial, or only to a polynomial of at most degree D? In other words, can you verify the degree of the committed polynomial?

Thanks!

Load G1 and G2 points from a file rather than relying on the setup function

Summary

We're using Arkworks at sifraitech/rust-kzg and are currently undergoing a migration to EIP-4844 by using ethereum/c-kzg-4844 as a reference. The said implementation loads trusted setups from files, e.g., trusted_setup.txt, that hold the following information:

G1Count | G2Count | [G1] | [G2]

On the contrary, Arkworks invokes the method kzg10::setup and feeds it a RngCore object to generate the setup, which is not as secure as loading the precomputed G1 and G2 points from a file.

Problem Definition

As far as I'm understanding, powers_of_g and powers_of_gamma_g are responsible for holding G1 and G2 points, respectively. But as you can see, the type of powers_of_gamma_g is BTreeMap::<usize, G1Affine>, which by definition cannot hold G2 points. We attempted to replace the values of powers_of_g with G1 points loaded from a setup file, but we are unsure of where to put the G2 points.

Proposal

Allow us to load G1 and G2 points from a file rather than relying on the setup function.


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Tests sample enforced_degree_bounds from max_degree

In the test_template function and equation_test_template in lib.rs, the randomly generated degree_bounds are sampled randomly from a range up to max_degree when they should have a maximum degree of supported_degree. So, degree_bounds passed into the trimming function can be invalid.

Odd usage/API of PolynomialCommitment trait associated error

Hey!
I've been with the ark-poly-commit library.
And I have my Error enum where inside I have an ArkPolyCommit error variant.

The issue is that seems I can't do:

pub fn universal_setup<R: RngCore>(
        max_degree: usize,
        rng: &mut R,
    ) -> Result<UniversalSRS<F, PC>, MyError> {
        PC::setup(max_degree, None, rng)
            .map_err(|err| ark_poly_commit::Error::from)?
    }

Nor

pub fn universal_setup<R: RngCore>(
        max_degree: usize,
        rng: &mut R,
    ) -> Result<UniversalSRS<F, PC>, MyError> {
        PC::setup(max_degree, None, rng)?
    }

And I have the From<ark-poly-commit-error> for MyError ofc.

The issue I see is that PolynomialCommitment has an associated Error type. And although the associated trait bounds seem correct. I can't make it work as requires me to add generics in all my errors where this should not be necessary.
See: https://github.com/arkworks-rs/poly-commit/blob/master/src/lib.rs#L174

Wondering how does people sort this out without ugly solutions like https://github.com/ZK-Garage/plonk/blob/master/plonk-core/src/error.rs#L96-L107.

Ideally, can't we remove the Error type from the trait and just leave the regular error? This would make integration within enums much easier.

Fix and update dependencies to 0.3

Summary

Latest version is not building. We should target published crates instead of github branches so we can publish a new version for this crate

Problem Definition

Unpublished branches are harder to manage and keep track of semver/latest updates

Proposal

Fix minor issues and prepare to release


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Add `Absorb` trait bound on `PCCommitment`

Summary

Using the PCS in a wider context will require absorbing the commitment for Fiat Shamir.

The current challenge is that Absorb is not implemented for AffineRepr, and the current pairing-based schemes in this repo have commitments of the form:

pub struct Commitment<E: Pairing>(
    /// The commitment is a group element.
    pub E::G1Affine,
);

We unfortunately can't add a blanket impl like:

impl<T> Absorb for T
where
    T: AffineRepr,
{
    fn to_sponge_bytes(&self, dest: &mut Vec<u8>) {
        todo!()
    }

    fn to_sponge_field_elements<F: PrimeField>(&self, dest: &mut Vec<F>) {
        todo!()
    }
}

Rust complains "conflicting implementations of trait absorb::Absorb for type u8".

Option 1

We could add Absorb on AffineRepr in the ec crate. However, aside from creating a cyclic dependency from ec to crypto-primitives, I think this is not the best design.

Option 2

One solution is to further restrict G: AffineRepr + Absorb. This propagates into a lot of trait bounds in the poly-commit repo, though. The upside is that it doesn't require any upstream changes, and the two current implementors of AffineRepr which are the structs short_weierstrass::Affine and twisted_edwards::Affine already have Absorb implemented.

Option 3

Larger breaking refactor that makes use of the fact that most pairing computations (and certainly our implementations in ec::models) only ever support SW. The proposal is then to rip out the associated type G1Affine from Pairing, and restrict the class of pairings that can be done to SW-curves, so that we can have:

pub struct Commitment<P: SWCurveConfig>(
    /// The commitment is a group element.
    pub Affine<P>,
);

This almost works, as we still have IPA Commitment struct here in poly-commit which uses the trait AffineRepr (i.e. not an associated type from Pairing):

pub struct Commitment<G: AffineRepr> {
    /// A Pedersen commitment to the polynomial.
    pub comm: G,
    ...
}

And so mixing in Option 2 would need to happen anyway, but limited to one PCS.

Option 4

Any other ideas?

Several issues with building the code

Summary of Bug

I tried following the directions provided in the README.md. On running cargo build --release, I am getting several issues, some of which are shown in the following image:
Screenshot 2023-01-27 at 7 51 04 AM

Version

git commit hash is: 88c97d2

Steps to Reproduce

My OS is MacOS Ventura 13.1.
My system is MacBook Air M2.

Clang --version:
Apple clang version 14.0.0 (clang-1400.0.29.202)
Target: arm64-apple-darwin22.2.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin

Master branch does not build

Hi,

I am encountering problems when trying to build the master branch of this library. It seems that common dependencies such as ark-ec or ark-serialize (see attached pictures for examples of the ecountered errors) fail to build. However, I have used such libraries for other Rust projects without issues. I tried modified Cargo.toml to address these issues but then other dependency problems arise. Can you assist me with this problem? Thanks a lot!

Screenshot 2022-11-15 at 13 15 06
Screenshot 2022-11-15 at 13 15 50
Screenshot 2022-11-15 at 13 16 04
Screenshot 2022-11-15 at 13 16 20
Screenshot 2022-11-15 at 13 16 32

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.