privacy-scaling-explorations / halo2curves Goto Github PK
View Code? Open in Web Editor NEWLicense: Other
License: Other
The recent merge of #69 may make it very difficult to update halo2_proofs
and other associated libraries, which are currently architected to depend on halo2_curve
trait implementations (not zkcrypto/pairing
).
Recently gave a go at updating halo2curves
in those repos and the number of cascading changes across repos is very large -- and the trait mismatches between halo2_curves
and zkcrypto/pairing
will make updating things a bit of a challenge.
Just wanted to flag this as a potential maintainability challenge.
In #54 we extend the macros to support curve with a != 0
, but it also affects the performance for rest curves with a = 0
, the performance drops is roughly like:
We should make the macros more modular so the curves with a = 0
could still benefit from the faster algorithm.
Consider:
zkcrypto/group#48
zcash/pasta_curves#75
Which may be relevant to the local integration of pasta curves, as well as the downstream integration in Nova ( microsoft/Nova#198 ).
/cc @CPerezz
When using the PSE fork of halo2
together with halo2_gadgets
, it turns out that many gadgets require F: PrimeFieldBits
(e.g. struct LookupRangeCheckConfig<F: PrimeFieldBits, const K: usize>
), which is not implemented for any curves except the pasta curves.
Is there a good reason for that? Am I overlooking something?
Thanks :)
Random point generation also clears cofactor just after finding a valid point on curve. In g2 of pluto it seems either cofactor clearing or subgroup check is not good. Reproduce with:
let point = G2::random(OsRng);
assert!(bool::from(point.is_on_curve()));
use group::cofactor::CofactorGroup;
assert!(bool::from(point.is_torsion_free()));
Would be nice to create macros that generate big testing suites for all the different field and curve implementations we have in the lib.
Recently I'm working on a project which involved both halo2curves and arkworks.rs.
I found that the serialization of bn254 curve point like G1 are different in these two projects.
when serializing, halo2curves add the negative sign as the 6th bit of last bytes, and point of infinity as the 7th bit. see
halo2curves/src/derive/curve.rs
Line 138 in a3f15e4
I had dealt the subtle difference in my project.
However, I wonder if there exists a common standard for (de)serialization Field and Curve Point?
If there isn't, I wonder how could we change the situation, and unify the serialization in many other zk projects and make proof parsing more portable?
Hey all !
Recent changes in privacy-scaling-explorations/halo2#111 have greatly improved serialization performance, which is great ! However as halo2curves::pasta::Fp
etc... do not currently implement SerdeObject
it can make serializing VerifyingKey
when using pasta curves kind of difficult.
Are there any plans in motion to implement SerdeObject
on said curve structs ? If not happy to give it a go.
It will be nice to have all the curves implemented with UncompressedEncoding
for dependent usage, then we don't need to deserialize the coordinate manually when we get uncompressed representation.
Currently the CI is missing a clippy check.
New avenues for cycling recursion like goblin and a nova like folding scheme are introduced lately. So in short term we will probably be exprerimenting with bn grumpkin cycle
Upcoming optimized msm method privacy-scaling-explorations/halo2#40 heavily uses base field inversion. Currently inversion is an exponentiation with (p-2
). It can be optimized by replacing with one of these
See more on privacy-scaling-explorations/halo2#174 (comment) .
If you check https://github.com/privacy-scaling-explorations/halo2curves/actions/runs/7192434158/job/19596041204?pr=111#step:5:25
A bump in colored
package causing:
error: package `colored v2.1.0` cannot be built because it requires rustc 1.70 or newer, while the currently active rustc version is 1.67.0
We should consider bumping the MSRV again..
My project uses the halo2 lib from https://github.com/privacy-scaling-explorations/halo2/tree/main/halo2_proofs as a dependency. That halo2 lib has a dependency on
halo2curves = { version = "0.6.0", default-features = false }
I got a compilation error for my project:
error[E0599]: no method named
ct_ne
found for typei64
in the current scope
--> /home/default2/.cargo/registry/src/index.crates.io-6f17d22bba15001f/halo2curves-0.6.0/src/ff_ext/mod.rs:18:25
|
18 | self.legendre().ct_ne(&-1)
| ^^^^^ help: there is a method with a similar name:count_ones
ct_ne
was introduced in subtle
2.5, whereas halo2curves
depends on 2.4
I think you need to bump the version.
I wrote a unit test since grcov says this part of the code isn't covered by unit tests...
fn serdes<G: CurveExt>() {
for _ in 0..100 {
let projective_point = G::random(OsRng);
let affine_point: G::AffineExt = projective_point.into();
let projective_repr = projective_point.to_bytes();
let affine_repr = affine_point.to_bytes();
println!("{:?} \n{:?}", projective_repr.as_ref(), affine_repr.as_ref());
// let projective_point_rec = G::from_bytes(&projective_repr).unwrap();
// let projective_point_rec_unchecked = G::from_bytes(&projective_repr).unwrap();
let affine_point_rec = G::AffineExt::from_bytes(&affine_repr).unwrap();
// let affine_point_rec_unchecked = G::AffineExt::from_bytes(&affine_repr).unwrap();
// assert_eq!(projective_point, projective_point_rec);
// assert_eq!(projective_point, projective_point_rec_unchecked);
assert_eq!(affine_point, affine_point_rec);
// assert_eq!(affine_point, affine_point_rec_unchecked);
}
}
it looks like all other curves pass this test except for scep curve, and the failure looks probabilistic
failures:
---- secp256k1::curve::test_curve stdout ----
[75, 102, 146, 216, 157, 139, 235, 249, 33, 4, 229, 181, 130, 54, 246, 104, 112, 38, 114, 128, 247, 203, 212, 155, 135, 16, 79, 169, 89, 142, 96, 31]
[75, 102, 146, 216, 157, 139, 235, 249, 33, 4, 229, 181, 130, 54, 246, 104, 112, 38, 114, 128, 247, 203, 212, 155, 135, 16, 79, 169, 89, 142, 96, 31]
[59, 37, 55, 74, 178, 53, 60, 76, 177, 231, 181, 66, 97, 223, 249, 152, 176, 150, 173, 214, 89, 244, 220, 182, 10, 195, 32, 72, 53, 29, 102, 205]
[59, 37, 55, 74, 178, 53, 60, 76, 177, 231, 181, 66, 97, 223, 249, 152, 176, 150, 173, 214, 89, 244, 220, 182, 10, 195, 32, 72, 53, 29, 102, 205]
[85, 83, 100, 141, 31, 95, 208, 91, 98, 1, 64, 15, 102, 7, 95, 50, 214, 134, 99, 36, 3, 115, 249, 39, 178, 40, 69, 25, 113, 91, 66, 179]
[85, 83, 100, 141, 31, 95, 208, 91, 98, 1, 64, 15, 102, 7, 95, 50, 214, 134, 99, 36, 3, 115, 249, 39, 178, 40, 69, 25, 113, 91, 66, 179]
thread 'secp256k1::curve::test_curve' panicked at 'assertion failed: `(left == right)`
left: `(0xb3425b71194528b227f97303246386d6325f07660f4001625bd05f1f8d645355, 0x1c9297eeb9587df7bde2784cb11b13101818fb91503c217f75a29d437e14dedd)`,
right: `(0x33425b71194528b227f97303246386d6325f07660f4001625bd05f1f8d645355, 0xd71b826f4c8aaf3dcae44293010e340981a379fa6b4bcc881a6ce0025eb2396d)`', src/tests/curve.rs:34:9
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Code is here FYI: https://github.com/scroll-tech/halo2curves/blob/94fe3291fc4f4f7c8eacdcf3c748b33c03de53a9/src/tests/curve.rs#L18
At current state Blake2b is hardcoded
This page is empty, which is sad:
Please note the following piece in the Rust documentation (emphasis mine):
https://doc.rust-lang.org/cargo/reference/manifest.html#the-version-field
Before you reach 1.0.0, anything goes, but if you make breaking changes, increment the minor version. In Rust, breaking changes include adding fields to structs or variants to enums.
Current rust versions of popular crates:
From which I take that releasing a crate, especially if it is suitably marked as experimental in its Readme, is not a big commitment that has to impact the roadmap, or constrain the project. However, it is a huge boon in coordinating dependencies and synchronizing efforts in using the crate. It's also a way to get feedback through that actual usage of the project.
IOW, I hope halo2curves being released can actually make halo2curves better.
The (S)SWU hash_to_curve methods (RFC), introduced in #47 for grumpkin (see also #70 for secp256k1), simply require finding if an element is a quadratic residue in several places in the computation.
This is currently implemented as foo.sqrt().is_some()
, which resolves to a square root computation.
I suspect it may be faster to implement a real Legendre symbol there, see e.g. Arkworks:
https://github.com/arkworks-rs/algebra/blob/13fd33e33c1186b2757e3fd3948b6eee16ee9e4b/ff/src/fields/mod.rs#L256-L260
To rebase privacy-scaling-explorations/halo2#40 we require endomorphism encoding for pasta cuves.
The formatting in this repo is a little strange. I run cargo fmt and it replaces 100% of the files. I'm curious how you're configuring the formatting for this, and whether you have a specific rustfmt file somewhere?
In order to support curves like in #38 we need to be able to implement traits for them ONLY due to pairing.rs
traits in halo2curves.
This, makes merging PRs like #38 really complex as we're forced to duplicate code. Otherwise we have a foreign structure implementing a foreign trait.
But, we can actually require the pairing::Engine
-related deps from https://crates.io/crates/pairing and implement them inside of our crate for the curves defined on it (and it is already implemented for foreign curves which is fine).
@han0110 @kilic is there any issue that I'm missing that doing this implies?
The extensions of Bn256
base field Fq2
and Fq6
are constructed choosing a quadratic/cubic non-residue in their respective base field. Each of these extensions implement the function multiply_by_nonresidue
, which multiplies an element by a non-residue element of the extension field.
However, these extensions each have another function that is identical:
The remaining extension, Fq12
, does not seem to have such duplicated function.
These functions are not used anywhere, I think they are just duplicates and should be removed for clarity.
Edit:
Moreover, the following comments should be corrected:
Line 258 in 6e2ff38
v
is the non-residue used for the quadratic extension [Fq12 : Fq6]
. It is a quadritic, not cubic, non-residue.Line 277 in 6e2ff38
If the derive_serde
feature is enabled - turn on the feature "serde" in the pasta_curves
crate.
derive_serde = ["serde/derive", "serde_arrays", "hex", "pasta_curves/serde"]
Since many people (including our project ๐) use pasta curves re-export via halo2curves I think it could be useful to have all base types serialisable
The following branch: https://github.com/huitseeker/halo2curves/tree/test_asm_output
Implements a test which draws byte strings using a deterministically-seeded RNG, passes them to hash_to_curve, and looks at the representation of the obtained point:
https://github.com/huitseeker/halo2curves/blob/test_asm_output/src/bn256/mod.rs#L28-L58
cargo nextest run --no-capture -E 'test(test_hash_to_curve_print)'
--features "asm"
.
cargo nextest run --no-capture -E 'test(test_hash_to_curve_print)' --features "asm"
Tested on https://github.com/privacy-scaling-explorations/halo2curves/tree/81a078254518a7a4b7c69fab120621deaace9389 but microsoft/Nova#261 shows this applied to v0.4.0 at least and probably since.
I would suspect a possible missing modular reduction in the field arithmetic somewhere between vanilla / asm, but haven't looked into the cause any deeper.
Original issue/PR: privacy-scaling-explorations/sonobe#76
This trait is not used for anything and @kilic mentioned that in the past it was not clear if there was a bug on some implementation of it.
Currently the halo2 CurveExt
trait is not fully implemented except for the pasta curves. The missing function hash_to_curve
is unimplemented (see here). The implementation for pasta curves (here) makes use of an algorithm valid only for curves with a*b = 0
(see here). This implementation should therefore work for bn256 and secp256k1, which both have a = 0
.
My first attempt to copy-paste the pasta implementation over gets stuck on the alloc
dependency. Once that's figured out I'd be happy to open a PR
The following code snippet which does G + G + G + G = 4G and (G + G) + (G + G) = 4G gives different results for the Add function for the projective representation in here:
halo2curves/src/derive/curve.rs
Line 997 in c7f8867
let generator: G1 = G1::generator();
let generator_two = generator.add(generator);
let generator_three = generator_two.add(generator);
let generator_four = generator_three.add(generator);
println!("G1 generator is -> {:?}", generator);
println!("G1_two is -> {:?}", generator_two);
println!("G1_three is -> {:?}", generator_three);
println!("G1_four is -> {:?}", generator_four);
let res = generator_two.add(generator_two);
println!("G1_four double is -> {:?}", res);
An approach like the one used in Halo2 using SerdeFormat
instead of having 6 diff functions sounds like a good option.
We should make a proposal a and implement it.
Some of the field constants (such as DELTA
, ZETA
... )are not necessary for the functionality we are using right now. However it would be good to add them to have a complete implementation of the fields and curves.
Floating point values are truncated when exporting with serde, there may be other issues with parsing long floating point values. There should be an option for users to enable for accurate floats if needed.
Here was one fix that we used within ezkl https://github.com/zkonduit/ezkl/blob/main/src/graph/input.rs
There was a need to explicitly map floating point values to f64 otherwise values gets truncated
Without custom Serialization, example
{"input_data": [[0.053262424, ... ]]}
With custom Serialization
data["input_data"] == [[0.05326242372393608, ...]]|
impl Serialize for GraphInput {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let mut state = serializer.serialize_struct("GraphInput", 4)?;
let input_data_f64: Vec<Vec<f64>> = self
.input_data
.iter()
.map(|v| v.iter().map(|&f| f as f64).collect())
.collect();
let output_data_f64: Vec<Vec<f64>> = self
.output_data
.iter()
.map(|v| v.iter().map(|&f| f as f64).collect())
.collect();
state.serialize_field("input_data", &input_data_f64)?;
state.serialize_field("output_data", &output_data_f64)?;
if let Some(processed_inputs) = &self.processed_inputs {
state.serialize_field("processed_inputs", &processed_inputs)?;
}
if let Some(processed_params) = &self.processed_params {
state.serialize_field("processed_params", &processed_params)?;
}
if let Some(processed_outputs) = &self.processed_outputs {
state.serialize_field("processed_outputs", &processed_outputs)?;
}
state.end()
}
}
Some other issues regarding large floats which is documented in this issue. There may be a need to enable to float_roundtrip
flag to enable more accurate float parsing but this comes at an added processing time cost.
[[bench]]
name = "msm"
harness = false
required-features = ["multicore"]
Seems like we had a leftover feature for msm benchs
Originally posted by @kilic in #122 (comment)
This RFC goal is facilitating contribution to Halo2 backend.
While trying to implement privacy-scaling-explorations/halo2#187, in particular the "extended jacobian coordinates" steps, our team is significantly slowed down by having to deal with 2 different repos.
In particular, to accelerate MSM we need to improve the backend in halo2curves repo, but the benefit can only be benchmarked in halo2 which is a very clunky workflow.
Given that MSM is a significant bottleneck and focus point, there are likely ongoing changes that will be done by backend teams. In particular there are GPU provers running on Taiko testnet that uses a modified Halo2/Halo2curves and would be delighted to have a standard MSM/NTT API for acceleration.
The Halo2-KZG proof system is split into:
The backend repo should allow easy integration and benchmark of backends and accelerators
for example I have identified the following accelerators focused backend that mirror halo2curves on GPU:
And a "frontend" like zkevm-circuit can be build on top.
Stretch goal:
We use additive notation for elliptic curves not multiplicative notation, multi_exp
should be multi_scalar_mul
There are actually very few changes needed to implement the first part
Note: this will have the library depart from upstream Zcash, but this seems to have been an intent from the start as #29 by @kilic actually moves the MSM into halo2curves.
Furthermore, porting any MSM or FFT enhancement from Zcash Halo2-IPA should be easy as those are very self-contained functions.
To be discussed, the idea is that by just exposing the right traits in a GPU repo like icicle or sppark (or even a CPU repo like blstrs or constantine)
Benefits:
The BN curves Pluto
and bn256
can a good amount of code that is currently duplicated (once #98 is merged). In order to fix this we should introduce macros for:
This will be a good opportunity to refactor some of bn256
code which is in many parts undocumente and at times quite unclear.
Derived from #116 and as discussed offline with @huitseeker, We should add some README section that explains that the lib (as any Rust lib) that uses rayon as a dependency cannot compile to a wasm-target.
Not only that, but I think it will also be beneficial to add a compiler warning such that the user gets some hint on what the source of the error is.
The current group law uses 16 field multiplications.
https://github.com/privacy-scaling-explorations/halo2curves/blob/main/src/derive/curve.rs#L810
We may use the method from https://eprint.iacr.org/2015/1060 which requires only 12 field multiplications.
I have a demo implementation of this algorithm here:
https://github.com/zhenfeizhang/rcb15-group-law
This should improve the proving speed by 25% or so.
(I noticed that both Halo2 and Arkworks didn't use this 8 years old method. Did I missed anything here?)
Error log.
error[E0432]: unresolved imports `maybe_rayon::current_num_threads`, `maybe_rayon::iter::IndexedParallelIterator`, `maybe_rayon::iter::IntoParallelRefIterator`, `maybe_rayon::slice::ParallelSliceMut`
Error: --> packages/halo2curves/src/multicore.rs:8:5
|
8 | current_num_threads,
| ^^^^^^^^^^^^^^^^^^^ no `current_num_threads` in the root
9 | iter::{IndexedParallelIterator, IntoParallelRefIterator},
| ^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^ no `IntoParallelRefIterator` in `iter`
| |
| no `IndexedParallelIterator` in `iter`
10 | slice::ParallelSliceMut,
| ^^^^^^^^^^^^^^^^^^^^^^^ no `ParallelSliceMut` in `slice`
|
help: a similar name exists in the module
|
9 | iter::{IntoParallelIterator, IntoParallelRefIterator},
| ~~~~~~~~~~~~~~~~~~~~
help: a similar name exists in the module
|
9 | iter::{IndexedParallelIterator, IntoParallelIterator},
| ~~~~~~~~~~~~~~~~~~~~
help: a similar name exists in the module
|
10 | slice::ParallelSlice,
| ~~~~~~~~~~~~~
For more information about this error, try `rustc --explain E0432`.
error: could not compile `halo2curves` (lib) due to previous error
Error: warning: build failed, waiting for other jobs to finish...
Error: Compiling your crate to WebAssembly failed
Caused by: Compiling your crate to WebAssembly failed
Caused by: failed to execute `cargo build`: exited with exit status: 101
full command: cd "/home/runner/work/spartan-ecdsa-experiment/packages/spartan_wasm" && "cargo" "build" "--lib" "--release" "--target" "wasm32-unknown-unknown"
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
Error: Process completed with exit code 1.
See this discussion for some context.
Hello,
Thanks for developing this library and making it open-source! I was checking if this library is compatible with EVM precompiles, so ported an existing example here under ecPairing
as a Rust test in src/bn256/curve.rs
. All the points in the example, both on G1 and G2, unwrap without a panic (so the specified points are on their curves), but the check on the last line fails: assert_eq!(engine::pairing(&p1, &p2), engine::pairing(&p3, &p4));
.
Any idea what's happening? Thanks so much in advance for your help with this!
#[test]
fn test_evm_precompile() {
// test ecPairing
//(G1)x
let p1_x = Fq::from_raw([0x86b21fb1b76e18da, 0x5bc9eeb6a3d147c1, 0x86308b7af7af02ac, 0x2cf44499d5d27bb1]);
//(G1)y
let p1_y = Fq::from_raw([0xfe16d3242dc715f6, 0x0b0c868df0e7bde1, 0xe69108924926e45f, 0x2c0f001f52110ccf]);
//(G2)x_1
let p2_x1 = Fq::from_raw([0xaef31e3cfaff3ebc, 0x61cd63919354bc06, 0x4e2a32234da8212f, 0x1fb19bb476f6b9e4]);
//(G2)x_0
let p2_x0 = Fq::from_raw([0x79d9159eca2d98d9, 0x4ffe2f2f3504de8a, 0x914e03e21df544c3, 0x22606845ff186793]);
//(G2)y_1
let p2_y1 = Fq::from_raw([0xd3ab3698d63e4f90, 0x48eea9abfdd85d7e, 0xcb5fa81fc26cf3f0, 0x2bd368e28381e8ec]);
//(G2)y_0
let p2_y0 = Fq::from_raw([0x950bb16041a0a85e, 0x91e66f59be6bd763, 0xf0ff1743cbac6ba2, 0x2fe02e47887507ad]);
//(G1)x
let p3_x = Fq::from_raw([0x1, 0x0, 0x0, 0x0]);
//(G1)y
let p3_y = Fq::from_raw([0x3c208c16d87cfd45, 0x97816a916871ca8d, 0xb85045b68181585d, 0x30644e72e131a029]);
//(G2)x_1
let p4_x1 = Fq::from_raw([0xaad45f40ec133eb4, 0xede09cc4328f5a62, 0x3caaf13cbf443c1a, 0x1971ff0471b09fa9]);
//(G2)x_0
let p4_x0 = Fq::from_raw([0xf5abfef9ab163bc7, 0xd6c104e9e9eff40b, 0x5733cbdddfed0fd8, 0x091058a314182298]);
//(G2)y_1
let p4_y1 = Fq::from_raw([0x8fc02c2e907baea2, 0x0af8c212d9dc9acd, 0x96c1f4e453a370eb, 0x2a23af9a5ce2ba27]);
//(G2)y_0
let p4_y0 = Fq::from_raw([0x54de6c7e0afdc1fc, 0x422ebc0e834613f9, 0xb548a4487da97b02, 0x23a8eb0b0996252c]);
let p1 = G1Affine::from_xy(p1_x, p1_y).unwrap();
let p2 = G2Affine::from_xy(
Fq2 { c0: p2_x0, c1: p2_x1 },
Fq2 { c0: p2_y0, c1: p2_y1 },
).unwrap();
let p3 = G1Affine::from_xy(p3_x, p3_y).unwrap();
let p4 = G2Affine::from_xy(
Fq2 { c0: p4_x0, c1: p4_x1 },
Fq2 { c0: p4_y0, c1: p4_y1 },
).unwrap();
assert_eq!(engine::pairing(&p1, &p2), engine::pairing(&p3, &p4));
}
The Makefile fails to publish with:
Enter passphrase for key '......':
Total 0 (delta 0), reused 0 (delta 0), pack-reused 0
To github.com:privacy-scaling-explorations/halo2curves.git
* [new tag] v0.1.0 -> v0.1.0
cargo release publish --execute --verbose
error: no such subcommand: `release`
make: *** [Makefile:16: release] Error 101
Idk why we have the release
there. But is definitely wrong.
Will do a PR ASAP. In any case, the thing is already in crates.io.
Originally posted by @CPerezz in #60 (comment)
The current definition of this trait is:
https://github.com/privacy-scaling-explorations/halo2curves/blob/main/src/bn256/fr.rs#L336-L338
This is not correct.
N
is meant to be the order of a small multiplicative sub-group. Not it's generator.
The error can be seen if we execute:
modulus = 0x30644e72e131a029b85045b68181585d2833e84879b9709143e1f593f0000001
sage: factor(modulus-1)
2^28 * 3^2 * 13 * 29 * 983 * 11003 * 237073 * 405928799 * 1670836401704629 * 13818364434197438864469338081
GF(modulus).primitive_element() ^ ((modulus - 1) // 2)
21888242871839275222246405745257275088548364400416034343698204186575808495616
sage: print(21888242871839275222246405745257275088548364400416034343698204186575808495616.hex())
30644e72e131a029b85045b68181585d2833e84879b9709143e1f593f0000000
According to the formula in the ff
crate:
pub trait WithSmallOrderMulGroup<const N: u8>: PrimeField {
/// A field element of small multiplicative order $N$.
///
/// The presense of this element allows you to perform (certain types of)
/// endomorphisms on some elliptic curves.
///
/// It can be calculated using [SageMath] as
/// `GF(modulus).primitive_element() ^ ((modulus - 1) // N)`.
/// Choosing the element of order $N$ that is smallest, when considered
/// as an integer, may help to ensure consistency.
///
/// [SageMath]: https://www.sagemath.org/
const ZETA: Self;
}
We can see that N
should be the order of the multiplicative sub-group which has 3
as a generator and order 2
.
Luckily, the generic doesn't affect the implementation. And indeed, if you compute ZETA
you will get the correct result (that matches the one used in the repo.
So the only wrong thing is the generic which should be N=2
.
Edit:
I'm definitely missing something as the pasta_curves
forces the multiplicative subgroup order to be 3
always..
What is true, is that with N =3
we don't get the ZETA
used actually.
Currently for Secp256k1::hash_to_curve
we use Shallue-van de Woestijne method, but we could also use Simplified Shallue-van de Woestijne-Ulas method for better performance.
While editing the PR, I accidentally merged the PR.
The only test that is not passing after my fixes in 1fc3b83 and 2ce9e23 is the sqrt()
fn for secp256r1::Fq
.
@rrrliu could you try to fix it? I've been trying the whole day and haven't succeeded. Otherwise, I'll pair up with @davidnevadoc and try to fix it.
Apologies for the inconvencience
Please update all PSE libs with updated version of halo2curves
or bring back the 0.3.x
tags.
The Frobenius map function is implemented in a rather confusing way:
Lines 273 to 275 in 6e2ff38
where the precomputed values are:
Lines 570 to 586 in 6e2ff38
The first value is Fq(-1)^0 = 1
. The second value is just the Legendre symbol of -1
. -1
is by definition (in this field) a quadratic non-residue, so the second value is -1
. As a consequence the Frobenius map of an even power maps each element to itself, and odd powers are mapped into the conjugation. (For an in depth explanation check out this blog post: https://alicebob.modp.net/the-frobenius-endomorphism-with-finite-fields/).
This is the way it is implemented in zkcrypto/bls12_381
: https://github.com/zkcrypto/bls12_381/blob/7de7b9d9c509b9973b35a3241b74bbbea95e700a/src/fp2.rs#L141
Tags all disappeared? This means older tags of halo2_proofs don't work anymore
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.