w3c / vc-di-ecdsa Goto Github PK
View Code? Open in Web Editor NEWData Integrity specification for ECDSA using NIST-compliant curves
Home Page: https://w3c.github.io/vc-di-ecdsa/
License: Other
Data Integrity specification for ECDSA using NIST-compliant curves
Home Page: https://w3c.github.io/vc-di-ecdsa/
License: Other
Hi folks,
It looks like between:
ECDSA Cryptosuite v2019
Achieving Data Integrity using ECDSA with NIST-compliant curves
Final Community Group Report 24 July 2022
https://www.w3.org/community/reports/credentials/CG-FINAL-di-ecdsa-2019-20220724/
and
Data Integrity ECDSA Cryptosuites v1.0
Achieving Data Integrity using ECDSA with NIST-compliant curves
W3C Working Draft 23 July 2023
https://www.w3.org/TR/2023/WD-vc-di-ecdsa-20230723/
the intended type
for representing P-256 keys went from EcdsaSecp256r1VerificationKey2019
to Multikey
.
I can find some earlier discussion about the proliferation of cryptosuites at https://lists.w3.org/Archives/Public/public-vc-wg/2022Jul/0044.html
My questions are: was multikey the solution selected for streamlining? And can folks use Multikey
for both new cryptosuites (like P-256) and older curves (like K-256 with the existing EcdsaSecp256k1VerificationKey2019
), with the multikey prefix being all that distinguishes the curves? Seems reasonable, just checking.
cc: @OR13 who hinted at this over in w3c/did-spec-registries#515
Should it be mentioned, e.g. as a security consideration, that proof type EcdsaSecp256r1Signature2019
is used by the NYS Excelsior Pass but not according to this specification?
More info: spruceid/ssi#330
From the PING's review (w3cping/privacy-request#120):
Is there value in allowing non-deterministic signatures or should this spec just require the usage of RFC6979 as noted in section 4.2 of the security considerations section, but this seems like an opportunity for the spec to eliminate behavior that has been implemented incorrectly quite a few times and led to private key reveal issues.
... and follow up from PING:
We reviewed these points today during the PING call and there appeared to be consensus agreement to address these points with the exception that the non-deterministic signatures can be left as SHOULD.
/cc @kdenhartog
We need to ensure that SHA-384 is passed as a param / used as a hash in RDFC-1 and JCS when using a P-384 key. We also need to ensure that we fetch the verification method early enough in the algorithm steps to get the key size to inform these decisions in implementations.
The fact that JavaScript (on some platforms) uses Uint8Array to express byte strings / byte arrays should not escape into the proof serialization format. The proof data serializes bytes, it doesn't matter that JavaScript in a browser interacts with bytes using a Uint8Array interface. In other words, CBOR has a special tag that can be inserted to very specifically indicate that a Uint8Array is used -- when the simpler major type 2 expression (byte string) will do instead.
We should tell implementers to use major type 2 for simplicity (and not force implementers to work with Uint8Arrays, especially when working in other languages that express bytes in another way that wouldn't result in a special CBOR tag) but perhaps note that it does not affect the security / verifiability of the proofs as it's just a detail of the serialization format of the proof value.
To address Orie's comment at w3c/vc-di-eddsa#4 (comment) .
From the PING's review (w3cping/privacy-request#120):
This spec doesn't actually define a privacy section, but still has issues that are being used to track topics that still need to be authored.
/cc @kdenhartog
Hi all, I agree that this suite of signatures is important for compatibility. In addition due to the current work on VC Data Integrity and the EdDSA crypto suite 2020 we should be able to quickly update this document, complete the algorithm details and furnish test vectors. Below are some questions/comments prior to generating any pull requests.
Key Formats section 2.1. In the EdDSA specification this is "Verification Methods". Do we want to support the general "MultiKey" approach? Do we need to support "EcdsaSecp256r1VerificationKey2019" and "EcdsaSecp384r1VerificationKey2019" specific types? In EdDSA we had a legacy requirement for the "Ed25519VerificationKey2020" key type. Is that the case here? Note that there are multicodec codes for P-256 and P-384 compressed public keys and these are used in the DID:key spec. The did:key draft gives some of these values.
Note that FIPS PUB 186-4 doesn't defined the compressed key format. NIST SP 800-186 discusses point compression but not the most commonly used encoding. This is defined in RFC5480 section 2.2: "The first octet of the OCTET STRING indicates whether the key is compressed or uncompressed. The uncompressed form is indicated by 0x04 and the compressed form is indicated by either 0x02 or 0x03 (see 2.3.3 in [SEC1]). The public key MUST be rejected if any other value is included in the first octet." This encoding is used after further processing with X.509 certificates and hence is widely supported and should be explicitly cited in this or reference document.
Similarly in section 2.2 on "Signature Formats" can we use the general "DataIntegrityProof" type from the data integrity spec along with a cryptosuite
designation rather than two new signature types "EcdsaSecp256r1Signature2019" and "EcdsaSecp384r1Signature2019". Possible cryptosuite names could be: "ecdsa-secp256r1-2019" and "ecdsa-secp384r1-2019"
Section 3 on "Algorithms" is incomplete. It seems like we would want to use a procedure such as in the EdDSA draft which is explicit on the steps to construction signature that involved both the unsigned document, and proof options, canonicalization, hashing, hash combination, and signing.
FIPS PUB 186-4: Digital Signature Standard (DSS) cited in the draft is deprecated and should be replaced with FIPS 186-5 (2023) and NIST SP 800-186 (2023).
RFC 9053 CBOR Object Signing and Encryption (COSE): Initial Algorithms. States: " Implementations SHOULD use a deterministic version of ECDSA such as the one defined in [RFC6979]." Should this document make a similar statement since this has been such a common source of security issues. Many libraries conform to RFC6979.
The example below uses "DataIntegrityProof" type, was created with an algorithmic approach of the EdDSA draft, and a deterministic P-256r1 signing algorithm (that was verified against RFC6979 test vectors). The public key included in the verification method is a multicodec encoded compressed P-256 key.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": [
"VerifiableCredential",
"AlumniCredential"
],
"name": "Alumni Credential",
"description": "An minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
},
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "ecdsa-secp256r1-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
"proofPurpose": "assertionMethod",
"proofValue": "z33SWiBLC2FAc1rAJbWU39PoHbTe7zcrVnDBqk94cVf2UDs54XxMuwaa1rMhafE8xn47Hdu7nqETWhks4VanNe46g"
}
}
Just wondering, since I don't remember this showing up anywhere except recently.
When creating a proof, other custom proof fields might be given, but it looks like the proof configuration algorithm will not include these -- and it should.
In §3.2.5 point (4) looks like that on my screen:
it seems that the
<li>
Set |proofConfig|.|@context| to
|unsecuredDocument|.|@context|.
</li>
respec idiom does not work (I presume the @
character is the source of the problems). Probably the safest is to use <em>
or <i>
explicitly, but that is really the editors' decision, I did not want to put this into a PR...
https://www.w3.org/TR/vc-di-ecdsa/#transformation-ecdsa-rdfc-2019 takes a map that is not the result of the Node Map Generation algorithm and passes it to https://www.w3.org/TR/json-ld11-api/#deserialize-json-ld-to-rdf-algorithm, which expects "a map node map, which is the result of the Node Map Generation algorithm".
w3c/json-ld-api#579 reports that this type requirement is unclear, but the discussion there indicates that you're expected to do the extra step in callers.
I'll note that the current text refers to a section from vc-data-integrity as "Section 4: Retrieving Cryptographic Material" instead of "Section 4: Retrieve Verification Method" (what it is currently called in vc-di-data-integrity.
This should be editorial/informative.
The current proof configuration algorithm seems to make created
required instead of optional, this should be fixed.
To unify error handling language across this specification (and if desired other cryptosuite specifications) I'd recommend:
Below I show codes used but not in the DI specification and error conditions without codes. Thoughts/Opinions?
These seemed to me to need codes and rigorous handling. Line numbers are approximate. Need to assign to existing error codes or come up with new codes.
u
, indicating that it is a multibase-base64url-no-pad-encoded value, throwing an error ";u
, indicating that it is a multibase-base64url-no-pad-encoded value, throwing an error if it does not."0xd9
, 0x5d
, and 0x01
, throwing an error if it does not."As NIST has updated the signature document that includes ECDSA (FIPS 186-5, February 2023). It does not include the definition of curves (P-256 or P-384). These are now defined in NIST SP 800-186, February 2023.
Neither of these documents refer to these curves as secpr1
. The document SECG2 contains definitions for curves secp256r1
, secp384r1
, and secp521r1
which are the same as NIST curves P-256, P-384, and P-521 respectively but the term secpr1
is not in general use. Note that some ECDSA libraries use the secp256r1
, secp384r1
, and secp521r1
terminology, others use the P-256, P-384, and P-521 terminology, and some use both.
Would recommend removing the secpr1
term from the document. Use the more modern NIST P-256, P-384, etc... terminology in general and add a note on the equivalence to the secp256r1
and other curves.
If there is agreement I can come up with a PR
Cheers Greg
In Section 2.2.1 DataIntegrityProof the proof value is required to be "encoded using the base-58-btc header and alphabet", however for ECDSA-SD variants 3.4.2 serializeBaseProofValue and 3.4.7 serializeDerivedProofValue we require:
Hence it seems that section 2.2.1 needs to be updated to permit the base64url-no-pad-encoding as well as the base-58-btc.
Hi,
I've not found any mention about a recommended HMAC key length. Test vector Example 49 says 32
bytes. Is it the only allowed length?
Forgive me if I will sound very picky...
§3. Algorithms contains this paragraph:
When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used with ECDSA algorithms, the cryptographic hashing function that is passed to the algorithm MUST be determined by the size of the associated public key. For P-256 keys, SHA-2 with 256 bits of output is utilized. For P-384 keys, SHA-2 with 384-bits of output is utilized.
(Emphasis is mine).
Which is o.k., the RDFC reference to SHA makes it indeed possible. What it says is:
The default hash algorithm used by RDFC-1.0, namely, SHA-256 [FIPS-180-4].
Implementations MUST support a parameter to define the hash algorithm, MUST support SHA-256 and SHA-384 [FIPS-180-4], and SHOULD support the ability to specify other hash algorithms.
So, if I use ecdsa with P-256, I do not "pass on" anything whatsoever, I use the default RDFC behavior. It is only when I use P-384 that I have to use the hash function parameter of RDFC (and the details on how I have to do that is implementation dependent).
Isn't it better to say something more "neutral", like
When the RDF Dataset Canonicalization Algorithm (RDFC-1.0) [RDF-CANON] is used with ECDSA algorithms, the cryptographic hashing function used by RDFC-1.0 MUST depend on the size of the associated public key. For P-256 keys, the SHA-2 with 256 bits of output MUST be utilized, which is the default. For P-384 keys, SHA-2 with 384-bits of output MUST be utilized.
MULTIBASE and MULTICODEC are not standards. Please replace uses of them with normative text defining the contents any fields using them.
This is related to w3c/vc-data-integrity#191.
With the did:plc
method, we have been returning verificationMethod
objects with type EcdsaSecp256k1VerificationKey2019
and a publicKeyMultibase
string encoding with:
z
, indicating base58btc encodingThis is a different encoding from that used for did:key
, which includes a varint multiformat table value indicating the specific key type, and compressed curve public key bytes.
For example:
{
"id": "#atproto",
"type": "EcdsaSecp256k1VerificationKey2019",
"controller": "did:plc:yk4dd2qkboz2yv6tpubpc6co",
"publicKeyMultibase": "zQYEBzXeuTM9UR3rfvNag6L3RNAs5pQZyYPsomTsgQhsxLdEgCrPTLgFna8yqCnxPpNT7DBk6Ym3dgPKNu86vt9GR"
}
where the corresponding key has did:key
format: did:key:zQ3shXjHeiBuRCKmM36cuYnm7YEMzhGnCmCyW92sRJ9pribSF
.
I had assumed that the un-compressed key format was required in a DID specification document somewhere, but when I went looking just now, could find no reference in the Core 1.0 doc, https://www.w3.org/TR/did-spec-registries, Verifiable Credential Data Integrity 1.0 , etc. The last link does talk about how the Multikey
method specifically requires a publicKeyMultibase
encoding with the multicodec included, but doesn't mention how other types like EcdsaSecp256k1VerificationKey2019
should be encoded.
This github issue is asking to clarify: is there required multibase format? is it ok to use the multicodec-based encoding, as used with did:key
and Multikey
?
The examples need to have their contexts checked to ensure they are up to date. At least one of them is incorrect:
Example 2:
https://w3c.github.io/vc-di-ecdsa/#example-two-public-keys-p-256-and-p-384-encoded-as-multikeys-in-a-controller-document
Should not be using context "https://w3id.org/security/data-integrity/v1" but instead: "https://w3id.org/security/multikey/v1".
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.