Coder Social home page Coder Social logo

vc-di-ecdsa's Issues

Confirming `EcdsaSecp256r1VerificationKey2019` -> `Multikey` transition

Hi folks,

It looks like between:

ECDSA Cryptosuite v2019
Achieving Data Integrity using ECDSA with NIST-compliant curves
Final Community Group Report 24 July 2022
https://www.w3.org/community/reports/credentials/CG-FINAL-di-ecdsa-2019-20220724/

and

Data Integrity ECDSA Cryptosuites v1.0
Achieving Data Integrity using ECDSA with NIST-compliant curves
W3C Working Draft 23 July 2023
https://www.w3.org/TR/2023/WD-vc-di-ecdsa-20230723/

the intended type for representing P-256 keys went from EcdsaSecp256r1VerificationKey2019 to Multikey.

I can find some earlier discussion about the proliferation of cryptosuites at https://lists.w3.org/Archives/Public/public-vc-wg/2022Jul/0044.html

My questions are: was multikey the solution selected for streamlining? And can folks use Multikey for both new cryptosuites (like P-256) and older curves (like K-256 with the existing EcdsaSecp256k1VerificationKey2019), with the multikey prefix being all that distinguishes the curves? Seems reasonable, just checking.

cc: @OR13 who hinted at this over in w3c/did-spec-registries#515

Excelsior Pass divergence

Should it be mentioned, e.g. as a security consideration, that proof type EcdsaSecp256r1Signature2019 is used by the NYS Excelsior Pass but not according to this specification?
More info: spruceid/ssi#330

Add normative guidance that Deterministic signatures SHOULD be used

From the PING's review (w3cping/privacy-request#120):

Is there value in allowing non-deterministic signatures or should this spec just require the usage of RFC6979 as noted in section 4.2 of the security considerations section, but this seems like an opportunity for the spec to eliminate behavior that has been implemented incorrectly quite a few times and led to private key reveal issues.

... and follow up from PING:

We reviewed these points today during the PING call and there appeared to be consensus agreement to address these points with the exception that the non-deterministic signatures can be left as SHOULD.

/cc @kdenhartog

Tell implementers to use major type 2 encoding (not tag 64 -- Uint8Array) for CBOR

The fact that JavaScript (on some platforms) uses Uint8Array to express byte strings / byte arrays should not escape into the proof serialization format. The proof data serializes bytes, it doesn't matter that JavaScript in a browser interacts with bytes using a Uint8Array interface. In other words, CBOR has a special tag that can be inserted to very specifically indicate that a Uint8Array is used -- when the simpler major type 2 expression (byte string) will do instead.

We should tell implementers to use major type 2 for simplicity (and not force implementers to work with Uint8Arrays, especially when working in other languages that express bytes in another way that wouldn't result in a special CBOR tag) but perhaps note that it does not affect the security / verifiability of the proofs as it's just a detail of the serialization format of the proof value.

Initial Review, Suggestions, Test Vectors...

Hi all, I agree that this suite of signatures is important for compatibility. In addition due to the current work on VC Data Integrity and the EdDSA crypto suite 2020 we should be able to quickly update this document, complete the algorithm details and furnish test vectors. Below are some questions/comments prior to generating any pull requests.

  • Key Formats section 2.1. In the EdDSA specification this is "Verification Methods". Do we want to support the general "MultiKey" approach? Do we need to support "EcdsaSecp256r1VerificationKey2019" and "EcdsaSecp384r1VerificationKey2019" specific types? In EdDSA we had a legacy requirement for the "Ed25519VerificationKey2020" key type. Is that the case here? Note that there are multicodec codes for P-256 and P-384 compressed public keys and these are used in the DID:key spec. The did:key draft gives some of these values.

  • Note that FIPS PUB 186-4 doesn't defined the compressed key format. NIST SP 800-186 discusses point compression but not the most commonly used encoding. This is defined in RFC5480 section 2.2: "The first octet of the OCTET STRING indicates whether the key is compressed or uncompressed. The uncompressed form is indicated by 0x04 and the compressed form is indicated by either 0x02 or 0x03 (see 2.3.3 in [SEC1]). The public key MUST be rejected if any other value is included in the first octet." This encoding is used after further processing with X.509 certificates and hence is widely supported and should be explicitly cited in this or reference document.

  • Similarly in section 2.2 on "Signature Formats" can we use the general "DataIntegrityProof" type from the data integrity spec along with a cryptosuite designation rather than two new signature types "EcdsaSecp256r1Signature2019" and "EcdsaSecp384r1Signature2019". Possible cryptosuite names could be: "ecdsa-secp256r1-2019" and "ecdsa-secp384r1-2019"

  • Section 3 on "Algorithms" is incomplete. It seems like we would want to use a procedure such as in the EdDSA draft which is explicit on the steps to construction signature that involved both the unsigned document, and proof options, canonicalization, hashing, hash combination, and signing.

  • FIPS PUB 186-4: Digital Signature Standard (DSS) cited in the draft is deprecated and should be replaced with FIPS 186-5 (2023) and NIST SP 800-186 (2023).

  • RFC 9053 CBOR Object Signing and Encryption (COSE): Initial Algorithms. States: " Implementations SHOULD use a deterministic version of ECDSA such as the one defined in [RFC6979]." Should this document make a similar statement since this has been such a common source of security issues. Many libraries conform to RFC6979.

  • The example below uses "DataIntegrityProof" type, was created with an algorithmic approach of the EdDSA draft, and a deterministic P-256r1 signing algorithm (that was verified against RFC6979 test vectors). The public key included in the verification method is a multicodec encoded compressed P-256 key.

{
  "@context": [
    "https://www.w3.org/ns/credentials/v2",
    "https://www.w3.org/ns/credentials/examples/v2"
  ],
  "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
  "type": [
    "VerifiableCredential",
    "AlumniCredential"
  ],
  "name": "Alumni Credential",
  "description": "An minimum viable example of an Alumni Credential.",
  "issuer": "https://vc.example/issuers/5678",
  "validFrom": "2023-01-01T00:00:00Z",
  "credentialSubject": {
    "id": "did:example:abcdefgh",
    "alumniOf": "The School of Examples"
  },
  "proof": {
    "type": "DataIntegrityProof",
    "cryptosuite": "ecdsa-secp256r1-2019",
    "created": "2023-02-24T23:36:38Z",
    "verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
    "proofPurpose": "assertionMethod",
    "proofValue": "z33SWiBLC2FAc1rAJbWU39PoHbTe7zcrVnDBqk94cVf2UDs54XxMuwaa1rMhafE8xn47Hdu7nqETWhks4VanNe46g"
  }
}

Editorial (respec?) issue in the proof configuration section

In §3.2.5 point (4) looks like that on my screen:

Screenshot 2024-07-18 at 08 51 20

it seems that the

           <li>
Set |proofConfig|.|@context| to
|unsecuredDocument|.|@context|.
            </li>

respec idiom does not work (I presume the @ character is the source of the problems). Probably the safest is to use <em> or <i> explicitly, but that is really the editors' decision, I did not want to put this into a PR...

ecdsa-rdfc-2019 Transformation passes wrong type to "Deserialize JSON-LD"

https://www.w3.org/TR/vc-di-ecdsa/#transformation-ecdsa-rdfc-2019 takes a map that is not the result of the Node Map Generation algorithm and passes it to https://www.w3.org/TR/json-ld11-api/#deserialize-json-ld-to-rdf-algorithm, which expects "a map node map, which is the result of the Node Map Generation algorithm".

w3c/json-ld-api#579 reports that this type requirement is unclear, but the discussion there indicates that you're expected to do the extra step in callers.

Unify Error Handling Language

To unify error handling language across this specification (and if desired other cryptosuite specifications) I'd recommend:

  1. Use appropriate error handling language as in DI specification, e.g., "an error MUST be raised and SHOULD convey an error type of ERROR_CODE_NAME." where ERROR_CODE_NAME is defined in the DI specification.
  2. Use standardized error codes from DI Specification, and if needed add new codes to DI specification
  3. Check for non-rigorous error handling language and if it needs to be updated (errors without codes that need codes)

Below I show codes used but not in the DI specification and error conditions without codes. Thoughts/Opinions?

Error Codes Used but Not in DI Spec

  1. INVALID_PROOF_CONFIGURATION used in 3.2.5 Proof Configuration (ecdsa-rdfc-2019), 3.3.5 Proof Configuration (ecdsa-jcs-2019), and 3.6.4 Base Proof Configuration (ecdsa-sd-2023)
  2. INVALID_PROOF_DATETIME used in 3.2.5 Proof Configuration (ecdsa-rdfc-2019), 3.3.5 Proof Configuration (ecdsa-jcs-2019), and 3.6.4 Base Proof Configuration (ecdsa-sd-2023)

Errors without Codes

These seemed to me to need codes and rigorous handling. Line numbers are approximate. Need to assign to existing error codes or come up with new codes.

  • line 1782: "the JSON pointer does not match the given |document|."
  • line 2201: "Ensure the |proofValue| string starts with u, indicating that it is a multibase-base64url-no-pad-encoded value, throwing an error ";
  • line 2209: "Ensure that the |decodedProofValue| starts with the ECDSA-SD base proof header bytes 0xd9, 0x5d, and 0x00, throwing an error if it does not."
  • line 2214: "Ensure the result is an array of five elements."
  • line 2472: "Ensure the |proofValue| string starts with u, indicating that it is a multibase-base64url-no-pad-encoded value, throwing an error if it does not."
  • line 2480: "Ensure that the |decodedProofValue| starts with the ECDSA-SD disclosure proof header bytes 0xd9, 0x5d, and 0x01, throwing an error if it does not."
  • line 2486: "Ensure the result is an array of five elements: a byte array of length 64, a byte array of length 36, an array of byte arrays, each of length 64, a map of integers to byte arrays of length 32, and an array of integers, throwing an error if not."
  • line 2938: "If the length of |signatures| does not match the length of |nonMandatory|, throw an error indicating that the signature count does not match the non-mandatory message count."

ECDSA Signature and Curve Definition/Terminology

As NIST has updated the signature document that includes ECDSA (FIPS 186-5, February 2023). It does not include the definition of curves (P-256 or P-384). These are now defined in NIST SP 800-186, February 2023.

Neither of these documents refer to these curves as secpr1. The document SECG2 contains definitions for curves secp256r1, secp384r1, and secp521r1 which are the same as NIST curves P-256, P-384, and P-521 respectively but the term secpr1 is not in general use. Note that some ECDSA libraries use the secp256r1, secp384r1, and secp521r1 terminology, others use the P-256, P-384, and P-521 terminology, and some use both.

Would recommend removing the secpr1 term from the document. Use the more modern NIST P-256, P-384, etc... terminology in general and add a note on the equivalence to the secp256r1 and other curves.

If there is agreement I can come up with a PR

Cheers Greg

Update DataIntegrityProof proofValue admissible encodings

In Section 2.2.1 DataIntegrityProof the proof value is required to be "encoded using the base-58-btc header and alphabet", however for ECDSA-SD variants 3.4.2 serializeBaseProofValue and 3.4.7 serializeDerivedProofValue we require:

  • "baseProof to a string with the Multibase base64url-no-pad-encoding of proofValue..."
  • "derived proof as a string with the base64url-no-pad-encoding of proofValue..."

Hence it seems that section 2.2.1 needs to be updated to permit the base64url-no-pad-encoding as well as the base-58-btc.

[Editorial] Making the RCH hash function reference more explicit

Forgive me if I will sound very picky...

§3. Algorithms contains this paragraph:

When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used with ECDSA algorithms, the cryptographic hashing function that is passed to the algorithm MUST be determined by the size of the associated public key. For P-256 keys, SHA-2 with 256 bits of output is utilized. For P-384 keys, SHA-2 with 384-bits of output is utilized.

(Emphasis is mine).

Which is o.k., the RDFC reference to SHA makes it indeed possible. What it says is:

The default hash algorithm used by RDFC-1.0, namely, SHA-256 [FIPS-180-4].

Implementations MUST support a parameter to define the hash algorithm, MUST support SHA-256 and SHA-384 [FIPS-180-4], and SHOULD support the ability to specify other hash algorithms.

So, if I use ecdsa with P-256, I do not "pass on" anything whatsoever, I use the default RDFC behavior. It is only when I use P-384 that I have to use the hash function parameter of RDFC (and the details on how I have to do that is implementation dependent).

Isn't it better to say something more "neutral", like

When the RDF Dataset Canonicalization Algorithm (RDFC-1.0) [RDF-CANON] is used with ECDSA algorithms, the cryptographic hashing function used by RDFC-1.0 MUST depend on the size of the associated public key. For P-256 keys, the SHA-2 with 256 bits of output MUST be utilized, which is the default. For P-384 keys, SHA-2 with 384-bits of output MUST be utilized.

Clarifying `publicKeyMultibase` encoding: `did:key` style with multicodec code, or not?

With the did:plc method, we have been returning verificationMethod objects with type EcdsaSecp256k1VerificationKey2019 and a publicKeyMultibase string encoding with:

  • multibase prefix char z, indicating base58btc encoding
  • followed directly by the string encoded, un-compressed curve public key bytes

This is a different encoding from that used for did:key, which includes a varint multiformat table value indicating the specific key type, and compressed curve public key bytes.

For example:

{
  "id": "#atproto",
  "type": "EcdsaSecp256k1VerificationKey2019",
  "controller": "did:plc:yk4dd2qkboz2yv6tpubpc6co",
  "publicKeyMultibase": "zQYEBzXeuTM9UR3rfvNag6L3RNAs5pQZyYPsomTsgQhsxLdEgCrPTLgFna8yqCnxPpNT7DBk6Ym3dgPKNu86vt9GR"
}

where the corresponding key has did:key format: did:key:zQ3shXjHeiBuRCKmM36cuYnm7YEMzhGnCmCyW92sRJ9pribSF.

I had assumed that the un-compressed key format was required in a DID specification document somewhere, but when I went looking just now, could find no reference in the Core 1.0 doc, https://www.w3.org/TR/did-spec-registries, Verifiable Credential Data Integrity 1.0 , etc. The last link does talk about how the Multikey method specifically requires a publicKeyMultibase encoding with the multicodec included, but doesn't mention how other types like EcdsaSecp256k1VerificationKey2019 should be encoded.

This github issue is asking to clarify: is there required multibase format? is it ok to use the multicodec-based encoding, as used with did:key and Multikey?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.