Coder Social home page Coder Social logo

did-io's Introduction

Selective DID Resolver Client (@digitalbazaar/did-io)

NPM Version

A DID (Decentralized Identifier) resolution library for Javascript.

Table of Contents

Background

See also (related specs):

Version Compatibility

did-io v1.0 is a major breaking release, see the 1.0 CHANGELOG entry and Upgrading from v0.8.x to v1.0.0 checklist for details.

did-io v1.0 is compatible with the following libraries:

  • crypto-ld >= 5.0.0 (and related crypto suites).
  • jsonld-signatures >= 9.0.0
  • @digitalbazaar/did-method-key >= 1.0.0
  • did-veres-one >= 13.0.0 (currently, branch v13.x)
  • vc-js >= 7.0 (currently, branch v7.x)

Install

Requires Node.js 14+

To install locally (for development):

git clone https://github.com/digitalbazaar/did-io.git
cd did-io
npm install

To install as a dependency in another project, add this to your package.json:

"@digitalbazaar/did-io": "^X.x.x"

Usage

Supported DID method drivers

Using the CachedResolver to get() DID documents and keys

import {CachedResolver} from '@digitalbazaar/did-io';

// You can pass cache options to the constructor (see Cache Management below)
const resolver = new CachedResolver({max: 100}); // defaults to 100

On its own, the resolver does not know how to fetch or resolve any DID methods. Support for each one has to be enabled explicitly. It uses a Chai-like plugin architecture, where each driver is loaded via .use(driver).

import * as didKey from '@digitalbazaar/did-method-key';
import * as didVeresOne from 'did-veres-one';

const didKeyDriver = didKey.driver();
// Dev / testnet / live modes
const didVeresOneDriver = didVeresOne.driver({mode: 'dev'});

// Enable resolver to use the did:key and did:v1 methods for cached fetching.
resolver.use(didKeyDriver);
resolver.use(didVeresOneDriver);

After enabling individual DID methods, you can get() individual DIDs. CachedResolver will use the appropriate driver, based on the did: prefix, or throw an 'unsupported did method' error if no driver was installed for that method.

await resolver.get({did}); // -> did document
await resolver.get({url: keyId}); // -> public key node

Key Convenience Methods

You can use the provided convenience methods (methodFor() with .generate(), and didMethodDriver.publicMethodFor() with .get()) to get a hold of key pair instances (previously, this was done via a manual process of determining key id and using didDocument.keys[keyId]).

When retrieving documents with .get():

const didDocument = await resolver.get({did});
const publicKeyData = resolver.publicMethodFor({didDocument, purpose: 'authentication'});
// Then you can use the resulting plain JS object to get a key pair instance.
// via a configured CryptoLD instance, when you're working with multiple key types
// (see `crypto-ld` library for setup and usage):
const authPublicKey = await cryptoLd.from(publicKeyData);
// or, directly (if you already know the key type)
const authPublicKey = await Ed25519VerificationKey2020.from(publicKeyData);

When retrieving individual key objects with a .get(), you don't even need to use publicMethodFor():

const keyData = await resolver.get({url: keyId});
const publicKey = await cryptoLd.from(keyData);

Generating and registering DIDs and DID documents

did-io and CachedResolver are currently only for get() operations on multiple DID methods. To generate and register new DIDs or DID documents, use each individual driver's .generate() method. (The generation and registration process for each DID method is so different, that it didn't make sense to unify them on the CachedResolver level.)

Each driver's .generate() returns a tuple of didDocument, a Map of public/private key pairs (by key id), and a convenience methodFor function that allows lookup of key (verification method) by its intended purpose.

const {didDocument, keyPairs, methodFor} = await didMethodDriver.generate();
didDocument
// -> plain JS object, representing a DID document.
keyPairs
// -> a javascript Map of public/private LDKeyPair instances (from crypto-ld),
//   by key id
methodFor({purpose: 'keyAgreement'});
// for example, an X25519KeyAgreementKey2020 key pair instance, that can
// be used for encryption/decryption using `@digitalbazaar/minimal-cipher`.
methodFor({purpose: 'assertionMethod'});
// for example, an Ed25519VerificationKey2020 key pair instance for
// signing and verifying Verifiable Claims (VCs).

Using CachedResolver as a documentLoader

One of the most common uses of DIDs and their public keys is for cryptographic operations such as signing and verifying signatures of Verifiable Credentials and other documents, and for encrypting and decrypting objects.

For these and other Linked Data Security operations, a documentLoader function is often required. For example, NPM's package.json and package-lock.json mechanisms allow application developers to securely lock down a library's dependencies (by specifying exact content hashes or approximate versions). In the same manner, documentLoaders allow developers to secure their Linked Data Security load operations, such as when loading JSON-LD contexts, fetching DID Documents of supported DID methods, retrieving public keys, and so on.

You can use an initialized CachedResolver instance when constructing a documentLoader for your use case (to handle DID and DID key resolution for installed methods). For example:

const resolver = new CachedResolver();
resolver.use(didMethodDriver1);
resolver.use(didMethodDriver2);

const documentLoader = async url => {
  // Handle other static document and contexts here...
  
  // Use CachedResolver to fetch did: links.
  if(url && url.startsWith('did:')) {
    // this will handle both DIDs and key IDs for the 2 installed drivers
    const document = await resolver.get({url});
    return {
      url,
      document,
      static: true
    }
  }
}

Cache management

CachedResolver uses lru-memoize to memoize get() promises (as opposed to just the results of the operations), which helps in high-concurrency use cases. (And that library in turn uses lru-cache under the hood.)

The CachedResolver constructor passes any options given to it through to the lru-cache constructor, so see that repo for the full list of cache management options. Commonly used ones include:

  • max (default: 100) - maximum size of the cache.
  • maxAge (default: 5 sec/5000 ms) - maximum age of an item in ms.
  • updateAgeOnGet (default: false) - When using time-expiring entries with maxAge, setting this to true will make each entry's effective time update to the current time whenever it is retrieved from cache, thereby extending the expiration date of the entry.

Contribute

See the contribute file!

PRs accepted.

If editing the Readme, please conform to the standard-readme specification.

Commercial Support

Commercial support for this library is available upon request from Digital Bazaar: [email protected]

License

New BSD License (3-clause) © Digital Bazaar

did-io's People

Contributors

bigbluehat avatar davidlehn avatar dlongley avatar dmitrizagidulin avatar gannan08 avatar mattcollier avatar msporny avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

did-io's Issues

Ensure UUID type DIDs have DID url-to-document-contents integrity protection

Should we perhaps deprecate the uuid type DID generation? (Since unlike the cryptonym, it does not contain proof of integrity of the document in the url itself).

Ensure that UUID type DIDs have some sort of integrity protection (that ensures that the document that gets resolved at the URL is the same document you expected). Cryptonyms currently have this property (the DID is generated from a key in the DID Doc), but UUID types do not.

Perhaps we can use something like Resource Integrity Protection (or a simplified version of Decentralized Autonomic Data)?

Proper handling of unknown DIDs

const context = didDoc['@context'];

I'm dealing with a negative test in bedrock-ledger-validator-signature where the DID does not exist for a mock key 'did:v1:777ea7ad-ab68-4039-b85b-a45a795b2d93/keys/1'

I'm using v0.7.0 against genesis.testnet.veres.one, is that proper?

What seems to be happening is that the request for L86 is returning an empty document which leds to:

TypeError: Cannot read property '@context' of undefined
    at VeresOneClient.get (/home/matt/dev/bedrock-dev/bedrock-ledger-validator-signature/test/node_modules/did-io/lib/methods/veres-one/client.js:86:27)
    at process._tickCallback (internal/process/next_tick.js:68:7)

Add a buildDocumentLoader() convenience method to CachedResolver

Add a buildDocumentLoader() convenience method, with the following functionality:


An initialized CachedResolver instance provides a convenience method that
builds a documentLoader instance that works for its registered DID methods.
This loader can be further composed with other compatible JSON-LD document
loaders
.

const resolver = new CachedResolver();
resolver.use(didMethodDriver1);
resolver.use(didMethodDriver2);

const documentLoader = resolver.buildDocumentLoader();

// The resulting documentLoader function now supports getting DID documents
// for did method 1 and 2, as well as fetching public keys from those DIDs.

Consider cache auto-updater

Since fetching some DID docs can be a slow process, we may want to engineer something to keep caches warm independently of resolution requests. What this means is that -- we could schedule a timer to check for cache entries that are close to expiring, and we could re-resolve them in a background process. This would help reduce wait times when popular values have expired from the cache and just need to be refreshed.

Can bad things happen in the `default` case

return RSAKeyPair.from(data, options);

It appears to me that some there could be some unintended consequences from having this default case. This module really only supports two key types, is that correct? I think we just be explicit about that and throw if the keyType is not one of the acceptable types.

I would add that I think the use of switch with a fall through to a default case is a pattern to be avoided. I think there is good reasoning behind DB's strong preference for if...return pattern whenever possible. if...else when it's absolutely necessary. I just did a quick search and it appears that switch is used in only one place in the entire bedrock backend code base.

CachedResolver is a weird name

I think CachedResolver is kind of a weird name. It sounds like the resolver is cached vs. what I'm guessing it actually is ... which is a resolver that has a cache for whatever it is that it resolves. It's not clear why we didn't just name it DidResolver and the fact that it has a cache is just a feature of it. Can we do that in a newer version?

Use a good default iteration count

Use this as a guideline for determining a good default PBE iteration count:

http://security.stackexchange.com/questions/3959/recommended-of-iterations-when-using-pkbdf2-sha256

You should use the maximum number of rounds which is tolerable,
performance-wise, in your application. The number of rounds is a slowdown factor,
which you use on the basis that under normal usage conditions, such a slowdown
has negligible impact for you (the user will not see it, the extra CPU cost does not
imply buying a bigger server, and so on). This heavily depends on the operational
context: what machines are involved, how many user authentications per second...
so there is no one-size-fits-all response.

The wide picture goes thus:

  • The time to verify a single password is v on your system. You can adjust this time
    by selecting the number of rounds in PBKDF2. A potential attacker can gather f times
    more CPU power than you (e.g. you have a single server, and the attacker has 100
    big PC, each being twice faster than your server: this leads to f=200).
  • The average user has a password of entropy n bits (this means that trying to guess
    a user password, with a dictionary of "plausible passwords", will take on
    average 2n-1 tries).
  • The attacker will find your system worth attacking if the average password can be
    cracked in time less than p (that's the attacker's "patience").

Your goal is to make the average cost to break a single password exceed the
attacker patience, so that he does not even tries to, and goes on to concentrate
on another, easier target.

...

So the remaining parameter is v. With f = 200 (an attacker with a dozen good
GPU), a patience of one month, and n = 32, you need v to be at least 8 milliseconds
.
So you should set the number of rounds in PBKDF2 such that computing it over a
single password takes at least that much time on your server. You will still be able
to verify 120 passwords per second with a single core, so the CPU impact should
be negligible for you. Actually, it is safer to use more rounds than that, because,
let's face it, getting 32 bits worth of entropy out of the average user password is a
bit optimistic.

So if we think a month is enough to deter an attacker with a dozen good GPUs, our time to generate a key should be at least 8ms. We can easily go higher than this without messing up the UX and should. We just need to consider different browsers and CPUs will be running this.

We may also want to randomize the number of iterations, with a certain required minimum.

Add some validation to catch null/undefined hostnames

Default settings in bedrock-did-client set the hostname to 'null' which can get passed into did-io here:

https://github.com/digitalbazaar/bedrock-did-client/blob/did-io-0.7.x/lib/index.js#L35

This results in an error thrown out of this module:

Could not fetch ledger agents: { FetchError: request to https://null/ledger-agents failed, reason: getaddrinfo ENOTFOUND null null:443
    at ClientRequest.<anonymous> (/home/matt/dev/bedrock-dev/bedrock-ledger-validator-signature/test/node_modules/node-fetch/lib/index.js:1345:11)
    at ClientRequest.emit (events.js:182:13)
    at TLSSocket.socketErrorListener (_http_client.js:391:9)
    at TLSSocket.emit (events.js:182:13)
    at emitErrorNT (internal/streams/destroy.js:82:8)
    at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
    at process._tickCallback (internal/process/next_tick.js:63:19)
  message:
   'request to https://null/ledger-agents failed, reason: getaddrinfo ENOTFOUND null null:443',
  type: 'system',
  errno: 'ENOTFOUND',
  code: 'ENOTFOUND' }

There should be some validation of the hostname before attempting to connect to https://null

If mode and did should be validated as well if they aren't already.

Support unregistered/pairwise DID use case

Since there's no way to tell if a DID is registered or unregistered/pairwise (and this is by design), the resolution logic should be:

  1. do a GET to the ledger, check if it exists
  2. If doesn't exist (specifically a 404 error), then check local didStore to see if it's there, and return.
  3. If the DID also does not exist in local did store, then re-throw the 404 error

Where should `attachInvocationProof` live?

@dlongley @dmitrizagidulin

attachInvocationProof({operation, capability, capabilityAction, creator,
algorithm, privateKeyPem, privateKeyBase58}) {
// FIXME: use `algorithm` and validate private key, do not switch off of it
if(privateKeyPem) {
algorithm = 'RsaSignature2018';
} else {
algorithm = 'Ed25519Signature2018';
}
// FIXME: validate operation, capability, creator, and privateKeyPem
// TODO: support `signer` API as alternative to `privateKeyPem`
const jsigs = this.injector.use('jsonld-signatures');
return jsigs.sign(operation, {
algorithm,
creator,
privateKeyPem,
privateKeyBase58,
proof: {
'@context': constants.VERES_ONE_V1_CONTEXT,
proofPurpose: 'capabilityInvocation',
capability,
capabilityAction
}
});
}

Do we need it at all?

Should it be part of crypto-ld?

If we need the wrapper, we know that we want to be able to add these sorts of proofs to operations other that DID Documents.

Rewrite to remove node-specific dependencies

There are a number of dependencies that are node specific (or that can run in the browser but are unnecessarily large for what they are providing).

We should remove (from at least the browser-side implementation):

  • async
  • request
  • lodash

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.