Coder Social home page Coder Social logo

digital-asset / daml Goto Github PK

View Code? Open in Web Editor NEW
783.0 32.0 198.0 188.88 MB

The Daml smart contract language

Home Page: https://www.digitalasset.com/developers

Python 0.18% Shell 0.70% Groovy 0.01% PowerShell 0.09% Haskell 15.24% Scala 77.35% Makefile 0.01% TypeScript 0.79% Java 2.32% JavaScript 0.01% Batchfile 0.01% CSS 0.01% Nix 0.16% Go 0.02% Starlark 2.91% Smarty 0.02% sed 0.01% jq 0.01% TeX 0.15% Dockerfile 0.01%
daml smart-contract dlt privacy-by-design workflow-automation distributed-ledger bazel sdk

daml's People

Contributors

aherrmann-da avatar akrmn avatar azure-pipelines[bot] avatar carlpulley-da avatar cocreature avatar dasormeter avatar dylant-da avatar garyverhaegen-da avatar gerolf-da avatar hurryabit avatar leo-da avatar miklos-da avatar mziolekda avatar neil-da avatar nickchapman-da avatar nicu-da avatar nmarton-da avatar paulbrauner-da avatar pbatko-da avatar rautenrieth-da avatar remyhaemmerle-da avatar rohanjr avatar s11001001 avatar samirtalwar avatar samuel-williams-da avatar shayne-fletcher avatar skisel-da avatar sofiafaro-da avatar stefanobaghino-da avatar tudor-da avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

daml's Issues

Make sure licenses are properly generated

The NOTICE file needs to contain all third party licenses of dependencies.

  • Haskell libs pull all licenses
  • C static libs pull all licenses
  • C dynamic libs pull all licenses
  • Node pull all licenses
  • Scala SBT libs pull all licenses
  • Java Maven libs pull all licenses
  • Python libs pull all licenses
  • Ruby libs (from dev-env) pull all licenses
  • Bazel Go libs
  • Gradle libs
  • Check for known high sev vulnerabilities

Steps to generate from Blackduck

  1. Use latest snapshot version of Blackduck hub-detect-5.3.0.jar from https://repo.blackducksoftware.com/artifactory/bds-integrations-snapshot/com/blackducksoftware/integration/hub-detect/5.3.0-SNAPSHOT/

  2. Run full bazel build //... for daml repo

  3. Run full Bazel Blackduck scan from root of repo for all projects and components (that Blackduck can recognise)
    java -jar ~/Downloads/hub-detect-5.3.0-20190302.191922-36.jar --blackduck.offline.mode=false --detect.included.detector.types=bazel --detect.bazel.target=//... --logging.level.com.blackducksoftware.integration=DEBUG --blackduck.timeout=60 --detect.project.name=local/daml --detect.code.location.name=local/daml --detect.project.version.name=0.11.10 --detect.project.version.phase=development --blackduck.api.token=$BLACKDUCK_HUBDETECT_TOKEN --blackduck.url=https://digitalasset.blackducksoftware.com --detect.npm.include.dev.dependencies=false --detect.python.python3=true --detect.cleanup=false --detect.cleanup.bdio.files=false

  4. Run da use <latest preview release> to pull down all SDK artifacts being released

  5. Run Blackduck scan against the --detect.source.path=~/.da/ directory
    java -jar ~/Downloads/hub-detect-5.3.0-20190302.191922-36.jar --blackduck.offline.mode=false --detect.project.tool=DETECTOR --logging.level.com.blackducksoftware.integration=DEBUG --blackduck.timeout=60 --detect.project.name=local/sdk --detect.code.location.name=local/sdk --detect.project.version.name=0.11.10 --detect.project.version.phase=development --blackduck.api.token=$BLACKDUCK_HUBDETECT_TOKEN --blackduck.url=https://digitalasset.blackducksoftware.com --detect.npm.include.dev.dependencies=false --detect.python.python3=true --detect.cleanup=false --detect.cleanup.bdio.files=false --detect.source.path=~/.da/ --detect.detector.search.depth=8 --detect.blackduck.signature.scanner.exclusion.pattern.search.depth=10

  6. Once scan is processed and results are viewable in Blackduck Hub https://digitalasset.blackducksoftware.com, go to relevant project, Reports tab, and generate notices file in txt format
    image

  7. Concatenate the notices file from the SDK artifact scan and the full DAML repo scan and store output in source as NOTICES file at root of daml repo

Steps to Generate Haskell/C Dependencies

  1. Bazel query to extract list of dependencies - the name of haskell licenses can either be parsed out or extracted from the --output build or --output xml formats
    bazel query 'kind("(haskell_import|haskell_library|cc_import|cc_library)", filter("@.*", deps(//...)))' --output label_kind --keep_going | sort | uniq

  2. For statically imported haskell packages that live on hackage, retrieve relevant license under various names in manner similar to this code
    https://github.com/DACH-NY/daml/blob/ee3506bb2bb3c1c8e463bc05733250f5ffd7153c/daml-foundations/daml-tools/docs/daml-licenses/licenses/extract.py#L84-L88

  3. For non-Haskell packages and/or dynamic libraries, a mapping of library name to relevant license file will need to be maintained manually a la https://github.com/DACH-NY/daml/blob/ee3506bb2bb3c1c8e463bc05733250f5ffd7153c/daml-foundations/daml-tools/docs/daml-licenses/licenses/extract.py#L171-L183

  4. Generate output of static and dynamic Haskell and C libraries in Output in format equivalent to that of auto-generated notices file above, concat to Blackduck output and include in NOTICES file at root

They should be generated, put in the docs, and check they are not GPL. Or we should conclude its not necessary?

Don't hardcode version of scala libraries in release script

Currently, the generation of .pom files uses hardcoded versions of a few Scala libraries which are taken from rules_scala. This is likely to break if we upgrade rules_scala.

  • org.scala-lang:scala-library:2.12.2
  • org.scala-lang.modules:scala-parser-combinators_2.12:1.0.4
  • com.thesamet.scalapb:scalapb-runtime:0.8.4

Create standalone SDK package for offline installation

We expect many of our users to work in locked-down environments, where the assistant won't be able to download from external sites. For these situations we need to provide a zip / tarball with the latest SDK version, that can installed following some instructions (where to unzip, which things to put on path, etc.)

Use native installation paradigm (apt for linux)

At present, to download DA binary artifacts, standard commands to install binaries are not available to developers. To lower the barrier to entry we want to enable standard installation approaches that are familiar to developers in particular languages idiomatic to the OS and environment:

Linux

apt-get daml

check contract visibility when looking up contracts in sandbox

The scenario runner implements a key check making sure that contracts we look up are visible:

.

we need to implement the same check in the sandbox, but right now we don't, see

SandboxDamle.consume(engine.submit(commands))(packageContainer, getContract, lookupKey).map {
.

to check, we need to check that the committer (which the sandbox knows) can see the contract requested (which the sandbox does not currently know).

to be able to do this, we need to start storing some additional data in LedgerState. specifically we need to keep track of which party can see contracts, and update this as we add transactions to the ledger state.

open question: do we need to take the ledger time into account at all in this visibility check? cc @oggy- @andreaslochbihler-da

introduce a runnable postgres

We need to introduce an easily runnable postgres instance we can use for testing. Use https://github.com/testcontainers/testcontainers-java and write a unit test for validating that it works.

multiple grpc-core jars on classpath

In the sandbox project we noticed that multiple instances (although same version) of grpc-core is put on the classpath. The order of dependencies does make a difference and we can end up with NoSuchMethodErrors depending on which jar is getting found first.

Ledger API 2.0

Create a v2.0 API interface parallel to the v1.x. The new version should implement all the change proposals we collected here. For a transitional period we could support both versions.

Speed up compilation of ghc-lib

Currently compiling ghc-lib is super slow. We should figure out why, and see what can be done. That may well involve profiling GHC itself and figuring out which modules take longest to compile and why.

CC @shayne-fletcher-da

damlc: no acceptable C compiler found in path

$ bazel build damlc
[dev-env] Building tools.bazel...
these derivations will be built:
  /nix/store/a794c4nz8vmqwx7vwqqnaawnavwcsim5-bazel.drv
these paths will be fetched (109.01 MiB download, 111.54 MiB unpacked):
  /nix/store/h5w8nnmbi7gw0rs57b02k8k5xxh5a779-bazel-0.22.0
copying path '/nix/store/h5w8nnmbi7gw0rs57b02k8k5xxh5a779-bazel-0.22.0' from 'http://cache.da-int.net'...
building '/nix/store/a794c4nz8vmqwx7vwqqnaawnavwcsim5-bazel.drv'...
[dev-env] Built tools.bazel in /nix/store/sfv7abpb02amwhzl0x88crj0sx0xz5ks-bazel and linked to /home/flokli/dev/daml/dev-env/var/gc-roots/bazel
Extracting Bazel installation...
Starting local Bazel server and connecting to it...
INFO: Invocation ID: f79fbe22-34cb-4fa2-a560-e3703b31d771
INFO: SHA256 (https://github.com/tferi-da/rules_scala/archive/b6a96474a5ee9eed0515cb0464de374e0df516f8.zip) = f953bb45ee17196a8c054786e30489e3b8764e7fbeb6d6942ac2b61b3c8f7221
DEBUG: Rule 'io_bazel_rules_scala' modified arguments {"sha256": "f953bb45ee17196a8c054786e30489e3b8764e7fbeb6d6942ac2b61b3c8f7221"}
DEBUG: /home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/bazel_skylib/skylark_library.bzl:23:1: WARNING: skylark_library.bzl is deprecated and will go away in the future, please use bzl_library.bzl instead.
/nix/store/nvvmnb7qn2accgpxzbwvp5i995slycyy-bazel-cc-toolchain
WARNING: /home/flokli/dev/daml/daml-foundations/daml-ghc/package-database/BUILD.bazel:12:1: target 'ghc-prim' is both a rule and a file; please choose another name for the rule
DEBUG: /home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/bazel_skylib/lib.bzl:30:1: WARNING: lib.bzl is deprecated and will go away in the future, please directly load the bzl file(s) of the module(s) needed as it is more efficient.
[dev-env] Building tools.sed...
[dev-env] Built tools.sed in /nix/store/wkgszaq2dkc4asapcbx6ypd7xdnzad9f-gnused-4.4 and linked to /home/flokli/dev/daml/dev-env/var/gc-roots/sed
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking for gcc... no
checking for cc... no
checking for cl.exe... no
configure: error: in `/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/haskell_network_1843485230':
configure: error: no acceptable C compiler found in $PATH
See `config.log' for more details
cabal2bazel: callProcess: ./configure (exit 1): failed
ERROR: Analysis of target '//:damlc' failed; build aborted: no such package '@haskell_network_1843485230//': Traceback (most recent call last):
        File "/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/ai_formation_hazel/hazel.bzl", line 19
                symlink_and_invoke_hazel(ctx, ctx.attr.hazel_base_repo_name, <4 more arguments>)
        File "/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/ai_formation_hazel/hazel_base_repository/hazel_base_repository.bzl", line 84, in symlink_and_invoke_hazel
                fail("Error running hazel on {}:\n{}\...))
Error running hazel on network.cabal:
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking for gcc... no
checking for cc... no
checking for cl.exe... no

[dev-env] Building tools.sed...
[dev-env] Built tools.sed in /nix/store/wkgszaq2dkc4asapcbx6ypd7xdnzad9f-gnused-4.4 and linked to /home/flokli/dev/daml/dev-env/var/gc-roots/sed
configure: error: in `/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/haskell_network_1843485230':
configure: error: no acceptable C compiler found in $PATH
See `config.log' for more details
cabal2bazel: callProcess: ./configure (exit 1): failed
INFO: Elapsed time: 22.424s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (131 packages loaded, 1373 targets configured)
    currently loading: @haskell_ghc_lib__6093350//
    Fetching @haskell_network_1843485230; fetching 5s

find out what's going on with traits with state and bazel

In tag broken-stateful-trait-inheritance , we have class TestExecusionSequencerFactory being a trait with a variable executionSequencerFactory.

Before this class was in the test code for AbstractContractServiceIT and . @gaboraranyossy-da moved it to platform-test-utils and stuff started breaking. Specifically, the setter for executionSequencerFactory in AbstractContractServiceIT is abstract, for some reason.

We suspect that this has to do with ijar files that bazel generates, but we're not entirely sure.

Right now the workaround is to duplicate the TestExecusionSequencerFactory in the tests that need it, but we clearly should fix it.

Include MacOS version in Bazel cache keys

Currently we don’t seem to include the MacOS version in the Bazel cache keys which leads to warnings about object files having been built on a different version of MacOS. While that doesn’t seem to have caused problems so far, it seems very sketchy and we should make sure that the MacOS version gets included in the cache keys.

come up with DAML-LF 2, which is exactly like DAML-LF 1.x, but without recursive data type

we need to get rid of the recursive data type in the DAML-LF proto. it causes endless trouble (see #437 for the latest in a long series of complaints). i propose to just have an array of exprs at the top level for each module, i.e.

message ExprTable {
  repeated Expr exprs = 1;
}

and then have pointers into said array instead of recursive occurrences.

however, we might as well clean up other stuff since we can:

  • remove recursive data type
  • remove party literals, getParty and the flag indicating the absence of party literals
  • remove deprecated arrow type
  • clean up identifier data type to be in line with what we have in Scala
  • remove the variant of exercise which expects actors
  • remove foldl, foldr and equal_list

@martin-drhu-da @remyhaemmerle-da please add items as you see fit.

Public SDK artifacts to public repos -- maven central and/or bintray jcenter

To ease barrier to entry of using SDK so developers do not need to add custom repositories or manage their bintray credentials, publish SDK artifacts to a public repository that is already defined by default for most build tools (maven / sbt)

Leading contenders are maven central and bintray jcenter

Approach for publishing outlined here https://reflectoring.io/bintray-jcenter-maven-central/

For general SDK userbase we want to optimise to making the software available in readily accessible public locations (maven central, jcenter bintray) and that is the reason we want to get our software to those public repos.

Additionally we need to consider that many enterprise institutions will not have public internet acces enabled to pull from public repos by default and having all the artifacts in the single bintray private repo location would be adventagous.

So recommendation is we continue to our our private bintray as a staging repo and promote to public repos from there.

We can follow a publish to maven central / jcenter bintray approach as outlined in this article https://reflectoring.io/bintray-jcenter-maven-central/

Things to consider

  1. There are no credentials or special license agreements required for JAR artifacts that are published through the public repositories. Therefore, if users pull these in via Group-Artifact-Version (GAV) definition in their build files, they will immediately have the classes and artifacts ready to use.

References

https://reflectoring.io/bintray-jcenter-maven-central/
https://maven.apache.org/repository/guide-central-repository-upload.html

packageId depends on path given

da run damlc -- package /Users/bame/Desktop/Projects/bilateral-repo-demo/scripts/../backend/src/main/daml/LibraryModules.daml br
and
da run damlc -- package /Users/bame/Desktop/Projects/bilateral-repo-demo/backend/src/main/daml/LibraryModules.daml br
give different packageIds.

gRPC Recursion Limit

Long scenarios fail with the below:

Waiting for Sandbox................Failed to start sandbox-java
Description: user error (Unable to connect to Sandbox on port 7600: timeout after 15 seconds)
Error type: SomeException
------ last 20 lines from log ------
    at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42673)
    at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42668)
    at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
    at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42816)
    at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42760)
    at com.digitalasset.daml_lf.DamlLf1$Case$1.parsePartialFrom(DamlLf1.java:43725)
    at com.digitalasset.daml_lf.DamlLf1$Case$1.parsePartialFrom(DamlLf1.java:43720)
    at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
    at com.digitalasset.daml_lf.DamlLf1$Expr.<init>(DamlLf1.java:22608)
    at com.digitalasset.daml_lf.DamlLf1$Expr.<init>(DamlLf1.java:22394)
    at com.digitalasset.daml_lf.DamlLf1$Expr$1.parsePartialFrom(DamlLf1.java:39643)
    at com.digitalasset.daml_lf.DamlLf1$Expr$1.parsePartialFrom(DamlLf1.java:39638)
    at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
    at com.digitalasset.daml_lf.DamlLf1$CaseAlt.<init>(DamlLf1.java:39836)
    at com.digitalasset.daml_lf.DamlLf1$CaseAlt.<init>(DamlLf1.java:39733)
    at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42673)
    at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42668)
    at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
    at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42816)
    at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42760)

The actual error is:

com.google.protobuf.InvalidProtocolBufferException: Protocol message had too many levels of nesting. May be malicious. Use CodedInputStream.setRecursionLimit() to increase the depth limit.

The solution is to either increase the recursion limit or refactor the data structures not to be so right-tailed.

Sign binaries

For general security reasons it would be nice to sign all binaries we produce for a release, including the installer. No need to do it immediately, but in the medium term.

CC @shaul-da

Fix blinking ScalaCodegenIt

It just blinked in my PR with totally unrelated changes:

ScalaCodeGenIT:
- generated package ID among those returned by the packageClient (665 milliseconds)
- alice creates CallablePayout contract and receives corresponding event *** FAILED *** (10 seconds, 179 milliseconds)
  A timeout occurred waiting for a future to complete. Queried 40 times, sleeping 250 milliseconds between each query. (ScalaCodeGenIT.scala:323)
  org.scalatest.concurrent.Futures$FutureConcept$$anon$1:
  at org.scalatest.concurrent.Futures$FutureConcept.tryTryAgain$1(Futures.scala:538)
  at org.scalatest.concurrent.Futures$FutureConcept.futureValueImpl(Futures.scala:550)
  at org.scalatest.concurrent.Futures$FutureConcept.futureValueImpl$(Futures.scala:479)
  at org.scalatest.concurrent.ScalaFutures$$anon$1.futureValueImpl(ScalaFutures.scala:275)

test java codegen with strange type / module names using daml-lf directly

there are various edge cases in the java codegen that we cannot test easily or at all using a .daml file. for example i do not believe we can currently generate type names with dollars in them from .daml.

before it was very hard to write DAML-LF directly, but it's actually much easier now thanks to @remyhaemmerle-da 's parser. see https://github.com/DACH-NY/daml/blob/2766d13da75fcf7b1e08f00fbc02d6891aa53645/daml-foundations/daml-lf/interface/src/test/scala/com/digitalasset/daml/lf/iface/TypeSpec.scala#L75 for an example test where i use the type parser.

problematic tests we should make sure to test:

  • strange names (dollars, keywords, etc)
  • prefix clashes, e.g. Foo:Bar and Foo.Bar:Baz -- this is now forbidden as per DAML-LF collision restriction, see https://github.com/DACH-NY/daml/blob/c876df419ddcfb13b4cc445e9f014c8ac6f6642a/daml-lf/spec/daml-lf-1.rst#id31 . specifically, the type Foo:Bar would conflict with module Foo.Bar
  • Foo:Bar resulting in class Foo in the default namespace, rendering it effectively unusable (cannot be used in another Java file). -- not applicable anymore because now we create a package per module, rather than a class per module.

and anything else @stefanobaghino-da , @gerolf-da , or @mariop-da can think of.

add grpcurl tutorial to the docs

grpcurl is great because it's code we do not have to maintain. however getting started is a bit finnicky. me and @bame-da agree that having a tutorial for people to feed off would be very useful.

implement sdk release process for monorepo

we have to come up with something like the old daml-foundations ci, but for the new repo.

specifically, it should build all the artifacts that an SDK release is made of and upload them to the proper channels.

for jars, the main piece of work required is to come up with the pom.xml files programmatically from bazel queries.

it's not entirely clear yet what we need to do for typescript / javascript / docs.

Components we need to release because the da assistant will download them:

  • damlc
  • daml-extension
  • navigator
  • extractor
  • ledger-api-protos
  • sandbox
  • quickstart-java
  • quickstart-js
  • sdk

JVM libraries pulled in via the pom.xml in quickstart-java:

  • bindings-java
  • bindings-js
  • daml-lf-archive
  • rs-grpc-bridge

Other stuff

Desugaring should insert necessary Paren

Given:

daml 1.2
-- A simple example, deliberately encoding in CRLF format
module Simple where

import DA.Optional

template Person
  with
    person : Party
  where
    signatory person
    controller fromOptional (Just person) can
      Sleep : () do return ()

This gives the error:

/Users/neil/projects/daml/daml-foundations/daml-ghc/tests/Simple.daml:12:16: error:
    * No instance for (IsParties
                         (Optional (Maybe Party) -> Maybe Party))
        arising from a use of `toParties'
        (maybe you haven't applied a function to enough arguments?)
    * In the expression: toParties fromOptional (Just person)
      In the first argument of `concat', namely
        `[toParties fromOptional (Just person)]'
      In the expression: concat [toParties fromOptional (Just person)]

It's all worked, but the printed expression is toParties fromOptional (Just person). However, the App's are written such that the actual semantics are toParties (fromOptional (Just person)). We should insert a Paren application there so the pretty printed output doesn't confuse people.

CC @shayne-fletcher-da

create targets to produce a zip / jar file for each .proto set we want to release

right now we do not have bazel targets producing packages with the .proto files that we want to distribute. these are, at least:

  • daml-lf value
  • daml-lf transaction
  • daml-lf archive
  • ledger api

note that the proto_library rule produces a binary file with the descriptor set for the .protos, but without the protos themselves.

introduce a Sandbox config file

  • Introduce a config file using Typesafe config
  • Put all DB / TLS related entries there
  • use it for various hardcoded elements as well
  • Reconsider shared uses case with CLI

Improve damlc warnings on the command line

@raphael-speyer-da requests:

  1. Allow suppress warnings on command line. When you just care about errors, having to visually scan through a bunch of warnings on the terminal to find them is slow and finicky.
  • command line can show warnings regardless of whether there are any errors.

  • when warnings are emitted, also emit their unique id so that users know how to -Werror or suppress them

Document warning supression

We should probably document a subset of the option - e.g. encourage people to {-# OPTIONS -Wno-incomplete-patterns #-} as the less-ghc-specific and more modern thing.

Strip Haskell binaries

We should strip the Haskell binaries that we produce in Bazel. Ideally we would fix this by getting rules_haskell to respect Bazel’s --strip flag.

fix WARNING: An illegal reflective access operation has occurred

On da run sandbox I get some text saying:

WARNING: An illegal reflective access operation has occurred

At this point I freaked out, and, assuming the computer was illegally accessing reflective services, smashed all the mirrors in my house. This message should not be something users see.

The fix for this is in the protobuf lib and will be released in version v3.7.0. There are only RCs for it when I am writing this. It's gonna take some time until this change makes it up the protobuf -> grpc-protobuf -> grpc-java -> .. dependency chain.

daml-assistant: thin daml assistant.

We're moving toward a new assistant called daml, instead of da, that redirects commands in the style of git. Each sdk version release will ship with a set of tools, called daml-compile, daml-upgrade, daml-doctor, daml-uninstall, etc. When the user invokes the daml command, for example daml compile Foo.daml, the assistant will find the appropriate version of the daml-compile tool and invoke that with the argument Foo.daml.

There will be a .daml folder in the user home folder on unix, and appropriate location on Windows. The .daml directory will look like:

.daml/
  |- config.yaml
  |- sdk/
       |- 0.12.1/
            |- bin/
                 |- daml
                 |- daml-compile
                 |- daml-upgrade
                 |- ...
            |- ...
       |- 0.12.0/
            |- bin/
                 |- daml
                 |- daml-compile
                 |- daml-upgrade
                 |- ...
            |- ...

The daml command is only responsible for:

  1. Finding the current project by going up the current directory path and finding a da.yaml file.

  2. Selecting the appropriate version of the sdk to use (based on da.yaml, or latest version if not found).

  3. Invoking the appropriate daml-something command with the arguments it was given, and with additional environment variables:

    • DAML_HOME = the absolute path of .daml folder
    • DAML_PROJECT = the absolute path of the project folder (if it exists)
    • DAML_SDK = the absolute path of the chosen sdk version folder
    • DAML_SDK_VERSION = the version number of the chosen sdk version

The invoked daml-something tools should read the $DAML_HOME/config.yaml and $DAML_PROJECT/da.yaml files directly for configuration.


Packaging and releasing

We want each SDK version be a single tarball (or zip file). The daml tools, and the daml assistant itself, will be part of the SDK tarball.

We want to be able to move away from relying on Bintray for distributing the SDK. The latest version of the SDK tarball should be made available through some consistent URL for each platform (e.g. daml.com/sdk/latest-linux.tar.gz) that redirects to the latest version.

$DAML_HOME/config.yaml will be configured with this URL, and daml-upgrade will fetch a new version when it becomes available.

Separately, we will need an installer script for each platform, and to incorporate this installer script with existing software distribution channels (such as brew and apt-get). Ideally the installer simply downloads the tarball, expands it to the right place, writes a basic $DAML_HOME/.config.yaml, puts daml in the PATH, and you're good to go.


TBD:

  1. Should there be a $DAML_HOME/bin folder? Or should daml be installed somewhere in the PATH? Or something else? Or both?

  2. Should the $DAML_SDK directory be readonly?

    • Except for tools that manage SDK versions, such as daml-upgrade and daml-uninstall, of course.
  3. If a daml-something tool is invoked directly, without the environment variables above, should the tool attempt to determine the flags itself? Or give an error?

    • Part of the daml-assistant project should be a Haskell library that makes it easy to determine these variables if they aren't passed directly, and this is what daml will use, so there is uniformity.
  4. If YES to 3, and the current tool is from the wrong SDK version for the current project (based on da.yaml), should the tool go ahead and silently invoke the appropriately versioned tool instead? Should this be an error? A warning?

Use pure instead of return

Using return confuses people who expect:

return 1
debug 2
return 2

To return immediately at 1 without continuing, like in every other language in the world. Using pure or wrap solves that problem. We should decide what the preferred flavour is, and encourage it throughout, perhaps putting return in the DEPRECATED set of compatible things.

CC @meiersi-da @martin-drhu-da @bethaitman-da @bame-da

Use native installation paradigm for DAML (brew for OSX)

At present, to download DA binary artifacts, standard commands to install binaries are not available to developers. To lower the barrier to entry we want to enable standard installation approaches that are familiar to developers in particular languages idiomatic to the OS and environment:

OSX

brew cask create daml

cask 'daml' do
  version '110'
  sha256 '2ca6cc3741e7bec3f9efe06eb485ce7a5052a697d5bffa7d9608d624dc2ff0bf'

  url 'https://bintray.com/digitalassetsdk/DigitalAssetSDK/download_file?file_path=com%2Fdigitalasset%2Fda-cli%2F110-f42c34eba1%2Fda-cli-110-f42c34eba1-osx.run'
  name 'Digital Asset SDK da-cli'
  homepage 'https://bintray.com/digitalassetsdk/DigitalAssetSDK/da-cli'

  app 'DAML'
end

Using Brew Detail and Reference

https://engineering.innovid.com/distributing-command-line-tools-with-homebrew-d03e795cadc8
https://kylebebak.github.io/post/distribute-program-via-homebrew
https://docs.brew.sh/Acceptable-Formulae

Related Issues

#331 #332

Faster damlc compilation

@raphael-speyer-da

  • faster compilation, with our big project it currently takes over a minute just to compile the daml and the code base will like grow quite a lot over the next year or so

Possible solutions:

  • Expose a -j flag which is faster, but non-deterministic - which doesn't matter for active dev anyway.
  • Use incremental caching with .hi files.
  • Profile and optimise the process

tidy up filesystem structure of the repo

a draft of how to organize the components:

  • release -- what is now daml-foundations/ci
  • daml-lf/ https://github.com/DACH-NY/daml/pull/454
    • spec
    • archive
    • interpreter
    • data
    • transaction
    • lfpackage
    • repl
    • testing-tools
  • ledger-api/
    • proto files
    • basic tests
  • haskell-ide-core/
    • "will be its own component until we spin it out" -- @neil-da
  • compiler/
    • compiler
    • scenario interpreter
    • vs code extension?
  • ledger/
    • common libraries
    • sandbox
    • semantic tests
  • language-support/
    • java/
      • bindings
      • codegen
    • js/
      • bindings
      • ts codegen
    • scala/
      • bindings
      • codegen
  • da-assistant/ (old)
  • daml-assistant/ (new)
  • navigator/ -- #783
  • extractor/ -- #688
  • installers/
  • docs/
    • all the sdk docs
  • bazel_tools/
    • utility functions, etc. the
  • dev-env
  • README.md
  • CONTRIBUTING.md
  • BAZEL*.md
  • LICENSE, once we open source

moreover,

  • all sbt files are removed
  • all BUCK files are removed
  • everything that does not fit in the above directories is removed
  • ledger-bindings removed, see #123

change imports for daml_lf.proto

in daml_lf.proto

https://github.com/DACH-NY/daml/blob/master/daml-foundations/daml-lf/archive/da/daml_lf.proto

imports are done as,

import "da/daml_lf_0.proto";
import "da/daml_lf_1.proto";
import "da/daml_lf_dev.proto";

(bold is mine)

which makes it hard to import the files from another project

if you could change that to

import "daml_lf_0.proto";
import "daml_lf_1.proto";
import "daml_lf_dev.proto";

without the /da prefixes the file could then be referenced in other projects

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.