digital-asset / daml Goto Github PK
View Code? Open in Web Editor NEWThe Daml smart contract language
Home Page: https://www.digitalasset.com/developers
The Daml smart contract language
Home Page: https://www.digitalasset.com/developers
The NOTICE file needs to contain all third party licenses of dependencies.
Use latest snapshot version of Blackduck hub-detect-5.3.0.jar
from https://repo.blackducksoftware.com/artifactory/bds-integrations-snapshot/com/blackducksoftware/integration/hub-detect/5.3.0-SNAPSHOT/
Run full bazel build //...
for daml repo
Run full Bazel Blackduck scan from root of repo for all projects and components (that Blackduck can recognise)
java -jar ~/Downloads/hub-detect-5.3.0-20190302.191922-36.jar --blackduck.offline.mode=false --detect.included.detector.types=bazel --detect.bazel.target=//... --logging.level.com.blackducksoftware.integration=DEBUG --blackduck.timeout=60 --detect.project.name=local/daml --detect.code.location.name=local/daml --detect.project.version.name=0.11.10 --detect.project.version.phase=development --blackduck.api.token=$BLACKDUCK_HUBDETECT_TOKEN --blackduck.url=https://digitalasset.blackducksoftware.com --detect.npm.include.dev.dependencies=false --detect.python.python3=true --detect.cleanup=false --detect.cleanup.bdio.files=false
Run da use <latest preview release>
to pull down all SDK artifacts being released
Run Blackduck scan against the --detect.source.path=~/.da/ directory
java -jar ~/Downloads/hub-detect-5.3.0-20190302.191922-36.jar --blackduck.offline.mode=false --detect.project.tool=DETECTOR --logging.level.com.blackducksoftware.integration=DEBUG --blackduck.timeout=60 --detect.project.name=local/sdk --detect.code.location.name=local/sdk --detect.project.version.name=0.11.10 --detect.project.version.phase=development --blackduck.api.token=$BLACKDUCK_HUBDETECT_TOKEN --blackduck.url=https://digitalasset.blackducksoftware.com --detect.npm.include.dev.dependencies=false --detect.python.python3=true --detect.cleanup=false --detect.cleanup.bdio.files=false --detect.source.path=~/.da/ --detect.detector.search.depth=8 --detect.blackduck.signature.scanner.exclusion.pattern.search.depth=10
Once scan is processed and results are viewable in Blackduck Hub https://digitalasset.blackducksoftware.com, go to relevant project, Reports tab, and generate notices file in txt format
Concatenate the notices file from the SDK artifact scan and the full DAML repo scan and store output in source as NOTICES file at root of daml repo
Bazel query to extract list of dependencies - the name of haskell licenses can either be parsed out or extracted from the --output build or --output xml formats
bazel query 'kind("(haskell_import|haskell_library|cc_import|cc_library)", filter("@.*", deps(//...)))' --output label_kind --keep_going | sort | uniq
For statically imported haskell packages that live on hackage, retrieve relevant license under various names in manner similar to this code
https://github.com/DACH-NY/daml/blob/ee3506bb2bb3c1c8e463bc05733250f5ffd7153c/daml-foundations/daml-tools/docs/daml-licenses/licenses/extract.py#L84-L88
For non-Haskell packages and/or dynamic libraries, a mapping of library name to relevant license file will need to be maintained manually a la https://github.com/DACH-NY/daml/blob/ee3506bb2bb3c1c8e463bc05733250f5ffd7153c/daml-foundations/daml-tools/docs/daml-licenses/licenses/extract.py#L171-L183
Generate output of static and dynamic Haskell and C libraries in Output in format equivalent to that of auto-generated notices file above, concat to Blackduck output and include in NOTICES file at root
They should be generated, put in the docs, and check they are not GPL. Or we should conclude its not necessary?
Currently, the generation of .pom
files uses hardcoded versions of a few Scala libraries which are taken from rules_scala
. This is likely to break if we upgrade rules_scala
.
org.scala-lang:scala-library:2.12.2
org.scala-lang.modules:scala-parser-combinators_2.12:1.0.4
com.thesamet.scalapb:scalapb-runtime:0.8.4
Hi team,
Would it be possible to have a flag in java codegen to package the output into a JAR? It would help with dependencies and distribution.
We expect many of our users to work in locked-down environments, where the assistant won't be able to download from external sites. For these situations we need to provide a zip / tarball with the latest SDK version, that can installed following some instructions (where to unzip, which things to put on path, etc.)
There are lots of docs on syntax etc everywhere, so someone needs to trawl and put together a complete proposal.
At present, to download DA binary artifacts, standard commands to install binaries are not available to developers. To lower the barrier to entry we want to enable standard installation approaches that are familiar to developers in particular languages idiomatic to the OS and environment:
apt-get daml
The scenario runner implements a key check making sure that contracts we look up are visible:
we need to implement the same check in the sandbox, but right now we don't, see
to check, we need to check that the committer (which the sandbox knows) can see the contract requested (which the sandbox does not currently know).
to be able to do this, we need to start storing some additional data in LedgerState
. specifically we need to keep track of which party can see contracts, and update this as we add transactions to the ledger state.
open question: do we need to take the ledger time into account at all in this visibility check? cc @oggy- @andreaslochbihler-da
We need to introduce an easily runnable postgres instance we can use for testing. Use https://github.com/testcontainers/testcontainers-java
and write a unit test for validating that it works.
At present, to download DA binary artifacts, standard commands to install binaries are not available to developers. To lower the barrier to entry we want to enable standard installation approaches that are familiar to developers in particular languages idiomatic to the OS and environment:
....
consider https://hackage.haskell.org/package/nsis
In the sandbox project we noticed that multiple instances (although same version) of grpc-core
is put on the classpath. The order of dependencies does make a difference and we can end up with NoSuchMethodError
s depending on which jar is getting found first.
Create a v2.0
API interface parallel to the v1.x
. The new version should implement all the change proposals we collected here. For a transitional period we could support both versions.
Currently compiling ghc-lib is super slow. We should figure out why, and see what can be done. That may well involve profiling GHC itself and figuring out which modules take longest to compile and why.
CC @shayne-fletcher-da
The markdown-to-rst converter in DA.Daml.GHC.Damldoc.Render.Rst
does not handle code blocks in lists very well, as evidenced by #344. This seems to be a general issue with (GitHub-flavored) markdown (see https://gist.github.com/clintel/1155906). Unfortunately this is not a case where there is an obvious correct fix. Once this is fixed, fix the Enum
docs accordingly (see change in #344).
We should put some form of changelog in the sdk.tar.gz
.
as reported on slack
no main manifest attribute, in /damlc-linux.jar
$ bazel build damlc
[dev-env] Building tools.bazel...
these derivations will be built:
/nix/store/a794c4nz8vmqwx7vwqqnaawnavwcsim5-bazel.drv
these paths will be fetched (109.01 MiB download, 111.54 MiB unpacked):
/nix/store/h5w8nnmbi7gw0rs57b02k8k5xxh5a779-bazel-0.22.0
copying path '/nix/store/h5w8nnmbi7gw0rs57b02k8k5xxh5a779-bazel-0.22.0' from 'http://cache.da-int.net'...
building '/nix/store/a794c4nz8vmqwx7vwqqnaawnavwcsim5-bazel.drv'...
[dev-env] Built tools.bazel in /nix/store/sfv7abpb02amwhzl0x88crj0sx0xz5ks-bazel and linked to /home/flokli/dev/daml/dev-env/var/gc-roots/bazel
Extracting Bazel installation...
Starting local Bazel server and connecting to it...
INFO: Invocation ID: f79fbe22-34cb-4fa2-a560-e3703b31d771
INFO: SHA256 (https://github.com/tferi-da/rules_scala/archive/b6a96474a5ee9eed0515cb0464de374e0df516f8.zip) = f953bb45ee17196a8c054786e30489e3b8764e7fbeb6d6942ac2b61b3c8f7221
DEBUG: Rule 'io_bazel_rules_scala' modified arguments {"sha256": "f953bb45ee17196a8c054786e30489e3b8764e7fbeb6d6942ac2b61b3c8f7221"}
DEBUG: /home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/bazel_skylib/skylark_library.bzl:23:1: WARNING: skylark_library.bzl is deprecated and will go away in the future, please use bzl_library.bzl instead.
/nix/store/nvvmnb7qn2accgpxzbwvp5i995slycyy-bazel-cc-toolchain
WARNING: /home/flokli/dev/daml/daml-foundations/daml-ghc/package-database/BUILD.bazel:12:1: target 'ghc-prim' is both a rule and a file; please choose another name for the rule
DEBUG: /home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/bazel_skylib/lib.bzl:30:1: WARNING: lib.bzl is deprecated and will go away in the future, please directly load the bzl file(s) of the module(s) needed as it is more efficient.
[dev-env] Building tools.sed...
[dev-env] Built tools.sed in /nix/store/wkgszaq2dkc4asapcbx6ypd7xdnzad9f-gnused-4.4 and linked to /home/flokli/dev/daml/dev-env/var/gc-roots/sed
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking for gcc... no
checking for cc... no
checking for cl.exe... no
configure: error: in `/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/haskell_network_1843485230':
configure: error: no acceptable C compiler found in $PATH
See `config.log' for more details
cabal2bazel: callProcess: ./configure (exit 1): failed
ERROR: Analysis of target '//:damlc' failed; build aborted: no such package '@haskell_network_1843485230//': Traceback (most recent call last):
File "/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/ai_formation_hazel/hazel.bzl", line 19
symlink_and_invoke_hazel(ctx, ctx.attr.hazel_base_repo_name, <4 more arguments>)
File "/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/ai_formation_hazel/hazel_base_repository/hazel_base_repository.bzl", line 84, in symlink_and_invoke_hazel
fail("Error running hazel on {}:\n{}\...))
Error running hazel on network.cabal:
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking for gcc... no
checking for cc... no
checking for cl.exe... no
[dev-env] Building tools.sed...
[dev-env] Built tools.sed in /nix/store/wkgszaq2dkc4asapcbx6ypd7xdnzad9f-gnused-4.4 and linked to /home/flokli/dev/daml/dev-env/var/gc-roots/sed
configure: error: in `/home/flokli/.cache/bazel/_bazel_flokli/69f3fdf621d09d121bb086d392d083d6/external/haskell_network_1843485230':
configure: error: no acceptable C compiler found in $PATH
See `config.log' for more details
cabal2bazel: callProcess: ./configure (exit 1): failed
INFO: Elapsed time: 22.424s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (131 packages loaded, 1373 targets configured)
currently loading: @haskell_ghc_lib__6093350//
Fetching @haskell_network_1843485230; fetching 5s
The compiling N/M message would do that.
To store contracts and transactions we need to serialize them to JSON.
Options:
In tag broken-stateful-trait-inheritance , we have class TestExecusionSequencerFactory
being a trait with a variable executionSequencerFactory
.
Before this class was in the test code for AbstractContractServiceIT
and . @gaboraranyossy-da moved it to platform-test-utils
and stuff started breaking. Specifically, the setter for executionSequencerFactory
in AbstractContractServiceIT
is abstract, for some reason.
We suspect that this has to do with ijar
files that bazel generates, but we're not entirely sure.
Right now the workaround is to duplicate the TestExecusionSequencerFactory
in the tests that need it, but we clearly should fix it.
Requested by @raphael-speyer-da and @andrae-da so they can pack/unpack strings. Concretely, something like Text -> [Int]
and [Int] -> Text
, or on single Int
, or something around that area.
Currently we don’t seem to include the MacOS version in the Bazel cache keys which leads to warnings about object files having been built on a different version of MacOS. While that doesn’t seem to have caused problems so far, it seems very sketchy and we should make sure that the MacOS version gets included in the cache keys.
we need to get rid of the recursive data type in the DAML-LF proto. it causes endless trouble (see #437 for the latest in a long series of complaints). i propose to just have an array of exprs at the top level for each module, i.e.
message ExprTable {
repeated Expr exprs = 1;
}
and then have pointers into said array instead of recursive occurrences.
however, we might as well clean up other stuff since we can:
getParty
and the flag indicating the absence of party literalsexercise
which expects actorsfoldl
, foldr
and equal_list
@martin-drhu-da @remyhaemmerle-da please add items as you see fit.
If I have calls to trace
, traceRaw
, and traceId
in my code and I’m running it as a .dar in the sandbox, it should be possible to run the sandbox in a mode in which these are output to the sandbox logs.
To ease barrier to entry of using SDK so developers do not need to add custom repositories or manage their bintray credentials, publish SDK artifacts to a public repository that is already defined by default for most build tools (maven / sbt)
Leading contenders are maven central and bintray jcenter
Approach for publishing outlined here https://reflectoring.io/bintray-jcenter-maven-central/
For general SDK userbase we want to optimise to making the software available in readily accessible public locations (maven central, jcenter bintray) and that is the reason we want to get our software to those public repos.
Additionally we need to consider that many enterprise institutions will not have public internet acces enabled to pull from public repos by default and having all the artifacts in the single bintray private repo location would be adventagous.
So recommendation is we continue to our our private bintray as a staging repo and promote to public repos from there.
We can follow a publish to maven central / jcenter bintray approach as outlined in this article https://reflectoring.io/bintray-jcenter-maven-central/
https://reflectoring.io/bintray-jcenter-maven-central/
https://maven.apache.org/repository/guide-central-repository-upload.html
da run damlc -- package /Users/bame/Desktop/Projects/bilateral-repo-demo/scripts/../backend/src/main/daml/LibraryModules.daml br
and
da run damlc -- package /Users/bame/Desktop/Projects/bilateral-repo-demo/backend/src/main/daml/LibraryModules.daml br
give different packageIds.
I see documentation for DA.Text.parseInt_, which isn't exported (or shouldn't be).
All rules are marked with FIXME(#448)
.
Long scenarios fail with the below:
Waiting for Sandbox................Failed to start sandbox-java
Description: user error (Unable to connect to Sandbox on port 7600: timeout after 15 seconds)
Error type: SomeException
------ last 20 lines from log ------
at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42673)
at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42668)
at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42816)
at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42760)
at com.digitalasset.daml_lf.DamlLf1$Case$1.parsePartialFrom(DamlLf1.java:43725)
at com.digitalasset.daml_lf.DamlLf1$Case$1.parsePartialFrom(DamlLf1.java:43720)
at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
at com.digitalasset.daml_lf.DamlLf1$Expr.<init>(DamlLf1.java:22608)
at com.digitalasset.daml_lf.DamlLf1$Expr.<init>(DamlLf1.java:22394)
at com.digitalasset.daml_lf.DamlLf1$Expr$1.parsePartialFrom(DamlLf1.java:39643)
at com.digitalasset.daml_lf.DamlLf1$Expr$1.parsePartialFrom(DamlLf1.java:39638)
at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
at com.digitalasset.daml_lf.DamlLf1$CaseAlt.<init>(DamlLf1.java:39836)
at com.digitalasset.daml_lf.DamlLf1$CaseAlt.<init>(DamlLf1.java:39733)
at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42673)
at com.digitalasset.daml_lf.DamlLf1$CaseAlt$1.parsePartialFrom(DamlLf1.java:42668)
at com.google.protobuf.CodedInputStream$ArrayDecoder.readMessage(CodedInputStream.java:911)
at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42816)
at com.digitalasset.daml_lf.DamlLf1$Case.<init>(DamlLf1.java:42760)
The actual error is:
com.google.protobuf.InvalidProtocolBufferException: Protocol message had too many levels of nesting. May be malicious. Use CodedInputStream.setRecursionLimit() to increase the depth limit.
The solution is to either increase the recursion limit or refactor the data structures not to be so right-tailed.
The general problem is to show relevant instances for all types and all typeclasses. It would be best to solve the general problem, or to make it easier to do so at a later date.
For general security reasons it would be nice to sign all binaries we produce for a release, including the installer. No need to do it immediately, but in the medium term.
CC @shaul-da
It just blinked in my PR with totally unrelated changes:
ScalaCodeGenIT:
- generated package ID among those returned by the packageClient (665 milliseconds)
- alice creates CallablePayout contract and receives corresponding event *** FAILED *** (10 seconds, 179 milliseconds)
A timeout occurred waiting for a future to complete. Queried 40 times, sleeping 250 milliseconds between each query. (ScalaCodeGenIT.scala:323)
org.scalatest.concurrent.Futures$FutureConcept$$anon$1:
at org.scalatest.concurrent.Futures$FutureConcept.tryTryAgain$1(Futures.scala:538)
at org.scalatest.concurrent.Futures$FutureConcept.futureValueImpl(Futures.scala:550)
at org.scalatest.concurrent.Futures$FutureConcept.futureValueImpl$(Futures.scala:479)
at org.scalatest.concurrent.ScalaFutures$$anon$1.futureValueImpl(ScalaFutures.scala:275)
there are various edge cases in the java codegen that we cannot test easily or at all using a .daml
file. for example i do not believe we can currently generate type names with dollars in them from .daml
.
before it was very hard to write DAML-LF directly, but it's actually much easier now thanks to @remyhaemmerle-da 's parser. see https://github.com/DACH-NY/daml/blob/2766d13da75fcf7b1e08f00fbc02d6891aa53645/daml-foundations/daml-lf/interface/src/test/scala/com/digitalasset/daml/lf/iface/TypeSpec.scala#L75 for an example test where i use the type parser.
problematic tests we should make sure to test:
Foo:Bar
and Foo.Bar:Baz
-- this is now forbidden as per DAML-LF collision restriction, see https://github.com/DACH-NY/daml/blob/c876df419ddcfb13b4cc445e9f014c8ac6f6642a/daml-lf/spec/daml-lf-1.rst#id31 . specifically, the type Foo:Bar
would conflict with module Foo.Bar
Foo:Bar
resulting in class Foo
in the default namespace, rendering it effectively unusable (cannot be used in another Java file). -- not applicable anymore because now we create a package per module, rather than a class per module.and anything else @stefanobaghino-da , @gerolf-da , or @mariop-da can think of.
grpcurl is great because it's code we do not have to maintain. however getting started is a bit finnicky. me and @bame-da agree that having a tutorial for people to feed off would be very useful.
we have to come up with something like the old daml-foundations ci, but for the new repo.
specifically, it should build all the artifacts that an SDK release is made of and upload them to the proper channels.
for jars, the main piece of work required is to come up with the pom.xml
files programmatically from bazel queries.
it's not entirely clear yet what we need to do for typescript / javascript / docs.
Components we need to release because the da
assistant will download them:
damlc
daml-extension
navigator
extractor
ledger-api-protos
sandbox
quickstart-java
quickstart-js
sdk
JVM libraries pulled in via the pom.xml
in quickstart-java
:
bindings-java
bindings-js
daml-lf-archive
rs-grpc-bridge
Other stuff
damlc.jar
: nice to have, but see #465Given:
daml 1.2
-- A simple example, deliberately encoding in CRLF format
module Simple where
import DA.Optional
template Person
with
person : Party
where
signatory person
controller fromOptional (Just person) can
Sleep : () do return ()
This gives the error:
/Users/neil/projects/daml/daml-foundations/daml-ghc/tests/Simple.daml:12:16: error:
* No instance for (IsParties
(Optional (Maybe Party) -> Maybe Party))
arising from a use of `toParties'
(maybe you haven't applied a function to enough arguments?)
* In the expression: toParties fromOptional (Just person)
In the first argument of `concat', namely
`[toParties fromOptional (Just person)]'
In the expression: concat [toParties fromOptional (Just person)]
It's all worked, but the printed expression is toParties fromOptional (Just person)
. However, the App's are written such that the actual semantics are toParties (fromOptional (Just person))
. We should insert a Paren application there so the pretty printed output doesn't confuse people.
CC @shayne-fletcher-da
right now we do not have bazel targets producing packages with the .proto files that we want to distribute. these are, at least:
note that the proto_library
rule produces a binary file with the descriptor set for the .protos, but without the protos themselves.
@gerolf-da said it's good for user experience and we have to do it if we want to be good citizens.
@raphael-speyer-da requests:
command line can show warnings regardless of whether there are any errors.
when warnings are emitted, also emit their unique id so that users know how to -Werror or suppress them
We should probably document a subset of the option - e.g. encourage people to {-# OPTIONS -Wno-incomplete-patterns #-}
as the less-ghc-specific and more modern thing.
We should strip the Haskell binaries that we produce in Bazel. Ideally we would fix this by getting rules_haskell
to respect Bazel’s --strip
flag.
On da run sandbox I get some text saying:
WARNING: An illegal reflective access operation has occurred
At this point I freaked out, and, assuming the computer was illegally accessing reflective services, smashed all the mirrors in my house. This message should not be something users see.
The fix for this is in the protobuf lib and will be released in version v3.7.0. There are only RCs for it when I am writing this. It's gonna take some time until this change makes it up the protobuf -> grpc-protobuf -> grpc-java -> .. dependency chain.
We're moving toward a new assistant called daml
, instead of da
, that redirects commands in the style of git
. Each sdk version release will ship with a set of tools, called daml-compile
, daml-upgrade
, daml-doctor
, daml-uninstall
, etc. When the user invokes the daml
command, for example daml compile Foo.daml
, the assistant will find the appropriate version of the daml-compile
tool and invoke that with the argument Foo.daml
.
There will be a .daml
folder in the user home folder on unix, and appropriate location on Windows. The .daml
directory will look like:
.daml/
|- config.yaml
|- sdk/
|- 0.12.1/
|- bin/
|- daml
|- daml-compile
|- daml-upgrade
|- ...
|- ...
|- 0.12.0/
|- bin/
|- daml
|- daml-compile
|- daml-upgrade
|- ...
|- ...
The daml
command is only responsible for:
Finding the current project by going up the current directory path and finding a da.yaml
file.
Selecting the appropriate version of the sdk to use (based on da.yaml
, or latest version if not found).
Invoking the appropriate daml-something
command with the arguments it was given, and with additional environment variables:
.daml
foldersdk
version foldersdk
versionThe invoked daml-something
tools should read the $DAML_HOME/config.yaml
and $DAML_PROJECT/da.yaml
files directly for configuration.
We want each SDK version be a single tarball (or zip file). The daml tools, and the daml
assistant itself, will be part of the SDK tarball.
We want to be able to move away from relying on Bintray for distributing the SDK. The latest version of the SDK tarball should be made available through some consistent URL for each platform (e.g. daml.com/sdk/latest-linux.tar.gz
) that redirects to the latest version.
$DAML_HOME/config.yaml
will be configured with this URL, and daml-upgrade
will fetch a new version when it becomes available.
Separately, we will need an installer script for each platform, and to incorporate this installer script with existing software distribution channels (such as brew
and apt-get
). Ideally the installer simply downloads the tarball, expands it to the right place, writes a basic $DAML_HOME/.config.yaml
, puts daml
in the PATH, and you're good to go.
TBD:
Should there be a $DAML_HOME/bin
folder? Or should daml
be installed somewhere in the PATH? Or something else? Or both?
Should the $DAML_SDK
directory be readonly?
daml-upgrade
and daml-uninstall
, of course.If a daml-something
tool is invoked directly, without the environment variables above, should the tool attempt to determine the flags itself? Or give an error?
daml-assistant
project should be a Haskell library that makes it easy to determine these variables if they aren't passed directly, and this is what daml
will use, so there is uniformity.If YES to 3, and the current tool is from the wrong SDK version for the current project (based on da.yaml
), should the tool go ahead and silently invoke the appropriately versioned tool instead? Should this be an error? A warning?
Using return
confuses people who expect:
return 1
debug 2
return 2
To return
immediately at 1
without continuing, like in every other language in the world. Using pure
or wrap
solves that problem. We should decide what the preferred flavour is, and encourage it throughout, perhaps putting return
in the DEPRECATED set of compatible things.
Currently, tarballs created using the package_app
have the executable flag set. We should not set this flag.
At present, to download DA binary artifacts, standard commands to install binaries are not available to developers. To lower the barrier to entry we want to enable standard installation approaches that are familiar to developers in particular languages idiomatic to the OS and environment:
brew cask create daml
cask 'daml' do
version '110'
sha256 '2ca6cc3741e7bec3f9efe06eb485ce7a5052a697d5bffa7d9608d624dc2ff0bf'
url 'https://bintray.com/digitalassetsdk/DigitalAssetSDK/download_file?file_path=com%2Fdigitalasset%2Fda-cli%2F110-f42c34eba1%2Fda-cli-110-f42c34eba1-osx.run'
name 'Digital Asset SDK da-cli'
homepage 'https://bintray.com/digitalassetsdk/DigitalAssetSDK/da-cli'
app 'DAML'
end
https://engineering.innovid.com/distributing-command-line-tools-with-homebrew-d03e795cadc8
https://kylebebak.github.io/post/distribute-program-via-homebrew
https://docs.brew.sh/Acceptable-Formulae
Possible solutions:
-j
flag which is faster, but non-deterministic - which doesn't matter for active dev anyway.Likely a dependency for #188
a draft of how to organize the components:
release
-- what is now daml-foundations/ci
daml-lf/
https://github.com/DACH-NY/daml/pull/454
spec
archive
interpreter
data
transaction
lfpackage
repl
testing-tools
ledger-api/
haskell-ide-core/
compiler/
ledger/
language-support/
java/
js/
scala/
da-assistant/
(old)daml-assistant/
(new)navigator/
-- #783extractor/
-- #688installers/
docs/
bazel_tools/
dev-env
README.md
CONTRIBUTING.md
BAZEL*.md
LICENSE
, once we open sourcemoreover,
BUCK
files are removedin daml_lf.proto
https://github.com/DACH-NY/daml/blob/master/daml-foundations/daml-lf/archive/da/daml_lf.proto
imports are done as,
import "da/daml_lf_0.proto";
import "da/daml_lf_1.proto";
import "da/daml_lf_dev.proto";
(bold is mine)
which makes it hard to import the files from another project
if you could change that to
import "daml_lf_0.proto";
import "daml_lf_1.proto";
import "daml_lf_dev.proto";
without the /da prefixes the file could then be referenced in other projects
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.