Coder Social home page Coder Social logo

fluidex / fluidex-backend Goto Github PK

View Code? Open in Web Editor NEW
37.0 5.0 8.0 175 KB

Building the first permissionless layer2 orderbook DEX on Ethereum, powered by PLONK zk-rollup

Home Page: https://www.fluidex.io/

Shell 84.04% Makefile 11.80% JavaScript 4.15%
layer2 zk-rollup dex ethereum orderbook plonk

fluidex-backend's Introduction

FluiDex Backend

FluiDex team is building the first permissionless layer2 orderbook DEX on Ethereum, powered by PLONK zk-rollup.

This repo contains all the backend stuff, including exchange matching engine, rollup state manager, prover cluster (master and workers), and zk circuit codes. You can read through our design rationale here.

Currently it is only a demo/PoC version, and many features are still WIP.

Architecture

Components & Submodules

Submodules:

  • circuits: ZK-Rollup circuits written in circom. It lies in the rollup-state-manager submodule.
  • dingir-exchange: the matching engine server. It matches eddsa-signed L2 orders from users, and generates trades. It writes all the 'events' (e.g., orders/trades/balance updates) to the global Kafka message bus.
  • rollup-state-manager: maintaining the global rollup state Merkle tree. It fetches events/transactions from the Kafka message bus, and updates the Merkle tree accordingly, and generates L2 blocks.
  • prover-cluster: a master-workers cluster for proving PLONK ZK-SNARK circuits. It loads L2 blocks generated by rollup-state-manager, then proves them, and writes the proofs to databases.
  • regnbue-bridge: a L1-L2 bridge for fast withdrawal/deposit. Currently in the demo version, it acts like a faucet, sending some initial tokens to each new user of the FluiDex zk-rollup network.

Some external services:

  • Kafka: used as the global message bus.
  • PostgreSQL: the main database we use. It stores the match engine history/state, prover-cluster state, rollup (L2 blocks/L2 txs) state etc.
  • TimescaleDB: time-series databases, used for exchange market data (e.g., K-Line).

Some zero knowledge development tools developed by Fludiex team are used to process the circuits, including snarkit and plonkit

How to run it

Ubuntu 20.04 is the only supported environment currently. You could speed up the building following https://doc.rust-lang.org/cargo/guide/build-cache.html#shared-cache, and more documents can be found here.

# install some dependencies and tools
# including rust / docker / docker-compose / nodejs etc.
$ bash scripts/install_all.sh

# compile zk circuits and setup circuits locally
# start databases and message queue with docker compose
# and launch all the services
# and a mock orders/trades generator
$ bash run.sh

# stop all the processes and destroy docker compose clusters
$ bash stop.sh

Some useful commands have been added to Makefile:

# print the L2 blocks status, total block number, prover block number, etc.
$ make prover_status

# print the latest trades generated by matchengine
$ make new_trades

Now you can also attach a prove client cluster to the backend, see the document

Persist Data

NOTE: for the first time, DO NOT set DX_CLEAN before run.sh. set env DX_CLEAN with false (case insensitive) to skip data purging stage when execute stop.sh.

TODOs

  • Data availability. And change original inputs as private, then use their hash as the (single) public input.

Known Issues

  • In order signature verification, user nonce and order id should be, but haven't yet been, signed.
  • For test convenience, common reference string (CRS) is setup locally rather than by MPC.

fluidex-backend's People

Contributors

haoyuathz avatar lightsing avatar lispc avatar noel2004 avatar silathdiir avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

fluidex-backend's Issues

latest contract too large?

Transaction reverted: trying to deploy a contract whose code is too large
Warning: Contract code size exceeds 24576 bytes (a limit introduced in Spurious Dragon). This contract may not be deployable on mainnet. Consider enabling the optimizer (with a low "runs" value!), turning off revert strings, or using libraries.  

devops: metric of apis and sqls

  1. how much time every api endpoint takes ( more urgent, in case we forget to add correct index to db )
  2. how often they are queried ( not urgent )
  3. error ratio

restart without cleaning DB

allow restarting without cleaning DB.

can config “cleaning DB" or "not cleaning DB".

default is "not cleaning DB ".

  1. pg DB
  2. sled DB
  3. need to deal with MQ carefully?

restart with `export DX_CLEAN=false` may fail

description

restart with export DX_CLEAN=false may fail for

  • prover-cluster client
  • block_submitter
  • tick
  • restapi
  • contracts ticker

how to reproduce

export DX_CLEAN=false
./stopsh
./run.sh

log

ubuntu@ip-172-31-18-140:~/repos/fluidex-backend$ make tail_log 
docker-compose --file orchestra/docker/docker-compose.yaml --project-name orchestra logs > orchestra/docker-compose.log
docker-compose --file regnbue-bridge/docker/docker-compose.yaml --project-name faucet logs > regnbue-bridge/docker-compose.log
ls rollup-state-manager/*.log prover-cluster/*.log dingir-exchange/*.log dingir-exchange/logs/*.log regnbue-bridge/*.log contracts/*.log orchestra/*.log | xargs tail -n 3
==> contracts/ganache.2021-09-23.log <==
9007199254740991

Listening on 127.0.0.1:8545

==> contracts/ticker.2021-09-23.log <==
        at emitErrorNT (node:internal/streams/destroy:193:8)
        at emitErrorCloseNT (node:internal/streams/destroy:158:3)
        at processTicksAndRejections (node:internal/process/task_queues:83:21)

==> dingir-exchange/logs/matchengine.2021-09-23.log <==
    
Sep 23 04:10:38.953  INFO dingir_exchange::message::producer: kafka producer disconnected    
Sep 23 04:10:38.953  INFO dingir_exchange::message::producer: kafka producer running terminated    

==> dingir-exchange/logs/persistor.2021-09-23.log <==
Sep 23 04:10:40.281  INFO dingir_exchange::message::persist: start auto commiting for topic internaltransfer    
Sep 23 04:10:40.281  INFO dingir_exchange::message::persist: start auto commiting for topic registeruser    
Sep 23 04:10:41.301  INFO dingir_exchange::message::consumer: start consuming topic ["trades", "internaltransfer", "registeruser", "balances", "orders"]    

==> dingir-exchange/logs/restapi.2021-09-23.log <==
  44: __libc_start_main
  45: _start
    

==> dingir-exchange/tick.2021-09-23.log <==
  details: 'No connection established',
  metadata: Metadata { internalRepr: Map(0) {}, options: {} }
}

==> orchestra/docker-compose.log <==
exchange_envoy        | [2021-09-23 04:19:48.260][19][debug][connection] [source/common/network/connection_impl.cc:243] [C49] closing socket: 0
exchange_envoy        | [2021-09-23 04:19:48.260][19][debug][client] [source/common/http/codec_client.cc:107] [C49] disconnect. resetting 0 pending requests
exchange_envoy        | [2021-09-23 04:19:48.260][19][debug][pool] [source/common/conn_pool/conn_pool_base.cc:407] [C49] client disconnected, failure reason: 

==> prover-cluster/client.2021-09-23.log <==
  15: __libc_start_main
  16: _start
    

==> prover-cluster/coordinator.2021-09-23.log <==
  status = $1
  and updated_time < current_timestamp - interval '172800 seconds'
  

==> regnbue-bridge/block_submitter.2021-09-23.log <==
    0: error trying to connect: tcp connect error: Connection refused (os error 111)
    1: tcp connect error: Connection refused (os error 111)
    2: Connection refused (os error 111)

==> regnbue-bridge/docker-compose.log <==
regnbue_bridge_pq | 2021-09-23 04:10:26.580 UTC [28] LOG:  TimescaleDB background worker launcher connected to shared catalogs
regnbue_bridge_pq | 2021-09-23 04:11:31.098 UTC [51] LOG:  the "timescaledb" extension is not up-to-date
regnbue_bridge_pq | 2021-09-23 04:11:31.098 UTC [51] HINT:  The most up-to-date version is 2.4.2, the installed version is 2.1.0.

==> regnbue-bridge/faucet.2021-09-23.log <==
  
Sep 23 04:19:50.229  INFO sqlx::query: COMMIT; rows: 0, elapsed: 93.348µs  
Sep 23 04:19:50.230  INFO sqlx::query: /* SQLx ping */; rows: 0, elapsed: 83.309µs  

==> rollup-state-manager/rollup_state_manager.2021-09-23.log <==
Sep 23 04:15:22.823  INFO rollup_state_manager: generate 0 blocks with block_size 2 in 240.00372s: average TPS: 0    
Sep 23 04:17:22.823  INFO rollup_state_manager: generate 0 blocks with block_size 2 in 360.00385s: average TPS: 0    
Sep 23 04:19:22.823  INFO rollup_state_manager: generate 0 blocks with block_size 2 in 480.00394s: average TPS: 0    
ubuntu@ip-172-31-18-140:~/repos/fluidex-backend$ tail -n 300 dingir-exchange/logs/restapi.2021-09-23.log
Sep 23 04:10:40.256 DEBUG restapi: Prepared DB connection: postgres://exchange:[email protected]/exchange    
Sep 23 04:10:40.257  INFO restapi: Connect to manage channel http://0.0.0.0:50051    
Sep 23 04:10:41.666 ERROR fluidex_common::non_blocking_tracing: thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: tonic::transport::Error(Transport, hyper::Error(Connect, ConnectError("tcp connect error", Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })))', src/bin/restapi.rs:34:18
   0: fluidex_common::non_blocking_tracing::get_backtrace
             at /home/ubuntu/.cargo/git/checkouts/common-rs-f10c3d305ff6aa0a/24293b3/src/non_blocking_tracing.rs:50:30
   1: fluidex_common::non_blocking_tracing::set_panic_hook::{{closure}}
             at /home/ubuntu/.cargo/git/checkouts/common-rs-f10c3d305ff6aa0a/24293b3/src/non_blocking_tracing.rs:39:13
   2: std::panicking::rust_panic_with_hook
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/panicking.rs:595:17
   3: std::panicking::begin_panic_handler::{{closure}}
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/panicking.rs:497:13
   4: std::sys_common::backtrace::__rust_end_short_backtrace
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/sys_common/backtrace.rs:141:18
   5: rust_begin_unwind
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/panicking.rs:493:5
   6: core::panicking::panic_fmt
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/panicking.rs:92:14
   7: core::result::unwrap_failed
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/result.rs:1355:5
   8: core::result::Result<T,E>::unwrap
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/result.rs:1037:23
   9: restapi::main::{{closure}}
             at src/bin/restapi.rs:29:13
  10: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/future/mod.rs:80:19
  11: <tokio::task::local::RunUntil<T> as core::future::future::Future>::poll::{{closure}}::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/task/local.rs:668:65
  12: tokio::coop::with_budget::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/coop.rs:106:9
  13: std::thread::local::LocalKey<T>::try_with
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/thread/local.rs:376:16
  14: std::thread::local::LocalKey<T>::with
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/thread/local.rs:352:9
  15: tokio::coop::with_budget
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/coop.rs:99:5
      tokio::coop::budget
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/coop.rs:76:5
      <tokio::task::local::RunUntil<T> as core::future::future::Future>::poll::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/task/local.rs:668:42
  16: tokio::macros::scoped_tls::ScopedKey<T>::set
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/macros/scoped_tls.rs:61:9
  17: tokio::task::local::LocalSet::with
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/task/local.rs:573:9
  18: <tokio::task::local::RunUntil<T> as core::future::future::Future>::poll
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/task/local.rs:658:9
  19: tokio::task::local::LocalSet::run_until::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/task/local.rs:516:9
  20: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/future/mod.rs:80:19
  21: <core::pin::Pin<P> as core::future::future::Future>::poll
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/future/future.rs:120:9
  22: tokio::runtime::basic_scheduler::Inner<P>::block_on::{{closure}}::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/basic_scheduler.rs:208:62
  23: tokio::coop::with_budget::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/coop.rs:106:9
  24: std::thread::local::LocalKey<T>::try_with
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/thread/local.rs:376:16
  25: std::thread::local::LocalKey<T>::with
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/thread/local.rs:352:9
  26: tokio::coop::with_budget
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/coop.rs:99:5
      tokio::coop::budget
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/coop.rs:76:5
      tokio::runtime::basic_scheduler::Inner<P>::block_on::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/basic_scheduler.rs:208:39
  27: tokio::runtime::basic_scheduler::enter::{{closure}}
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/basic_scheduler.rs:299:29
  28: tokio::macros::scoped_tls::ScopedKey<T>::set
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/macros/scoped_tls.rs:61:9
  29: tokio::runtime::basic_scheduler::enter
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/basic_scheduler.rs:299:5
  30: tokio::runtime::basic_scheduler::Inner<P>::block_on
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/basic_scheduler.rs:197:9
  31: tokio::runtime::basic_scheduler::InnerGuard<P>::block_on
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/basic_scheduler.rs:452:9
  32: tokio::runtime::basic_scheduler::BasicScheduler<P>::block_on
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/basic_scheduler.rs:157:24
  33: tokio::runtime::Runtime::block_on
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/runtime/mod.rs:450:46
  34: tokio::task::local::LocalSet::block_on
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.9.0/src/task/local.rs:477:9
  35: actix_rt::runtime::Runtime::block_on
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-rt-2.1.0/src/runtime.rs:85:9
  36: actix_rt::system::SystemRunner::block_on
             at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-rt-2.1.0/src/system.rs:186:9
  37: restapi::main
             at src/bin/restapi.rs:17:1
  38: core::ops::function::FnOnce::call_once
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/ops/function.rs:227:5
  39: std::sys_common::backtrace::__rust_begin_short_backtrace
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/sys_common/backtrace.rs:125:18
  40: std::rt::lang_start::{{closure}}
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/rt.rs:49:18
  41: core::ops::function::impls::<impl core::ops::function::FnOnce<A> for &F>::call_once
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/core/src/ops/function.rs:259:13
      std::panicking::try::do_call
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/panicking.rs:379:40
      std::panicking::try
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/panicking.rs:343:19
      std::panic::catch_unwind
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/panic.rs:431:14
      std::rt::lang_start_internal
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/rt.rs:34:21
  42: std::rt::lang_start
             at /rustc/53cb7b09b00cbea8754ffb78e7e3cb521cb8af4b/library/std/src/rt.rs:48:5
  43: main
  44: __libc_start_main
  45: _start

block_submitter error: VM Exception

Nov 02 07:42:09.887 ERROR regnbue_bridge::block_submitter::eth_sender: (code: -32000, message: VM Exception while processing transaction: revert, data: Some(Object({"stack": String("o: VM Exception while processing transaction: revert\n    at Function.o.fromResults (/app/ganache-core.docker.cli.js:4:297036)\n    at e.exports (/app/ganache-core.docker.cli.js:55:2302856)"), "name": String("o")})))

add BIG CI for this repo

run every 12 hours rather than every PR.

clone codes, start, and check many things.

check db, check api, check process alive etc

contracts ticker nonce error

ticker deployed to: 0x2FDBb59691c94d390c8C1e6cCC3caB7B635ab6bF
tick tock {
  hash: '0xaa14a6af62c661afba9c90f2324c89a095722fc089ad9d6a0909645666b9bfcc',
  type: 0,
  accessList: null,
  blockHash: '0xbb605555413ee60ce11161d6960ffde717ac8ebb0a98e26d6e080495aef2ae60',
  blockNumber: 3,
  transactionIndex: 0,
  confirmations: 1,
  from: '0x3577C97DA04A1cf179bcf651Cb41fbDD1B3e4E43',
  gasPrice: BigNumber { _hex: '0x0ba43b7400', _isBigNumber: true },
  gasLimit: BigNumber { _hex: '0x0324ab', _isBigNumber: true },
  to: '0x2FDBb59691c94d390c8C1e6cCC3caB7B635ab6bF',
  value: BigNumber { _hex: '0x00', _isBigNumber: true },
  nonce: 2,
  data: '0x3eaf5d9f',
  r: '0x5c7a3c31b8c0effb9490fb4e726fab279cb1cfe6a49bf3a092fecd65d34d9d26',
  s: '0x096c8729c949e9f1aa255b9879a150422a9312d86c6cd3b47f21955452b84d20',
  v: 106778,
  creates: null,
  chainId: 53371,
  wait: [Function (anonymous)]
}
tick tock {
  hash: '0xe734f984c04b8b34e5ba4d7f8d4ed9a1be93428e2d061dd85466d021e7f3c5a9',
  type: 0,
  accessList: null,
  blockHash: '0xc63ab70636da6e32b56d64fb67aae08ea69dbc208ed2df2217563378fe7968ed',
  blockNumber: 4,
  transactionIndex: 0,
  confirmations: 1,
  from: '0x3577C97DA04A1cf179bcf651Cb41fbDD1B3e4E43',
  gasPrice: BigNumber { _hex: '0x0ba43b7400', _isBigNumber: true },
  gasLimit: BigNumber { _hex: '0x01ffb3', _isBigNumber: true },
  to: '0x2FDBb59691c94d390c8C1e6cCC3caB7B635ab6bF',
  value: BigNumber { _hex: '0x00', _isBigNumber: true },
  nonce: 3,
  data: '0x3eaf5d9f',
  r: '0xffeb618adec94f4144dace870c0840418844beb84bbc7039e6284328f0fcb338',
  s: '0x0022b31c645741fa05293320770d36bf46c9aa632f2bd9f30aa4956c74523209',
  v: 106778,
  creates: null,
  chainId: 53371,
  wait: [Function (anonymous)]
}
tick tock {
  hash: '0x34936a24387854469537e62f9dad82bc31378fa4e2d1a702672364e393b9a841',
  type: 0,
  accessList: null,
  blockHash: '0x3c742977e4471018272a737daf2372f535c27ed2a33e9580df2d617d4c7722b8',
  blockNumber: 5,
  transactionIndex: 0,
  confirmations: 1,
  from: '0x3577C97DA04A1cf179bcf651Cb41fbDD1B3e4E43',
  gasPrice: BigNumber { _hex: '0x0ba43b7400', _isBigNumber: true },
  gasLimit: BigNumber { _hex: '0x01ffb3', _isBigNumber: true },
  to: '0x2FDBb59691c94d390c8C1e6cCC3caB7B635ab6bF',
  value: BigNumber { _hex: '0x00', _isBigNumber: true },
  nonce: 4,
  data: '0x3eaf5d9f',
  r: '0x735f5a23b824ebe8db316a1335064f736a37a3cf04dd809a043ef32e6eb2e4ee',
  s: '0x565c7efaba56400a06cc7d3b7ad35c0e6146ad53755591b4c2e2bf7d8732119d',
  v: 106778,
  creates: null,
  chainId: 53371,
  wait: [Function (anonymous)]
}
tick tock {
  hash: '0x49856918d71f31b62b1a369750166f88a659d56a1d65bc4b030e38a589c23f2b',
  type: 0,
  accessList: null,
  blockHash: '0xc44a93e4f96de22d48d1a82142a69d50a72854c238153d5a221eabda87be8483',
  blockNumber: 6,
  transactionIndex: 0,
  confirmations: 1,
  from: '0x3577C97DA04A1cf179bcf651Cb41fbDD1B3e4E43',
  gasPrice: BigNumber { _hex: '0x0ba43b7400', _isBigNumber: true },
  gasLimit: BigNumber { _hex: '0x01ffb3', _isBigNumber: true },
  to: '0x2FDBb59691c94d390c8C1e6cCC3caB7B635ab6bF',
  value: BigNumber { _hex: '0x00', _isBigNumber: true },
  nonce: 5,
  data: '0x3eaf5d9f',
  r: '0xa0f1b9dd6c600a5e2fff68e6e291b7d82944e3cc8ebf732af8eba62aa880beba',
  s: '0x12f8549844059c7955252950e4f6b7d68363be874d5bee02dd81c7398d585593',
  v: 106777,
  creates: null,
  chainId: 53371,
  wait: [Function (anonymous)]
}
tick tock {
  hash: '0x601223e2a767f8b341cf84db23be2be49598bb1efca9ae2e8f9b1c2ac84752ba',
  type: 0,
  accessList: null,
  blockHash: '0x5dfccc7ac7867ccd3fe2901cb4544084d2a5dcd4374fbcedd67dfd69d332800a',
  blockNumber: 7,
  transactionIndex: 0,
  confirmations: 1,
  from: '0x3577C97DA04A1cf179bcf651Cb41fbDD1B3e4E43',
  gasPrice: BigNumber { _hex: '0x0ba43b7400', _isBigNumber: true },
  gasLimit: BigNumber { _hex: '0x01ffb3', _isBigNumber: true },
  to: '0x2FDBb59691c94d390c8C1e6cCC3caB7B635ab6bF',
  value: BigNumber { _hex: '0x00', _isBigNumber: true },
  nonce: 6,
  data: '0x3eaf5d9f',
  r: '0x309cc25fc8cfffb1573fb0a36877f5a3d34e7f4a05c039e82fb3559c6fbc3f0f',
  s: '0x6454e913f92abda3b3413f0650ba8bbf762e45ce2a5695becd51c0ce7e8b6776',
  v: 106777,
  creates: null,
  chainId: 53371,
  wait: [Function (anonymous)]
}
tick tock {
  hash: '0x9dc1c76adee3532c2d6d08c7788e4d8b6ef09e1a28667ae9f5a6ad5e0e74b0eb',
  type: 0,
  accessList: null,
  blockHash: '0x29ff008f4b8dd3dd824a21c4c92117e54d37cc2e9327d44b0fa4ac0026243926',
  blockNumber: 8,
  transactionIndex: 0,
  confirmations: 1,
  from: '0x3577C97DA04A1cf179bcf651Cb41fbDD1B3e4E43',
  gasPrice: BigNumber { _hex: '0x0ba43b7400', _isBigNumber: true },
  gasLimit: BigNumber { _hex: '0x01ffb3', _isBigNumber: true },
  to: '0x2FDBb59691c94d390c8C1e6cCC3caB7B635ab6bF',
  value: BigNumber { _hex: '0x00', _isBigNumber: true },
  nonce: 7,
  data: '0x3eaf5d9f',
  r: '0x75e8e4e0039d84a9a06df5d63154b251167e07ead7083fedf9457028bc7e0995',
  s: '0x09e8c4d5bd45fa3c2b83333b5fd7810f3c0bde59e1edcda9fec9a0a6ce836c0c',
  v: 106778,
  creates: null,
  chainId: 53371,
  wait: [Function (anonymous)]
}
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
error ProviderError: the tx doesn't have the correct nonce. account has nonce of: 8 tx has nonce of: 2
    at HttpProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/http.ts:49:19)
    at GanacheGasMultiplierProvider.request (/home/ubuntu/repos/fluidex-backend/contracts/node_modules/hardhat/src/internal/core/providers/gas-providers.ts:181:34)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)

feat: add k8s prover cluster

the cluster can be split into two parts:

  1. prover cluster. It should be elastic. we can use k8s for this. Some cloud providers give us serverless elastic k8s. https://www.alibabacloud.com/help/doc-detail/86377.htm Even deploying prover cluster on Aliyun and other parts on AWS is ok. I suggest using k8s for proving even for an early stage testnet product. ( Late July ? Aug ? )
  2. other processes. They can be managed manually at least for a while.

We can even setup single-node local k8s using https://github.com/kubernetes-sigs/kind . So user can still run run.sh to launch all the stuff.

( So in run.sh, if some env var like K8S_ENDPOINT is not null, the script deploy prover cluster ( master / slave / db ) into that k8s, if K8S_ENDPOINT is null, launch local k8s using Kind, and deploy prover cluster into it.
Then the rollup will insert records into the db in k8s.

All the issues found when running run.sh on a 'clean' host

  • Should checking the prerequisite of rust, cargo, nodeJS and npm. in ubuntu I install node via snap (sudo snap install node --classic --channel=14), the dist package may be too obsolete (8.0)

  • Following global tools need to be solved:

    • plonkit (cargo install --git https://github.com/Fluidex/plonkit)
    • sqlx (cargo install sqlx-cli --no-default-features --features postgres)
    • snarkit (npm i -g snarkit)
    (we may also link the cargo install binary to a public path like /usr/local/bin)
    
  • circuit is seems not able to be initialized by npm i correctly because of a problemic package-lock.json and need to remove the lock file first, like I had done in the prover-cluster setup docker

  • envsubst issue (#39)

  • snarkit compiling issue (fluidex/snarkit#14)

  • #48

Also should we consider running ticker.ts in the very last after verifying all processes have been there

For repo. of circuits, recursively use submodule managed by rollup-state-manager

Currently backend maintain a standalone reference to the repo. of circuits as git submodule, so does rollup-state-manager project.

We notice that protocols inside L2 circuits must be consistent with the state managing code in rollup-manager. When the l2 circuits could not recognized input object generated from rollup-manager the whole backend would run into failure.

It may be a better solution that backend simply apply the circuits inside rollup-state-manager to ensure the l2 circuit and state producing code always being consistent

skip listening `add_token`

MNEMONIC argument is not correct

only using the first word

  49 HD Wallet
  50 ==================
  51 Mnemonic:      anxiety
  52 Base HD Path:  m/44'/60'/0'/0/{account_index}
  53 
  54 Gas Price
  55 ==================

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.