Coder Social home page Coder Social logo

xquery-v2-testing's Introduction

XQuery v2.0

Setup For Testing Milestones 1, 2 & 3

Dependencies & Requirements

  • Linux Ubuntu 20.04 LTS (or similar)
  • python 3.8 or 3.9
  • docker version 20.10
  • docker compose version v2.3.4
  • alembic 1.8.0
  • running Web3 enabled node (e.g. ETH, AVAX, Sys NEVM.)
  • Ensure no containers or services are listening on any of the following ports: 5432, 6379, 8080

Clone This Repository and Enter Cloned Repo

# --recursive flag is needed while cloning to ensure the xquery2 submodule is  
# also cloned.

git clone --recursive https://github.com/blocknetdx/xquery-v2-testing

# If this repo has previously been cloned and the original xquery2 repo
# has since been updated, the xquery2 submodule in this repo can be
# updated with:

git submodule update

# Enter the clone of the xquery-v2-testing repo:

cd xquery-v2-testing

Launch Postgres, Redis & Hasura containers

cp .env.template .env

# Review the env variables set it .env; modify them if desired

./run.sh

Enter xquery2 submodule

cd xquery2

set up & activate virtual environment

sudo apt install python3-virtualenv
virtualenv -p python3 ./.venv
source .venv/bin/activate

pip install -r requirements.txt

Errors running pip install -r requirements.txt

If you get errors running pip install -r requirements.txt, see the following:

XQuery requires the psycopg2 python package, which is compiled from source and thus has additional system prerequisites (C compiler, system dev packages). See here.

The required system packages can be install with:

sudo apt install build-essential python3-dev libpq-dev gcc

Alternatively, install the precompiled psycopg2-binary python package instead, like this:

Edit requirements.txt and change this line:

psycopg2==2.9.3

to this:

psycopg2-binary==2.9.3

Then issue once again:

pip install -r requirements.txt

Configuration

All configurable settings are consolidated in xquery2/xquery/config.py. Generally, no other files need to be modified!

The following options (with default value) are available and can be adjusted in the configuration file. Alternatively, each option can also be set via its corresponding env variable. See details bellow:

CONFIG = {
    # Database settings
    "DB_HOST": os.getenv("DB_HOST", "localhost"),
    "DB_PORT": os.getenv("DB_PORT", 5432),
    "DB_USERNAME": os.getenv("DB_USERNAME", "root"),
    "DB_PASSWORD": os.getenv("DB_PASSWORD", "password"),
    "DB_DATABASE": os.getenv("DB_DATABASE", "debug"),
    "DB_SCHEMA": os.getenv("DB_SCHEMA", "public"),

    # Redis cache settings
    "REDIS_HOST": os.getenv("REDIS_HOST", "localhost"),
    "REDIS_PORT": os.getenv("REDIS_PORT", 6379),
    "REDIS_PASSWORD": os.getenv("REDIS_PASSWORD", "password"),
    "REDIS_DATABASE": os.getenv("REDIS_DATABASE", 0),

    # Controller
    "XQ_NUM_WORKERS": os.getenv("XQ_NUM_WORKERS", 8),
    
    # web3 provider RPC url
    "API_URL": os.getenv("API_URL", "http://localhost:8545/"),
    # "API_URL": os.getenv("API_URL", "https://cloudflare-eth.com/v1/mainnet"),  # ETH
    # "API_URL": os.getenv("API_URL", "https://api.avax.network/ext/bc/C/rpc"),  # AVAX
    # "API_URL": os.getenv("API_URL", "https://rpc.syscoin.org/"),  # SYS
}

The xquery2/xquery/config.py file included in this repo should be considered as an example/template. Feel free to change configs as desired. For example, the web3 provider RPC URLs in this template config are as follows:

    # web3 provider RPC url
    "API_URL": os.getenv("API_URL", "http://localhost:8545/"),
    # "API_URL": os.getenv("API_URL", "https://cloudflare-eth.com/v1/mainnet"),  # ETH
    # "API_URL": os.getenv("API_URL", "https://api.avax.network/ext/bc/C/rpc"),  # AVAX
    # "API_URL": os.getenv("API_URL", "https://rpc.syscoin.org/"),  # SYS

Setting API_URL as above configures XQuery to fetch blockchain data from http://localhost:8545/ . This will work well if you plan to test XQ's indexing of Sys NEVM chain and Sys NEVM is running locally and the blockchain data of NEVM is available via RPC calls to port 8545 of localhost. If Sys NEVM is running locally in a docker container, and port 8545 is only accessible on the container itself (not on localhost), then you'll need to replace localhost with the IP of the SYS container. If NEVM is not running locally, you might want to set the API_URL to https://rpc.syscoin.org/, though that source of NEVM data will be rate limited. If you want to test indexing of AVAX while gathering AVAX blockchain data from the public AVAX blockchain source, "https://api.avax.network/ext/bc/C/rpc", you can set your API_URL to that. That works fine, but that public source of AVAX data is rate limited. If you have AVAX running locally and you want to ensure AVAX indexing speed is not limited by the public source of AVAX blockchain data, you may wish to set the API_URL to something like http://172.31.11.28:9650/ext/bc/C/rpc ...where 172.31.11.28 is the IP of the local avax container.

Database

Run the following commands to create the database tables.

alembic -n default -c alembic/alembic.ini revision --autogenerate -m 'creating schema'

alembic -n default -c alembic/alembic.ini upgrade head

NOTE, If the first alembic command above returns an error like this:
ERROR [alembic.util.messaging] Target database is not up to date.
it means you have already run that alembic command in this environment, so you can skip running that alembic command and go directly to the second alembic command.

Verify Setup

Optionally, test the environment and configuration:

python -m test_setup

If the above command returns something like the following, the setup is correct:

06:09:17.0905 INFO  [MainThread 2178926] xquery.util.misc: Processing
time of 'main()': 0.3918 seconds.

If it returns errors, there are errors in the setup.

Run Example

Run one of the preconfigured examples, Pangolin (PNG) Exchange on Avalanche or Pegasys (PSYS) Exchange on Syscoin:

python -m run_png 2> run_png.log &
python -m run_psys 2> run_psys.log &

The above commands start the indexing of AVAX/Sys NEVM blockchain into a database stored on a (temporary) volume attached to the xquery-pg (postgres) container. Logs generated by python -m run_png or python -m run_psys are streamed to STDERR, which is why this example redirects STDERR to a .log file with 2> run_png.log or 2> run_psys.log. To free the Linux terminal, these examples run the run_png or run_psys module in the background with & at the end.

To watch the streaming logs, you can issue:

# For indexing avax/pangolin:
tail -f run_png.log

# For indexing Sys NEVM/Pegasys:
tail -f run_psys.log 

Then issue ^C to interrupt the scrolling logs. To interrupt the python -m run_png or python -m run_psys command:

# bring the run_png or run_psys process to the foreground:

fg

# then interrupt the process by issuing (at least one) ^C

(Instead of issuing python -m run_png 2> run_png.log & or python -m run_psys 2> run_psys.log &, one could alternatively open a tmux window, activate the same virtual environment there with source .venv/bin/activate, then issue python -m run_png or python -m run_psys in the tmux window and let the logs scroll within the tmux window.)

Verify Indexed Data in Hasura Console

Find the IP of the server where XQuery v2 is running:

curl ifconfig.co
# OR
curl ifconfig.io

Note the IP of your server, which we'll refer to here as <SERVER-IP>

Navigate in a web browser to:

http://<SERVER-IP>:8080/console

This should give you a graphical Hasura GraphQL interface to the database indexed by XQuery v2. You can make specific queries for specific blocks through this GUI, then compare the query results to the information given in Blockchain Explorers.

Track Tables in Hasura Console

The first time you navigate in a web browser to http://<SERVER-IP>:8080/console, you'll probably see the message, "Looks like you do not have any tables" under the API tab of the console, and a suggestion to "Click the Data tab on top to create tables." Follow this suggestion and click the Data tab. The initial Hasura Console message will look something like this:

Hasura Console 1

After clicking the Data tab, click the public tab on the left. You should now see a page something like this:

Hasura Console 2

On this screen you should see an option to track a number of available tables. Click, "Track All"

In the previous version of xquery2, the main table of interest was the xquery table, which is where one could query for a variety of indexed EVM data. It seems the xquery table is not utilized or populated in the latest version of xquery2 (confirming this now w/ Riku...), but lots of interesting data is available for querying in the other tables and aggregates.

Tests

WARNING: Some tests currently affect the state of the cache and database. Only run on a development setup!

Some tests only run on Avalanche (AVAX) currently

pytest --collect-only tests/

pytest -v tests/
pytest -v -rP tests/

pytest -v -k="cache" tests/
pytest -v -k="filter" tests/
pytest -v -k="indexer" tests/
pytest -v -k="middleware" tests/

Benchmarks

python -m bench.bench_cache_redis
python -m bench.bench_fetch_token
python -m bench.bench_fetch_token_batched
python -m bench.bench_get_block
python -m bench.bench_get_block_batched
python -m bench.bench_get_logs
python -m bench.bench_serialize

Deactivate Virtual Environment When Finished

deactivate

xquery-v2-testing's People

Contributors

conanmishler avatar shrnkld avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.