Coder Social home page Coder Social logo

o3's Introduction

o3

Hadoop-Airflow Analytics.

Development Environment (macOS)

First setup conda with Python 3.6, then:

git clone [email protected]:mblomdahl/o3.git
cd o3/
wget -P resources/ http://apache.mirrors.spacedump.net/hadoop/common/hadoop-2.9.2/hadoop-2.9.2.tar.gz
wget -P resources/ http://apache.mirrors.spacedump.net/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz
wget -P resources/ http://apache.mirrors.spacedump.net/hive/hive-2.3.4/apache-hive-2.3.4-bin.tar.gz
wget -P resources/ https://repo.continuum.io/archive/Anaconda3-2018.12-Linux-x86_64.sh
wget -P resources/ https://www-eu.apache.org/dist/avro/avro-1.8.2/java/avro-tools-1.8.2.jar
wget -P resources/ https://www-eu.apache.org/dist/avro/avro-1.7.7/java/avro-tools-1.7.7.jar
conda env create --name o3 python=3.6 -f environment-macos.yml
conda activate o3
export AIRFLOW_GPL_UNIDECODE=yes
export AIRFLOW_HOME=$(pwd)/airflow_home
export HADOOP_USER_NAME=airflow
export AVRO_TOOLS_PATH=$(pwd)/resources/avro-tools-1.7.7.jar
pip install -e .
# Checkout your secret enterprise DAGs into `prod-dags` root dir.
git clone ssh://[email protected]:7999/ANALY/prod-dags.git prod-dags

Start a Postgres database and initialize Airflow DB on it:

mkdir -p pgdata
docker run -d -p 2345:5432 -v $(pwd)/pgdata:/var/lib/postgresql/data --name o3_postgres postgres:9.6
# Set sql_alchemy_conn to point at local Postgres around airflow_home/airflow.cfg#L55.
sed -i 's|sqlite:////.*|postgresql+psycopg2://postgres:[email protected]:2345/postgres|' airflow_home/airflow.cfg
airflow initdb

Start the Airflow webserver and scheduler:

airflow webserver -p 8080

In the repo root dir, from a second terminal session, activate the Python environment and start the scheduler:

conda activate o3
export AIRFLOW_HOME=$(pwd)/airflow_home
airflow scheduler

Finally browse to http://localhost:8080/, and that's it. :)

Creating the Conda Environment File

These commands needs to be executed on the target platform. The output file can then replace the corresponding file in this repo, i.e. environment-linux.yml and environment-macos.yml:

conda create --name o3 --yes python=3.6
conda install --name o3 -c conda-forge --yes psycopg2 hdfs3 airflow libhdfs3=2.3.0=1 ansible netaddr \
    ipython pandas fastavro pyhive pyspark jupyter xlrd matplotlib paramiko bcrypt requests-futures \
    dictdiffer pip openpyxl xlwt fastparquet python-snappy pyarrow
conda activate o3; pip install -e .
conda env export --name o3 > environment-<platform>.yml

Provisioning

ansible-playbook -i inventories/<inventory>.ini provision.yml

Jupyter

During an Ansible provisioning a Jupyter Notebook will be deployed on port 8888 of the provisioned server. It can also be executed in a development environment, like so:

conda activate o3; jupyter notebook --notebook-dir=notebooks   

Links

o3's People

Contributors

carlba avatar mblomdahl avatar vtepliuk avatar

Watchers

 avatar  avatar

o3's Issues

DAG 2

Build upon the #10 DAG and add missing/scoped-out features, namely using HDFS as filesystem backend.

Bug fix for SecondaryNameNode Consistency Checks

We keep getting this error from SecondaryNameNode logs when putting data into HDFS:

2019-01-20 13:49:09,581 ERROR org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Exception in doCheckpoint
java.io.IOException: Inconsistent checkpoint fields.
LV = -63 namespaceID = 553474715 cTime = 1547394276488 ; clusterId = CID-76c76a10-13bc-4774-ae24-f7c30af38867 ; blockpoolId = BP-1604516264-10.100.1.185-1547394276488.
Expecting respectively: -63; 1106474597; 1547383651693; CID-23f73b9c-2bac-40d6-86d9-a393aeee9cfe; BP-766916281-127.0.0.1-1547383651693.
	at org.apache.hadoop.hdfs.server.namenode.CheckpointSignature.validateStorageInfo(CheckpointSignature.java:134)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:550)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:360)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$1.run(SecondaryNameNode.java:325)
	at org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:481)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:321)
	at java.lang.Thread.run(Thread.java:748)

Should find out what's wrong and repair it. Here's a how-to resource that seems credible: https://community.pivotal.io/s/article/Secondary-NameNode-Checkpoint-Error-Inconsistent-Checkpoint-Fields

Log dir configurations

Update all of Hive, Spark, YARN and HDFS components to write logs under /var/log, not in installation dirs.

DAG 1

Create a first DAG that looks for a .txt file in hdfs://user/airflow/inbox, applies tasks t1 and t2 on it in parallel, then writes a summary of upstream results in t3.

           /---> t1 ---\
input_file              t3 ---> output_file
           \---> t2 ---/

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.