Coder Social home page Coder Social logo

airflow_hive_spark_sqoop's Introduction

Customized airflow docker

Dependencies

  • airflow==1.10.14
  • hadoop==3.2.1
  • hive==3.1.2
  • spark==3.1.1
  • sqoop==1.4.7

How to run

You can clone this repository and run the following command to start the airflow webserver.

docker compose up -d

You can access the airflow webserver with http://localhost:8080

It's already mount the dags and plugins parts to the docker volume, please feel free to change the configuration in the docker-compose.yml for you preference.

To close the airflow container

docker compose down -v

References

Todos

  • build a hive, spark, and sqoop cluster for testing the airflow operators.

Project milestones

  • Build an airflow docker image with Postgres, Sqoop, Spark, and Hive` components.
  • Publish to the docker hub for arm64 architecture contribution.
  • Used it in the following project to build a data engineer challenge pipeline.

Learning objectives

Docker

  • Understanding how to build a docker image from other built images with dockerfile configuration.
    • understand the parameter difference in the docker file (ENV, RUN, ARG, CMD, etc.).
  • Able to change or modify the parameter from the existing built image.
    • Successfully modify and build the image for the airflow container.
  • Learn how to structure a docker project. e.g., .dockerignore, docker-compose.yml

Hadoop

  • Understanding the basic need for configuring the Hadoop ecosystem. e.g., configuration files core-site.xml, hdfs-site.xml, etc.
  • Be able to work around the dependency issues between Hadoop components. For example, which Hive version should we use with the Hadoop 3.2.1?

Notes

  1. The tricky part of this project is not a docker or Hadoop ecosystem. But it's to make all the components dependencies working together. For example, you have to understand why you have to use the python:3.6-stretch version for building the Hadoop-based image instead of python:3.7-slim-buster` provided in the original docker image.

    • Quick answer is that the slim-buster version doesn't support the JAVA 8, which we have to use for installing the Hadoop components.
  2. You will face many dependency problems not only from the Linux-based but also from the python and pip environment. Almost all of the time, you have to find the workaround in the stack overflow and trust me you are not the first to face the issues.

    • Trial and error help you a lot in fixing each issue. Please don't give up that's all I learned from this process.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.