Comments (6)
Hi @akshungupta, thanks for the issue. Could you please share settings for your Apache Spark cluster and docker.conf?
from mist.
Hi,
Thank you for getting back to me.
The settings for Apache Spark are as follows:
- I have 1 master and 2 worker nodes each on different machines. All of the machines can talk to each other.
- I have used
sbin/start-master.sh
andsbin/start-slave.sh
scripts (part of Apache Spark installation) to start the masters and slaves (workers). - There are no other custom settings that I have configured to create my Apache Spark cluster. It is a very simple cluster that I have brought up consisting of three machines.
I have attached my docker.conf
, router.conf
and the application simple_streaming.py
in configs_job.zip
to this post. simple_streaming.py
is a very simple application and we wanted to get a feel of deploying mist on Apache Spark cluster.
Here is how I compiled the mist image and ran it:
sudo docker build -t test_mist .
sudo docker run -p 2003:2003 --name mist -v /var/run/docker.sock:/var/run/docker.sock -v $PWD/configs:/usr/share/mist/configs -v $PWD/jobs:/jobs test_mist mist
One thing to note is the following:
We had downloaded the source code of Apache Spark and had used the binary from the project to launch the Apache Spark cluster and tell mist the IP and port of the master. We could not find a way to use mist's apache spark binary to set up a custom cluster. Please also let us know if this is possible.
from mist.
Hi @akshungupta
You can start workers and connect them to the master via:
./sbin/start-slave.sh <master-spark-URL>
If you used spark-defaults.conf
show me content please.
Generally it is better to use the IP address of Master node instead of the alphabetic hostname that is provided by spark master.
spark://<master-machine-IP>:7077
Once you have started a worker, look at the master’s web UI (port 8080 or 4040 by default). You should see all your nodes listed there.
from mist.
Hi @leonid133
I had started my workers and master using sbin/start-mater.sh
and sbin/start-slave.sh
scripts available as part of the Spark code base. I did see all my worker nodes on the master's web UI.
I am not sure whether this error is because I am not using mist's own spark binary or not. I am using a different binary but same spark version (2.1.0).
I did not use spark-defaults.conf. I just started 2 slaves on 2 different machines and provided the master spark url to each slave when running the script /sbin/start-slave.sh
I tried using the IP address instead of the DNS name but I am getting the same error.
from mist.
Hi @akshungupta
Try to check the connection from docker container to spark master node and SPARK_VERSION
docker exec -it <mist_docker_name> bash
telnet <master_host> <master_port>
echo $SPARK_VERSION
Now for Spark version 2.1.0 we have a specific docker
docker run -p 2003: 2003 -v /var/run/docker.sock:/var/run/docker.sock -d hydrosphere / mist: master-2.1.0 Mist
from mist.
Hi @leonid133, using master-2.1.0
fixed the issue!
Thank you!
from mist.
Related Issues (20)
- HTTP API - Validate artifact file extension HOT 3
- mistlibpy - Row from SqlContext doesn't have method `asDict`
- Python - mistpy: BadParameterException HOT 3
- Facing error Couldn't find JsEncoder instance for Map[String,Any] HOT 2
- Required support for database other than H2 HOT 1
- Job cancellation returns 400 Bad Request in async mode
- Delete context by Id is not working HOT 4
- ContextFrontend: Ask worker connection for context failed HOT 1
- ERROR FunctionInfoProvider failed HOT 3
- 2.12 support - docs
- Strange debug-like code at Json4sConversion HOT 1
- PySpark - starting from Spark 2.4.1 python jobs don't work
- Mist : Unsupported major.minor version 52.0 HOT 3
- Starting child for FunctionInfoProvider failed HOT 6
- Unable to download mist-cli
- Run parallel jobs on-prem dynamic spark clusters
- How to Integrate Mist API with AWS EMR? HOT 2
- Support k8s helm HOT 1
- Is Mist deprecated/abandoned HOT 3
- Long running spark jobs when cancelled from mist ui continue to stay in cancelling state and eventually fail with something went wrong error
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mist.