Coder Social home page Coder Social logo

flink-client's Introduction

flink-client

This library provides a Java client for managing Apache Flink via the Monitoring REST API.

The client is generated with Swagger Codegen from an OpenAPI specification file.

License

The library is distributed under the terms of BSD 3-Clause License.

Copyright (c) 2019-2021, Andrea Medeghini
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

* Redistributions of source code must retain the above copyright notice, this
  list of conditions and the following disclaimer.

* Redistributions in binary form must reproduce the above copyright notice,
  this list of conditions and the following disclaimer in the documentation
  and/or other materials provided with the distribution.

* Neither the name of the library nor the names of its
  contributors may be used to endorse or promote products derived from
  this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

How to get the binaries

The library is available in Maven Central Repository, and GitHub.

If you are using Maven, add this dependency to your POM:

<dependency>
    <groupId>com.nextbreakpoint</groupId>
    <artifactId>com.nextbreakpoint.flinkclient</artifactId>
    <version>1.0.4</version>
</dependency>        

How to generate the code

Generate the code from the OpenAPI specification using Maven:

mvn clean compile

How to build the library

Build the library using Maven:

mvn clean package

How to run the tests

Make sure you have Docker installed and create the required bridge network:

docker network create flink-test 

Run the tests using Maven:

mvn clean verify

Documentation

Create the Flink client:

FlinkApi api = new FlinkApi();

Configure host and port of the server:

api.getApiClient().setBasePath("http://localhost:8081");

Configure socket timeouts:

api.getApiClient().getHttpClient().setConnectTimeout(20000, TimeUnit.MILLISECONDS)
api.getApiClient().getHttpClient().setWriteTimeout(30000, TimeUnit.MILLISECONDS)
api.getApiClient().getHttpClient().setReadTimeout(30000, TimeUnit.MILLISECONDS)

Optionally enable debugging:

api.getApiClient().setIsDebugging(true)

Get Flink cluster configuration:

DashboardConfiguration config = api.showConfig();

Show list of uploaded jars:

JarListInfo jars = api.listJars();

Upload a jar which contain a Flink job:

JarUploadResponseBody result = api.uploadJar(new File("flink-job.jar"));

Run an uploaded jar which some arguments:

JarRunResponseBody response = api.runJar("bf4afb3b-d662-435e-b465-5ddb40d68379_flink-job.jar", true, null, "--INPUT A --OUTPUT B", null, "your-main-class", null);

Get status of all jobs:

JobIdsWithStatusOverview jobs = api.getJobs();

Get details of a job:

JobDetailsInfo details = api.getJobDetails("f370f5421e5254eed8d6fc6673829c83");

Terminate a job:

api.terminateJob("f370f5421e5254eed8d6fc6673829c83", "cancel");

For all the remaining operations see documentation of Monitoring REST API or see OpenAPI specification file.

Flink compatibility

The current version of the library has been tested against Flink 1.10.0, but the library is known to work with older versions.
The library is compatible with Flink Monitoring REST API v1, which didn't change for long time (at least since Flink 1.7). The library should be compatible with newer versions too, unless a breaking change is introduced in the Flink Monitoring REST API.

Known limitations

The library has integration tests with a code coverage of 90%. Few endpoints don't have tests and not all fields in the responses are currently verified.

flink-client's People

Contributors

boliza avatar nextbreakpoint avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

flink-client's Issues

about the return value when I call 'runJar' method

Hello, this project is very helpful for me, but when I call the 'runJar' method, I can't get the return value of JobId which is very important for me, however when I enable the debug mode by call method setDebugging(true), there is jobId in console, like below

{"jobid":"796bae32665baed05232765484e28dff"}

Now I'm looking for the code that print the jobId, but have not seen anything about it yet. so the question that bother me is why the return value not contains the jobId,? what should I do to get the jobid?
Thanks!

Sorry the sources?

I was looking at this client, but I cannot find any trace of the library sources in the master branch (the only available). Only tests are available, which are not exactly what an open source project is expected to provide. May be a push is missing?

About more subtasks metrics details.

About url /jobs/:jobid/vertices/:vertexid/subtasks/:subtaskindex/metrics
I want more metrics info from subtask however the subtask information I get from the client is a simple map.
And metrics are not just IOMetricsInfo.

JobIdsWithStatusOverview jobIdsWithStatusOverview = api.getJobs();
        String jobId = jobIdsWithStatusOverview.getJobs().get(0).getId();
        JobDetailsInfo details = api.getJobDetails(jobId);
        details.getVertices().forEach(v -> {
            IOMetricsInfo metricsInfo = v.getMetrics();
            Map<String, Integer> tasks = v.getTasks();
        });

image

More information is not available, as shown in the figure below and kafka related.

image

你好

感谢你提供的工具!

flink v1.10 support

Flink v1.10 has been released. Does the current version v1.0.2 support it?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.