Coder Social home page Coder Social logo

junshen1969 / app-connect-tekton-pipeline Goto Github PK

View Code? Open in Web Editor NEW

This project forked from khongks/app-connect-tekton-pipeline

0.0 0.0 0.0 6.22 MB

Example CI/CD pipeline for App Connect Enterprise

Home Page: https://dalelane.co.uk/blog/?p=4676

License: MIT License

Shell 38.74% Java 61.26%

app-connect-tekton-pipeline's Introduction

Example pipeline for App Connect Enterprise

An example Tekton pipeline for deploying an IBM App Connect Enterprise application to Red Hat OpenShift.

Overview

This repository contains an example of how to use Tekton to create a CI/CD pipeline that builds and deploys an App Connect Enterprise application to Red Hat OpenShift.

animated gif

This pipeline uses the IBM App Connect Operator to easily build, deploy and manage your applications in containers. The pipeline runs on OpenShift to allow it to easily be integrated into an automated continuous delivery workflow without needing to build anything locally from a developer's workstation.

For background information about the Operator, and the different resources that this pipeline will create (e.g. IntegrationServer and Configuration), see these blog posts:

Pipeline

The pipeline builds and deploys your App Connect Enterprise application. You need to run it every time your application has changed and you want to deploy the new version to OpenShift.

When running App Connect Enterprise in containers, there is a lot of flexibility about how much of your application is built into your container image, and how much is provided when the container starts.

For background reading on some of the options, and some of the considerations about them, see the blog post:

This pipeline provides almost all parts of your application at runtime when the container starts. The only component that is baked into the image is the application BAR file.

Baking the BAR files into custom App Connect images prevents the need to run a dedicated content server to host BAR files, however if you would prefer to do that see the documentation on Mechanisms for providing BAR files to an integration server for more details on how to do this. (The pipeline in this repository uses the approach described as "Custom image" in that documentation.)

Install tools

  1. git cli, if not installed.

  2. oc cli, if not installed.

  3. tkn cli, if not installed

  4. jq, if not installed

  5. Apache Kafka. You need this to do testing for the Complex message flow.

Fork or clone the repository

  1. You can first fork the repository to your Git organisation.

  2. Then clone the repository using the following command

% git clone https://github.com/<org>/app-connect-tekton-pipeline

Obtain IBM entitlement key

  1. Go to the Container software library.

  2. Copy the key - this will be used to create a pull secret later.

Install prerequisites

If you have a vanilla OpenShift cluster, you will need to install the prerequisites.

For simple pipeline run, you need to install the following:

  • Platform Navigator - which automatically installed Cloud Pak for Integration
  • Cloud Pak Operators
  • App Connect Dashboard (optional)

For complex pipeline run, in additiona to the above components, you need to install the following:

  • Event Streams
  • Postgres database

We have prepared the scripts to install these components in the folder demo-pre-reqs. The instructions is found in this README.md.

To access the OpenShift cluster via oc cli, you need to login using token by running this command in Terminal.

% open https://oauth-openshift.apps.<clusterID>.<domainName>/oauth/token/request

After login, you will be presented a command which you need to run in the terminal

% oc login --token=<token> --server=https://api.<clusterID>.<domainName>:6443

Once you have access to the cluster, you can install the prerequisites.

Running the pipeline

link
pipeline spec: pipeline.yaml
example pipeline runs: simple-pipelinerun.yaml
complex-pipelinerun.yaml
helper scripts: 1-deploy-simple-integration-server.sh
1-deploy-complex-integration-server.sh

What the pipeline does

Builds your IBM App Connect Enterprise application and deploys it to the OpenShift cluster.

Outcome from running the pipeline

A new version of your application is deployed with zero-downtime - replacing any existing version of the app once it is ready.

Screenshot

screenshot of the deploy pipeline tasks

Background

As discussed above, most of your application configuration will be provided to your application container at runtime by the Operator using Configuration resources.

As shown in the screenshot above, this example pipeline currently supports many, but not all, of the types of Configuration resource:

For more information about the other Configuration types, see the documentation on Configuration types for integration servers. Adding support for any of these additional types would involve adding additional tasks to the tasks provided in this repo - the existing tasks are commented to help assist with this.

Each of these configuration resources is individually optional. Two example App Connect applications are provided to show how the pipeline supports different application types.

Simple stand-alone applications

The pipeline can be used to deploy a stand-alone application with no configuration dependencies.

link
sample application simple-demo
pipeline run config simple-pipelinerun.yaml
demo script: 1-deploy-simple-integration-server.sh

screenshot

This is a simple App Connect application with no external configuration.

screenshot

When deploying this, the pipeline skips all of the Configuration tasks:

screenshot of a pipeline run for the simple app

Watching the pipeline run looks like this (except it takes longer).

animated gif

Complex applications

The pipeline can be used to deploy complex applications with multiple configuration dependencies and supporting Java projects.

link
sample application sample-ace-application
pipeline run config complex-pipelinerun.yaml
demo script: 1-deploy-complex-integration-server.sh

screenshot

This is an example of an App Connect application that needs configuration for connecting to:

  • a PostgreSQL database
  • an external HTTP API
  • an Apache Kafka cluster

screenshot

When deploying this, the pipeline runs all of the Configuration tasks required for this application:

screenshot of a pipeline run for the complex app

Watching the pipeline run (also sped up!) it looks like this.

animated gif

To avoid needing to store credentials in git with your application code, the pipeline retrieves credentials from Kubernetes secrets. When configuring the pipeline for your application you need to specify the secrets it should use to do this.

Tests

If you have a test project for your App Connect application, the pipeline can run this test as well.

Provide the name of your test project in the pipeline config, and your tests will be run after the BAR file is built.

If you don't provide a test project, the test step in the pipeline will be skipped.

Sample apps

I've put notes on how I set up the sample apps to demonstrate the pipeline in demo-pre-reqs/README.md however neither of the sample apps are particularly useful and were purely used to test and demo the pipeline.

You can import them into App Connect Toolkit to edit them if you want to by:

  1. File -> Import... -> Projects from Folder or Archive
  2. Put the location of the ace-projects folder as the Import source.
  3. Tick all of the projects

That will let you open the projects and work on them locally. If you're curious what they do, I'll include some brief notes below:

Simple app

It provides an HTTP endpoint that returns a Hello World message.

screenshot of the message flow

Running this:

% curl "http://$(oc get route -nace-demo hello-world-http -o jsonpath='{.spec.host}')/hello"

returns this:

{"hello":"world"}

Test

A test for this app is provided in simple-demo_Test.

To run it:

  1. Create a local integration server called TEST_SERVER (inheriting the configuration in the TEST_SERVER folder)
  2. Run the test launch configuration simple-demo_Test.launch

Complex app

It provides an intentionally contrived event-driven flow that:

  • "Kafka consumer todo updates"
    • receives a JSON message from a Kafka topic
  • "get id from update message"
    • parses the JSON message and extracts an ID number from it
    • uses the id number to create an HTTP URL for an external API
  • "retrieve current todo details"
    • makes an HTTP GET call to the external API
  • "base64 encode the description"
    • transforms the response from the external API using a custom Java class
  • "insert into database"
    • inserts the transformed response payload into a PostgreSQL database

screenshot of the message flow

The aim of this application was to demonstrate an ACE application which needed a variety of Configuration resources.

But it means that running this:

% echo '{"id": 1, "message": "quick test"}' | ./kafka-console-producer.sh \
    --bootstrap-server $BOOTSTRAP \
    --topic TODO.UPDATES \
    --producer-property "security.protocol=SASL_SSL" \
    --producer-property "sasl.mechanism=SCRAM-SHA-512" \
    --producer-property "sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="appconnect-kafka-user" password="$PASSWORD";" \
    --producer-property "ssl.truststore.location=ca.p12" \
    --producer-property "ssl.truststore.type=PKCS12" \
    --producer-property "ssl.truststore.password=$CA_PASSWORD"

gets you this:

store=# select * from todos;
 id | user_id |       title        |            encoded_title             | is_completed
----+---------+--------------------+--------------------------------------+--------------
  1 |       1 | delectus aut autem | RU5DT0RFRDogZGVsZWN0dXMgYXV0IGF1dGVt | f
(1 row)

Test

A test for this app is provided in sample-ace-application_Test.

To run it:

  1. Create a local integration server called TEST_SERVER (inheriting the configuration in the TEST_SERVER folder)
  2. Run the test launch configuration sample-ace-application_Test.launch

Configuring the pipeline for your App Connect Enterprise application

To run the pipeline for your own application, you need to first create a PipelineRun.

The sample pipeline runs described above provide a good starting point for this, which you can modify to your own needs. You need to specify the location of your App Connect Enterprise application code and configuration resources. All of the available parameters are documented in the pipeline spec if further guidance is needed.

Accessing Git using credentials

If your App Connect Enterprise files are in a git repository that requires authentication to access, you will also need to provide credentials for the pipeline to be able to clone your repository.

Create a file called github-credentials.yaml (in the same folder as the 0-setup.sh script)

It should look like this:

apiVersion: v1
kind: Secret
metadata:
  name: github-credentials
  annotations:
    tekton.dev/git-0: https://github.com
type: kubernetes.io/basic-auth
stringData:
  username: your-user-name
  password: your-github-token

You can alter the github URL if your Git repository is somewhere different (e.g. GitHub Enterprise).

If your Git repository is publically readable, you can skip this step.

Supported versions

This sample pipeline was tested on OpenShift 4.12.

You can see the versions of what I was running on OpenShift at ./demo-pre-reqs/operators/. It is possible that this pipeline would need modifying to work with different versions of OpenShift, Tekton, or App Connect.

Before you run pipelines

Replace the ${BLOCK_STORAGECLASS} variable and create simple-pipelinerun.yaml or complex-pipelinerun.yaml .

% envsubst < simple-pipelinerun.yaml.tmpl > simple-pipelinerun.yaml
% envsubst < complex-pipelinerun.yaml.tmpl > complex-pipelinerun.yaml

More info

For help with using this or if you have any questions, please create an issue or contact me.

app-connect-tekton-pipeline's People

Contributors

dalelane avatar trevor-dolby-at-ibm-com avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.