Coder Social home page Coder Social logo

redhat-healthcare-chiefarchitect / idaas-kic Goto Github PK

View Code? Open in Web Editor NEW

This project forked from project-herophilus/aceframework-hdf

0.0 0.0 0.0 47 KB

KIC - Knowledge, Insight and Conformance - designed to be a platform that maintains all activity that occurs with the iDaaS Components

License: Apache License 2.0

Shell 1.95% Java 98.05%

idaas-kic's Introduction

iDaaS KIC

KIC - Knowledge, Insight and Conformance - designed to be a platform that maintains all activity that occurs with the iDaaS Components. This is the upstream for RedHat Healthcare's iDaaS KIC.

iDAAS-KIC (Knowledge, Insight and Conformance)

iDAAS DataHub platform is intended for persisting of data needed within the iDAAS. The current main usage is for auditing and logging activities.

Add-Ons

This solution contains three supporting directories. The intent of these artifacts to enable resources to work locally:

  • platform-scripts: support for some common platform activities. All the scripts are named to describe their capabilities
  • platform-datatier: DDL that support this implementation

Pre-Requisites

For all iDaaS design patterns it should be assumed that you will either install as part of this effort, or have the following:

  1. An existing Kafka (or some flavor of it) up and running. Red Hat currently implements AMQ-Streams based on Apache Kafka; however, we have implemented iDaaS with numerous Kafka implementations. Please see the following files we have included to try and help:
    Kafka
    KafkaWindows
    No matter the platform chosen it is important to know that the Kafka out of the box implementation might require some changes depending upon your implementation needs. Here are a few we have made to ensure:
    In /config/consumer.properties file we will be enhancing the property of auto.offset.reset to earliest. This is intended to enable any new system entering the group to read ALL the messages from the start.
    auto.offset.reset=earliest
  2. Some understanding of building, deploying Java artifacts and the commands associated. If using Maven commands then Maven would need to be intalled and runing for the environment you are using. More details about Maven can be found here
  3. An internet connection with active internet connectivity, this is to ensure that if any Maven commands are run and any libraries need to be pulled down they can.

We also leverage Kafka Tools to help us show Kafka details and transactions; however, you can leverage code or various other Kafka technologies ot view the topics.

RDBMS (Relational Database Management Systems)

We wanted to provide a very simple way to have users be able to visualize and report on anything the iDaaS framework(s) do. So we have included a very basic extensible relational database tier. Here are links to the Community Edition RDBMS We are using:

MySQL Community Edition: v8 or greater
PostgreSQL: v9.6 or greater

You will need to create a database within either technology and have all the credentials for the database.

Database Implementation

To implement the relational database. You do not need to implement it as you can merely setup transactions to be processed and routed out as
JSON events and processed to whatever technology makes the implementation happy.

  1. Within the Supported RDBMS we need to create a user and password and then keep the information for later in the configuration process.
  2. Run the DDL for the database from platform-datatier (latest release for your RDBMS)
  3. Configure the application.properties
  4. Start up and start processing data

Platform Implementation

Scenario: Integration

The implementation processes all the data from the topic named opmgmt_transactions.

Integration Data Flow Steps

  1. The integration application subscribes to all the data from the topic opmgmt_transactions topic.
  2. The data will be processed from the topic and the header attributes will all be parsed.
  3. Depending on the settings in the application.properties the header attributes can go to files, a relational database table named insight within the kic database or both. These actions are done through parsing the header attributes.

Builds

This section will cover both local and automated builds.

Local Builds

Within the code base you can find the local build commands in the platform-scripts directory

  1. Run the build-solution.sh script It will run the maven commands to build and then package up the solution. The package will use the usual settings in the pom.xml file. It outputs target/idaas-datahub.jar

Running

Once built you can run the solution by executing ./platform-scripts/start-solution.sh. The script will startup Kafka and iDAAS DataHub Service.

You can enable storing audit in DB by running the service with --idaas.storeInDb=true parameter. You can customize default DB connection details by providing the following properties:
mySQL Property Example
:

# MySQL
idaas.dbDriverClassName=com.mysql.cj.jdbc.Driver
idaas.dbUrl=jdbc:mysql://localhost:3306/kic?useLegacyDatetimeCode=false&serverTimezone=GMT
idaas.dbPassword=Developer123
idaas.dbUsername=root
idaas.dbTableName=insight

Postgres Property Example

idass.dbDriverClassName=org.postgresql.Driver
idaas.dbUrl=jdbc:postgresql://localhost:5432/audit_db
idaas.dbUsername=postgres
idass.dbPassword=Postgres123
idaas.dbTableName=audit
idaas.createDbTable=true

It is possible to overwrite configuration by:

  1. Providing parameters via command line e.g. ./start-solution.sh --idaas.auditDir=some/other/audit/dir
  2. Creating an application.properties next to the idaas-datahub.jar in the target directory
  3. Creating a properties file in a custom location ./start-solution.sh --spring.config.location=file:./config/application.properties

Supported properties include:

#logging.config=classpath:logback.xml

# the options from org.apache.camel.spring.boot.CamelConfigurationProperties can be configured here
camel.springboot.name=iDaaS-KIC
# lets listen on all ports to ensure we can be invoked from the pod IP
server.address=0.0.0.0
management.address=0.0.0.0
# lets use a different management port in case you need to listen to HTTP requests on 8080
management.port=8081
# Server - Internal
server.port=9970
# disable all management enpoints except health
endpoints.enabled = false
endpoints.health.enabled = true
spring.main.web-application-type=none
camel.springboot.main-run-controller=true

# Kafka Details
idaas.kafkaBrokers=localhost:9092
idaas.kafkaTopicName=opsmgmt_platformtransactions
# Audit Directory Location
idaas.storeInFs=true
idaas.auditDir=audit
# Relational Database Detail
idaas.storeInDb=true
# Setting the createDbTable=true will try to autocreate a table
# code is located in DataSourceConfiguration.java
idaas.createDbTable=false
# Postgres
#idaas.dbDriverClassName=org.postgresql.Driver
#idaas.dbUrl=jdbc:postgresql://localhost:5432/idaas_kic
#idaas.dbPassword=Developer123
#idaas.dbUsername=postgres
#idaas.dbTableName=insight
# MySQL
idaas.dbDriverClassName=com.mysql.cj.jdbc.Driver
idaas.dbUrl=jdbc:mysql://localhost:3306/kic?useLegacyDatetimeCode=false&serverTimezone=GMT
idaas.dbPassword=Developer123
idaas.dbUsername=root
idaas.dbTableName=insight

Alternatively, if you have a running instance of Kafka, you can start a solution with: ./platform-scripts/start-solution-with-external-kafka.sh --idaas.kafkaBrokers=host1:port1,host2:port2. The script will startup iDAAS DataHub Service.

Optionally you can configure the service to store audit data in DB. You can run an instance of Postgres db with:

For Postgres: docker run --rm --name datahub-db -e POSTGRES_PASSWORD=Postgres123 -p 5432:5432 postgres:alpine

Usage

The service listens for Kafka messages on the opsmgmt_platformtransactions queue by default and outputs them to 'audit' dir as json files.

The output includes both message and headers.

There's also an endpoint available for putting messages on the queue. The REST service runs on port 8080 by default, but it can be customized by providing --server.port=8082 parameter.

Testing:
In order for any testing to work you must:

  1. Have a Kafka server up and ready to process data to/from.
  2. Start the iDaaS-KIC platform (multiple ways)
  3. you can POST a message like this:
curl --location --request POST 'http://localhost:8080/message' \
--header 'Content-Type: application/json' \
--data-raw '{
    "auditEntireMessage": "test",
    "processingtype": "data",
	"industrystd": "HL7",
	"messagetrigger": "ADT"
}'
  1. You can use the iDaaS-Simulator-KIC which simply runs and transacts to the iDaaS-KIC tier when files move.

Automated Builds

Automated Builds are going to be done in Azure Pipelines

Getting Involved

Here are a few ways you can get or stay involved.

Ongoing Enhancements

We maintain all enhancements within the Git Hub portal under the projects tab, please feel free to make recommendations our goal is to continually improve.

Defects/Bugs

All defects or bugs should be submitted through the Git Hub Portal under the issues tab Happy using and coding....

idaas-kic's People

Contributors

redhat-healthcare-chiefarchitect avatar balanscott avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.