Transit Dashboard App
This repository contains the code for the Transit Dashboard App, which uses Splunk Cloud Services to display a dashboard with transit route status by stop using data from Seattle Transit agencies.
This readme contains the following sections:
- Set up your environment
- Key concepts and terminology of Splunk Cloud Services
- Log in to Splunk Cloud Services and use the APIs
- Set up a tenant with a data pipeline
- Get sample data in and out of your tenant
- Define the app and create a subscription
- Build and run the Transit Dashboard App
Set up your environment
If you haven't done so already, sign up for Splunk Investigate with a new or existing Splunk account, accept the Terms Of Service, and choose a name for your Splunk Cloud Services tenant.
To ensure your local development environment is set up to successfully submit API requests, and to access and run code examples and applications, install the following tools:
- Splunk Cloud Services CLI, or
scloud
, is a command-line tool that makes it easy to submit API requests against Splunk Cloud Services. Download and unpack thescloud
binary from GitHub. - Git is a source-code management tool that allows you to run commands to access projects and keep track of changes to application files.
- Node.js is a JavaScript runtime environment that is required for a number of example applications accessed from the Splunk Developer Portal. Node.js version 8.3.0 or later is required for this app.
- npm is package manager that makes it easy for programmers to publish and share source code of Node.js libraries.
- The npm client is installed with Node.js.
- Log in to npm (run
npm login
) before runningnpm
ornpx
commands.
- Yarn is a package manager that enables additional workflows for JavaScript applications. Yarn is used to install packages and run commands in the Transit Dashboard App. Yarn version 1.3.2 or later is required for this app.
Key concepts and terminology of Splunk Cloud Services
Splunk Cloud Services is a collection of services exposed via REST APIs to enable fast and flexible app development.
The most fundamental concepts to understand for this sample app are the identities and their relationships:
- Principals represent an actor that makes API requests against Splunk Cloud Services.
- Users, apps, and service accounts are kinds of principals.
- Tenants represent a way to contain and isolate the resources provided by Splunk Cloud Services.
- Splunk Cloud Services is multi-tenant. There is no separate deployment per customer as with Splunk Enterprise.
- API requests are all made in the context of a specific tenant.
- Principals are allowed to make API requests against a tenant only a membership exists.
- The membership includes specific permissions granted to the principal within the tenant.
- Apps are allowed to operate on behalf of users with a subscription to the tenant.
- The subscription includes specific permissions that are granted to the app principal within the tenant.
- The subscription and membership provide consent between the app and principal.
Clone the repo
-
Create a folder on your computer for the project files.
-
At the command line, navigate to the folder and run the following command:
$ git clone https://github.com/splunk/conf19-sdc-workshop
Log in to Splunk Cloud Services and use the APIs
You can use the Splunk Cloud Services APIs in different ways, depending on your preference. Here are a couple of different ways to get information about your principal and see the tenants you are a member of.
Use the API Reference Console to explore the APIs
The API Reference Console is similar to using a Postman collection, allowing you to make Splunk Cloud Services REST API requests using pre-defined forms and view the formatted results.
-
Log into the Splunk Developer Portal with your Splunk.com credentials.
-
Navigate to the API Reference page.
-
Select Identity from the list of services on the left, then click the
GET /principals
endpoint. -
In the main window under the endpoint, click Console.
-
Click Show to reveal your Authorization token.
Each API request requires a valid token, for example, when using cURL directly. Because you are logged into the Splunk Developer Portal, your access token is automatically applied to requests. Tokens expire after one hour.
-
Click Submit to make the API request to return your principal name.
- Copy the value of the principal name that is displayed for use in the next API request.
-
Under Identity from the list of services on the left, click the
GET /principals/{p}
endpoint. -
In the main window under the endpoint, click Console.
-
Enter your principal name copied from step 6 in the form field.
-
Click Submit to make the API request to display details about your principal account.
- From the response, copy the value in the
tenants
array and store it for later use. Each API request will need to target a specific tenant.
- From the response, copy the value in the
Use the Splunk Cloud Services CLI to explore APIs
For a more programmatic approach, use scloud
at the command line to explore the Splunk Cloud Services APIs.
To log in at the command line, enter:
$ scloud -u <YOUR-PRINCIPAL-NAME> login
To get details about your user account (your principal), enter:
$ scloud identity get-principal <YOUR-PRINCIPAL-NAME>
Set up a tenant with a data pipeline
All incoming data moves through the Splunk Data Stream Processor (DSP). Data from REST APIs or Splunk Forwarders first flows through the Splunk Firehose. The Firehose aggregates your data into a single stream. From there, your data goes through a data pipeline where you can perform complex transformations and troubleshooting on your data before sending it to your indexers for searching.
Indexes are defined as kind of dataset managed by the Catalog service, along with other datasets such as search job or kvstore lookup. Datasets are knowledge objects that contain data that can be read or written to.
After events are indexed, they can be searched through an updated and refined Splunk Search Processing Language (SPL2). SPL2 uses a natural grammar that more closely resembles SQL. The same stats
and eval
functions are still there, to allow you to create visualizations. See the SPL2 Command Quick Reference for details.
Before data can be ingested, your tenant must have a pipeline defined and activated to process the events. For this app, create a simple passthrough pipeline that reads events from the Splunk Firehose and writes them to the "main" index.
To access pipeline dsl and sample data files, enter the data
directory:
$ cd data
To create a passthrough pipeline, enter the following scloud
commands:
$ scloud set tenant <YOUR-TENANT-NAME>
$ scloud streams compile-dsl -dsl-file passthrough.dsl > passthrough.upl
$ scloud streams create-pipeline -name passthrough -bypass-validation true -data-file passthrough.upl
Make note of the id
(the one that is returned underneath the description
field). You'll need it for the next command.
To activate the pipeline:
$ scloud streams activate-pipelines <PIPELINE-ID>
Get sample data in and out of your tenant
Once you have an activated pipeline, you can start sending events to your tenants using the Ingest service.
This repo includes two JSON files with sample data from Seattle transit agencies containing service and arrival/departure data for routes provide during a set period of time.
Run the following scloud
commands to ingest the sample data files.
On *nix:
$ cat agencies-with-coverage.json \
| scloud ingest post-events \
-host localhost \
-source agencies_with_coverage_json \
-sourcetype json_no_timestamp \
-format raw
$ cat arrivals-and-departures.json \
| scloud ingest post-events \
-host localhost \
-source arrivals_and_departures_json \
-sourcetype json_no_timestamp \
-format raw
On Windows:
more arrivals-and-departures.json | scloud ingest post-events -host localhost -source arrivals_and_departures_json -sourcetype json_no_timestamp -format raw
more agencies-with-coverage.json | scloud ingest post-events -host localhost -source agencies_with_coverage_json -sourcetype json_no_timestamp -format raw
Explore the data through search
After ingesting and passing the sample data through the pipeline, the data is indexed and available for search.
Run the following commands to search the sample data files to see how many routes are currently active for each transit agency.
On *nix:
$ scloud search "| from index:main where source=\"arrivals_and_departures_json\" \
| stats count() as refCount \
by 'data.references.agencies{}.name'" \
-earliest 0 \
-latest now
On Windows:
scloud search "from index:main where source=\"arrivals_and_departures_json\" | stats count() as refCount by 'data.references.agencies{}.name' " -earliest 0 -latest now
Use Splunk Investigate to get data in
Go to Splunk Investigate, select a tenant, click Data, then Add Data for wizards to help you get data in, set up and configure pipelines, and preview live data streaming into your pipeline!
Note: After data has been ingested, you can see the number of events passing through each node in your pipeline.
Define the app and create a subscription
A quick overview of apps in Splunk Cloud Services:
- Apps are self-hosted and run in isolation from Splunk Cloud Services. This model is different from Splunk Enterprise, where apps are installed to run alongside Splunk Enterprise.
- Apps are use-case driven, where the use case doesn’t have to be about Splunk: apps use Splunk Cloud Services to solve problems.
- Apps are developed with consistent integration points to Splunk Cloud Services. Apps all use the same APIs to configure, run, and develop apps.
- Subscriptions represent an authorization grant between an app and a tenant, and are required before any API requests can be made. Every subscription results in a webhook call back to the app, so that the app knows it can start.
To define the app and create a subscription with your tenant:
-
Create the app with a unique name and title.
Apps are defined in a "home tenant" so that Splunk Cloud Services knows about metadata such as name, description, required permissions, and webhooks that get triggered on subscription events.
Note: App names and titles are unique across all tenants, so for this sample app, replace
<TENANT>
below with your tenant name.$ scloud appreg create-app transit.demo.<TENANT> web \ -redirect-urls http://localhost:3000 \ -login-url https://auth.scp.splunk.com \ -title "Transit Dashboard Demo App for <TENANT>" \ -description "Copy of the transit dashboard demo app"
Make note of the
<CLIENT_ID>
that is returned. You'll need it when configuring the Transit Dashboard App. -
Create a subscription between your tenant and the app
$ scloud appreg create-subscription transit.demo.<TENANT>
Build and run the Transit Dashboard App
The Transit Dashboard App is a simple dashboard that displays transit route status by stop used by Seattle Transit.
To build and run the app:
-
At the command line, navigate to this
transit_dashboard_app
folder. -
Install the dependencies by running the following command:
$ yarn
-
In the ./src/config/config.json file, update the following values:
- Replace
YOUR CLIENT ID
with your client ID. - Replace
YOUR TENANT ID
with your tenant name.
- Replace
-
Start the example app in develop mode:
$ yarn run start
-
In a browser, open
localhost:3000
to view the app.
Resources
For more information about Splunk Cloud Services, see the Splunk Developer Portal.
For more dashboard examples, see the SCS Dashboard Examples repo.
For more about the tools that were used to build this app, see:
- Splunk Cloud Services SDK for JavaScript, which uses JavaScript to communicate with the Splunk Cloud Services REST APIs.
- Splunk Cloud Services Auth component, which provides a UI for users to authenticate with Splunk Cloud Services.
- Dashboard Framework, which contains a framework for working with dashboards and visualizations.
- Create React App, for bootstrapping this app.