Coder Social home page Coder Social logo

microsoft / spektate Goto Github PK

View Code? Open in Web Editor NEW
41.0 11.0 17.0 8.7 MB

This project is a visualization tool for https://github.com/microsoft/bedrock

Home Page: http://spektate-cd.centralus.azurecontainer.io

License: MIT License

Python 1.71% HTML 0.68% TypeScript 93.15% CSS 0.45% Shell 3.12% JavaScript 0.30% Dockerfile 0.36% Mustache 0.22%

spektate's Introduction

Build Status Azure DevOps coverage

Spektate

This is an initiative to visualize Project Bedrock. Spektate ties in information from the repositories API, the pipelines API and information stored in an Azure Table to display the dashboard with the following components:

Here's a detailed diagram describing the Spektate workflow. Each pipeline is responsible for sending a unique set of data to the storage, which is used to connect all the pieces together:

Currently, Spektate consists of a command line interface and a simple dashboard prototype. The instructions to use both are below.

Note: Spektate dashboard will delete deployments when their corresponding builds/releases have expired in Azure DevOps.

Official docker images for this dashboard are located at mcr.microsoft.com/k8s/bedrock/spektate.

Onboard a Bedrock project to use Spektate

Follow the steps in this guide to onboard a project to use Spektate.

Install on your cluster

The helm chart to use this dashboard is located here. There's a Load Balancer provided in the helm chart but it's turned off in values.yaml with the setting externalIP. Set this to true if you would like to expose the dashboard via a public endpoint.

Substitute values for your configuration and install this dashboard via the command below:

cd ./chart
helm install . --name spektate --set storageAccessKey=<storageAccessKey> --set storageTableName=<storageTableName> --set storagePartitionKey=<storagePartitionKey> --set storageAccountName=<storageAccountName> --set pipelineProject=<pipelineProjectName> --set pipelineOrg=<pipelineOrg> --set pipelineAccessToken=<PipelinePAT> --set manifest=<manifestRepoName> --set manifestAccessToken=<manifestAccessToken> --set githubManifestUsername=<gitHubUserName>  --set sourceRepoAccessToken=<sourceRepoAccessToken>
  • storageAccessKey: Access key for the storage account
  • storageTableName: Table name for the storage account
  • storagePartitionKey: Partition key for your configuration, you may want to use project name or some identifier that helps separate unrelated configurations for the purpose of introspection.
  • storageAccountName: Storage account name
  • pipelineProject: Project name for the pipelines in Azure DevOps
  • pipelineOrg: Org name for the pipelines in Azure DevOps
  • pipelineAccessToken: Access token for pipelines in Azure DevOps
  • manifestRepoName: Manifest repository name
  • manifestAccessToken: Access token for the manifest repository
  • sourceRepoAccessToken: Access token for the source repository
  • Note: If you're using GitHub, add githubManifestUsername: Account name or organization name under which the manifest repository resides.

If you're not using an external IP, use port-forwarding to access the dashboard:

  1. Copy pod name from kubectl get pods
  2. kubectl port-forward pod/<pod-name> 2200:5000 or change 2200 to a port of your choice
  3. Navigate to http://localhost:2200 or change 2200 to a port of your choice

Dashboard dev mode

  1. Clone this repository.

  2. There is frontend and backend folders and each of them are separate yarn projects. cd frontend in one window and cd backend in another.

  3. Add the following env variables to your shell where you have changed directory to backend:

    export REACT_APP_STORAGE_ACCESS_KEY=
    export REACT_APP_STORAGE_TABLE_NAME=
    export REACT_APP_STORAGE_PARTITION_KEY=
    export REACT_APP_STORAGE_ACCOUNT_NAME=
    export REACT_APP_PIPELINE_PROJECT=
    export REACT_APP_PIPELINE_ORG=
    export REACT_APP_PIPELINE_ACCESS_TOKEN=
    export REACT_APP_MANIFEST=
    export REACT_APP_MANIFEST_ACCESS_TOKEN=
    export REACT_APP_SOURCE_REPO_ACCESS_TOKEN=
    • REACT_APP_STORAGE_ACCESS_KEY: Access key for the storage account
    • REACT_APP_STORAGE_TABLE_NAME: Table name for the storage account
    • REACT_APP_STORAGE_PARTITION_KEY: Partition key for your configuration, you may want to use project name or some identifier that helps separate unrelated configurations for the purpose of introspection.
    • REACT_APP_STORAGE_ACCOUNT_NAME: Storage account name
    • REACT_APP_PIPELINE_PROJECT: Project name for the pipelines in Azure DevOps
    • REACT_APP_PIPELINE_ORG: Org name for the pipelines in Azure DevOps
    • REACT_APP_PIPELINE_ACCESS_TOKEN: Access token for pipelines in Azure DevOps
    • REACT_APP_MANIFEST: Manifest repository name
    • REACT_APP_MANIFEST_ACCESS_TOKEN: Access token for the manifest repository
    • REACT_APP_SOURCE_REPO_ACCESS_TOKEN: Access token for the source repository
    • Note: If you're using GitHub, add REACT_APP_GITHUB_MANIFEST_USERNAME: Account name or organization name under which the manifest repository resides.
  4. Run yarn in both to install dependencies

  5. Run yarn start in both to start the applications. You should be able to see the dashboard launch in one, and a Node.js server start in another! It should navigate you to the browser where dashboard is running.

Publish Docker image

In order to publish images to this repository, you will need access to devcrewsacr.azurecr.io

  1. Run az acr login --name devcrewsacr
  2. If you do not know credentials you will need to login, grab them from portal.azure.com or run az acr credential show --name devcrewsacr.
  3. Run docker login devcrewsacr.azurecr.io and you will be prompted to enter the credentials
  4. Run docker push devcrewsacr.azurecr.io/public/k8s/bedrock/spektate:<tag>

Azure Web App Hosting

You can provide an Azure Active Directory layer of authetication on top of Spektate. Follow instructions here.

Command Line Interface

To use the CLI for Spektate, head over to https://github.com/microsoft/bedrock-cli.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

spektate's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spektate's Issues

Can't see anything on the site with a lot of old data

We've applied the spektate a few months ago (just in pipelines) without running the spektate site.
Now, we've run them. But the site didn't show anything. I checked in az storage table, it has a lot of data.
Then I run and debug this site locally and I found the problems in method exports.parseDeploymentsFromDB (IDeployment.js file in node_modules):

  • const p1 = srcPipeline.getListOfBuilds(srcBuildIds); didn't return anything with ~1k length of srcBuildIds
  • batch.deleteEntity(entry); raised errors when delete 100th item and more.

Inconsistent ImageTag in examples

Items in the Azure table don't match up between App Code Build and the ACR to HLD update. See bolded:

Step # 2 says

    tag_name="$(PARTITION_KEY)-$(Build.SourceBranchName)-$(Build.BuildId)" 
    commitId=$(Build.SourceVersion)
    commitId=$(echo "${commitId:0:7}")
    echo "python update_pipeline.py $(ACCOUNT_NAME) $(ACCOUNT_KEY) $(TABLE_NAME) $(PARTITION_KEY) p1 $(Build.BuildId)  imageTag $tag_name commitId $commitId"
    python update_pipeline.py $(ACCOUNT_NAME) $(ACCOUNT_KEY) $(TABLE_NAME) $(PARTITION_KEY) p1 $(Build.BuildId) imageTag $tag_name commitId $commitId 

Step # 3 says

echo "python update_pipeline.py $(ACCOUNT_NAME) $(ACCOUNT_KEY) $(TABLE_NAME) $(PARTITION_KEY) imageTag $(Build.BuildId) p2 $(Release.ReleaseId) hldCommitId $latest_commit"
python update_pipeline.py $(ACCOUNT_NAME) $(ACCOUNT_KEY) $(TABLE_NAME) $(PARTITION_KEY) imageTag $(Build.BuildId) p2 $(Release.ReleaseId) hldCommitId $latest_commit

The code says an equals filter is going on:

    table_service = TableService(account_name=account_name, account_key=account_key)
    entities = table_service.query_entities(table_name, filter=filter_name + " eq '"+ filter_value + "'")

Add ability to run update_pipeline.py with a parameter that has value is none

Hello @samiyaakhtar ,I am using Bedrock. I have 3 stages corresponding to 3 environments (dev, qa, prod), each of which is transformed by Fabrikate into Infra-materialized. I am using Spektate in 3 places.

  1. After building the image.
  2. After the release (release stage).
  3. After Fabrikate transforms.
    After running at positions 1 and 2, Spektate creates and updates a row but when running at position 3, it creates a new line. I expect that after running, there will be only 1 row for each environment.
    I have found a solution to this problem, but it needs a little modification in the script. The script needs to support running with an input variable of 1 none value
    ex:

python "$ HOME / spektate / pipeline-scripts / update_pipeline.py" $ (ACCOUNT_NAME) $ (ACCOUNT_KEY) $ (TABLE_NAME) $ (PARTITION_KEY ) imageTag $ IMAGE_TAG p2 $ BUILD_TAG manifestCommitId
I have also tried to fork the code and changed it myself. Can you take a look?
https://github.com/TaiNguyen2406/spektate/blob/master/pipeline-scripts/update_pipeline.py

Restructure README

Since we hope to have Spektate support multiple CI/CD platforms we should refactor the README.

  • What is Spektate?
    • Architecture (link to a separate page)
  • Requirements
  • Download Link
  • Getting Started
    • Azure Devops (link to a separate page)
    • GitHub Actions (link to a separate page)

Investigate fluxv2

  • Setup a spektate dashboard for a configuration deployed using fluxv2
  • Investigate the information available within fluxv2 that can be displayed on spektate for improving the post-deployment experience. Create tasks accordingly

When a build/release is expired by ADO, Spektate needs to stop displaying it

This is causing empty holes in the dashboard because some builds have been expired in Azure DevOps but the tool is still trying to show them (because they exist in the storage). Perhaps there should be a cleanup/garbage collection mechanism that goes in and cleans up these entries from the db when they've been removed from the ADO builds?

Screen Shot 2019-09-09 at 1 23 47 PM

CI/CD pipeline scripts for Github actions and Gitlab to update Spektate storage

Right now, bedrock-cli generates bash script for the pipelines to update Spektate storage only for AzDO pipelines. Spektate repository has no documentation on finding these scripts if a user is not using bedrock-cli, or they're using a tool other than AzDO.

  • Docs on pipeline scripts for Github actions, Gitlab and AzDO
  • Add support to bedrock cli to generate pipelines for Github actions
  • Add support to bedrock cli to generate pipelines for Gitlab

Spektate configuration for first time setup needs work

In order for users to launch Spektate and setup configuration, there's no clear definition or documentation on where to start. Right now, there's a different list of ~10 environment variables to export for each of the orchestrators Azdo/github/gitlab and there's no error handling or guidance for the user on setting this up.

  • bedrock-cli supports setting this up
  • for users not using bedrock-cli, create a configuration file / improve docs on Spektate independently.

Cluster sync status improvement

  • Investigate if it's possible to get the history of past flux-sync tags that were pushed to the GitHub repository
    • If possible, add a time stamp to past deployments of when they were synced
    • If not possible, figure out a better way to display it rather than using a full column

Identity provider in Spektate

Given an AAD application client id and password we could authenticate. The set up could be as follows:

  • Configure Spektate with AAD Identity provider credentials for a tenant
  • Run Spektate in your Kubernetes cluster
  • Setup an Azure public IP address
  • Setup an ingress in your Kubernetes cluster using the public IP address (Your can configure such that only Azure Front Door as access)
  • Setup something like Azure Front Door to route to your backend (exposed public IP address)

This way folks can access Spektate security with authorization using Kubernetes to run the workload.

Helps with #167

Spektate storage update scripts need to be templatized

As a developer,
I would like to take the Spektate scripts as templates for updating the storage in the three pipelines
So that I can leverage the templates without having to use bedrock-cli to generate them.

Right now, the scripts aren't well documented and aren't kept in a template format for anyone to take and re-use. bedrock-cli supports creating them but comes with a very specific HLD format and dependencies on Fabrikate, which all teams may not want to use.

Here's Andre's repository for templates: https://github.com/andrebriggs/bedrock-templates this is a good place to start

Gitlab support

As a developer I want Spektate to support Gitlab repositories and Gitlab CI/CD pipelines

Explore code velocity metrics

A great add on for spektate would be to measure the velocity of a container till it ends up on the cluster, measuring this metric will help us diagnose issues with Azure, or flux, etc.

Possible ways to do this: Measure the time it takes from a commit into source code till the cluster sync tag is updated by flux in the manifest repo. Push this velocity into the azure storage

Variable Groups: update documentation on how to use them

Following the instructions my app code pipeline didn't recognize the variable group (See #8). In order to get the pipeline to recognize my variable group I had to alter my azure-pipelines.yaml file. What I originally had for variables:

variables:
  GOBIN:  '$(GOPATH)/bin' # Go binaries path
  GOROOT: '/usr/local/go1.11' # Go installation path
  GOPATH: '$(system.defaultWorkingDirectory)/gopath' # Go workspace path
  modulePath: '$(GOPATH)/src/github.com/$(build.repository.name)' # Path to the module's code

Change I made to get the variable group to be recognized:

variables:
- group: container journey
- name: GOBIN  
  value: '$(GOPATH)/bin' # Go binaries path
- name: GOROOT 
  value: '/usr/local/go1.11' # Go installation path
- name: GOPATH
  value: '$(system.defaultWorkingDirectory)/gopath' # Go workspace path
- name: modulePath
  value: '$(GOPATH)/src/github.com/$(build.repository.name)' # Path to the module's code

I also had to update my HLD azure-pipelines.yaml file to add

variables:
- group: container journey

I followed the instructions here.

Did you have the same issue? We should update the instructions if so.

Dashboard - provide detail of what pipeline stage failed

As a:
User of the introspection dashboard

I want:
to know what pipeline stage failed

So that:
I know the state of the pipeline

Describe the solution you'd like:
Currently we show the overall result of a pipeline. If a pipeline is multi-stage, if the first stage passes and the second stage fails, we show both pipelines as failed, this is due to the API we're currently using

  • Update the build statuses of the pipelines such that they reflect the status of the stage rather than the overall build, and they should link into each stage when clicked

Acceptance Criteria:
Build status is correctly displayed based on stage rather than overall build

Describe alternatives you've considered:
Leave as is

Additional context:

Does this require updates to documentation?:
No

Consider UI acceptance testing

Testing frameworks such as Robot might make UI acceptance testing easier.

An idea is that the a test can run against the CI version of the Spektate after the nightly deploy is done

Allow adjusting column width for columns (in UI or in config)

Suggestion

Allow adjusting column width for columns (in UI or in config)

Issue

If I have a service name with the value "Azure Voting App Front End" the UI which trim starting at 10 characters.

it seems very possible that some service names may be more than 10 characters and perhaps multiple services will have the same starting 10 characters.

Issue below
image

The instance of Spketate is {"version":"spektateacr.azurecr.io/spektate:spektate-master-23213"} according to http://40.64.74.69:5000/api/version

cc @samiyaakhtar @gemorris

ACR to HLD pipeline instructions issue

The current instructions say to use:

latest_commit=$(git rev-parse --short HEAD)

cd ../container-journey/pipeline-scripts

sudo /usr/bin/easy_install virtualenv
pip install virtualenv 
pip install --upgrade pip
python -m virtualenv venv
source venv/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt

echo "python update_pipeline.py $(ACCOUNT_NAME) $(ACCOUNT_KEY) $(TABLE_NAME) $(PARTITION_KEY) imageTag $(Build.BuildId) p2 $(Release.ReleaseId) hldCommitId $latest_commit"
python update_pipeline.py $(ACCOUNT_NAME) $(ACCOUNT_KEY) $(TABLE_NAME) $(PARTITION_KEY) imageTag $(Build.BuildId) p2 $(Release.ReleaseId) hldCommitId $latest_commit

The issue here is that we don't clone the repo and cd into it. Instead we should have:

latest_commit=$(git rev-parse --short HEAD)

git clone https://github.com/samiyaakhtar/container-journey.git
cd ./container-journey/pipeline-scripts

sudo /usr/bin/easy_install virtualenv
pip install virtualenv 
pip install --upgrade pip
python -m virtualenv venv
source venv/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt

echo "python update_pipeline.py $ACCOUNT_NAME $ACCOUNT_KEY $TABLE_NAME $PARTITION_KEY imageTag $(Build.BuildId) p2 $(Release.ReleaseId) hldCommitId $latest_commit"
python update_pipeline.py $ACCOUNT_NAME $ACCOUNT_KEY $TABLE_NAME $PARTITION_KEY imageTag $(Build.BuildId) p2 $(Release.ReleaseId) hldCommitId $latest_commit

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.