Coder Social home page Coder Social logo

opensearch-project / opensearch-dashboards Goto Github PK

View Code? Open in Web Editor NEW
1.5K 46.0 777.0 2.87 GB

πŸ“Š Open source visualization dashboards for OpenSearch.

Home Page: https://opensearch.org/docs/latest/dashboards/index/

License: Apache License 2.0

Dockerfile 0.03% Shell 0.24% JavaScript 20.55% TypeScript 77.66% SCSS 0.72% HTML 0.01% CSS 0.71% Batchfile 0.02% Handlebars 0.04% EJS 0.05%
search opensearch analytics foss apache2

opensearch-dashboards's Introduction

Welcome

OpenSearch Dashboards is an open-source data visualization tool designed to work with OpenSearch. OpenSearch Dashboards gives you data visualization tools to improve and automate business intelligence and support data-driven decision-making and strategic planning.

We aim to be an exceptional community-driven platform and to foster open participation and collective contribution with all contributors. Stay up to date on what's happening with the OpenSearch Project by tracking GitHub issues and pull requests.

You can contribute to this project by opening issues to give feedback, share ideas, identify bugs, and contribute code.

Set up your OpenSearch Dashboards development environment today! The project team looks forward to your contributions.

Code Summary

Build and Test Unit Test Code Coverage Link Checker

Project Resources

Code of Conduct

This project has adopted the Amazon Open Source Code of Conduct. For more information see the Code of Conduct FAQ, or contact [email protected] with any additional questions or comments.

License

This project is licensed under the Apache v2.0 License.

Copyright

Copyright OpenSearch Contributors. See NOTICE for details.

opensearch-dashboards's People

Contributors

bargs avatar bigfunger avatar bleskes avatar chrisronline avatar cjcenizal avatar cqliu1 avatar epixa avatar flash1293 avatar jbudz avatar jgowdyelastic avatar kobelb avatar lcawl avatar legrego avatar lukasolson avatar mshustov avatar nreese avatar panda01 avatar ppisljar avatar rashidkpc avatar simianhacker avatar sorenlouv avatar spalger avatar stacey-gammon avatar stormpython avatar thomasneirynck avatar timroes avatar tsullivan avatar w33ble avatar walterra avatar ycombinator avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

opensearch-dashboards's Issues

Proposal: [Discuss] [Core] Remove custom Package manager from osd-pm

As of now the Dashboards project entirely depends on custom build logic inside osd-pm The main purpose was to support OSS v/s non OSS development and distribution. This adds up lots of extra overheads over custom build system maintenance.

Describe the solution you'd like
This issue is to provide idea / PoC / designs to use more OSS and community friendly tool such as lerna, yarn workspace Or latest release of Npm workspace.

Note: This issues doesn't intent any immediate changes or ongoing implementation but more on to collect idea and how can we improve overall developer experience with community friendly tooling.

[Tests] Ensure Unit test cases pass

This issues is to track and ensure all the unit test cases are being passed. As part of rename exercises we have excluded snapshot files and will rely on jest to update them.

  • Update automated snapshots Run yarn test:jest -u
  • Run yarn test:jest and list down all broken unit test cases
  • Fix test cases / remove if not relevant anymore.

[Tests] enable integration tests ignored due to invalid snapshot url

Description
In the file osd-opensearch/src/artifact.js there is a reference to the snapshot URL that is invalid, however, a valid link is required for the tests in ui_settings/integration_tests/index.test.ts and ui_settings/create_or_upgrade_saved_config/integration_tests/create_or_upgrade.test.ts to run successfully. Since we do not currently have a replacement URL or alternative these tests were ignored for the purpose of successful integration runs of tests that are not blocked due to an external dependency.

Once there is a valid snapshot URL or an alternative is in place, then these tests can be re-enabled. But until than this issue is BLOCKED.

To Reproduce
Steps to reproduce the behavior:

  1. Enable tests that were ignored in ui_settings/integration_tests/index.test.ts and ui_settings/create_or_upgrade_saved_config/integration_tests/create_or_upgrade.test.ts
  2. Run yarn run test:jest_integration
  3. See test failures

Expected behavior
Successful run of integration tests with no test failures

OpenSearch Version
n/a

Dashboards Version
n/a

Plugins

n/a

Screenshots

n/a

Host/Environment (please complete the following information):

  • OS: Ubuntu
  • Browser and version [e.g. 22]: n/a

Additional context

Currently blocked until we get a snapshot URL or alternative.

Create build artifacts

As of now There are no tested build artifacts, this issue ensures we have all the artifacts and cleaned build pipeline.

  • Use official NodeJS website for downloading the artifacts
  • Remove -no-oss flag from the build tool
  • Remove -oss flag from the build tool
  • Ensure we produce only oss distribution
  • Tar
    • windows
    • OSX
    • Linux
  • Docker
  • OS Distribution
    • Deb
    • RPM

Update README 'Getting started' section

Current README states to run yarn start --oss to start the server, but the oss flag has been removed after #251 and will fail. Note that just running yarn start will run on port 5603 instead of 5601. But, it will default to port 5601 by passing the --no-base-path flag.

Creating this issue to track what the suggested command should be here.

[Meta] Plugin Decoupling

Is your feature request related to a problem? Please describe.
Currently, developing plugin is cumbersome, requires entire repository to be available. We should find a way to reduce this and make Dashboards plugin development simplified.

Specific problems include

  • coupled versioning/build
  • coupled dependencies
  • lack of a true extensibility platform for customers (e.g. dynamic loading of extensions)
  • and more (WIP)

Describe the solution you'd like
Publish all independent packages to NPM repository (plugin-helpers/ generators) and provide docker based plugin development to get it started

[BUG] Maps show error

Describe the bug
Calls to get map data errors out. While navigating to any page that makes a call to get map data it sends a request to a non-existent service. This is because the original URL for the service was replaced while the fork was undergoing renaming. This was expected bug to occur but will need to be fixed.

To Reproduce
Steps to reproduce the behavior:

  1. Load sample data
  2. Go to dashboards
  3. Go to a map widget
  4. See error

Expected behavior
Successful response to an existing service and it shows map data.

OpenSearch Version
n/a

Dashboards Version
1.x main

Plugins
n/a

Screenshots

Screen Shot 2021-03-23 at 9 47 44 AM

Host/Environment (please complete the following information):

  • OS: Ubuntu
  • Browser and version [e.g. 22]; Firefox 78.7.0esr

Additional context

Need to discuss with the community about what map service to use.

[Test] Ensure that all Jest Unit test cases pass

This issues is to track and ensure all the jest unit test cases are being passed. As part of rename exercises we have excluded snapshot files and will rely on jest to update them.

  • Update automated snapshots Run yarn test:jest -u
  • Run yarn test:jest and list down all broken unit test cases
  • Fix test cases / remove if not relevant anymore.

[Tests] Migrate mocha tests to jest

There are few mocha tests which needs to be migrated to Jest.
Following test cases should be converted to jest.

Steps to convert :

  • Move tests out __tests__ directory and append it with test.js / test.ts extension.
  • Ensure tests are passing and are able to run via Jest.
╰─ yarn test:mocha                                                                                                                                                                                                                                         ─╯
yarn run v1.22.10
$ node scripts/mocha


  dev/File
    constructor
      βœ“ throws if path is not a string
    #getRelativePath()
      βœ“ returns the path relative to the repo root
    #isJs()
      βœ“ returns true if extension is .js
      βœ“ returns false if extension is .xml
      βœ“ returns false if extension is .css
      βœ“ returns false if extension is .html
      βœ“ returns false if file has no extension
    #getRelativeParentDirs()
      βœ“ returns the parents of a file, stopping at the repo root, in descending order
    #toString()
      βœ“ returns the relativePath
    #toJSON()
      βœ“ returns the relativePath

  All configs should use a single version of Node
    βœ“ should compare .node-version and .nvmrc
    βœ“ should compare .node-version and engines.node from package.json

  getUrl
    βœ“ should convert to a url
    βœ“ should convert to a url with port
    βœ“ should convert to a secure hashed url

  tasks/lib/licenses
    assertLicensesValid()
      βœ“ returns undefined when package has valid license
      βœ“ throw an error when the packages license is invalid
      βœ“ throws an error when the package has no licenses
      βœ“ includes the relative path to packages in error message

  dev/mocha/junit report generation
    βœ“ reports on failed setup hooks

  unset(obj, key)
    invalid input
      βœ“ should do nothing if not given an object
      βœ“ should do nothing if not given a key
      βœ“ should do nothing if given an empty string as a key
    shallow removal
      βœ“ should remove the param using a string key
      βœ“ should remove the param using an array key
    deep removal
      βœ“ should remove the param using a string key
      βœ“ should remove the param using an array key
    recursive removal
      βœ“ should clear object if only value is removed
      βœ“ should clear object if no props are left
      βœ“ should remove deep property, then clear the object

  plugins/console
    #getOpenSearchProxyConfig
      βœ“ sets timeout
      βœ“ uses https.Agent when url's protocol is https
      βœ“ uses http.Agent when url's protocol is http
      ssl
        βœ“ sets rejectUnauthorized to false when verificationMode is none
        βœ“ sets rejectUnauthorized to true when verificationMode is certificate
        βœ“ sets checkServerIdentity to not check hostname when verificationMode is certificate
        βœ“ sets rejectUnauthorized to true when verificationMode is full
        βœ“ doesn't set checkServerIdentity when verificationMode is full
        βœ“ sets ca when certificateAuthorities are specified
        when alwaysPresentCertificate is false
          βœ“ doesn't set cert and key when certificate and key paths are specified
          βœ“ doesn't set passphrase when certificate, key and keyPassphrase are specified
        when alwaysPresentCertificate is true
          βœ“ sets cert and key when certificate and key are specified
          βœ“ sets passphrase when certificate, key and keyPassphrase are specified
          βœ“ doesn't set cert when only certificate path is specified
          βœ“ doesn't set key when only key path is specified

  ProxyConfig
    constructor
      βœ“ uses ca to create sslAgent
      βœ“ uses cert, and key to create sslAgent
      βœ“ uses ca, cert, and key to create sslAgent
    #getForParsedUri
      parsed url does not match
        βœ“ returns {}
      parsed url does match
        βœ“ assigns timeout value
        βœ“ assigns ssl.verify to rejectUnauthorized
        uri us http
          ca is set
            βœ“ creates but does not output the agent
          cert is set
            βœ“ creates but does not output the agent
          key is set
            βœ“ creates but does not output the agent
          cert + key are set
            βœ“ creates but does not output the agent
        uri us https
          ca is set
            βœ“ creates and outputs the agent
          cert is set
            βœ“ creates and outputs the agent
          key is set
            βœ“ creates and outputs the agent
          cert + key are set
            βœ“ creates and outputs the agent

  ProxyConfigCollection
    http://localhost:5601
      βœ“ defaults to the first matching timeout
    https://localhost:5601/.opensearch_dashboards
      βœ“ defaults to the first matching timeout
    http://localhost:5602
      βœ“ defaults to the first matching timeout
    https://localhost:5602
      βœ“ defaults to the first matching timeout
    http://localhost:5603
      βœ“ defaults to the first matching timeout
    https://localhost:5603
      βœ“ defaults to the first matching timeout
    https://localhost:5601/index
      βœ“ defaults to the first matching timeout
    http://localhost:5601/index
      βœ“ defaults to the first matching timeout
    https://localhost:5601/index/type
      βœ“ defaults to the first matching timeout
    http://notlocalhost
      βœ“ defaults to the first matching timeout
    collection with ssl config and root level verify:false
      βœ“ verifies for config that produces ssl agent
      βœ“ disabled verification for * config

  #set_headers
    βœ“ throws if not given an object as the first argument
    βœ“ throws if not given an object as the second argument
    βœ“ returns a new object
    βœ“ returns object with newHeaders merged with originalHeaders
    βœ“ returns object where newHeaders takes precedence for any matching keys

  WildcardMatcher
    pattern = *
      βœ“ matches http
      βœ“ matches https
      βœ“ matches nothing
      βœ“ does not match /
      βœ“ matches localhost
      βœ“ matches a path
      defaultValue = /
        βœ“ matches /
    pattern = http
      βœ“ matches http
      βœ“ does not match https
      βœ“ does not match nothing
      βœ“ does not match localhost
      βœ“ does not match a path
    pattern = 560{1..9}
      βœ“ does not match http
      βœ“ does not matches 5600
      βœ“ matches 5601
      βœ“ matches 5602
      βœ“ matches 5603
      βœ“ matches 5604
      βœ“ matches 5605
      βœ“ matches 5606
      βœ“ matches 5607
      βœ“ matches 5608
      βœ“ matches 5609
      βœ“ does not matches 5610

  dateMath
    errors
      βœ“ should return undefined if passed something falsy
      βœ“ should return undefined if I pass an operator besides [+-/]
      βœ“ should return undefined if I pass a unit besidess,m,h,d,w,M,y,ms
      βœ“ should return undefined if rounding unit is not 1
      βœ“ should not go into an infinite loop when missing a unit
      forceNow
        βœ“ should throw an Error if passed a string
        βœ“ should throw an Error if passed a moment
        βœ“ should throw an Error if passed an invalid date
    objects and strings
      βœ“ should return the same moment if passed a moment
      βœ“ should return a moment if passed a date
      βœ“ should return a moment if passed an ISO8601 string
      βœ“ should return the current time when parsing now
      βœ“ should use the forceNow parameter when parsing now
    subtraction
      βœ“ should return 5s ago
      βœ“ should return 5s before 2014-01-01T06:06:06.666Z
      βœ“ should return 5s before forceNow
      βœ“ should return 5m ago
      βœ“ should return 5m before 2014-01-01T06:06:06.666Z
      βœ“ should return 5m before forceNow
      βœ“ should return 5h ago
      βœ“ should return 5h before 2014-01-01T06:06:06.666Z
      βœ“ should return 5h before forceNow
      βœ“ should return 5d ago
      βœ“ should return 5d before 2014-01-01T06:06:06.666Z
      βœ“ should return 5d before forceNow
      βœ“ should return 5w ago
      βœ“ should return 5w before 2014-01-01T06:06:06.666Z
      βœ“ should return 5w before forceNow
      βœ“ should return 5M ago
      βœ“ should return 5M before 2014-01-01T06:06:06.666Z
      βœ“ should return 5M before forceNow
      βœ“ should return 5y ago
      βœ“ should return 5y before 2014-01-01T06:06:06.666Z
      βœ“ should return 5y before forceNow
      βœ“ should return 5ms ago
      βœ“ should return 5ms before 2014-01-01T06:06:06.666Z
      βœ“ should return 5ms before forceNow
      βœ“ should return 12s ago
      βœ“ should return 12s before 2014-01-01T06:06:06.666Z
      βœ“ should return 12s before forceNow
      βœ“ should return 12m ago
      βœ“ should return 12m before 2014-01-01T06:06:06.666Z
      βœ“ should return 12m before forceNow
      βœ“ should return 12h ago
      βœ“ should return 12h before 2014-01-01T06:06:06.666Z
      βœ“ should return 12h before forceNow
      βœ“ should return 12d ago
      βœ“ should return 12d before 2014-01-01T06:06:06.666Z
      βœ“ should return 12d before forceNow
      βœ“ should return 12w ago
      βœ“ should return 12w before 2014-01-01T06:06:06.666Z
      βœ“ should return 12w before forceNow
      βœ“ should return 12M ago
      βœ“ should return 12M before 2014-01-01T06:06:06.666Z
      βœ“ should return 12M before forceNow
      βœ“ should return 12y ago
      βœ“ should return 12y before 2014-01-01T06:06:06.666Z
      βœ“ should return 12y before forceNow
      βœ“ should return 12ms ago
      βœ“ should return 12ms before 2014-01-01T06:06:06.666Z
      βœ“ should return 12ms before forceNow
      βœ“ should return 247s ago
      βœ“ should return 247s before 2014-01-01T06:06:06.666Z
      βœ“ should return 247s before forceNow
      βœ“ should return 247m ago
      βœ“ should return 247m before 2014-01-01T06:06:06.666Z
      βœ“ should return 247m before forceNow
      βœ“ should return 247h ago
      βœ“ should return 247h before 2014-01-01T06:06:06.666Z
      βœ“ should return 247h before forceNow
      βœ“ should return 247d ago
      βœ“ should return 247d before 2014-01-01T06:06:06.666Z
      βœ“ should return 247d before forceNow
      βœ“ should return 247w ago
      βœ“ should return 247w before 2014-01-01T06:06:06.666Z
      βœ“ should return 247w before forceNow
      βœ“ should return 247M ago
      βœ“ should return 247M before 2014-01-01T06:06:06.666Z
      βœ“ should return 247M before forceNow
      βœ“ should return 247y ago
      βœ“ should return 247y before 2014-01-01T06:06:06.666Z
      βœ“ should return 247y before forceNow
      βœ“ should return 247ms ago
      βœ“ should return 247ms before 2014-01-01T06:06:06.666Z
      βœ“ should return 247ms before forceNow
    addition
      βœ“ should return 5s from now
      βœ“ should return 5s after 2014-01-01T06:06:06.666Z
      βœ“ should return 5s after forceNow
      βœ“ should return 5m from now
      βœ“ should return 5m after 2014-01-01T06:06:06.666Z
      βœ“ should return 5m after forceNow
      βœ“ should return 5h from now
      βœ“ should return 5h after 2014-01-01T06:06:06.666Z
      βœ“ should return 5h after forceNow
      βœ“ should return 5d from now
      βœ“ should return 5d after 2014-01-01T06:06:06.666Z
      βœ“ should return 5d after forceNow
      βœ“ should return 5w from now
      βœ“ should return 5w after 2014-01-01T06:06:06.666Z
      βœ“ should return 5w after forceNow
      βœ“ should return 5M from now
      βœ“ should return 5M after 2014-01-01T06:06:06.666Z
      βœ“ should return 5M after forceNow
      βœ“ should return 5y from now
      βœ“ should return 5y after 2014-01-01T06:06:06.666Z
      βœ“ should return 5y after forceNow
      βœ“ should return 5ms from now
      βœ“ should return 5ms after 2014-01-01T06:06:06.666Z
      βœ“ should return 5ms after forceNow
      βœ“ should return 12s from now
      βœ“ should return 12s after 2014-01-01T06:06:06.666Z
      βœ“ should return 12s after forceNow
      βœ“ should return 12m from now
      βœ“ should return 12m after 2014-01-01T06:06:06.666Z
      βœ“ should return 12m after forceNow
      βœ“ should return 12h from now
      βœ“ should return 12h after 2014-01-01T06:06:06.666Z
      βœ“ should return 12h after forceNow
      βœ“ should return 12d from now
      βœ“ should return 12d after 2014-01-01T06:06:06.666Z
      βœ“ should return 12d after forceNow
      βœ“ should return 12w from now
      βœ“ should return 12w after 2014-01-01T06:06:06.666Z
      βœ“ should return 12w after forceNow
      βœ“ should return 12M from now
      βœ“ should return 12M after 2014-01-01T06:06:06.666Z
      βœ“ should return 12M after forceNow
      βœ“ should return 12y from now
      βœ“ should return 12y after 2014-01-01T06:06:06.666Z
      βœ“ should return 12y after forceNow
      βœ“ should return 12ms from now
      βœ“ should return 12ms after 2014-01-01T06:06:06.666Z
      βœ“ should return 12ms after forceNow
      βœ“ should return 247s from now
      βœ“ should return 247s after 2014-01-01T06:06:06.666Z
      βœ“ should return 247s after forceNow
      βœ“ should return 247m from now
      βœ“ should return 247m after 2014-01-01T06:06:06.666Z
      βœ“ should return 247m after forceNow
      βœ“ should return 247h from now
      βœ“ should return 247h after 2014-01-01T06:06:06.666Z
      βœ“ should return 247h after forceNow
      βœ“ should return 247d from now
      βœ“ should return 247d after 2014-01-01T06:06:06.666Z
      βœ“ should return 247d after forceNow
      βœ“ should return 247w from now
      βœ“ should return 247w after 2014-01-01T06:06:06.666Z
      βœ“ should return 247w after forceNow
      βœ“ should return 247M from now
      βœ“ should return 247M after 2014-01-01T06:06:06.666Z
      βœ“ should return 247M after forceNow
      βœ“ should return 247y from now
      βœ“ should return 247y after 2014-01-01T06:06:06.666Z
      βœ“ should return 247y after forceNow
      βœ“ should return 247ms from now
      βœ“ should return 247ms after 2014-01-01T06:06:06.666Z
      βœ“ should return 247ms after forceNow
    rounding
      βœ“ should round now to the beginning of the s
      βœ“ should round now to the beginning of forceNow's s
      βœ“ should round now to the end of the s
      βœ“ should round now to the end of forceNow's s
      βœ“ should round now to the beginning of the m
      βœ“ should round now to the beginning of forceNow's m
      βœ“ should round now to the end of the m
      βœ“ should round now to the end of forceNow's m
      βœ“ should round now to the beginning of the h
      βœ“ should round now to the beginning of forceNow's h
      βœ“ should round now to the end of the h
      βœ“ should round now to the end of forceNow's h
      βœ“ should round now to the beginning of the d
      βœ“ should round now to the beginning of forceNow's d
      βœ“ should round now to the end of the d
      βœ“ should round now to the end of forceNow's d
      βœ“ should round now to the beginning of the w
      βœ“ should round now to the beginning of forceNow's w
      βœ“ should round now to the end of the w
      βœ“ should round now to the end of forceNow's w
      βœ“ should round now to the beginning of the M
      βœ“ should round now to the beginning of forceNow's M
      βœ“ should round now to the end of the M
      βœ“ should round now to the end of forceNow's M
      βœ“ should round now to the beginning of the y
      βœ“ should round now to the beginning of forceNow's y
      βœ“ should round now to the end of the y
      βœ“ should round now to the end of forceNow's y
      βœ“ should round now to the beginning of the ms
      βœ“ should round now to the beginning of forceNow's ms
      βœ“ should round now to the end of the ms
      βœ“ should round now to the end of forceNow's ms
    math and rounding
      βœ“ should round to the nearest second with 0 value
      βœ“ should subtract 17s, rounded to the nearest second
      βœ“ should add 555ms, rounded to the nearest millisecond
      βœ“ should subtract 555ms, rounded to the nearest second
      βœ“ should round weeks to Sunday by default
      βœ“ should round weeks based on the passed moment locale start of week setting
      βœ“ should round up weeks based on the passed moment locale start of week setting
      βœ“ should round relative to forceNow
      βœ“ should parse long expressions
    used momentjs instance
      βœ“ should use the default moment instance if parameter not specified
      βœ“ should not use default moment instance if parameter is specified
      βœ“ should work with multiple different instances
      βœ“ should use global instance after passing an instance
    units
      βœ“ should have units descending for unitsDesc
      βœ“ should have units ascending for unitsAsc

  testSubjSelector()
    βœ“ converts subjectSelectors to cssSelectors

  @osd/eslint/disallow-license-headers
    valid
      βœ“ /* license */

console.log('foo')
      βœ“ // license

console.log('foo')
    invalid
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“ console.log('foo')

  @osd/eslint/no-restricted-paths
    valid
      βœ“ import a from "../client/a.js"
      βœ“ const a = require("../client/a.js")
      βœ“ import b from "../server/b.js"
      βœ“ notrequire("../server/b.js")
      βœ“ notrequire("../server/b.js")
      βœ“ require("../server/b.js")
      βœ“ import b from "../server/b.js"
      βœ“ require("os")
      βœ“ const d = require("./deep/d.js")
      βœ“ const d = require("./deep/d.js")
      βœ“ import { X } from "./index_patterns"
    invalid
      βœ“ export { b } from "../server/b.js"
      βœ“ import b from "../server/b.js"
      βœ“ import a from "../client/a"
import c from "./c"
      βœ“ const b = require("../server/b.js")
      βœ“ const b = require("../server/b.js")
      βœ“ const d = require("./deep/d.js")
      βœ“ const d = require("src/core/server/saved_objects")
      βœ“ const d = require("ui/kfetch")
      βœ“ const d = require("ui/kfetch/public/index")
      βœ“ import { X } from "./index_patterns"

  @osd/eslint/require-license-header
    valid
      βœ“ /* license */

console.log('foo')
      βœ“ // license

console.log('foo')
    invalid
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“ console.log('foo')
      βœ“

/* license */

console.log('foo')
      βœ“ /* not license */
/* license */
console.log('foo')

  opensearchArchiver: Stats
    #skippedIndex(index)
      βœ“ marks the index as skipped
      βœ“ logs that the index was skipped
    #deletedIndex(index)
      βœ“ marks the index as deleted
      βœ“ logs that the index was deleted
    #createdIndex(index, [metadata])
      βœ“ marks the index as created
      βœ“ logs that the index was created
      with metadata
        βœ“ debug-logs each key from the metadata
      without metadata
        βœ“ no debug logging
    #archivedIndex(index, [metadata])
      βœ“ marks the index as archived
      βœ“ logs that the index was archived
      with metadata
        βœ“ debug-logs each key from the metadata
      without metadata
        βœ“ no debug logging
    #indexedDoc(index)
      βœ“ increases the docs.indexed count for the index
    #archivedDoc(index)
      βœ“ increases the docs.archived count for the index
    #toJSON()
      βœ“ returns the stats for all indexes
      βœ“ returns a deep clone of the stats
    #forEachIndex(fn)
      βœ“ iterates a clone of the index stats

  opensearchArchiver createFormatArchiveStreams
    { gzip: false }
      βœ“ returns an array of streams
      βœ“ streams consume js values and produces buffers
      βœ“ product is pretty-printed JSON separated by two newlines
    { gzip: true }
      βœ“ returns an array of streams
      βœ“ streams consume js values and produces buffers
      βœ“ output can be gunzipped
    defaults
      βœ“ product is not gzipped

  opensearchArchiver createParseArchiveStreams
    { gzip: false }
      βœ“ returns an array of streams
      streams
        βœ“ consume buffers of valid JSON
        βœ“ consume buffers of valid JSON separated by two newlines
        βœ“ provides each JSON object as soon as it is parsed
      stream errors
        βœ“ stops when any document contains invalid json
    { gzip: true }
      βœ“ returns an array of streams
      βœ“ parses blank files
      streams
        βœ“ consumes gzipped buffers of valid JSON
        βœ“ parses valid gzipped JSON strings separated by two newlines
      stream errors
        βœ“ stops when the input is not valid gzip archive
    defaults
      βœ“ does not try to gunzip the content

  opensearchArchiver: createGenerateDocRecordsStream()
    βœ“ scolls 1000 documents at a time
    βœ“ uses a 1 minute scroll timeout
    βœ“ consumes index names and scrolls completely before continuing

  opensearchArchiver: createIndexDocRecordsStream()
    βœ“ consumes doc records and sends to `_bulk` api
    βœ“ consumes multiple doc records and sends to `_bulk` api together
    βœ“ waits until request is complete before sending more
    βœ“ sends a maximum of 300 documents at a time
    βœ“ emits an error if any request fails

  opensearchArchiver: createCreateIndexStream()
    defaults
      βœ“ deletes existing indices, creates all
      βœ“ deletes existing aliases, creates all
      βœ“ passes through "hit" records
      βœ“ creates aliases
      βœ“ passes through records with unknown types
      βœ“ passes through non-record values
    skipExisting = true
      βœ“ ignores preexisting indexes
      βœ“ filters documents for skipped indices

  opensearchArchiver: createDeleteIndexStream()
    βœ“ deletes the index without checking if it exists
    βœ“ reports the delete when the index existed

  opensearchArchiver: createGenerateIndexRecordsStream()
    βœ“ consumes index names and queries for the mapping of each
    βœ“ filters index metadata from settings
    βœ“ produces one index record for each index name it receives
    βœ“ understands aliases

  opensearchArchiver: createFilterRecordsStream()
    βœ“ consumes any value
    βœ“ produces record values that have a matching type

  basic config file with a single app and test
    βœ“ runs and prints expected output

  failure hooks
    βœ“ runs and prints expected output

  readConfigFile()
    βœ“ reads config from a file, returns an instance of Config class
    βœ“ merges setting overrides into log
    βœ“ supports loading config files from within config files
    βœ“ throws if settings are invalid


  408 passing (6s)

Done in 8.62s.

[Tests] Ensure functional test cases pass

This issues is to track and ensure all the fucntional test cases are being passed.

  • Run yarn test:ftr and list down all broken test cases
  • Fix test cases / remove if not relevant anymore.

[PURIFY] create OSS distribution only and remove Default distribution.

The Default distribution is generated by Elastic license, we should remove it and only create OSS distribution and no other distribution.

Clean up distribution

  • Clean up non OSS distributions
  • Clean up non OSS distributions for all platforms

Purify buildtools

  • Remove all logic to check the distribution flavor and make the OSS as the only option.(i.e remove all other distribution options and remove --oss flag)
  • Remove all tests for different flavors

[BUG] Sample data path from Overview page is broken

When navigating from the home page to the sample data page using the 'Try our sample data' link, it routes properly but if you use the same link from the overview page it breaks and gives an empty page.

This is likely because the code is currently just appending the /tutorial_directory/sampleData path to the current path the user is on. When navigating from overview, I see the path as /opensearch_dashboards_overview#/tutorial_directory/sampleData but it should always use /home#/tutorial_directory/sampleData

Remove update snapshot on .github/pr_check_workflow.yml

Is your feature request related to a problem? Please describe.

When adding yarn test:jest to .github/pr_check_workflow.yml the workflow fails due to out of date snapshots. So we can some tests running we added -u to the workflow so that the snapshots update and the test can pass just like how we did the Jenkins file. But this bad because we should be failing tests if the code author have not updated and committed the snapshots.

Describe the solution you'd like

Figure out why yarn test:jest is failing, fix it, and remove the -u to update snapshots in the CI.

Describe alternatives you've considered

n/a

Additional context

n/a

Proposal: OpenSearch Dashboards Search Multiple Indices

Requirements - What kind of business use case are you trying to solve?

The ability to search multiple indices at the same time with a single query

Problem - What blocks you from solving this today?

Today the user interface in Kibana allows for the query to only run against a single index.

Proposal - what do you suggest to solve the problem or improve the existing situation?

We should modify Kibana to select multiple indices at the same time via a checkbox and run the corresponding query. Today you can only pick one at a time.

image

Assumptions

Installation of Kibana

Any open questions to address

None

Thanks from @jkowall and @galangel

[CI] Enable CI for project

  • Enable initial building on CI
  • Enable Unit test cases
  • Enable Integration tests
  • Enable building distributions.

[BUG] Errors building docker container

I am trying to build a docker container with OpenSearch-Dashboards, but this fails with the following error:

   β”‚      Step 23/31 : COPY --chown=1000:0 config/kibana.yml /usr/share/kibana/config/kibana.yml
   β”‚          at makeError (/mnt/build/OpenSearch-Dashboards/node_modules/execa/lib/error.js:59:11)
   β”‚          at handlePromise (/mnt/build/OpenSearch-Dashboards/node_modules/execa/index.js:114:26)
   β”‚          at process._tickCallback (internal/process/next_tick.js:68:7)

error Command failed with exit code 1.

The Dockerfile that I think is generated by src/dev/build/tasks/os_packages/docker_generator/templates/Dockerfile seems to reference kibana many times so guessing needs to be updated?

To Reproduce
Steps to reproduce the behavior:
Run the following commands

yarn osd bootstrap --force-install --allow-root
yarn build --docker

Expected behavior
A docker container is built

[CI] Cleanup CI directory

Currently , .ci directory has coupled with upstream infrastructure, remove it and start fresh for CI.

[RENAME] Renaming from kibana -> opensearch Dashboards

track progress toward renaming from kibana to the new opensearch dashboards project

Methods / Comments / Variables / Docs / Classes / Interfaces

Old New
elasticsearch opensearch
es opensearch
Es OpenSearch
kibana opensearchDashboards
Kibana OpenSearchDashboards
kbn osd
KBN OSD
Elasticsearch OpenSearch

Files

Old New
kbn-* osd_*
kibana opensearch_dashboards_*

Directories

Old New
kbn-* osd-*
kibana opensearch-dashboards-*

packages

  • .ci
  • elastic-datemath
  • elastic-eslint-config-kibana
  • elastic-safer-lodash-set
  • kbn-ace #40
  • kbn-analytics #44
  • kbn-apm-config-loader #78
  • kbn-babel-preset #52
  • kbn-config #64
  • kbn-config-schema #71
  • kbn-dev-utils #74
  • kbn-es #79
  • kbn-es-archiver #42
  • kbn-eslint-import-resolver-kibana #45
  • kbn-eslint-plugin-eslint #46
  • kbn-expect #47
  • kbn-i18n #48
  • kbn-interpreter #49
  • kbn-logging #50
  • kbn-monaco #51
  • kbn-optimizer #59
  • kbn-plugin-generator #62
  • kbn-plugin-helpers
  • kbn-pm
  • kbn-release-notes
  • kbn-spec-to-console
  • kbn-std
  • kbn-storybook
  • kbn-telemetry-tools
  • kbn-test
  • kbn-test-subj-selector #43
  • kbn-ui-framework #41
  • kbn-ui-shared-deps
  • kbn-utility-types #39
  • kbn-utils #38

src/plugins

  • advanced_settings
  • apm_oss
  • bfetch
  • charts
  • console
  • dashboard
  • data
  • dev_tools
  • discover
  • embeddable
  • es_ui_shared
  • expressions
  • home
  • index_pattern_management
  • input_control_vis
  • inspector
  • kibana_legacy
  • kibana_overview
  • kibana_react
  • kibana_usage_collection
  • kibana_utils
  • legacy_export
  • management
  • maps_legacy
  • navigation
  • newsfeed
  • region_map
  • saved_objects
  • saved_objects_management
  • security_oss
  • share
  • telemetry
  • telemetry_collection_manager
  • telemetry_management_section
  • tile_map
  • timelion
  • ui_actions
  • url_forwarding #94
  • usage_collection
  • vis_default_editor
  • vis_type_markdown
  • vis_type_metric
  • vis_type_table
  • vis_type_tagcloud
  • vis_type_timelion
  • vis_type_timeseries
  • vis_type_vega #88
  • vis_type_vislib
  • vis_type_xy
  • visualizations
  • visualize

Write unit tests for restricted endpoints and upstream communication

We need unit tests that make sure that no PRs or code is trying to hit the elastic telemetry endpoints such as /api/ui_metric/report or other telemetry endpoints

We also need unit tests that fail if opensearch_dashboards.json manifest files exist for plugins we have disabled such as Newsfeed and Telemetry Management Section

[BUG] package contains legacy information

Describe the bug

package.json contains legacy information.

  • "homepage": "https://www.opensearch.co/products/kibana", - this URL will not exist.
  • build points to a number and sha hash that are probably not valid any longer.

It's questionable to have the author field still pointing to someone not involved with the fork - I'll leave this call to someone else.

[Versioning] Release Process and Versioning for OpenSearch Dashboards

What is our release cadence

What model of versioning will be used?

New forks will start using the Apache Versioning

How will OpenSearch and OpenSearch-Dashboards versions be related to each other?

OpenSearch Dashboards and OpenSearch will release major version together. They will NOT synchronize minor release β€” whenever the team feels they’re ready to release a minor version or patch (modulo the schedule above), they should release.

What we guarantee is that any major release of OpenSearch Dashboards is compatible with the same major release of OpenSearch. For example: 3.2.1 of NotKibana will work with 3.0.4 of NotElasticsearch, but 2.3.1 of NotKibana is not guaranteed to work with 3.0.4 of NotElasticsearch

Breaking Changes and Backwards Compatibility

We will not release any breaking changes except in major releases.

More on versioning has been discussed opensearch-project/OpenSearch#95

Try

Is your feature request related to a problem? Please describe.

A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like

A clear and concise description of what you want to happen.

Describe alternatives you've considered

A clear and concise description of any alternative solutions or features you've considered.

Additional context

Add any other context or screenshots about the feature request here.

[BUG]Readme OpenSearch Github Repo Link

Describe the bug

The READMe says to clone OpenSearch as a pre-requisite to run OpenSearch Dashboards but the link for OpenSearch Engine is of OpenSearch Dashboards. Can we rectify the README?

[Tests] Ensure jest Integrations test cases pass

This issues is to track and ensure all the jet integrations test cases are being passed. As part of rename exercises we have excluded snapshot files and will rely on jest to update them.

  • Update automated snapshots Run yarn test:jest_integration -u
  • Run yarn test:jest_integration and list down all broken test cases
  • Fix test cases / remove if not relevant anymore.

Maintainer Areas of Responsibility

It would be great if we could also add people's areas of responsibilities to the new MAINTAINERS file. This helps people understand which of these maintainers is responsible for a particular area. We could add this as a new column to the existing table or as a separate section of the file. I'm happy to do if someone wanted to let me know which people are responsible for what.

Here's an example.

The motivation for my suggestion is to help the rest of us outside of Amazon better understand the maintainers and what they are likely to be working on. The key here is likely and not exclusively. Having a bit more info helps the rest of us:

  • understand who to talk to if we need to escalate a PR if it gets neglected for a long time or if something urgent comes up.
  • learn about potential gaps that we could fill - maybe no one from Amazon is focused in a particular area that one of us has expertise in.
  • we're all individuals interacting with each other, and if we all know a bit more about the people in leadership positions, it can help the rest of us understand where you're coming from.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.