Coder Social home page Coder Social logo

dev-team-enablement's People

Contributors

chriscool avatar daviddias avatar jessicaschilling avatar stebalien avatar victorb avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

isabella232

dev-team-enablement's Issues

Development flow/process with sharness in it's own repository

Now, with sharness in it's own repository, we need to figure out a development flow that works for everyone. The goal is to allow developers to make changes in sharness without having to update the implementations until they are ready. The idea is similar to how you decouple DB migrations and backend deployments from each other so they can be run when appropriate.

So the current thinking is some flow like this:

  • New Feature

    • Add new feature tests in sharness
    • New go-ipfs PRs will run sharness as usual via triggering a build after go-ipfs builds
    • If a test that never passed before is not passing, don't treat this as a test failure[0]
    • If the test have passed before, but is not passing now, it's a failure
  • Bugfix

    • If something was fixed as a bugfix (maybe the asserted hash was wrong), we should be able to
      fix it async from other fixes, as to not stop development. Changed tests needs to be marked somehow that they can now fail, but don't treat it as such
  • [0] This is to avoid new changes in sharness to now start failing bunch of PRs that are not related to the change at all.

Goal of this issue is to come forward with a process that works for the teams involved (mainly go-ipfs, js-ipfs and rs-ipfs (when that happens))

cc @chriscool @travisperson @diasdavid @whyrusleeping

OKR: Test isolation (`interface-ipfs-core`)

https://github.com/ipfs/testing/projects/1#card-9340002

The recent refactor to interface-ipfs-core did a lot to help with improving test isolation. There is currently only a single test which fails to run in isolation from the others (from js-ipfs-api under go-ipfs v0.4.15)

https://github.com/ipfs/interface-ipfs-core/issues/320

It also appears that this is more of an issue with js-ipfs-api as well, it just snuck through as valid because go-ipfs was rejecting the type change of the config key.

Repository listings

Allow website jobs to choose hugo version

From ipfs-inactive/docs#68 (comment)

Currently, websites using jenkins to deploy to ipfs cannot chose which hugo version to use.

What we want to enable, is that the website() function in the ipfs/jenkins-libs repository, should accept a hugo_version argument, where projects can specify exactly which hugo version to use.

For this to work, we should tag our ci-websites image with the version from hugo, build and push those to Docker Hub.

Currently failing JS projects

The following projects should be checked and observed why they are failing, fixing any issue that is on CI's side rather than on the project itself.

Progressive Testing

Does anyone know of a better name / the actual name for this?


This relates to #45, interface-ipfs-core, as well as other interface testing projects.

The goal of, what I'm calling here at the moment, progressive testing, is to enable a way to run a test suite that may contain failing tests and still get value out of the results. It also aims to remove implementation specific workarounds inside the interface tests themselves, which current exists quite extensively insight of interface-ipfs-core.

The basic idea is to allow failing tests to not effect the outcome of a test suite as long as they have not been previously proven to have passed.

I think this is important because IPFS and libp2p are build around the idea of interfaces and protocols. These ideas enable a powerful way to build and compose software, but I think it's important to also make it easy to consume and work with the tests.

libp2p

ipfs

Protocol Driven Development https://github.com/ipfs/pdd

The issue at the moment is due to the simplicity of must test runner and reporters. A suite of tests is consider to have failed if any single test fails. Generally communicated through the runners exit code.

When starting to work on a new project this means you are in a continue state of failure till a full implementation has been completed. To get around this issue at the moment we rely on making modifications to the tests themselves for each implementation (see interface-ipfs-core).

One possible solution is to define a new kind of reporter. One that has previous knowledge of which tests have passed prior. Any previously passing test failing results in a failure of the suite. If a test has never passed it is considers skipped.

This enables a project to introduce the entire test suite and incrementally move towards 100% test completion.

This does have a drawback though, as it does require tracking previous state of a tests.

@alanshaw has made some progress with this by allowing more finally controlled execution of tests by breaking things down into smaller sections.

ipfs-inactive/interface-js-ipfs-core#290

This is a great step in the right direction, as it enables projects to incrementally build and introduce subsystems. However, we still run into the issue of a new test being added and breaking implementations that have yet to introduce the new feature.

Custom npm registry running with jenkins master

The npm registry is still having issues serving packages, returning 404 whenever it feels like it. We should run our own package cache that automatically retries to get packages that gets returned a 404, at least a couple of times.

Would make build times faster but most importantly, make builds more stable.

Regular weekly sync?

Might be useful to start having sync meetings so we can ensure that we're making progress on the right things.

@travisperson for now it'll just be us (and interested people), but in the future more people so might be good that we start with it right away. What you think?

Tracking requests

Is there a way to know if a request was acknowledge and what is the expectation to have it done?

CI status badge

We need Jenkins CI badges so that we can replace the Jenkins, Circle and Appveyor on README

image

OKR: Reduce complexity (interface-ipfs-core)

https://github.com/ipfs/testing/projects/1#card-9340136

A large refactor of interface-ipfs-core project (ipfs-inactive/interface-js-ipfs-core#290) was merged last week. The new organization of tests make it easier to find test for any given command, as the test structure mimics the js-ipfs / js-ipfs-api command layouts.

I recently opened an issue (https://github.com/ipfs/interface-ipfs-core/issues/313) to build on top of this refactor to help address some of the patterns and conventions around writing tests to help reduce code duplication and code complexity.

Currently seeking feedback before moving forward.

Setup Jenkinsfile for sharness to pull down go-ipfs to run with sharness

As a first step with sharness, we should add it to Jenkins now when it's under the IPFS organization.

Plan is to:

  • Have Jenkins find the latest successful build of go-ipfs
  • Clone sharness
  • Put the go-ipfs binary in the right place
  • Run sharness
  • Save the tests results
  • Figure out how to set Commit Status on other projects (for when sharness will be triggered from go-ipfs PR, we need to set cross-project Commit Status)

cc @chriscool

Tracking OKRs 2018 Q2

WIP issue to organize information obtained from team leads of the rest of the WGs

ipfs-cluster (Hector)

  • Jenkins needs to handle secrets for code coverage uploading
  • kubernetes-ipfs might need support for automation. Also handle secrets
  • test-lab, we're both unsure if this falls under DX WG

Interop tests

We have some tests designed to guarantee interoperability between the js and go implementations (also these ones to a degree although I don't believe it's their primary purpose).

Are they run as part of CI or the release process anywhere? They should really prevent releases if they fail.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.