Coder Social home page Coder Social logo

heupr's Introduction

heupr

The service app ๐Ÿ”ง

Build Status Go Report Card Coverage Status

Introduction

Heupr automates project management for software teams working on GitHub. Our goal is to work towards building features and services that allow developers and managers to stay in the "flow zone" and do what they do best: write code!

Many projects can benefit from automating management tasks and Heupr is designed to provide a platform to do so quickly. The app can be easily installed on a target GitHub repository and configured by including a modular .heupr.yml file in the root directory. Any of the core packages, which provide the various feature functionalities such as issue assignment or pull request estimates, can be included in the configuration setup and there are plans to provide inclusion for any third-party packages made available in the community.

See the open Issues for information regarding current limitations to the platform and the status of work being done.

"Heupr" is a portmanteau of the words "heuristic" and "projects" and is pronounced "hew-Per."

Contributing

Check out our contribution and conduct guidelines; jump in and get involved!

We're excited to have you working with us!

Code

Here are a few quick points to help get you started working on the core Heupr repository:

  • We follow test-driven development (TDD) on this project so be sure to build test cases alongside the production code; Travis-CI and Coveralls will ensure that they are run and that our coverage is adequate but feel free to test things out locally too.
  • Our overall design goal is to be as "plug-n-play" as possible so new packages or features can be added easily; keep everything modular and minimal.
  • All of our code should be clean and readable so be sure to run gofmt and golint on your code - this is also checked by Travis-CI just to be safe!

Packages

NOTE: External, third-party packages do not yet have support but this is a planned feature. At the moment, if a third-party package is generally beneficial, it could be included in the "built-in" packages provided by Heupr. The guidelines below are for the planned external package support.

If you want to contribute a package that can be used by Heupr's backend, here are some guidelines:

  • Packages need to conform to the Backend interface provided by the backend package in the core repo, here.
  • A publicly accessible .so plugin file location needs to be provided via URL so that it can be referenced as a third-party backend package in your or another user's Heupr instance.
  • If you feel like you've got a really cool package, feel free to reach out to the project maintainers and request to have it added to the core packages - we'd love to include it!

Contact

Feel free to reach out to us on Twitter! if you're unable to find the answers you need in this README, repositories Issues tabs, or the Wiki FAQ page.

heupr's People

Contributors

forstmeier avatar taylormike avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

heupr's Issues

Create backend processing "new" function

Description

This will be just a general "generator"-style function that will be publicly available for use by the instantiating caller. The first iteration of this function (and probably the default) will just return an ElasticSearch instance that can be plugged into the backend to serve as the decision engine.

Right now, the design will be essentially that the New function will reference an externally instantiated ES instance (likely the first cut/default will be a Heroku add-on).

Specifics

  • build instantiate/launch node
  • hook up necessary interfaces (if needed)
  • return Backend struct with running node

Also:

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Refactor backend package into sub-package

Description

In order to make the application multi-functional, the resources currently stored directly in backend/ need to be factored out into their own sub-package. Other backend packages will then be able to be added.

Specifics

Some initial points:

  • package named estimatepr/ (or something else)
  • all existing file resources moved into it
  • adjusted import references throughout rest of codebase

Additional changes will need to be made to the frontend/ package to reflect these changes and the "root" logic will need to be updated to account for multiple sub-packages. See #18 for reference.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Build dependent-pin logic into preprocess

Description

This is one of the major additions I'd like to make to the platform. The goal here is to make the identification of the particular class for a given feature's model more flexible and configurable to the end user. Essentially, this identifies which contributor "owns" a particular object in GitHub (e.g. a piece of text); in model terms, this is responsible for identifying the dependent variables in the training process.

This should be as general as possible but in the first iteration, it will likely still be pretty heavily tied to the identification of a particular GitHub user (as opposed to some other object).

Specifics

We'll gradually expand these TODOs going forward.

  • build helper function (or interface)
  • arg(s): web struct
  • output(s): updated web struct with identified dependent (GitHub user)
  • build directly into the api/ module (but keep modular as possible)

Note that this will likely come after/alongside the completion of #1. Additionally, this will be closely related to the logic built in #3.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Create minimal issue assignment backend logic

Description

This would be to reintroduce the currently missing feature of issue assignment. Exact specifications as to how to accomplish this may change but using ElasticSearch might be the quickest solution

Specifics

Some points:

  • build "handler"
    • accept issue body input (possibly additional fields or whole struct)
    • return predicted username
  • create wrapper around ES API for mocking
  • set/reference necessary ES environment variables

This will come after the work in #18 and #19. Also see #21 for frontend reference.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Add in error checks for the reading methods called in Server.Start()

Description

Currently, we have two methods that read from the database but don't actually evaluate the errors that are generated; we need to implement a check for these and also update the unit test coverage accordingly.

Specifics

  • add in an error check for the readIntegrations method
  • add in an error check for the readSettings method
  • add in cases to the table-driven tests in server_test.go
    • scenario where readIntegrations returns an error
    • scenario where readSettings returns an error

Note that these changes may require some refactoring in order to effectively mock out these method calls; please reach out to the maintainers if additional guidance is needed.

Also, please see #4 for additional color on unit test coverage expansion.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Build "create" method for feature database

Description

It probably makes sense to build out a series of helper methods to encompass the database and keep the actual "feature processing" logic separate; however, these database methods will likely be called by the "feature processing" resources.

This will need to be able to receive new features and insert the object into the target database. Basically, we want the "C" in "CRUD."

Specifics

  • create a retrieve call using the interface
  • check if object already exists
    • true: perform update (possibly)
    • false: insert object
  • comprehensive table-based unit tests

Also, see #8, #10, and #11 for the other CRUD operations we'll (likely) need. This is also an MVP of the function that we'll need going forward (fleshing this out will be dependent on the structure of the feature struct).

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Re-implement unit tests for the server start method

Description

The server_test.go needs to have the currently commented-out unit test reimplemented to work towards more complete coverage (and ideally better adherence to TDD).

Specifics

  • follow the tests := []struct{} table-driven pattern
  • include the following scenarios at a minimum:
    • no repos in the database
    • one repo in the database
    • three repos in the database

There may be more worth adding but these will be included in a separate issue.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Flatten backend directory structure

Description

Some of the packaging within backend/ is likely unnecessary; the only real case I can think of would be to support easier unit tests (e.g. worker logic) but if that's an issue, it should be solved on a case-by-case basis. This also ties in with the goals of implementing a dependent-pin and feature-web.

Specifics

This is a preliminary list and will be expanded with more details going forward.

  • move sub-directories into root
    • preprocess/ folder
    • process/ folder
    • response/ folder
  • change all package statements to backend
  • build new interfaces
    • Model for learn/predict operations (this might need a rename)
    • Preprocessor for feature web and dependent pin logic

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Build "read" method for feature database

Description

It probably makes sense to build out a series of helper methods to encompass the database and keep the actual "feature processing" logic separate; however, these database methods will likely be called by the "feature processing" resources.

This will need to be able to retrieve target features from the database in an ad-hoc manner. Basically, we want the "R" in "CRUD."

Specifics

  • accept feature ID as argument (likely ID)
  • return full feature struct/object
  • comprehensive table-based unit tests

Also, see #7, #10, and #11 for the other CRUD operations we'll (likely) need. This is also an MVP of the function that we'll need going forward (fleshing this out will be dependent on the structure of the feature struct).

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Remove feature index from api/ package

Description

Although the "feature index" included in the api/ package does create some potential value, it isn't likely of much value currently. It was possibly going to be a source for a "replay server" of GitHub event/object data, but now that I think about it, it's probably better served as a standalone backend/ module.

The api/ package will likely now pivot to primarily focus on absorbing requests from the frontend/ package and coordinating outbound requests to various backend/ packages. For starters, this will likely be a simple 1:1 relationship with more complex routing being implemented as more backend combinations are presented.

Specifics

  • remove feature.go file
    • move reference number identifier logic to backend resources
    • move any required unmarshalling logic to backend resources
  • remove index.go file (nothing here is actually needed)
  • delete all associated _test.go files
  • adjust api.go in-code comments to indicate simple Learn/Predict calls without the "feature calls"

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Add unit test for database.readSettings()

Description

This is one of the three unit tests that still need to be built for the database methods; these could be a bit tricky and may require some refactoring to be fully flexible.

Specifics

  • include the scenarios below as a minimum
    • d.sqlDB.Query returns an error
    • toml.Decode returns an error
    • successful method call

Other cases are encouraged but not required at this time as they might require additional refactoring.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Build feature-web logic into preprocess

Description

This is similar to the normalization/conflation logic that we have in heupr/core but would ideally be more configurable and more flexible to allow different elements of the GitHub UI to be tied together (e.g. to create blobs of text from more sources other than just Issues and Pull Requests for text models). Essentially, this creates "webs" from feature "nodes" to be fed into models for training/prediction; this also creates an object hierarchy which would allow things like partial credit to enter the system.

The results of this implementation will be fed into a graph database (likely) that will like in the api/ module to be referenced when new events are received. This will need to have logic to handle searching/retrieving the existing "webs" in the database and to update them.

Specifics

  • create interface struct (e.g. web)
    • insertMultiple: bulk feature creation, create() database call, typically called by Learn (when a repo starts up
    • insertSingle: individual feature input, update or newly received
  • add an initial package implementation of the interface
  • include calls to web database management methods
  • embed directly into api/ module (for now)
  • test for method input/output expected results

Note that this logic will likely be called before the logic in #2; it may need to be a part of the same interface as well.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Build "update" method for feature database

Description

It probably makes sense to build out a series of helper methods to encompass the database and keep the actual "feature processing" logic separate; however, these database methods will likely be called by the "feature processing" resources.

This will need to be able to update existing feature objects in the database with new information. Basically, we want the "U" in "CRUD."

Specifics

  • accept a full feature struct argument
  • place object into database (likely with overwrite)
  • include test cases (table-driven)

Note that this particular method may also need to be able to call the delete method in the event that the update logic combines features together.

Also, see #7, #8, and #11 for the other CRUD operations we'll (likely) need. This is also an MVP of the function that we'll need going forward (fleshing this out will be dependent on the structure of the feature struct).

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Refactor frontend package into sub-package

Description

In order to make the application multi-functional, the resources currently stored directly in frontend/ need to be factored out into their own sub-package. Other frontend packages will then be able to be added.

Specifics

Some initial points:

  • package named estimatepr/ (or something else)
  • all existing file resources moved into it
  • adjusted import references throughout rest of codebase

Additional changes will need to be made to the backend/ package to reflect these changes and the "root" logic will need to be updated to account for multiple sub-packages. See #19 for reference.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Create CloudFormation template for application deployment

Description

This will be a JSON file containing the definition of the resources required to launch the application in AWS; most of this stuff will be for the benefit of third-party users looking to spin up their own instance of the app but it will also be good to have for internal use.

These specifications still need to be sorted out/expanded and it's possible a serverless approach will be adopted which will change the requirements.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Build "delete" method for feature database

Description

It probably makes sense to build out a series of helper methods to encompass the database and keep the actual "feature processing" logic separate; however, these database methods will likely be called by the "feature processing" resources.

This will need to be able to delete feature objects in the database as needed; this could potentially be necessary in the event that an update action combines features or something along those lines. Basically, we want the "D" in "CRUD."

Specifics

  • accept an object ID argument
  • delete entire feature object from database
  • include test cases (table-driven)

Also, see #7, #8, and #10 for the other CRUD operations we'll (likely) need. This is also an MVP of the function that we'll need going forward (fleshing this out will be dependent on the structure of the feature struct).

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Add unit tests for processRepoInstallation() method

Description

The tests have been temporarily removed from the ingestor/worker_test.go resources; they need to be added back and the actual production code needs to be reevaluated as well.

Specifics

  • test for "created" scenario
  • test for "deleted" scenario
  • remove anonymous function from production code
  • add error handling to production code
  • add result logging
  • add installed repo to in-memory storage

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Build Learn/Predict backend logic methods

Description

Both the bulk Learn and the transactional Predict methods need to be implemented. They'll be called in their respective API event handlers.

Specifics

Most of the actual processing will be the need to web the various objects together and combine the text in a way that is meaningful. Some generalized stopwords removal and lowercasing would also likely be needed.

  • create generalized internal object w/ required JSON field tags
  • build Learn method
    • parse byte array into internal object
    • parse received feature struct pointer slice
    • construct initial/simple "webbing" logic to combine title/body text fields together
    • index completed text blobs w/ associated contributor to ES
  • build Predict method
    • parse received byte object and feature struct pointer
    • perform matching search on the ES index
    • format returned prediction scores
  • build generalized stopwords removal (see heupr/core fore guidance)
  • build lowercasing helper method

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Create frontend/ package GitHub client and bulk request method

Description

This is a twofold project for the frontend/ package.

The first part will be creating the required GitHub client resources on for the frontend/ package so that it can interact with the API. Sufficient, but least required, access should be provided to this authenticated client. Additionally, I still need to sort out exactly how I want to provide the client access to the user's GitHub account/resources (app keys?).

Additionally, a "bulk ingestion" method needs to be built into the frontend/. This will be needed to pull down and load in all of the necessary data that will be used to "train" whatever backend models are in place. It will need to 1) be able to be triggered by a request from the CLI when the Heupr app is launched and 2) be able to be configured to only pull down the events/objects that are needed for the associated backend services.

Specifics

  • GitHub frontend/ client resources
    • embedded resource in the Frontend struct
    • created with the New() method call
    • accepted arguments:
      • user credentials (authentication keys possibly - still not confirmed)
  • "bulk data retrieval" method
    • called by separate event handler on frontend server
    • event handler listens to CLI request
    • accepted arguments:
      • target owner/repo strings (or IDs if preferred)
      • specified object/events to pull down for backend model "training"
      • note: these are also possibly part of the request received by the parent handler
    • makes required request to api/ package server "learn" endpoint

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Create minimal issue assignment frontend logic

Description

In order to process issues specifically, some adjustments may need to be made to the frontend to provide an additional sub-package.

Specifics

The application will need to be able to consume issues and potentially do some minor "webbing" between resources prior to passing them into the backend sub-package for processing.

  • create "handler"
    • process individually raised/new issues
    • bulk process existing issues for indexing into ES
  • add in unit tests for handler(s)

See #20 for backend reference.

Hey, @taylormike and @forstmeier - here's a new issue for you to check out!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.