rallytime / tamarack Goto Github PK
View Code? Open in Web Editor NEWA bot built with Tornado to automate common pull request tasks. Specifically designed for use with the SaltStack repositories.
License: Apache License 2.0
A bot built with Tornado to automate common pull request tasks. Specifically designed for use with the SaltStack repositories.
License: Apache License 2.0
Some of the files changed in saltstack/salt#47238 are salt-ssh
files and the team-ssh
should have been requested to review the PR via tamarack.
Look into where this is falling down and fix.
Because of the the way review_request
events get triggered instead of opened
on some PRs, this action handling will occasionally re-trigger review requests on PRs, even if the team has already approved the PR.
You can see this happening here: saltstack/salt#46002
When I requested the review from Mike, it triggered a review_request
event and so the bot added team-core
and team-transport
back on the review list, even though those teams have already approve the PR. We need to account for this here.
If the team has already approved the PR, we need to make sure we filter that out in the API event back to GitHub.
In some circumstances, environment variables are nice and good to use. However, as the project grows, it would be nice to configure certain options. Before we can figure out what those options are, we need to add support for a config file in general.
A great option to move from environment variable to config file would be the PORT
option. We can build on it from there.
As the title says, this project needs a proper setup.py file.
mention-bot has not been working correctly for months now, and today they archived the project.
This is a useful service and we could still benefit from it greatly. Let's implement it here with the correct "this was inspired by" text.
The ability to comment on the given PR is already there. We just need to parse out the logic of how to determine who should be pinged in the review request comment.
We need to add the ability to comment on PRs we test results come back from Jenkins. The easiest one to start with would likely be the pylint
run. If the pylint run fails are there are results available, comment on the PR to let the user know.
Something like:
@<user-name>: There is a lint failure on this pull request. The lint items must be fixed before the pull request can be merged.
<link-to-lint-violations-page>
Occasionally tests fail for various reasons and we don't get any results back. We should add the ability to check for test results and if they're missing, automatically re-run those tests.
The easiest ones to start with would be the docs and lint tests.
The scenario could go something like:
This depends on #5 being completed. It also would make sense to fix #3 before this issue.
Originally, GitHub was still working out the API for requesting team reviews. I no longer see the warnings about this feature being in development mode on the API docs, so I need to add this ability. It should be easy to add.
We can keep the comment functionality that is currently there, but it is probably just better to auto-request team reviewers instead.
Relevant API docs: https://developer.github.com/v3/pulls/review_requests/#create-a-review-request and see team_reviewers
input.
Research how Jenkins sends information from test runs back to GitHub as the results for test runs come in and start coding the building blocks for reacting to Jenkins events.
To get this up and running, it was easy to just print output to the console. Before this project gets too much bigger, we should convert these statements to use a log handler.
We want to be able to simply notify people when a new branch is created in the repo. This can be simple, using Slack's Incoming Webhooks.
Workflow:
Tasks:
The SUSE folks want to help review any type of PRs that the core team is requested to review. Rather than duplicating everything in the CODEOWNERS file, just handle that programmatically here.
We don't need to request reviews from all teams on merge-forwards. GitHub will request a review from team-core
automatically because of the CODEOWNERS file and the team permissions, but we can skip these types of PRs for the other teams.
Easiest thing to do would be to add a check to search for "Merge forward" in the title of the PR, like we used to do for mention bot.
Some unit tests need to be created for this repo so we can get started on the right foot. Let's use pytest ;)
@gtmanfred - Let's do this! We can chat about what this might look like moving forward if you like.
I will work on getting some docs together for this repo as well. But for now, the python parts should be able to have some tests running quickly. Then we can make a plan for getting tests running on PRs using travis or jenkins or whatever you like for now.
If a PR is merged and causes another PR to have a merge conflict, we should have the bot comment on the PR and ask the user to fix it and rebase.
Is there an event from GitHub that we can react to for this? If yes, this is easiest.
If no, we could poll through the list of PRs and comment. However, we need a mechanism for if the bot has already notified the user of the conflict. (Sort of like how stale-bot uses the "stale" label to track what issues are newly stale and what issues should be closed.)
There will be a lot of docs to add, but here's a list to get started:
Now that it's possible to assign reviewers directly on PRs, the GitHub webhook payload has changed as well. Let's leverage this instead of pulling down the CODEOWNERS file and parsing it. That's extra work we don't need to do.
The team payload looks something like this:
{
"action": "review_requested",
"number": 45897,
"pull_request": {
"url": "https://api.github.com/repos/saltstack/salt/pulls/45897",
"id": 167730763,
<snipped>
"requested_reviewers": [
],
"requested_teams": [
{
"name": "team-boto",
"id": foo,
"slug": "team-boto",
"description": "Boto Reviewers",
<snipped>
}
],
<snipped>
}
So, let's use that list of requested_teams
and figure out what requested_reviewers
might look like and handle that as well.
The way we're doing this currently is fine and it works, but we can reduce the # of API calls by half if we just use the data in the given webhook payload.
Use this issue to post comments to for automated testing. This issue can largely be ignored. ๐
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.