Coder Social home page Coder Social logo

github / issue-metrics Goto Github PK

View Code? Open in Web Editor NEW
371.0 112.0 46.0 950 KB

Gather metrics on issues/prs/discussions such as time to first response, count of issues opened, closed, etc.

Home Page: https://github.blog/2023-07-19-metrics-for-issues-pull-requests-and-discussions/

License: MIT License

Dockerfile 0.69% Makefile 0.45% Python 98.86%
issues metrics ospo actions github-actions python hacktoberfest

issue-metrics's Introduction

Issue Metrics Action

CodeQL Docker Image CI Python package OpenSSF Scorecard

This is a GitHub Action that searches for issues/pull requests/discussions in a repository, measures several metrics, and generates a report in form of a GitHub issue. The issues/pull requests/discussions to search for can be filtered by using a search query.

This action, developed by GitHub OSPO for our internal use, is open-sourced for your potential benefit. Feel free to inquire about its usage by creating an issue in this repository.

Available Metrics

Metric Description
Time to First Response The duration from creation to the initial comment or review.*
Time to Close The period from creation to closure.*
Time to Answer (Discussions Only) The time from creation to an answer.
Time in Label The duration from label application to removal, requires LABELS_TO_MEASURE env variable.

*For pull requests, these metrics exclude the time the PR was in draft mode.

*For issues and pull requests, comments by issue/pull request author's and comments by bots are excluded.

To find syntax for search queries, check out the documentation on searching issues and pull requests or searching discussions.

Sample Report

The output of this action is a report in form of a GitHub issue. Below you see a sample of such a GitHub issue.

Sample GitHub issue created by the issue/metrics GitHub Action

Getting Started

Create a workflow file (ie. .github/workflows/issue-metrics.yml) in your repository with the following contents:

Note: repo:owner/repo is the repository you want to measure metrics on

name: Monthly issue metrics
on:
  workflow_dispatch:
  schedule:
    - cron: '3 2 1 * *'

permissions:
  contents: read

jobs:
  build:
    name: issue metrics
    runs-on: ubuntu-latest
    permissions:
      issues: write
      pull-requests: read
    steps:
    - name: Get dates for last month
      shell: bash
      run: |
        # Calculate the first day of the previous month
        first_day=$(date -d "last month" +%Y-%m-01)

        # Calculate the last day of the previous month
        last_day=$(date -d "$first_day +1 month -1 day" +%Y-%m-%d)

        #Set an environment variable with the date range
        echo "$first_day..$last_day"
        echo "last_month=$first_day..$last_day" >> "$GITHUB_ENV"

    - name: Run issue-metrics tool
      uses: github/issue-metrics@v3
      env:
        GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        SEARCH_QUERY: 'repo:owner/repo is:issue created:${{ env.last_month }} -reason:"not planned"'

    - name: Create issue
      uses: peter-evans/create-issue-from-file@v5
      with:
        title: Monthly issue metrics report
        token: ${{ secrets.GITHUB_TOKEN }}
        content-filepath: ./issue_metrics.md

Example use cases

  • As a maintainer, I want to see metrics for issues and pull requests on the repository I maintain in order to ensure I am giving them the proper amount of attention.
  • As a first responder on a repository, I want to ensure that users are getting contact from me in a reasonable amount of time.
  • As an OSPO, I want to see how many open source repository requests are open/closed, and metrics for how long it takes to get through the open source process.
  • As a product development team, I want to see metrics around how long pull request reviews are taking, so that we can reflect on that data during retrospectives.

Support

If you need support using this project or have questions about it, please open up an issue in this repository. Requests made directly to GitHub staff or support team will be redirected here to open an issue. GitHub SLA's and support/services contracts do not apply to this repository.

OSPO GitHub Actions as a Whole

All feedback regarding our GitHub Actions, as a whole, should be communicated through issues on our github-ospo repository.

Use as a GitHub Action

  1. Create a repository to host this GitHub Action or select an existing repository. This is easiest if it is the same repository as the one you want to measure metrics on.
  2. Select a best fit workflow file from the examples directory for your use case.
  3. Copy that example into your repository (from step 1) and into the proper directory for GitHub Actions: .github/workflows/ directory with the file extension .yml (ie. .github/workflows/issue-metrics.yml)
  4. Edit the values (SEARCH_QUERY, assignees) from the sample workflow with your information. See the SEARCH_QUERY section for more information on how to configure the search query.
  5. If you are running metrics on a repository other than the one where the workflow file is going to be, then update the value of GH_TOKEN.
    • Do this by creating a GitHub API token with permissions to read the repository and write issues.
    • Then take the value of the API token you just created, and create a repository secret where the name of the secret is GH_TOKEN and the value of the secret the API token.
    • Then finally update the workflow file to use that repository secret by changing GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} to GH_TOKEN: ${{ secrets.GH_TOKEN }}. The name of the secret can really be anything. It just needs to match between when you create the secret name and when you refer to it in the workflow file.
  6. If you want the resulting issue with the metrics in it to appear in a different repository other than the one the workflow file runs in, update the line token: ${{ secrets.GITHUB_TOKEN }} with your own GitHub API token stored as a repository secret.
    • This process is the same as described in the step above. More info on creating secrets can be found here.
  7. Commit the workflow file to the default branch (often master or main)
  8. Wait for the action to trigger based on the schedule entry or manually trigger the workflow as shown in the documentation.

Configuration

Below are the allowed configuration options:

Authentication

This action can be configured to authenticate with GitHub App Installation or Personal Access Token (PAT). If all configuration options are provided, the GitHub App Installation configuration has precedence. You can choose one of the following methods to authenticate:

GitHub App Installation
field required default description
GH_APP_ID True "" GitHub Application ID. See documentation for more details.
GH_APP_INSTALLATION_ID True "" GitHub Application Installation ID. See documentation for more details.
GH_APP_PRIVATE_KEY True "" GitHub Application Private Key. See documentation for more details.
Personal Access Token (PAT)
field required default description
GH_TOKEN True "" The GitHub Token used to scan the repository. Must have read access to all repository you are interested in scanning.

Other Configuration Options

field required default description
GH_ENTERPRISE_URL False "" URL of GitHub Enterprise instance to use for auth instead of github.com
HIDE_AUTHOR False False If set to true, the author will not be displayed in the generated Markdown file.
HIDE_LABEL_METRICS False False If set to true, the time in label metrics will not be displayed in the generated Markdown file.
HIDE_TIME_TO_ANSWER False False If set to true, the time to answer a discussion will not be displayed in the generated Markdown file.
HIDE_TIME_TO_CLOSE False False If set to true, the time to close will not be displayed in the generated Markdown file.
HIDE_TIME_TO_FIRST_RESPONSE False False If set to true, the time to first response will not be displayed in the generated Markdown file.
IGNORE_USERS False False A comma separated list of users to ignore when calculating metrics. (ie. IGNORE_USERS: 'user1,user2'). To ignore bots, append [bot] to the user (ie. IGNORE_USERS: 'github-actions[bot]')
ENABLE_MENTOR_COUNT False False If set to 'TRUE' count number of comments users left on discussions, issues and PRs and display number of active mentors
MIN_MENTOR_COMMENTS False 10 Minimum number of comments to count as a mentor
MAX_COMMENTS_EVAL False 20 Maximum number of comments per thread to evaluate for mentor stats
HEAVILY_INVOLVED_CUTOFF False 3 Cutoff after which a mentor's comments in one issue are no longer counted against their total score
LABELS_TO_MEASURE False "" A comma separated list of labels to measure how much time the label is applied. If not provided, no labels durations will be measured. Not compatible with discussions at this time.
SEARCH_QUERY True "" The query by which you can filter issues/PRs which must contain a repo:, org:, owner:, or a user: entry. For discussions, include type:discussions in the query.

Further Documentation

Contributions

We would ❤️ contributions to improve this action. Please see CONTRIBUTING.md for how to get involved.

License

MIT

More OSPO Tools

Looking for more resources for your open source program office (OSPO)? Check out the github-ospo repoistory for a variety of tools designed to support your needs.

issue-metrics's People

Contributors

advik-b avatar ananta avatar ashleywolf avatar azamb avatar chrheg avatar dependabot[bot] avatar eichisanden avatar id avatar jmeridth avatar kzk-maeda avatar lawang24 avatar mainec avatar martincostello avatar mihirkohli avatar okabe-junya avatar parkerbxyz avatar pressxtochris avatar rajveer43 avatar shanemalachow avatar shanemalachow-sl avatar smstone avatar spier avatar zkoppert avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

issue-metrics's Issues

I'v got "The search query is invalid; Check the search query."

I can’t run actions by this error.

Run github/issue-metrics@v2
/usr/bin/docker run --name ghcriogithubissue_metricsv2_770918 --label ef7d85 --workdir /github/workspace --rm -e "GH_TOKEN" -e "SEARCH_QUERY" -e "HOME" -e "GITHUB_JOB" -e "GITHUB_REF" -e "GITHUB_SHA" -e "GITHUB_REPOSITORY" -e "GITHUB_REPOSITORY_OWNER" -e "GITHUB_REPOSITORY_OWNER_ID" -e "GITHUB_RUN_ID" -e "GITHUB_RUN_NUMBER" -e "GITHUB_RETENTION_DAYS" -e "GITHUB_RUN_ATTEMPT" -e "GITHUB_REPOSITORY_ID" -e "GITHUB_ACTOR_ID" -e "GITHUB_ACTOR" -e "GITHUB_TRIGGERING_ACTOR" -e "GITHUB_WORKFLOW" -e "GITHUB_HEAD_REF" -e "GITHUB_BASE_REF" -e "GITHUB_EVENT_NAME" -e "GITHUB_SERVER_URL" -e "GITHUB_API_URL" -e "GITHUB_GRAPHQL_URL" -e "GITHUB_REF_NAME" -e "GITHUB_REF_PROTECTED" -e "GITHUB_REF_TYPE" -e "GITHUB_WORKFLOW_REF" -e "GITHUB_WORKFLOW_SHA" -e "GITHUB_WORKSPACE" -e "GITHUB_ACTION" -e "GITHUB_EVENT_PATH" -e "GITHUB_ACTION_REPOSITORY" -e "GITHUB_ACTION_REF" -e "GITHUB_PATH" -e "GITHUB_ENV" -e "GITHUB_STEP_SUMMARY" -e "GITHUB_STATE" -e "GITHUB_OUTPUT" -e "RUNNER_OS" -e "RUNNER_ARCH" -e "RUNNER_NAME" -e "RUNNER_ENVIRONMENT" -e "RUNNER_TOOL_CACHE" -e "RUNNER_TEMP" -e "RUNNER_WORKSPACE" -e "ACTIONS_RUNTIME_URL" -e "ACTIONS_RUNTIME_TOKEN" -e "ACTIONS_CACHE_URL" -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/_temp/_runner_file_commands":"/github/file_commands" -v "/home/runner/work/twioku/twioku":"/github/workspace" ghcr.io/github/issue_metrics:v2
Starting issue-metrics search...
Searching for issues...
The search query is invalid; Check the search query.

I’ve got this error on action’s log.

This is my workflow file.

name: Monthly issue metrics
on:
  workflow_dispatch:

permissions:
  issues: write
  pull-requests: write

jobs:
  build:
    name: issue metrics
    runs-on: ubuntu-latest

    steps:

    - name: Run issue-metrics tool
      uses: github/issue-metrics@v2
      env:
        GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        SEARCH_QUERY: 'repo:owner/repo created:2023-05-01..2023-05-31'

    - name: Create issue
      uses: peter-evans/create-issue-from-file@v4
      with:
        title: Monthly issue metrics report
        token: ${{ secrets.GITHUB_TOKEN }}
        content-filepath: ./issue_metrics.md
        assignees: theMistletoe

Maybe, I missed how to wrire SEARCH_QUERY on workflow, but I can’t understand hot to fix that.

Tell me about something mistake or misunderstanding.

Better install instructions

Can we get some better installation instructions?

I'm struggling to understand what exactly I need to do in order to use this action.

Step 1 states "Create a repository to host this GitHub Action or select an existing repository."

Do I fork this repo? Clone it then upload to me own account? Or if I select a pre-existing repo, is it the one I want to be scanned?

Step 2 "Create the env values from the sample workflow below (GH_TOKEN, SEARCH_QUERY) with your information as repository secrets. More info on creating secrets can be found here. Note: Your GitHub token will need to have read access to the repository in the organization that you want evaluated"

Where do I put these env vars? I see an .env-example file, so I guess I need an .env file, but where would it go? In the repo I want scanning, or in some other repo (see issues with step 1)

Refactor: Move time window from search query to env variable

This will allow the application to do multiple searches based on issues closed and issues opened during the given time window. This could allow for taking multiple search passes so that searching for issues opened or closed in a time window can be seen in a single report.

Integration with V2 Projects

Firstly, thanks for this action, it's extremely useful.

However, I've been very excited to move our PR and issue management to V2 projects because I think that project statuses will be, over time, a better mechanism for managing issue work flow than labels. Here's the board we use for managing incoming contributions to help understand our workflow. We want to measure each steps to detect slow downs and optimize flow.

Do you anticipate that this action will support "time in status" for V2 projects?

I don't think this works today, so perhaps supporting this is currently blocked?

Add ability to set/compare to metric goals

If a team wants to keep their time to first comment or other metric under a certain goal such as 3 days, it would be cool to configure that and put red/green/yellow status emojis in the report to indicate on/off target.

Table columns break when PR title contains vertical bar ( | )

For example:

| Title | URL | Time to first response | Time to close | Time to answer |
| --- | --- | --- | --- | --- |
| FEED-1234 | Implement sorting | https://github.com/foo/bar/pull/6813 | 0:30:06 | None | None |

Above text results in "Implement sorting" shown under URL column

Authentication fails on GitHub Enterprise Server

Looks like there's not proper handling of instances of GitHub Enterprise Server with regards to authentication and queries against the API. When authenticating to GitHub Enterprise Server installations with github3 you need to specifically use the GitHubEnterprise object and pass in the URL of the server, and then use that for the rest of the calls. Looks like the code in

def auth_to_github() -> github3.GitHub:
only really uses the default object, which will interact with the GitHub.com API. The environment variables should be able to be used to determine whether this is a GitHub.com workflow or an enterprise server workflow, specifically GITHUB_SERVER_URL which is already passed into the Docker environment for the action.

Discussions are limited to 100 results

Seems like some folks might want this to be increased to measure metrics on more than 100 discussions. We could either implement a more reasonable limit or figure out how to handle paging up to the limit of github search results.

Assign members of a GitHub team handle in addition of individuals

First of all, thank you very much for open sourcing this action. It has potential to replace one of our internally homegrown workflows.

Here's my request:
Instead of assigning myself to the issue, I'd like the action to support setting a GitHub team handle as assignee and assign the members of that team instead.

Monthly contributor report

Feature: Filter Out CI User Responses from Time to First Response

Hi! Awesome tool, I could definitely see us using it.

One thing that makes it sort of hard to use (at least for our org at the moment, which is why I thought the metrics were wrong in that other issue :) ) is that we have CI users that respond to PRs. This messes with the time to response calculations as they are always pretty quick to respond, and thus beat the human response most of the time. I've tested by removing the first_comment_time from this line and re-running to get the actual time to first response (at least for us, assuming that a review comment is the first response is an ok assumption, I understand that isn't always true for everyone that would use this tool.)

My proposed solution would be to add a list of Users to ignore comments from, but I'm not really a Python person and can't tell how easy that would be to implement, since it looks like now you just pull the first comment from each issue, pr or discussion.

Is repo required?

The README states SEARCH_QUERY "must contain a repo: entry or an org: entry.", but when only supplying an org, an error occurs:

ValueError: The search query for issues/prs must include a repository owner and name

Is this expected?

Feature: Report time-quantiles

In the generated report, I currently see metrics such as these:

Average time to first response 22:26:39.529412
Average time to close 3 days, 17:21:22.066667
Average time to answer None

While averages are helpful, they can be biased by outliers. For example, a PR that took forever to close skews the reported average for the entire team, and it would be an unfair representation of how the team works. Skews occur a lot more than one may imagine, so I propose to introduce quantiles - specifically the median, P80, P95, P99.

Better yet, configuration-driven time metrics, letting the user choose what kind of time-reporting they need.

feat: README example that demonstrates multiple search queries with 1 report

README example that demonstrates multiple search queries with 1 report

Use case: I want to be able to run different searches in different repositories and report on the metrics in a single markdown report
Use case: I want to be able to see both open and closed issues over a certain time period

This could look like feeding the action a list of SEARCH_QUERYs or a workflow file that ties together several json outputs to create a single markdown report.

"Who are we helping" metric

Some InnerSource projects essentially provide platform capabilities for the organisation, meaning that they "only" have internal teams as customers. Understanding and talking about the customer value and business value of these platforms sometimes can be tricky.

With a bit of a workaround though it becomes easier: Look at issues and PRs created and check which team (filtered down to e.g. product teams with external customers) the reporter is assigned to. Then provide aggregate information on how many times each of these teams was unblocked by solved issues and how many times such teams were able to unblock themselves with PRs.

Is the generated table too long?

Due to the addition of features, the generated table has become long. Being able to collect many metrics is good, but if the table is too long, it becomes difficult to read.

example

| Title | URL | Author | Time to first response | Time to close | Time to answer | Time spent in waiting-for-review | Time spent in waiting-for-manager |
| --- | --- | --- | --- | --- | --- | --- | --- |
| Monthly issue metrics report | https://github.com/Okabe-Junya/sandbox/issues/249 | github-actions[bot] | None | None | None | None | None |

Originally post

When putting these tables into an issue it can be hard to read wide table with lots of entries and that is why I think this is important to be configurable. Any thoughts on that?

Originally posted by @zkoppert in #129 (comment)

`make test` doesn't work

The tests pass, but it seems like make test is not working properly.

make test result

$ make test
pytest -v --cov=. --cov-config=.coveragerc --cov-fail-under=80 --cov-report term-missing
ERROR: usage: pytest [options] [file_or_dir] [file_or_dir] [...]
pytest: error: unrecognized arguments: --cov=. --cov-config=.coveragerc --cov-fail-under=80 --cov-report
  inifile: None
  rootdir: /Users/junyaokabe/workspace/issue-metrics

make: *** [test] Error 4

pytest result

$ pytest
======================================================================== test session starts =========================================================================
platform darwin -- Python 3.11.0, pytest-7.4.0, pluggy-1.2.0
rootdir: /Users/junyaokabe/workspace/issue-metrics
collected 49 items                                                                                                                                                   

...

========================================================================= 49 passed in 0.17s =========================================================================

Originally posted by @Okabe-Junya in #129 (comment)

Specifying exception names in the overgeneral-exceptions

pylint: Command line or configuration file:1: UserWarning: Specifying exception names in the overgeneral-exceptions option without module name is deprecated and support for it will be removed in pylint 3.0. Use fully qualified name (maybe 'builtins.Exception' ?) instead.

Option to include number of very active reviewers in report

One metric that we found helpful at $dayjob was looking at how many people there are that provide the bulk of mentorship for contributors. If that drops below a certain line (2-3 people minimum) it typically means the project is in danger.

Technically what we do is simple: Compute the number of people that provide at least $n comments over time period $x.

Is there any way to retrieve that kind of information in a/this GitHub action?

Extended make lint Execution Time

What's the issue?

When I code for this project and run make lint before committing, it takes a long time.

Why is this happening?

I've set up the development environment for this repo using venv. Therefore, there's a venv directory (which isn't being tracked by git, by the way). Tools like flake8 and pylint check this directory, resulting in the extended execution time.

How should we resolve this?

Modifying the Makefile or .pylintrc could likely address this issue.

Future considerations (not addressed here)

While we currently pass options as arguments when executing flake8, we might also consider using a configuration file like pyproject.toml.

Exclude the issue owner's own comment from first response

I think the comments of the owner of the issue themselves should be excluded from Time to first response.

Because they will add something that they forgot to write in the body of the Issue, or make additional comments to supplement the body of the Issue(like #26).

We could modify it as the default behavior, but if we are more cautious, we might add an option like IGNORE_OWNER_COMMENT.

Search query for PRs returns metrics for issues

I have a query "repo:owner/repo is:pr created:${{ env.last_month }}" that I run after the same query, but for issues.

However, the issue created for this query is exactly the same as the one for issues. No PRs are mentioned in the issue metrics text.

Measuring discussions

It would be great to be able to measure all these metrics for discussions but this currently returns no results.

Formatting of stats output

The new stats features by #127 introduced two changes in the output, that might not have been intended.

I marked them in the screenshot below, and will try my best to describe them in text as well.

Label of 2nd column

Previous the label of the 2nd column of the stats table was "Value".
Now it is "Average".

However "Average" is not correct for rows 4-6 in the table, as those are just counts.

I don't have a great suggestion here. One could split up the table and introduce a new header for row 4-6.
Not sure how future-proof that would be either though.

Milliseconds in average/median/90th-percentile

The stats in the first 3 rows contain the milliseconds.

They reduce readability a bit, and I doubt that many users will need them.
My recommendation would be to remove the milliseconds.

One might even argue that for most purposes the seconds would not be needed either.
That might be a matter

Backwards compatibility?

@zkoppert had mentioned that introducing a new configuration option for the new stats would be good.

I guess that as the 2nd row still contains the average (like before), that idea was dropped?

Screenshot

Screenshot 2023-10-16 at 22 08 17

update wording on report for "issues"

The output says "Number of issues that remain open", "Number of issues closed", and "Total number of issues created" when they could be discussions or PRs. we should update the language to say "items open/closed/created" instead of "issues open/closed/created" to better reflect what they are.

Once that is updated in the code, the sample output in the README.md will also need to be updated.

[docs] Simple example in main README?

I noticed that after changes in #139 the main README.md does not contain a simple workflow example any more.
I believe that this file is also used for the documentation on the GHA marketplace?

When trying out a new GitHub Action, I tend to copy and paste a simple example from the documentation, and then adapt it to my purposes from there.

To do that, I find it unhandy if such a workflow example isn't too many clicks away.

I would therefore suggest to leave at least a basic example in the main README.md.

Not sure which example workflow would be the most simple to use? Maybe this one?
https://github.com/github/issue-metrics/blob/main/docs/example-workflows.md#fixed-time-example

Feature: Create metrics using project status of an issue

It would be a good idea to be able to generate a report that instead of using labels would use the status of the issue in a project, this would help to generate, for example, bug resolution progress reports with the times that the bug has been in each phase of its development. resolution.

Open In progress Resolved To test Tested
Issue 1 00:22:32 5:23:00
Issue 2 01:34:23 2:33:34 00:45:44
Issue 3

Combining multiple reports into a single GitHub issue

In InnerSourceCommons/InnerSourcePatterns#599 I learned that it is possible already to merge multiple reports into a single reports by apply a smart concatenation of multiple GHA steps. Very cool!

Here an example of how the resulting combined report looks like:
InnerSourceCommons/InnerSourcePatterns#601

I had some possible improvement ideas, related to this "combined reports" scenario:

Custom report title

The top-level title in the GitHub issue is always "Issue Metrics" right now. That is not really an issue if the GitHub issue only contains a single report, as the title of the GitHub issue itself can be used to provide a customized description of what the report is about.

However when a single GitHub issue contains multiple reports, this becomes more tricky.

Therefore we could add a configuration value REPORT_TITLE, roughly like this:

field required default description
REPORT_TITLE false "Issue Metrics" A custom title that will be printed at the very top of the report. Note: Most useful when combining multiple reports into a single issue.

Custom output file

The issue_metrics GHA always writes to the file issue_metrics.md.

When combining multiple reports into a single issue it would be helpful to write to different filenames, so that those files can then be concatenated into a single file, which is then written to a GitHub issue.

field required default description
OUTPUT_FILE false issue_metrics.md A custom output file that the report will be written to. Note: Most useful when combining multiple reports into a single issue.

Always print the title and the search query

As shown in InnerSourceCommons/InnerSourcePatterns#601, an empty report will only say "no issues found for the given search criteria".

It would be helpful to always print the title, as well as the search query that was used.

That helps will debugging and identifying what the specific report was meant to be about.

Tasks

  1. enhancement

Sample report

Issue Metrics

Metric Value
Average time to first response 0:50:44.666667
Average time to close 6 days, 7:08:52
Average time to answer 1 day
Number of items that remain open 2
Number of items closed 1
Total number of items created 3
Title URL Time to first response Time to close Time to answer
Discussion Title 1 https://github.com/user/repo/discussions/1 0:00:41 6 days, 7:08:52 1 day
Pull Request Title 2 https://github.com/user/repo/pulls/2 0:05:26 None None
Issue Title 3 https://github.com/user/repo/issues/3 2:26:07 None None

README is really long

The readme could be improved by having collapsable sections or making the examples collapsible. Maybe splitting some of the sections out into separate files that are links from the README would be good too?.

Typo in README.md

In the table describing the different actions, in the "Time in label" row, "applied" is misspelled as "appplied."

Add author line for result in pull-requests

I'm really happy to see author row for pull-request result.

Current:

Title URL Time to first response Time to close Time to answer
Hoge url 0:01:55 0:56:43 None

Expected:

Title URL Author Time to first response Time to close Time to answer
Hoge url @ostk0069 0:01:55 0:56:43 None

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.