Coder Social home page Coder Social logo

gitlab_stats's Introduction

Gitlab-stats

Github PyPI version Gitlab Python Build Status codecov Codacy Badge

Get to the gitlab API and generates a report based on the pipeline builds. Creates a report for the pipelines of the last two weeks. (On the assumption that there are less than 100 push per 2 weeks)

Installation

Install via pip using:

Local install with pip3:

In order to make it work:

  • Create a GITLAB_TOKEN env variable with your access token.

Get the project ID

For the script to work, you will need to get the project ID of your gitlab project. It is a unique ID that is used by the gitlab REST API to store your project information.

Get it in [your project] > Settings > General > General project settings

General project settings

General project settings

How to use

When installed you should be able to run it like that:

Here is the help when gitlab_stats -h:

You can save proxy with HTTP_PROXY or gitlab url with GITLAB_URL.

Docker

To build the docker use:

To run the docker use:

Important Note:

You will need an access token set up as an environment variable to reach your gitlab.

To get an access token based on your personal credentials go to your gitlab server [Account] -> [Settings] -> [Access Tokens]

General project settings

General project settings

Then give it a name and click Create personal access token.

Save this token somewhere safe then in bash:

gitlab_stats's People

Contributors

anderslindho avatar sylhare avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

gitlab_stats's Issues

Add list of projects per page with the CLI

In order to see the project ID and the project name, it would be nice to do something like:

gitlab_stats projects  # for first page

To have the first page of 100 projects, with that we could grep for the project we want and have its id. If multiple page, we could do something like

gitlab_stats projects 2  # for page 2

flaky test?

I have detected a failed flakefinder run on test_utils.py with pytest plugin flakefinder, you can find the details of this plugin at:
https://github.com/dropbox/pytest-flakefinder/tree/master/docs
The detailed issue:
seemingly the project_info is being polluted when flakefinder rerun the test several times.
Some snapshot of the error:


project_info = {'duration_in_minutes': '0 min 59s', 'duration_moy': 58.9, 'id': 4895805, 'name': 'integration-tests', ...}

    def get_success_percentage(project_info):
>       success = [pipeline['status'] for pipeline in project_info['pipelines']]
E       KeyError: 'pipelines'

../gitlab_stats/utils.py:62: KeyError
___________________________________________ UtilsTest.test_056_enhance_project_info ___________________________________________

self = <tests.test_utils.UtilsTest testMethod=test_056_enhance_project_info>

    def test_056_enhance_project_info(self):
>       response = enhance_project_info(tests.PROJECT_INFO)

test_utils.py:84: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../gitlab_stats/utils.py:81: in enhance_project_info
    project_info.update({'duration_moy': get_duration_moy(project_info)})
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

project_info = {'duration_in_minutes': '0 min 59s', 'duration_moy': 58.9, 'id': 4895805, 'name': 'integration-tests', ...}

    def get_duration_moy(project_info):
        print(project_info)
>       duration = (pipeline['duration'] for pipeline in project_info['pipelines'])
E       KeyError: 'pipelines'

../gitlab_stats/utils.py:55: KeyError

To reappear such a report, run pytest tests/test_utils.py --flake-finder at root directory.
The test can pass with normal pytest run, meaning the possibility that the test is flaky.
I have tried to resolve this but I cannot understand why the project_info get polluted, my guess is the tests are order dependent.
Please see whether this should be fixed or improved.

Change Status on Pypi

Now status is Beta, could be moved a bit higher to reflect reality.

Change setup.py
Create a new release
Deploy to Pypi

How about git stats?

For now it only get stats from gitlab.
What if we would use the powerfull (but complicated) git command to get some git stats like:

  • Number of commit per person
  • Number of commit per ticket (with #9 in the commit message , or a jira issue JIRA-123)
  • All commit between two releases
  • All tickets worked on between two releases (with some Fix #9 .... commit or something else)

Add gitlab mock server for test

Right now some tests are directly against the gitlab API, which is not the best practice.
It would be better to add a simple server that would fake the requests.

Have the possibility to use proxy or url saved inside a defined env variable

If you are behind a proxy or uses a different url for your own personal gitlab, it would become a bother to always have to enter something like:

gitlab-stats <ID> -u "http://my-gitlab.com" -p "http://my-proxy.com"

Something nice would be to use the predefined HTTP_PROXY and something like GITLAB_URL in the env var so that the command would just be:

gitlab-stats <ID> -up
# or something like a new flag for saved url and proxy
gitlab-stats -s <ID>

Add start date of the stats

As of now it gives the stats for the two previous week, but it would be nice to have the exact start date and end date to avoid confusion.

Plus it could be useful if we decide to give custom date range.

The start date would be the date of the first pipeline in the time range, according to #3 if there are more than a 100 pipelines the 100+ are not counted, so the start date shall be the date of the 100th pipeline.

Add the number of pipeline run

As of now, we look into the first page of pipeline (100 pipelines max). Then we check when they run. A nice feature would be to know the amount of pipeline run so we could see:

  • a few number of pipeline in a sprint could explain some success stat
  • a normal number of pipeline to see our average per sprint
  • a hundred number of pipeline, which means we would need to add the feature to look into more pages.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.