Coder Social home page Coder Social logo

wg-metrics-development's Introduction

CHAOSS Metrics Development Working Group

Table of Contents

Introduction

Goals

The Metrics Development Working Group focuses on defining the metrics that are used by multiple working groups or are important for community health.

Purpose

From time to time, we come across metrics that do not cleanly fit into one of the other existing working groups. This working group acts as a central hub to create and debate over such metrics.

Who should join this working group?

All contributors are welcome to participate in the Metrics Development Working Group. The areas of interest include organizational affiliation, responsiveness, geographic coverage, and more.

Participate

How to Join Us?

You are welcome to participate in our video conferences. The details of these meetings can be found here

Read the agenda and meeting minutes to know the discussions of the previous meetings and find out more about the next one.

As a contributor, you can help us keep our community open and inclusive. We request you to adhere to the guidelines mentioned in the CHAOSS Community: Code of Conduct

Contributing

See the CONTRIBUTING.md for more information.

Metrics

Released Metrics

We are involved in the release of CHAOSS metrics. Check out our published work at <https://chaoss.community/metrics/>

Contributors

Chairs

  • Kevin Lumbard
  • Matt Germonprez

Please feel free to contact our chairs in case you require any sort of assistance.

Amazing CHAOSS Project Contributors

We greatly appreciate our contributors at CHAOSS and look forward to your joining us as well.

License

The documents in this repository are released under the MIT License. See LICENSE file.

Copyright © CHAOSS, a Linux Foundation Project

wg-metrics-development's People

Contributors

bproffitt avatar ccarterlandis avatar elizabethn avatar geekygirldawn avatar georglink avatar germonprez avatar jaskiratsingh2000 avatar jmertic avatar klumb avatar marierere avatar ritik-malik avatar rpaik avatar sgoggins avatar vinodkahuja avatar xiaoya-yaya avatar yash-yp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

wg-metrics-development's Issues

Inconsistency in referring images

Hi,
I saw that the images in this repository are being referred differently in different markdown files.
A good example would be the directory : focus-areas/when
The following metric markdowns refer to images as:

Since there is already an attempt to bring all WG in same structure (doc), can we start by deciding a common way for referring the images...

Also, there is an error in mapping the metric from README, I tried to fix that in PR #109

Common Metrics Release Notes

This issue was created to capture the continuous release notes. Insight from comments will be used to create the regular release note.

Example comments could be:

  • name of new metric and date of release
  • descriptions of changes to existing metric
  • Changes to focus areas or restructuring of WG

Standard Structure for README.md in each WG repo

Hi everyone,

While standardizing the repositories, I noticed that the README file of each working group has a different structure. After discussing with @GeorgLink, I have drafted a proposal for a common structure that can be implemented across these README files. I have also replicated the same as a template.

Link: https://docs.google.com/document/d/1pfipIiaemdtdiDQpvY7jOKhKzsv_3lXLLsbJiByl5GU/edit?usp=sharing

Please provide your thoughts and suggestions regarding the proposal.

Release Notes - 2021-02

This issue was created to capture the continuous contribution notes. Insight from comments will be used to create the regular release notes.

Example comments could be:

name of new metric and date of release
descriptions of changes to existing metric
Changes to focus areas or restructuring of WG

Standardizing wg-common repository

Hi,

Since there is a constant effort to standardize the WG repositories, as listed here,
the wg-common repository contains a directory called template-folder which defines the metrics template.
Considering the fact that we already have a dedicated repository for metrics, should we remove this directory?

Check Common Metrics for Grammar Issues and Typos

Please check out the following links to CHAOSS Common Metrics. These metrics could use a double-check for grammar related issues and typos. Thanks!

Metrics are published on our website with markdown pages as the source in this repository. Please create pull requests to edit the markdown.

Standardize Working groups repository structure

Hello everyone,

We are trying to make a standard and uniform repository structure for all working groups of CHAOSS. This will make things easier for new members. See this document to know details about the proposed repo structure.

Please provide your feedback or concern about this proposal.

Metric Idea: Project segments by timezone

Hi everyone.

When evaluating a project, one metric could be concerned with what timezone the majority of activity occurs in. A company's ability to contribute to a project may be impacted if it is located in a timezone that is different from the project.

Taking this idea further: If a project is spread out through timezones, that is good for (geographic) diversity. However, if the work within each timezone is limited to separate areas of a project or code base, then this could be a sign that a project has disconnected parts. From this angle, the metric becomes a measure for risk.

Does this make sense?

Candidate Release Comments (Organizational Diversity)

This issue was created to collect comments about the upcoming metrics release.

This thread is for comments about Organizational Diversity

GitHub location: https://github.com/chaoss/wg-common/blob/master/focus-areas/organizational-affiliation/organizational-diversity.md

Release candidate: https://chaoss.community/metric-organizational-diversity/

See all release candidates of metrics are at:
https://chaoss.community/metrics-rc/

Important Dates

Release Freeze: June 21st, 2019
Candidate Release: June 24th, 2019
Comments Close: July 24th, 2019
Release Date: August 1st, 2019

New Metric: Burstiness of activity (distribution over time measurement)

UPDATE: Working on the metric here:
https://docs.google.com/document/d/1cIt6aJdWppIAW7LvLHMIW7oOLZtkYnZPjSaYhX_NBHI/edit


Burstiness captures how much activity is distributed equally over time, no activity, or occurs in bursts of high activity with long pauses.

I have heard about this metric for the first time applied to Wikipedia edits to understand coordination and communication behavior (non-archival conference, sorry, not reference).

Goh, K.-I., & Barabási, A.-L. (2008). Burstiness and Memory in Complex Systems. EPL (Europhysics Letters), 81(4), 48002. Retrieved from: https://arxiv.org/pdf/physics/0610233

New metric: Geographic Coverage

It would be interesting to track the geographic location of the committers to an open source project. If they are all from the very same region / location, the project is pretty localized - is this a successfully adopted project? if from all over the world, would it be a more successful one?

Discussing with Jesus while in Sonoma, he suggested looking at the time zone from the commits in the git repository. It is already available from GrimoireLab and useful as is. If possible to detaile it in terms of country rather than timezone, it would be even better- e.g. Italy or France or South Africa are all in the same time zone.

clarifying qualitative measures

This issue is a discussion that started during the review phase of the first CHAOSS metrics release.


Under the "Quantitative" section, what's the difference between New Contributing Organizations (7th bullet) and New Contributor Organizations (8th bullet)?

Originally posted by @rpaik: #22 (comment)


@geekygirldawn and @bproffitt : Do you know what the difference between the 7th and 8th bullets are intended to represent?

  • New Contributing Organizations (7th bullet) and
  • New Contributor Organizations (8th bullet)?

Originally posted by @sgoggins: #22 (comment)


@sgoggins I believe we were differentiating between organizations that contribute as an organization and which organizations contributors belong to.

Originally posted by @bproffitt: #22 (comment)


Could they be written like that:

  • New organizations that contribute to the project
  • Organizations that new contributors belong to

Originally posted by @germonprez: #22 (comment)

New metric: Occasional Contributors

Continued discussion from gitlab:cauldronio/cauldron#661 and conversation with @GeorgLink.


Summary

Create a metric, or common definition, of what a drive-through contributor means.

Background

CHAOSS has discussed the idea of "drive-by" contributors, "fly-by" contributions, and other metaphors for folks who come to make one contribution to a project, and then disappear. There is a strong negative connotation to "drive-by" contributors and it is not a common phrase outside of the United States. It leads to confusion when interpreting the meaning of the measure. To date, there does not seem to be a standard way of how we describe these kinds of contributors.

Details

I propose a new metric to define drive-through contributors. This is the most widely-used term I see in place of drive-bys. You can see @vmbrasseur's 2017 talk on Have It Your Way: Maximizing Drive-Thru Contributions, which is where I first heard this term and started using it.

Brainstorming and ideation on what criteria defines drive-through contributions is helpful to expand this idea.

Outcome

Common language and understanding of what drive-through contributions and contributors are, and the role of their contributions in the larger context of an Open Source project and/or ecosystem.

Release Notes 2021-09

This issue was created to capture the continuous contribution notes. Insight from comments will be used to create the regular release notes.

Example comments could be:

name of new metric and date of release
descriptions of changes to existing metric
Changes to focus areas or restructuring of WG

New Metric: Event Locations

Google doc is available at https://docs.google.com/document/d/1vSJFXA_YhqR0LjixTx4Y0Y9MUB1zXiF0Owh6-1XrvVM/edit

This issue is created to collect comments about the rolling release of "Event Locations"

This thread is for comments about the proposed metric.

This metric can be found here: https://github.com/chaoss/wg-common/blob/master/focus-areas/place/event-locations.md

See all release candidates: https://chaoss.community/metrics/

CHAOSS Metric Quality Checklist

Process

  • Create the “review issue” in the authoring WG’s repo for comments during review period and paste this template in
  • Create pull request to edit or add metric to WG’s repo (after checking Content Quality and Technical Requirements below)
  • Add the new metric or metric edit to release notes issue in working group repo
  • Update the Metrics Spreadsheet
  • Create issue in CHAOSS/Translations repository to kick-off translation to other languages (please use the the translation issue template)
  • "Metric Candidate Release" label added to the metric release candidate issue.
  • Metric was added to website

When above steps are completed:

  • Announce new/updated metric on mailing list, newsletter, community Zoom call, and Twitter. This can be coordinated with the community manager.

Content Quality

  • Required headings are filled in, including Questions.
  • Description provides context to metric
  • Objectives list sample uses for the metric and desired outcomes
  • If any, DEI uses of the metric are included Objectives
  • Optional headings that have no content were removed
  • Contributors section lists those contributors that want to be named
  • The name of the metric is the same in (1) metric heading, (2) metric file name, (3) focus area, (4) metrics spreadsheet, (5) “review issue”, (6) translation issue, and (7) website

Technical Requirements

  • Message in the metric markdown file that the metric will be part of the next regular release is at top of page and the links are correct (this is in the metric template)
  • Metric file name is the full metric name and only contains lower case letters and hyphens (“-”) for spaces
  • Images are included using markdown and relative links (as described in the metrics template)
  • Images have at least one empty line above and below them
  • Ensure images are placed in image folder and followed naming convention
  • If new focus area is created, ensure focus area is added to wg repo readme and focus area folder readme
  • Within the focus area, add the metric in the table and provide the link to the metric and metric question
  • Ensure tables within metric are converted as image and placed in the image folder (both original MD and screenshotted PNG format) and follow the naming convention
  • No HTML code in the metrics markdown file

[Large Scope for Discussion] Management of Organizational Affiliation and Email Aliases

There is a broad need for the curation of a list of email addresses and organizational affiliations. Curation of such a list will be helpful for developing more precise implementations of metrics related to organizational contribution levels.
Questions of privacy associated with the new European data laws must also be addressed and considered. One possibility is a Mozilla project aimed at supporting individual, global consent levels across sites we may use, like GitHub.

Discussion on responsiveness metrics

The goal of this task is to start the discussion about the several ways we can track responsiveness metrics.

From a broader perspective, this responsiveness metrics can be seen in several of the data sources for software development.

As an example, a code review process may have several steps such as:

  • the time to merge
  • the time waiting for a submitter action
  • the time waiting for a reviewer action
  • the time for each of the iterations in a code review process
  • the time to merge into master since a review is approved
  • the time for approval
  • the time to first response

In addition to this, we may find other measurable times at places such as in a mailing list, the first time to get an answer. Or in a more forum-based format such as Discourse or Stackoverflow where we can have the time to get an accepted answer.

With all of this in mind, and as a suggestion, we may start writing down the motivation of these metrics (mainly the goals we're following) and then start producing the specific questions and metrics.

As potential goals related to this, I have in mind a couple of them:

  • Efficiency in the software development process
  • Volunteers community care

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.