Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

⚖️ Define a set of evaluation criterias to illustrate #1

Open
janhalen opened this issue Jan 14, 2025 · 11 comments
Open

⚖️ Define a set of evaluation criterias to illustrate #1

janhalen opened this issue Jan 14, 2025 · 11 comments
Assignees
Milestone

Comments

@janhalen
Copy link
Contributor

janhalen commented Jan 14, 2025

📊 Objective

Create a comprehensive set of criteria to assess the overall health and security of our software projects in OS².

📝 Next Steps

  1. Brainstorm and finalize evaluation factors
  2. Break down factors into technical tasks
  3. Develop methods for data collection and visualization

💡 Discussion

Share your thoughts on important criteria we should include. What metrics do you find most valuable for assessing project health?

@janhalen
Copy link
Contributor Author

janhalen commented Jan 14, 2025

💡 Suggestions for Evaluation Criteria

🏥 Project Health

🔄 Update Frequency

  • Check last update date and commit frequency
  • Regular updates indicate active maintenance

🎫 Issues and Pull Requests

  • Analyze open/closed issues and PR merge frequency
  • Healthy projects show active issue management and regular contributions

🚀 Release Frequency

  • Examine version release cadence, including bug fixes
  • Consistent releases suggest ongoing development

👥 Contributor Activity

  • Assess contributor count and new contributor influx
  • Diverse, growing contributor base is positive

🤝 Community Engagement

  • Evaluate responsiveness to issues/PRs and interaction tone

⭐ Community Interest

  • Monitor stars and forks as indicators of interest

🔒 Security & Quality

🛡️ Security (OpenSSF Scorecard)

  • Utilize for quick assessment of security practices
  • Provides easy-to-understand project scores

📚 Documentation Quality

  • Evaluate completeness and clarity
  • Well-maintained docs show user-centric approach

💻 Code Quality

  • Use automated tools for assessment

📊 Usage Statistics

  • Check download stats for package manager-published projects

Further reading:
https://opensource.guide/metrics/


@zorp & @thpa-bitmind: Your input is valuable! Please share thoughts on:

  1. Criteria viability for our illustration goals, what is important?
  2. Technical details: GitHub API availability and potential calculations

Let's refine these criteria to create a robust evaluation framework! 🚀

@janhalen janhalen added this to the Project setup milestone Jan 14, 2025
@janhalen janhalen changed the title Evaluation criterias ⚖️ Define a set of evaluation criterias to illustrate Jan 14, 2025
@sissekaliaswendyfan
Copy link

Discussions as evaluation criteria?

I recently learned about the 'Discussions'-feature in GitHub repos (Im a newbie 👶).

As I understand it, repo-admin has to activate Discussions (right?). Even though Discussions might not be activated in a repo, we could consider including discussions in the evaluation criteria in ⭐️ Community Engagement?

(As mentioned - Im kindergarten-level at GitHub and apologize if I've got it wrong or if this is irrelevant 😅)

Later🐊
/Sis

@janhalen
Copy link
Contributor Author

Discussions as evaluation criteria?

I recently learned about the 'Discussions'-feature in GitHub repos (Im a newbie 👶).

As I understand it, repo-admin has to activate Discussions (right?). Even though Discussions might not be activated in a repo, we could consider including discussions in the evaluation criteria in ⭐️ Community Engagement?

(As mentioned - Im kindergarten-level at GitHub and apologize if I've got it wrong or if this is irrelevant 😅)

Later🐊 /Sis

Hi @nocodesissi! 👋

Thank you for your question regarding the use of Discussions as an evaluation criterion. It's great to see you engaging with the features of GitHub!
While Discussions can indeed foster community engagement, it's important to note that it is not an open-source feature of GitHub. This means that relying on it could lead to vendor lock-in, where you become dependent on GitHub's specific platform and features. If GitHub were to change its offerings or if you decided to move to another platform, transitioning away from Discussions could be challenging.
Moreover, many of the collaborative functionalities that Discussions offers are already available through issue comments. Issues allow for similar conversations and community interactions without the added dependency on a specific feature.
I appreciate your input, and it's always valuable to consider how we can engage our communities effectively! If you have any more questions or thoughts, feel free to share.

@zorp
Copy link
Collaborator

zorp commented Jan 14, 2025

Discussions as evaluation criteria?

I recently learned about the 'Discussions'-feature in GitHub repos (Im a newbie 👶).
As I understand it, repo-admin has to activate Discussions (right?). Even though Discussions might not be activated in a repo, we could consider including discussions in the evaluation criteria in ⭐️ Community Engagement?
(As mentioned - Im kindergarten-level at GitHub and apologize if I've got it wrong or if this is irrelevant 😅)
Later🐊 /Sis

Hi @nocodesissi! 👋

Thank you for your question regarding the use of Discussions as an evaluation criterion. It's great to see you engaging with the features of GitHub! While Discussions can indeed foster community engagement, it's important to note that it is not an open-source feature of GitHub. This means that relying on it could lead to vendor lock-in, where you become dependent on GitHub's specific platform and features. If GitHub were to change its offerings or if you decided to move to another platform, transitioning away from Discussions could be challenging. Moreover, many of the collaborative functionalities that Discussions offers are already available through issue comments. Issues allow for similar conversations and community interactions without the added dependency on a specific feature. I appreciate your input, and it's always valuable to consider how we can engage our communities effectively! If you have any more questions or thoughts, feel free to share.

I'm not that afraid of introducing discussions (or projects) as long as it is very clear to the community that: should GitHub in the future decide to charge money it is no longer part of our offering nor is it possible to migrate the content created in that case the project is moving away from GitHub.

That being said, we are introducing Zulip as a collaboration platform to boost community engagement, could we measure on activity in channels on Zulip and use that as a metric for community engagement? For example:

  • Number of people in the channel
  • Number of messages in a given time frame (recent time)
  • Number of different people engaging in conversation
  • Number of topics in the channel

Also, could we measure meeting activity based on how many meeting minutes are published last and current year? What would i require?

Also we could consider introducing files describing the organisation in the project template (if not already there). For example:

  • steering-commitee.md
  • coordination-group.md

these being present and having relevant content could also be a measure for community engagement.

@sissekaliaswendyfan
Copy link

Thank you @janhalen and @zorp for elaborating.

I understand the importance of being consious about vendor-lock-in.

Despite that, I mostly agree with Zorp, as I think it's important to also evaluate the community's interactions in forums like Discussions or Zulip. Even though it resembles the opportunities found in issues, I would argue that there's something different at play when a community can brainstorm and interact a bit more informally—as long as there's good practice in place to ensure that dialogues are moved to issues as soon as it's relevant. Naturally, this requires guidelines.

I’ve looked into [VMware Tanzu's](https://github.com/vmware-tanzu) approach to measuring community engagement, and I think there’s a lot of great stuff to draw from. I suggest we take inspiration from (or directly copy) their methods unless there are other projects doing something even better. There’s no need to reinvent the wheel. ;)

I believe it’s important for us to balance quantitative and qualitative parameters. Just because something can be quantified doesn’t necessarily mean it will provide the best answers to what we want to investigate—at least, that’s what I try to remind myself.

Feel free to check out VMware Tanzu’s health-check sheet: [HEALTHCHECK-SHEET.md](https://github.com/vmware-tanzu/community-engagement/blob/main/HEALTHCHECK-SHEET.md)

Can we use some of it?

I’m also a big fan of their Inclusive Terminology guide ❤️. That’s something we could consider as well.

What are your thoughts? 💭

@janhalen
Copy link
Contributor Author

@nocodesissi:

I completely agree that we shouldn't reinvent the wheel, and looking to established practices is a great approach. However, I'd like to suggest a slight shift in our reference point. While VMware Tanzu's methods are certainly valuable, it might be even more beneficial for us to align with a more independent, community-driven standard.

In this light, I'd like to draw your attention to the CHAOSS (Community Health Analytics Open Source Software) project: https://github.com/chaoss CHAOSS is a Linux Foundation project that focuses on creating analytics and metrics to help define community health. They offer a comprehensive, vendor-neutral approach to measuring open source community health and sustainability. Their metrics and methods are developed collaboratively by a diverse community of professionals and academics, which helps ensure a balanced and unbiased perspective. Some key advantages of using CHAOSS as our model include:

  • Vendor neutrality, reducing potential bias towards specific commercial interests
  • A wide range of metrics covering various aspects of community health
  • Regular updates and improvements based on community feedback
  • Alignment with broader open source community standards

@nocodesissi, @zorp, @thpa-bitmind: What do you think about pivoting towards CHAOSS as our primary reference?

I'm interrested to hear your thoughts on this approach!

@janhalen
Copy link
Contributor Author

The evaluation of a product's maturity is a complex process that encompasses numerous factors, including community health. To enhance the credibility and universality of our assessment, it would be beneficial to align our maturity levels with internationally recognized standards, such as those established by the Cloud Native Computing Foundation.

. The challenge of accurately describing an open source project's state, health, and maturity is not unique to OS2, and we need not reinvent the wheel by creating our own criteria or levels. For a comprehensive understanding of this topic, I recommend exploring the CNCF's project metrics
. These metrics are grounded in the well-established theory of Diffusion of Innovations, which has been popularized through frameworks like Crossing the Chasm
. By adopting more widely-accepted models, we can ensure our evaluation process is both robust and relatable to the broader open source community.

@janhalen
Copy link
Contributor Author

To boil it down and to reach our milestone, I think we @nocodesissi & @zorp need to boil the evaluation criteria for this PoC down to max 5 criteria. Lets agree on 5 initial evaluation criteria and agree on a deadline for doing so!

That way @thpa-bitmind can get started estimating the time needed for delivering a solution.

Later more criteria can be addded....

@janhalen
Copy link
Contributor Author

janhalen commented Jan 23, 2025

🔎 Criteria Refinement Results

After meeting on 23 january 2025 we narrowed it down and idientified 3 initial criteria we want to measure and report on:

  • 🌊 Implement Activity Frequency Tracker
    Measure on issues, pull requests and releases. How often do they occur.

  • 🔀 Implement issue conversion rate tracker
    Activity/Impact - Measure how many issues results in PR's that are merged.

  • Community engagement - Mesure on issues, commits and PRs. How many unique users are contributing?

  • Core governance - Measure that CONTRIBUTING.md and CODE_OF_CONDUCT.md exists and that they contain the recommended sections from the os2offdig template and that there is commits/prs by one of the appointed project maintainers to the template files. (This might introduce a need for a MAINTAINERS.md file)

The values/results of these 4 criteria should be seperated in 3 seperate buckets to make a simple "traffic light" red/yellow/green illustration. These buckets needs to be discussed, when ready this can be made a seperate issue:

  • Discuss bucket boundaries for the specific criteria

@nocodesissi & @zorp please confirm that you agree on these criteria (and upcoming tasks)

@sissekaliaswendyfan
Copy link

Thanks for following up on the meeting @janhalen.

I agree on these criteria 👍 As I recall we also discussed one more aspect of the activity/frequency-criteria, which was to maybe include the following: Measuring how many issues results in PR's.

This was something we considered to acommodate the possible scenario where a project has a lot of issues, that might not be handled by the community (AKA lead to PR's). What do you think about this? Is it already included in the Activity/Frequency-criteria or should we leave it for now and include it in a later version of Health Measuring? I completely trust in your opinion and decision in this regard, just wanted to mention it.

Thanks again! I really enjoy being able to contribute here and am so excited about seeing the final results 🥳

@janhalen
Copy link
Contributor Author

@nocodesissi: Youre right! I forgot, thanks for reminding me!

I have added it as a further critieria.. " Activity/Impact - "Measure how many issues results in PR's that are merged."

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants