-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
⚖️ Define a set of evaluation criterias to illustrate #1
Comments
💡 Suggestions for Evaluation Criteria🏥 Project Health🔄 Update Frequency
🎫 Issues and Pull Requests
🚀 Release Frequency
👥 Contributor Activity
🤝 Community Engagement
⭐ Community Interest
🔒 Security & Quality🛡️ Security (OpenSSF Scorecard)
📚 Documentation Quality
💻 Code Quality
📊 Usage Statistics
Further reading: @zorp & @thpa-bitmind: Your input is valuable! Please share thoughts on:
Let's refine these criteria to create a robust evaluation framework! 🚀 |
Discussions as evaluation criteria?I recently learned about the 'Discussions'-feature in GitHub repos (Im a newbie 👶). As I understand it, repo-admin has to activate Discussions (right?). Even though Discussions might not be activated in a repo, we could consider including discussions in the evaluation criteria in ⭐️ Community Engagement? (As mentioned - Im kindergarten-level at GitHub and apologize if I've got it wrong or if this is irrelevant 😅) Later🐊 |
Hi @nocodesissi! 👋 Thank you for your question regarding the use of Discussions as an evaluation criterion. It's great to see you engaging with the features of GitHub! |
I'm not that afraid of introducing discussions (or projects) as long as it is very clear to the community that: should GitHub in the future decide to charge money it is no longer part of our offering nor is it possible to migrate the content created in that case the project is moving away from GitHub. That being said, we are introducing Zulip as a collaboration platform to boost community engagement, could we measure on activity in channels on Zulip and use that as a metric for community engagement? For example:
Also, could we measure meeting activity based on how many meeting minutes are published last and current year? What would i require? Also we could consider introducing files describing the organisation in the project template (if not already there). For example:
these being present and having relevant content could also be a measure for community engagement. |
Thank you @janhalen and @zorp for elaborating. I understand the importance of being consious about vendor-lock-in. Despite that, I mostly agree with Zorp, as I think it's important to also evaluate the community's interactions in forums like Discussions or Zulip. Even though it resembles the opportunities found in issues, I would argue that there's something different at play when a community can brainstorm and interact a bit more informally—as long as there's good practice in place to ensure that dialogues are moved to issues as soon as it's relevant. Naturally, this requires guidelines. I’ve looked into [VMware Tanzu's](https://github.com/vmware-tanzu) approach to measuring community engagement, and I think there’s a lot of great stuff to draw from. I suggest we take inspiration from (or directly copy) their methods unless there are other projects doing something even better. There’s no need to reinvent the wheel. ;) I believe it’s important for us to balance quantitative and qualitative parameters. Just because something can be quantified doesn’t necessarily mean it will provide the best answers to what we want to investigate—at least, that’s what I try to remind myself. Feel free to check out VMware Tanzu’s health-check sheet: [HEALTHCHECK-SHEET.md](https://github.com/vmware-tanzu/community-engagement/blob/main/HEALTHCHECK-SHEET.md) Can we use some of it? I’m also a big fan of their Inclusive Terminology guide ❤️. That’s something we could consider as well. What are your thoughts? 💭 |
@nocodesissi: I completely agree that we shouldn't reinvent the wheel, and looking to established practices is a great approach. However, I'd like to suggest a slight shift in our reference point. While VMware Tanzu's methods are certainly valuable, it might be even more beneficial for us to align with a more independent, community-driven standard. In this light, I'd like to draw your attention to the CHAOSS (Community Health Analytics Open Source Software) project: https://github.com/chaoss CHAOSS is a Linux Foundation project that focuses on creating analytics and metrics to help define community health. They offer a comprehensive, vendor-neutral approach to measuring open source community health and sustainability. Their metrics and methods are developed collaboratively by a diverse community of professionals and academics, which helps ensure a balanced and unbiased perspective. Some key advantages of using CHAOSS as our model include:
@nocodesissi, @zorp, @thpa-bitmind: What do you think about pivoting towards CHAOSS as our primary reference? I'm interrested to hear your thoughts on this approach! |
The evaluation of a product's maturity is a complex process that encompasses numerous factors, including community health. To enhance the credibility and universality of our assessment, it would be beneficial to align our maturity levels with internationally recognized standards, such as those established by the Cloud Native Computing Foundation. . The challenge of accurately describing an open source project's state, health, and maturity is not unique to OS2, and we need not reinvent the wheel by creating our own criteria or levels. For a comprehensive understanding of this topic, I recommend exploring the CNCF's project metrics |
To boil it down and to reach our milestone, I think we @nocodesissi & @zorp need to boil the evaluation criteria for this PoC down to max 5 criteria. Lets agree on 5 initial evaluation criteria and agree on a deadline for doing so! That way @thpa-bitmind can get started estimating the time needed for delivering a solution. Later more criteria can be addded.... |
🔎 Criteria Refinement ResultsAfter meeting on 23 january 2025 we narrowed it down and idientified 3 initial criteria we want to measure and report on:
The values/results of these 4 criteria should be seperated in 3 seperate buckets to make a simple "traffic light" red/yellow/green illustration. These buckets needs to be discussed, when ready this can be made a seperate issue:
@nocodesissi & @zorp please confirm that you agree on these criteria (and upcoming tasks) |
Thanks for following up on the meeting @janhalen. I agree on these criteria 👍 As I recall we also discussed one more aspect of the activity/frequency-criteria, which was to maybe include the following: Measuring how many issues results in PR's. This was something we considered to acommodate the possible scenario where a project has a lot of issues, that might not be handled by the community (AKA lead to PR's). What do you think about this? Is it already included in the Activity/Frequency-criteria or should we leave it for now and include it in a later version of Health Measuring? I completely trust in your opinion and decision in this regard, just wanted to mention it. Thanks again! I really enjoy being able to contribute here and am so excited about seeing the final results 🥳 |
@nocodesissi: Youre right! I forgot, thanks for reminding me! I have added it as a further critieria.. " Activity/Impact - "Measure how many issues results in PR's that are merged." |
📊 Objective
Create a comprehensive set of criteria to assess the overall health and security of our software projects in OS².
📝 Next Steps
💡 Discussion
Share your thoughts on important criteria we should include. What metrics do you find most valuable for assessing project health?
The text was updated successfully, but these errors were encountered: