Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Transition Criteria Mapping to OpenSSF Scorecard Criteria

  • Legal
    • License (10 is pass, anything below is fail)
  • Diversity
    • Consider the "Contributors" metric (need to do research on what's an appropriate threshold, and also if we can tweak the parameters)

...

    • Also the diversity of maintainers (try a GitHub action to parse the MAINTAINERS.md file; as a fallback, the TOC will manually inspect.)
  • Release
    • Packaging (OpenSSF seems to give a 10 easily here, based on a single publish action, so perhaps we should consider this a soft criteria and mandate the highest score)

...

    • Also timeliness of releases using major and minor version numbers (use a script/action that checks time since last release, then use a threshold). (Consider adding such scripts to the TOC repo so any member can run it when required for reviews and evaluations.)
  • Testing and CI/CD
    • CI Tests (think about this one, if 10 is easily attainable, maybe mandate that; perhaps require a code coverage action and pick a threshold to exceed)
  • Security
    • Dangerous Workflow (require a 10)

...

    • Token Permissions (require a 10)

...

    • Branch-Protection (check with Ry if the 2-org merge approval policy is set by default for all repos; require Tier 4, or 9/10 points)

...

    • Dependency-Update-Tools (require a 10, which can be obtained just by configuring a bot like dependabot)

...

    • Fuzzing (check if it's easy to integrate tools like OSSFuzz, and if so, mandate a 10)

...

    • Pinned-Dependencies (investigate if this is a 0 but give the benefit if the doubt to the maintainers if they can provide good reasons for that score)

...

    • SAST (check if it's easy to integrate tools like CodeQL, and if so, mandate a 10))

...

    • Security-Policy (at least a 9, also ensure that the project's SECURITY.md  file builds on the default template in the HL TOC governing documents)

...

    • Signed-Releases (don't mandate anything until the Security Artifacts Task Force comes to a conclusion)

...

    • Token-Permissions (mandate a 10, ensuring least privilege use of tokens)

...

    • Vulnerabilities (investigate if this is less than 10 but give the benefit if the doubt to the maintainers if they can provide good reasons for that score because a ready fix is unavailable for a critical feature)
  • Structure
    • Code Review (require a 10, as human maintainer review of every PR is a basic requirement)
  • Maintenance Activity (not in our original badges' list) →
    • Maintenance (require a certain amount of activity to graduate, but we won't require a high number to maintain the graduated state for mature projects, dormancy check can be done by TOC but will be subjective, treat this as a soft criterion without mandating a hard threshold)
  • Production
    • ?? (should we have an ADOPTERS.md  file; interesting to look at, but we shouldn't mandate this as a criterion for graduation)
  • Documentation
    • (subjective criterion, test for presence and publication; make a list of things that docs should cover, like setup/installation instructions and a basic tutorial)


(How do we enforce a particular version of the scorecard?

  • Perhaps mention whenever an upgrade is due on the maintainers' Discord channel and expect that the maintainers of each project will submit a PR to upgrade the scorecard GitHub action.)