Skip to content

Add OpenSSF Best Practices Badge #14342

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 28, 2025

Conversation

maennchen
Copy link
Member

@maennchen maennchen commented Mar 18, 2025

Changes

  • Adds the passing badge to the README

Best Practices

I've filled out the form to the best of my knowledge, the answers can be seen here: https://www.bestpractices.dev/en/projects/10187

Approving this PR is also about making sure that the info there is correct.

Unmet SUGGESTED practices

For SUGGESTED best practices, we can decide to ignore them and still pass the badge. The following have been marked as UNMET:

  • test_most - It is SUGGESTED that the test suite cover most (or ideally all) the code branches, input fields, and functionality.

    We currently do not have any test coverage reporting, it would be good to add a coverage reporter to the setup. - I have started exploring this here: Test Coverage Reporting #14343

  • static_analysis_common_vulnerabilities - It is SUGGESTED that at least one of the static analysis tools used for the static_analysis criterion include rules or approaches to look for common vulnerabilities in the analyzed language or environment.
    static_analysis_often - It is SUGGESTED that static source code analysis occur on every commit or at least daily.

    We're currently using dialyzer, as well as various small tools like shellcheck and markdown lint. But we're not employing any static analysis on the Erlang / Elixir code focused on security. While there is tools in the ecosystem, such as elvis, sobelow, credo etc. I'm not convinced that they would have an impact on this repository.

  • dynamic_analysis - It is SUGGESTED that at least one dynamic analysis tool be applied to any proposed major production release of the software before its release.
    dynamic_analysis_enable_assertions - It is SUGGESTED that the project use a configuration for at least some dynamic analysis (such as testing or fuzzing) which enables many assertions. In many cases these assertions should not be enabled in production builds.

    To my knowledge we're not employing any dynamic analysis tools, and I also can't think of one that would make sense to use.

@kikofernandez
Copy link

Erlang/OTP is on the same boat with static analysis for Erlang, except that we also use CodeChecker for C/C++ and dynamic analysis for the C/C++ parts (valgrind).

What you wrote seems reasonable to me, and I think enough to get the passing badge.

Copy link

@kikofernandez kikofernandez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me and reasonable.
Dialyzer is a good static analysis tool, not a type system, so that criteria is met.
Regarding Security coding tools, I only know of SAFE from Erlang Solutions and it is not free (AFAIK).

@maennchen maennchen marked this pull request as ready for review March 20, 2025 09:57
@josevalim josevalim merged commit 33b6e6d into elixir-lang:main Mar 28, 2025
10 checks passed
@josevalim
Copy link
Member

💚 💙 💜 💛 ❤️

@maennchen maennchen deleted the jm/best_practices_badge branch March 28, 2025 10:51
@mohamedalikhechine
Copy link

Hey All, SAFE is a static analysis tool which is free for Open source, please send an email to [email protected] and we will assist with it. The documentation is here

@maennchen
Copy link
Member Author

@mohamedalikhechine I'm personally in favor of adding scanners to improve the quality / security of Elixir.
However, Elixir is different from most projects built on it. For example, creating atoms at runtime is a feature and not a possible issue as with most libraries. Therefore for a scanner to be useful, we would need to see that it is able to find relevant issues in the current code and also is not full of false positives. If you think that SAFE can do this, I would recommend to try it out and if the results are promising to open a separate discussion. (In case there's relevant findings, please be careful with the disclosure according to the security policy...)

For the scope of the best practices badge, I'm not sure if SAFE would be helpful since it is not FLOSS.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

4 participants