Abstract—New contributors are critical to open source projects. Without them, the project will eventually atrophy and become inactive, or its experienced contributors will bias the future directions the project takes. However, new contributors can also bring a greater risk of introducing vulnerable code. For projects that have a need for both secure implementations and a strong, diverse contributor community, this conflict is a pressing issue. One avenue being pursued that could facilitate this goal is rewriting components of C or C++ code in Rust— a language designed to apply to the same domains as C and C++, but with greater safety guarantees. Seeking to answer whether Rust can help keep new contributors from introducing vulnerabilities, and therefore ease the burden on maintainers, we examine the Oxidation project from Mozilla, which has replaced components of the Firefox web browser with equivalents written in Rust. We use the available data from these projects to derive parameters for a novel application of learning curves, which we use to estimate the proportion of commits that introduce vulnerabilities from new contributors in a manner that is directly comparable. We find that despite concerns about ease of use, first-time contributors to Rust projects are about 70 times less likely to introduce vulnerabilities than first-time contributors to C++ projects. We also found that the rate of new contributors increased overall after switching to Rust, implying that this decrease in vulnerabilities from new contributors does not result from a smaller pool of more skilled developers, and that Rust can in fact facilitate new contributors. In the process, we also qualitatively analyze the Rust vulnerabilities in these projects, and measure the efficacy of the common SZZ algorithm for identifying bug-inducing commits from their fixes.

  • Railing5132@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    No matter what tool is used, if you don’t start from a foundation of security first, your code will not be inherently secure. I can accept that some tools have more guardrails than others, but we are not teaching foundational security skills and principles, privacy and ethics even at the college level. Until that is addressed at a large scale and applied at the lowest layers to the silicon, we’re doomed to this security hell hole dystopia we’re living in.

    • realharo@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Would that actually help?

      Like, if you look at a list of recent vulnerabilities and breaches, what skills would have prevented those from happening?

      • Railing5132@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Looking at specific vulnerabilities or breaches in a complex, interconnected system wouldn’t be particularly helpful in the context I was aiming for. I was thinking more along the lines of generational education in secure practices. Thinking and acting securely on a global scale to ingrain that mindset in future engineers. Security and ethics courses for high school and engineering college undergrads.

        Of course, this all comes down to market forces. Manufacturers don’t have an incentive to do more than the bare minimum QA…

        Heres an example of the sorry current state: my son just graduated from a Big 10 school with a degree in robotics and electronics engineering. It was very heavy in programming. He’s continuing on to a Ph.D program. He had exactly ONE lecture regarding secure coding and programming ethics. He is required to have no more. In a 7-8 year program, 1.5 hours of formal instruction on secure coding practices and ethics.