Context

Senate Bill (SB) 1047 is legislation proposed by Senator Scott Wiener for regulating AI models that cost over $100 million to train. The bill was designed to hold AI companies accountable for potential damages caused by their models.

It gained widespread support from the population of California and a broad coalition of labor unions, AI safety advocates, Hollywood figures, and current and ex-employees of AI megacorporations.

However, many giant corporations including Google, Amazon, Meta, and OpenAI opposed the bill, asking Gavin Newsom to veto it.

Mozilla’s statement

On August 29, Mozilla joined the corporations to endorse a veto, publishing its own statement:

Mozilla is a champion for both openness and trustworthiness in AI, and we are deeply concerned that SB 1047 would imperil both of those objectives. For over 25 years, Mozilla has fought Big Tech to make the Internet better, creating an open source browser that challenged incumbents and raised the bar on privacy, security, and functionality for everyone in line with our manifesto.

Today, we see parallels to the early Internet in the AI ecosystem, which has also become increasingly closed and consolidated in the hands of a few large, tech companies. >We are concerned that SB 1047 would further this trend, harming the open-source community and making AI less safe — not more.

Mozilla has engaged with Senator Wiener’s team on the legislation; we appreciate the Senator’s collaboration, along with many of the positive changes made throughout the legislative process. However, we continue to be concerned about key provisions likely to have serious repercussions. For instance, provisions like those that grant the Board of Frontier Models oversight of computing thresholds without statutory requirements for updating thresholds as AI proves safe will likely harm the open-source AI community and the startups, small businesses, researchers, and academic communities that utilize open-source AI.

As the bill heads to the Governor’s desk, we ask that Governor Newsom consider the serious harm this bill may do to the open source ecosystem and pursue alternatives that address concrete AI risks to ensure a better AI future for all.

Source: Mozilla (PDF).

Gavin Newsom vetoed this bill on September 29th.

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    3
    ·
    2 months ago

    Okay, I’m definitely not defending them because they’ve been annoying me a lot lately, but I personally did disagree with this bill.

    Essentially it made it a pay-to-enter contest for AI, where the bar for entry was that you had to be a mega-tech-company to get to play with AI. Because we all trust Google and Facebook to play nicely. So startups wouldn’t be allowed to train AI models (more or less, wording is vague), but big tech is “trusted” to uphold the moral standards and can do so. So, yeah, glad the bill was vetoed.

    We do need regulation, but that one was just non-competetive bullshit.

    • LWD@lemm.eeOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      2 months ago

      Did you read the first paragraph of what I wrote before responding to it? Because this…

      Essentially it made it a pay-to-enter contest for AI, where the bar for entry was that you had to be a mega-tech-company

      …is clearly not the case.

      It’s the opposite: The bill only affects huge companies, not small ones.

      And let’s use a little critical reasoning: Google opposed the bill. OpenAI opposed the bill. Amazon opposed the bill. The biggest megacorporations sent their lobbyists to stop this bill from getting passed. Do you genuinely think they were acting against their own collective self-interests?

      • chicken@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 months ago

        It’s not actually clear that it only affects huge companies. Much of open source AI today is done by working with models that have been released for free by large companies, and the concern was that the requirements in the bill would deter them from continuing to do this. Especially the “kill switch” requirement made it seem like the people behind the bill were either oblivious to this state of affairs or intentionally wanting to force companies to stop releasing the model weights and only offer centralized services like what OpenAI is doing.