Skip to main content
AI

How big new AI regulatory pushes could affect open source

GitHub’s chief legal officer weighs in.
article cover

Shelley McKinley

6 min read

With governments around the world currently deliberating new laws that could define the future of AI regulation, one of the common sticking points is open-source AI.

Policymakers and regulators are grappling with questions like how they can avoid stifling a culture of open-source innovation, whether open source is safe when it comes to massive AI models, and what open-source AI even is, exactly.

As chief legal officer at GitHub, a platform home to tens of millions of open-source projects, Shelley McKinley’s job description includes working with lawmakers to ensure the interests of open-source developers are represented in policy discussions.

Tech Brew spoke with McKinley about the potential effects of the EU’s AI Act and California’s AI safety bill, the effort to define open source and the possibility of eventual federal legislation.

This conversation has been edited for length and clarity.

There has been a lot of talk among open-source developers about the effects of California’s AI safety bill, which is now awaiting a signature from the governor. Do you think that lawmakers have done enough to address the issue?

We’re still waiting to see what’s going to happen with all of that as it goes to signature and with the two different California bills [SB 1047, which mandates guardrails for big foundation models, and AB 3211, which requires platforms to detect, label, and in some cases block AI-generated imagery]. There’s going to continue to be a need to weigh in as this moves forward, whether it ultimately goes into law as written, or there’s more improvements that are made along the way—that’s the kind of thing that we would be monitoring. We held an open discussion at GitHub in San Francisco about it to ensure that we heard voices from the community and from the policymakers.

What are some of the concerns you heard with the bill?

We’re going to be looking at what impacts open source, specifically in a negative way, and how those bills get created, to provide transparency, provide any number of requirements around things like watermarking…[That] doesn’t mean it needs to be easy, but it needs to be clear, so we always understand what it is we need people to do…And so those are the kinds of things that, as we work through the details of the bill on the working level to make sure that things like, “Are there things like consent banners, that are required to be shown, that just simply are not going to work in the way the system works?” So, that’s what the focus of our work generally is, around how do we make sure it’s actually going to work in practice—to move from the bill, and its theoretical state of all of the things it’s trying to promote [or] prevent, and how do we get it so that developers can actually implement these things and that we aren’t blocking innovation? We’re instead taking a safe, responsible approach where innovation can continue.

Has the EU’s AI Act done enough to carve out and promote open-source AI in GitHub’s view?

I think that still remains to be seen. I think the great news is that there’s something in the EU AI Act. But of course, as much as we like to think that the EU AI Act is final and done and shipped, and now we’re all focused on compliance, there are a whole bunch of things that still need to get written. So, loads of practice, etcetera, those all need to get actually finished before people can implement over the next couple of years. So, that’s where we would work to ensure that as those things move from the legislation that’s passed, that they actually are, in the end, able to be implemented by developers.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

Will efforts to formally delineate open-source AI, like the Open Source Initiative’s recent definition, help with conversations about regulation?

I think that’s headed in the right direction in terms of how do you make things as transparent, open, and open source as possible, while also trying to avoid some of the pitfalls that could come with, for example, requiring that data sets are completely open publicly? So, I think that’s a good development. And it’s funny, it’s a question I get a lot, like, what is the definition of open source, and there are so many opinions about that, and certainly, people like to take the opinion that most suits what they’re what they’re pushing at the moment. So, it’s nice to see standards organizations and things that are helping set the level for everyone out there to take those kinds of positions.

Do you expect other states to follow the example of California’s AI safety bill?

We’re going to see a slew of these. I think there’s no doubt that the regulatory pressure around AI is not going to subside over this year. I do think we’re starting to see and think about more of a shift to societal resilience as a concept in terms of, “What do we really need to do to get ready for AI?” There is the regulatory component, and that’s an important one, but there’s also all of the investments that we need to make as a society to ensure that we’ve got skilling right so people could participate, to ensure that we’re thinking about…cyberattacks using AI. So, how are we going to make sure we have AI defenses up to that?...So, I think we’ll see some of this shift, starting to take a broader look at the whole ecosystem, versus just the narrow regulatory focus we’ve had to date.

Will we eventually see regulation on the federal level?

That’s hard to say. We’ve been waiting on a comprehensive privacy bill for a long time in the United States, and that has not come to fruition. Will there be a comprehensive AI [bill]? I can’t predict the future. I don’t necessarily expect it in the next year. You see a lot of states popping up—I mean, that’s the same thing we’ve seen in privacy—this is not really a new playbook in the United States. And the Europeans are out ahead on the AI Act, the same as [General Data Protection Regulation], and GDPR has become essentially the de facto standard. So, we’ll see where ultimately the implementation goes in the EU, and how they’re going to figure out who the actual regulators are. Is it the privacy regulators or are there others? They’ll probably have a mix, but I think that will continue to be, at least for the foreseeable future, the leading real regulatory effort on it is my guess. But surprise me. We’ve got a lot coming up with the election, so we’ll see.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.