Overturning of Chevron Deference’s Impact on Cybersecurity Regulation
2024-8-20 18:30:59 Author: securityboulevard.com(查看原文) 阅读量:5 收藏

Catch this episode on YouTubeAppleSpotify, or AmazonYou can read the show notes here.

Welcome back to Adopting Zero Trust or AZT. In our latest episode, we assembled a distinguished panel to dig into a timely topic affecting the cybersecurity landscape but has the fog of war wrapped around it. Today’s conversation centered around the recent developments in cybersecurity regulations and their potential impacts, ignited by the Supreme Court overturning Chevron Deference. This, of course, has other potential impacts on all regulation types enforced and shaped by federal agencies, but our focus is, of course, on cybersecurity, privacy, and AI.

The Panel

We welcome back Ilona Cohen, Chief Legal and Policy Officer at HackerOne, who joined us last year to discuss the National Cybersecurity Strategy. Ilona is also the former General Counsel for OMB. We are also joined by the GRC meme king, Troy Fine, the Director of SOC and ISO Assurance Services at Gills Norton. Beyond the memes, Troy takes a practical perspective on regulations and acts as our voice for those who may be most immediately impacted.

Key Takeaways

  • Chevron Deference overturned: The Supreme Court’s decision removes the requirement for courts to defer to federal agencies’ interpretations of ambiguous statutes and now relies on the courts.

  • Increased regulatory uncertainty: This ruling may lead to more challenges to existing and future regulations, potentially affecting cybersecurity and AI policies.

  • State vs. Federal regulation: The uncertainty at the federal level might prompt states to act more quickly on issues like AI and cybersecurity, potentially creating a patchwork of regulations.

  • Impact on AI regulation: With about 40 federal bills addressing AI in the pipeline, the ruling could complicate the process of creating comprehensive federal AI regulations.

  • Cybersecurity implications: Existing and proposed cybersecurity regulations, such as the Cyber Incident Reporting for Critical Infrastructure Act, may face new challenges.

  • Business concerns: While some business organizations applauded the ruling, the resulting regulatory uncertainty could be problematic for companies trying to plan and comply with regulations.

  • Expertise concerns: There are worries that courts may lack the technical expertise to make decisions on complex technological issues like AI without deferring to agency experts.

  • Potential for innovation: The regulatory uncertainty might create a wild west period for AI, potentially fostering innovation before more stringent regulations are imposed.

  • Self-regulation importance: In the absence of clear federal regulations, industry self-regulation initiatives may become more significant, especially in rapidly evolving fields like AI.

Editor’s Note

AZT does not take political stances, but as you’ll quickly hear in this episode, we do switch between legal, business, and some political opinions. Frankly, it’s difficult not to impose any bias on a discussion that has political implications. Still, I just want to reiterate that this is an important issue for CISOs, GRC teams, engineers, and cybersecurity teams who support maintaining compliance.

Revisiting Chevron Deference and Its Implications

This week, our primary focus is on the recent Supreme Court’s overturning of the Chevron Deference, a legal doctrine established in 1984 that allowed courts to defer to federal agencies’ interpretations of ambiguous statutes as long as those interpretations were seen as reasonable. Ilona Cohen provided a comprehensive breakdown of this landmark change and its implications for the regulatory landscape.

Claroty

The overturning signals a shift where courts must now interpret statutes independently, possibly leading to increased regulatory challenges and uncertainties. This decision could notably impact sectors with rapidly evolving technologies, particularly cybersecurity and AI. One of the primary concerns is decision-making based on expertise vs. legal power, which is where politics come into play.

State vs. Federal: The Regulation Tug of War

Neal raised pertinent questions about the potential shift of regulatory power from the federal level to the states. Ilona and Troy discussed how this shift could lead to a patchwork of state regulations, complicating compliance for businesses operating across multiple jurisdictions. The conversation highlighted the balance needed between timely federal action and the proactive measures taken by states such as California and Texas, especially in areas like privacy and cybersecurity regulations.

AI Regulation: Navigating Uncertainty

The panel explored the burgeoning field of AI, where regulation remains fragmented and inconsistent. With around 40 bills in the pipeline addressing AI at the federal level, the discussion underscored the complexities of regulating a technology that evolves faster than legislative processes.

Businesses are left navigating a gray area, experimenting with self-regulation frameworks like responsible AI initiatives. Neal pointed out the historical context of the Wild West days of the internet, suggesting that a similar period might be necessary for AI to foster innovation before stringent regulations are imposed. Meanwhile, companies that plan to offer their AI capabilities in the EU will still have to comply with the newly passed EU AI Act. This is similar to compliance with international regulations like GDPR.

Sector-Specific Insights

The discussion also touched on sector-specific regulatory approaches. The payment card industry and the advertising sector were cited as examples of successful self-regulation. Ilona emphasized that while some industries can self-regulate effectively, others, particularly those critical to infrastructure, might require more stringent oversight. Examples like the Colonial Pipeline incident were pivotal moments that highlighted the need for comprehensive cybersecurity standards.

The Role of Agency Enforcement

A significant part of the discussion revolved around the role of agency enforcement actions in the absence of clear regulations. Ilona argued that proactive rulemaking by agencies, despite its challenges, provides a preemptive clarity for businesses, contrasting with reactive enforcement actions post-incident.

Troy added that industries with well-established self-regulation mechanisms, like the payment card industry, can serve as models. However, replicating this success across different sectors remains challenging due to varying levels of maturity and complexity.

Future Scenarios: Best and Worst Cases

The panelists shared their perspectives on potential future scenarios.

Best case: Congress acts to provide clear delegations of authority to agencies, enabling them to create precise, informed regulations.

Worst case: A cacophony of state-level regulations leading to a compliance nightmare for businesses and inconsistent enforcement from various courts.

Thank you for joining AZT, an independent series. For more detailed insights from our episodes, visit adoptingzerotrust.com and subscribe to our newsletter.

Show Transcript

This transcript was automatically created and is undoubtedly filled with typos. As usual, we blame the machines for any errors.

Elliot Volkman: Hello and welcome back to Adopting Zero Trust or AZT. Today, we have a wonderful panel to talk about a very timely, but a sort of what if. Situation and I say, what if, because it remains to be seen how the situations that are unfolding are going to impact cybersecurity regulations and everything else that wraps around that.

However we do have a fantastic group of folks who can help explore the different aspects of this. And I will reintroduce a couple of folks. I don’t have to introduce you to Neal. I hope you know him by now and his voice.

Neal Dennis: You did your own little stint there for a while without me, but we got it.

Elliot Volkman: Hey, you went on vacation. That’s

Neal Dennis: go on vacation and I left you alone and you did just fine. You did great.

Elliot Volkman: I had to basically bribe people with hats. That’s the only form of payment that we have available since we make zero dollars. So just just put that out there. Anyways, let’s let’s go and start with Ilona. Your background is just insane. Maybe What’s the best way to describe, you work over at HackerOne.

Everyone should probably be familiar with that brand, but you have a nice history beyond that. So what’s the best way to position your background as it relates to this conversation?

Ilona Cohen: Sure. But before we get started, I’m a little disappointed that apparently you bribe people with hats and I don’t have one yet. So we’re gonna have to remedy that.

Elliot Volkman: To be fair, I only got them six months ago and I feel like we had you on a little bit before that.

Ilona Cohen: All right. Fair enough. So I’m the chief legal and policy officer of Hacker One. Hacker One. For some reason, you don’t know what that is. We will hack you and try to fix your vulnerabilities before the bad guys do. That’s the short version. My background is I’ve spent over a decade in the government, however, including in the White House counsel’s office.

for the Obama administration. And then I was the general counsel of the office of management and budget, which relevant to this discussion is the agency that handles all rulemaking in the administration. So all the rules that come through.

Elliot Volkman: Perfect. All right. Troy, you’re newcomer to this. What is your background? Who are you? And then I’ll actually probably talk about the topic, which will add some more context for why we have this panel.

Troy Fine: So I am Troy Fine. I am the director of SOC and ISO Assurance Services at Gills Norton. So we are a audit firm that, CPA firm that performs SOC 2, anything in the compliance alphabet. We’re doing those types of services and audit. I’ve been doing this for probably 12, 13 years, really started my career as an auditor and kind of just built my career on auditing and then in the cyber world, which is a little different, I think, from other cybersecurity practitioners.

But I also post a lot of memes. So if you follow me on LinkedIn I’m starting to think I might be a better an auditor, but that’s still up for debate. But I started to realize that, that I might try to make a career out of that. We’ll see.

Elliot Volkman: That is true. And that’s, in fact, why I was trying to build a podcast around you. Certainly 40, 000 people can’t just care about your memes. They gotta care about your opinions. Or maybe it only is in meme form, and

Troy Fine: Opinions that people can, opinions are hard. Memes are easy. So I try to stick to the memes.

Elliot Volkman: Alright, totally fair. So that brings us to the topic of today, which is again, very timely. It could change by time. We actually hit publish on this thing. So just fair warning. I’m not going to say the exact date because then I will get in trouble for something else that I currently do on the side.

We’re not going on that topic though anyways, so we’re gonna be talking about the recent Supreme Court, overturning of the Chevron deference which is I don’t wanna try to explain this. We have an expert for this. That I’m handing this over to you. Maybe you can contextualize this and then the rest of us can ask a couple idiotic questions to bring it back to reality.

Ilona Cohen: Sure. Happy to try. So at the end of the Supreme Court’s term, there were a couple of opinions, actually, that overturned a bedrock legal doctrine known as Chevron deference. And what the Supreme Court held was that courts could no longer overturned give deference to federal agencies interpretations of federal statutes.

So under the Chevron Doctrine before these cases which has been in place since 1984, courts would defer to federal agencies interpretations of the ambiguous statutes as long as those interpretations were reasonable. So that is no longer the case. Now it’s up to each court to read the statute anew and make a determination about what, whether the agency’s interpretation is the best interpretation.

And I know that this says You know, on the headline, AZT Chevron deference, but it wouldn’t. I would also like to mention that there was another case that is just as important as Chevron and that you have to look at the two together to really get a full understanding of the impact of the court’s actions.

And that is the corner post case. And so that is important because until that case, there had been an understanding that companies are. Plaintiff’s had up to six years after a rule was finalized to be able to challenge that rule. And that was always understood to be six years after the rule was complete.

And now the Supreme Court has said no, it’s six years after the injury. Has been discovered. So this puts basically every federal regulation on the potential chopping block

Elliot Volkman: Yeah, just a few potential impacts for our space. Yeah, not to mention, there’s still a lot of things that we’re currently working on. But maybe we can start with the big hot topic, which is AI. So there was a headline probably just this week or let’s say middle of July is the safe way to label that of the other administration which may or may not make it in to.

basically pull out all the stops and make AI regulation a little more friendly towards us. And I think that’s how the EU AI Act has positioned some of their stuff. But that is just one aspect of things that we’re already hearing about. With, so right now there is no federal AI regulation, correct me if I’m wrong.

But there’s 40 bills in the works. There’s a pipeline where there could be some impact. I’m curious from your perspective, like From all the things that are in the works today, how do you feel that some of that could impact especially on the cybersecurity front where people are abusing it for increased social engineering and kind of ransomware bail building and malware and some of that stuff.

Do you have any visibility into how you feel like pulling out the stops could increase risk or something to that extent?

Ilona Cohen: Sure. So following the ruling, the conventional wisdom or the set of rulings. Following the set of rulings, the conventional wisdom is there’s going to be a lot more challenges to all rulemaking going forward and a lot of regulatory uncertainty. And that’s in part because there aren’t that many clear rules or clear statutes.

on either cyber security, really, or AI. You’re right that there are a lot of pending bills right now. So there’s potential for future law, but Congress doesn’t move very quickly. As I’m sure everyone on this call knows, they move very slowly. And even when they do move, they don’t actually have the power.

Capacity to necessarily prescribe a certain set of standards that will keep pace with technology. So no matter what, anytime an agency has to interpret a law in order to make a relevant and a timely rule, they’re going to have to rely on something that’s ambiguous. And then that just means that’s going to be more and more likely to challenge.

The outcome is not Necessarily good for business because although there are many business organizations that like applauded the set of rulings, the reality is The uncertainty when it comes to regulation could, in fact, be very bad for business. The companies rely on the fact that they have, they understand what the regulatory outcome is and they can plan for it.

And in this case, you’re going to see something very different. You’re going to see uncertainty when it comes to the existing set of rules, but you’re also going to see uncertainty when it comes to future rules. So it’s going to be very challenging and this is actually something AI in particular, something that came up in the oral arguments when the Supreme Court considered this.

Justice Kagan mentioned AI and she said she talked about the fact that might be something that courts really might not get right because they don’t have the expertise necessary to be able to really make a decision about how it, The statute should be interpreted that you need to be able to rely on the agency experts who have the more of the scientific expertise or technical expertise necessary to be able to promulgate the rule.

Elliot Volkman: Okay, so that certainly makes sense. If I was to summarize that it, we’re entering a state of significant increase uncertainty, which means. Both positive and negatives can come from that. There’s more gray area for people to explore things, but without the guardrails that are being put up elsewhere it could create some other scenarios.

Neal, I’m gonna maybe throw that over to you a little bit. Feel free to jump on Soapbox, cause, I don’t know, there might be something there, right? With the framing of, yes, we, we know that even on the federal side, probably not cybersecurity experts, let alone AI experts. Yeah. What are your thoughts?

Neal Dennis: Transcript

initial quick curiosity question before we go down the AI particular rabbit hole.

I gave on the

a the federal government’s ability to push things down to the state level, or does this give more power to the state level to impact regulatory things like this in their own boundaries?

So there’s a point behind that when we get there. I’m just curious, does this empower the states a little bit more potentially, or possibly push more requirements down on the states to have their own guidelines for stuff AI or other things of that nature?

Ilona Cohen: I think it’s the way I see it, this uncertainty at the federal level may prompt states to act. In a faster or with greater breath than they might have otherwise. And that also creates a problem for industry, as because, all of us who are trying to deal with privacy law, for example, understand, you have to deal with California and that’s different than Illinois and that’s different than New York.

And. Each 1 of these states has a different set of standards, which makes it really difficult for industry to try to comply with, the different the differing views of the state lawmakers, and then it’s sometimes better for the federal government to act. But here, when the federal government is going to be potentially in a state of turmoil, it might prompt faster action from the states.

Neal Dennis: So that makes sense to me. So the reason why I ask is because you started down it a little bit. When we think about AI regulation or privacy regulations or in the state of Texas at the moment, this is a big thing here where they’re enforcing privacy regulations around the adult entertainment side of the house.

There’s things that they’re putting into play that are impacting that, which is living two minutes away from Austin. And They’ve literally picketed the crap out of that, which is hilarious, but different topic, different podcast. So I asked because At the

Ilona Cohen: Yeah, I’m not on that podcast, by the way. Don’t invite me, please.

Neal Dennis: But the reason why I’m curious about it is because the state is also Texas as a state I think has a little bit more impactful cyber regulation.

Forward thinking or negative. Either way you wanna look at it, whichever aspects. They do a lot of really neat things from the cyber world. They’ve done a lot of weird things that are just annoying from regulation. So when we think about ai. Because now, as everybody knows, we have SpaceX and Twitter formally being relocated here.

And Elon’s position on AI regulation, all the other stuff, what it is. So back on AI, I’m just more curious about if that seems like a technology that, That could be impacted at these regional levels while we wait for the federal government to do things. I feel like AI as a construct and security is too much of a global thing for a state to really try to wrap something around without it being hardcore challenged every day of the week.

But that’s what my personal fear is. And so I’m just curious, federal regulation versus state regulation around things that seem a little more global impactful, especially the AI topic.

Ilona Cohen: I do think you will have other bodies, like international bodies, state bodies that are going to act in the face of uncertainty at the federal level. I don’t think, by the way, I do think the federal government is going to try to legislate, but as they move so slowly and

And

they come up with in the federal front in legislation is going to need the interpretation of the experts at the agencies and that subsequent interpretation and application to real life standards, real life circumstances.

That’s what’s going to be really under the gun here when we’re talking about the post Chevron. World. There are a couple of things not just an AI, but also in cyber like Circea, the, cyber incident reporting for critical infrastructure act, which is currently a proposed rule.

Now, there are other cyber incidents. Similar critical infrastructure rules that are coming up as well that are trying to impose cyber standards using laws that never even dreamed of the word cyber let alone, regulating it. That are all going to come under the gun and are going to be subject to a lot of scrutiny in the courts.

And not only are we going to have to rely on judges who may not know the first thing about these topics, you’re also going to have different judges in different jurisdictions, reaching different opinions, which is going to lead to I’m not going to say a state of chaos, but certainly uncertainty that’s going to make it very difficult for companies to know where to invest and how to, make sure that they’re actually going to comply with what ultimately becomes the standard.

Neal Dennis: I can definitely see a lot of escalations going unanswered up towards the top of the food chain just because of the scale of how many will probably happen. You talked about,

Elliot Volkman: Troy. You deal with these all the time. I’m curious from like the under the perspective, how do you untangle that when you have folks that are coming to y’all for whatever perspective. They have over Is there a standard talk track? Like how do you untangle the 30 different overlapping

Troy Fine: You don’t. That’s the short answer. I think. I don’t think you can, especially when it comes to privacy, right? As we were talking about before with all the state laws and if you’re a global company, you have to deal with country specific privacy laws. So it’s not just in the U S it can be all over the world, right?

There is no, you can’t, right? If you look at meta, look at Google they have, that they have privacy teams, but like they have budgets for dealing with privacy Violations. They’re budgeting for that because they know it’s impossible to do it with and you’re gonna spend more money trying to comply than you are in probably, paying the bills to the governments, right?

I don’t think you really can, I think you can only do your best effort and hope you’re not doing anything negligent that would cause somebody to take a closer look. And that’s what they try to do so

Ilona Cohen: provided by Transcription

courts would do with something like quantum computing, right? There’s rulemaking that’s expected and Do we really think that Congress is going to get that if they do it in such a prescriptive way that will be, without ambiguity, therefore, without legal challenge.

And even if they do we think that the courts are going to be able to interpret whether or not the agencies get that? I’m not suggesting that the agencies should do that. Be without any checks on their authority in some of these areas, because, we want rules that are fair and not burdensome and not, overly cumbersome for industry and that are tailored really narrowly.

But on some of these more complex challenges, like in the technology realm, AI,

these and the researchers Like I mentioned, quantum computing, do we really think that The courts can get it right. And this is again where Justice Kagan’s dissent, writing for the minority said this opinion took a rule that was based on judicial humility and turned it into one that was based on judicial hubris.

Suggesting that the court really thinks it can do no wrong. It can, it will be the best position to decide all of these

And I really do question that. They can’t, it’s impossible because of the rate of change, right? You can’t write a law that can in technology or cyber or quantum, whatever, whatever you want to call it. It changes too fast to have a law that covers it all 5 years from now. It’s gonna be completely different.

Six months from now, it’s going to be completely different.

Neal Dennis: And I think maybe that’s for me personally, I think maybe that’s the Pro to this is indirectly just by sheer volume, deregulating things that, and this is the libertarian side of the conversation saying that, federal overreach on things that weren’t clearly defined constitutionally, blah, blah, blah, blah, blah.

Everybody pick your political party, figure out what you want to talk about there. But I see potential, especially in the tech world where, nineties, early two thousands wild West. In the grand scheme of things, you had hackers, crackers, white hats, black hats, pick a flavor of the day, doing all sorts of things and not really having a lot of legal structure to put them into prison or to do things when they were blatantly being malicious until the court hearings back in 2007 or 8, whatever it was, when the gentleman in Austin went to jail for the first ever spam abuse case.

I don’t, Big thing, fun stuff lived down the road from him when it was happening. It was fun to watch them roll them up. I didn’t know what was going on, but that set precedents for what it meant to be a spammer and take up internet bandwidth. So they wrote legal structure of some sort around him to arrest him.

Post his arrest to capture other people like him. I like the wild West and the tech, a lot of fun things get done personally. Yes, I still need to know who to arrest and throw in jail, but when it comes to developing tech, I think personally, less regulation equals better innovation down the road, you throw some guidelines on it relative to what it means to do so maliciously.

But I think they still need to be a little more open ended, especially when the tech is just getting there. I like the approach of the movie theaters and the music industry of self regulation. And so as you do it correctly, then the government should have no need to come in and. Slap you around and tell you what to do.

That’s my personal

Ilona Cohen: So you’re, there’s one, first of all, I don’t think of this as a political issue. I know that you just mentioned politics, but I think of this as just a, like an understanding of how you view the world. But there’s one assumption that you made in your statement that I think is false. And that is that less regulation means less burden on business.

And I actually think That in the absence of rulemaking and the challenges associated with rulemaking, you will not see agency in action. You might see more agency enforcement action. So different form of action. It will depend on the agency that you’re talking about and their authorities and how they use those authorities.

And I’m not suggesting even that those would be without any. Challenges. But the difference is you’ve got in rulemaking. You’ve got an agency telling you in advance. Actually, not you telling everyone in advance how they perceive this. Statute and its interpretation of the statute before you act, whereas agency enforcement action, you only see that to a single party or a set of parties after you’ve already acted.

And that’s why I prefer that. You, as an industry representative, I would rather know in advance how the agency perceives something, how they view the interpretation of a particular statute before action rather than after action. I don’t necessarily agree that, less regulation means less burden.

Neal Dennis: I definitely agree with that. I don’t agree with that side. So I apologize if that’s how it came up. I think burden of labor and burden of proof, all that fun stuff falls on the private side. Whoever’s managing, developing, running XYZ technology. Until the agency does step in and say, we think something’s wrong.

And I think once again, back to the old internet, that’s loosely what happened initially. We had people early nineties, eighties, early nineties, doing all sorts of stuff, positive and negative, no regulation blatantly there. So we had to rely on, in the early two thousands on someone like Time Warner to say you’re going against our personal business bylaws for usage and access.

And that’s what they wrote him up on was one of the big things was he was using someone else’s Wi Fi along with his own, and they just put him in court initially for some kind of bandwidth stuff and illegal usage of Wi Fi per private guidelines, whatever it was. And then they wrote other laws about the consumption and tier one ISP utilization rates and all this other stuff for him.

And I personally like that. But I think it’s on the person’s doing the work to understand. Where they could potentially go wrong and the industry that’s doing the work to initially self regulate until they reach a boiling point where the agencies come in and say, Hey, WTF you’re doing some things that we don’t necessarily agree with anymore.

And then let’s duke it out in the courts. And maybe that’s how the law comes into existence. I don’t know, but I like self regulation at the industry layer as a starting point.

Ilona Cohen: What about critical infrastructure?

and

of critical infrastructures, lack of cyber security, adherence, cyber security, regulation or self regulation, right? There are times when failure to adhere to basic cyber hygiene leads to entire. Sections of our economy, shutting down colonial pipeline is, of course, what I’m thinking about.

And that impacts not just the industry, not just the company, but an entire region and an entire economy. And so that’s the, that is the basis for the cyber security, national action plan. And this administration saying, we need to level the playing field. And if you’re not going to get there yourself, we’re going to read these statutes.

would you go

in order to make sure that you do

Oh

protect not just yourselves, but all of us. So I’m curious, I know you’re the podcast host, not me, but I’m going to turn it back on you and say, I’m curious do you think that those companies that have completely ignored cyber to the detriment of an entire economy should just be able to continue to proceed as normal?

Eastern seaboard be damned. I’m just going to do what I’m going to do.

Neal Dennis: no I think, so that’s a very valid question. So I think in that case, we had, we already had two, both federal and private regulation bodies that were at play and they ignored both. So I think under a private industry perspective the hilarity was that Colonial Pipeline

me in

had some issues in the private sector of things with their cohorts and their partner services and things like that.

We’ll leave it at that verbally, but they obviously had issues there. So they were paying the penalties before they even went before the federal courts. Once all this happened with their private sector partners. And I think that’s very important as an initial base. They did something wrong.

The private sector of that, those other businesses that they were with and doing things with regulated them. As much as they could at the time when that was happening and more or less impacted the prolonged business. Once things were back up and running financially, and then the federal courts came in and saw obviously what had happened.

Salt was going on, and I think that was a good teal or a moment to make a new regulation law for how that happened. And yeah, I think you’re going to have key turning points like that. That lead us down the road where government oversight should be applied or a review of what current oversight is there and where you missed the bill from the private regulations that your industry vertical may or may not have had.

And yeah, I think it’s a very fair question. And I think that is one of those milestone moments where someone needs to get smacked in the face and then government oversight can review what’s already been going on. But absence of

Ilona Cohen: is that courts are no longer really able to defer to a federal agency who makes a calculation based on common sense, right? The agency is generally supposed to come up with a reasonable interpretation. And It might be reasonable from my perspective, from your perspective, to finally get to some kind of a regulation that would prevent another colonial pipeline from happening but that’s not the calculus anymore.

That’s no longer the

Troy Fine: clarify, are you saying? Because this was, there’s this. That like the TSA is the ones that oversee the pipelines. They interpreted a law based on this doctrine to come up with rules. Are you saying that they wouldn’t be able to do that anymore because of that?

Yeah.

Ilona Cohen: look, I have not specifically looked at that rule to assess whether or not they would have the authority to be able to move forward, whether they relied on a reasonable interpretation, or whether they relied on the best reading of the statute. But I’m just saying generally Neal is having an argument with me.

Or discussion, I should say about whether rules are good or not whether regulation is when is the right time for regulation. That’s like the discussion that we were just having. And I’m saying, you can’t really apply common sense anymore to when is the right time to have a regulation because even though you and I might agree that it’s time because there’s.

Such a significant impact. It’s really all about the statutory construction and whether or not the statutory construction is the best. And in light of the fact that most of these, most of this administration’s effort to level the playing field when it comes to statutory when it comes to critical infrastructure and cybersecurity standards for critical infrastructure.

Those are based on old laws. That very likely talk about safety more generically than cyber security safety. And so you could easily see a court ultimately invalidating those. It’s just something to think about, is that’s, I think that’s our new reality, and that is

Troy Fine: concerning to on the safety things. I know the FTC brings a lot of they, they find a lot of companies based on old rules that just talk about safety and they’re interpreting safety as being cybersecurity related, which the law might’ve been written 50 over a hundred years. Could have been written 50 years ago.

And talked about safety, but the FTC is using this to say, you didn’t have the right cybersecurity protections in place. We’re now finding you blah, blah, blah, blah, blah. Or they’re using it for privacy. So I think you’re right. I do think it’s going to impact cyber from that perspective, because how could you say that safety, when they wrote that law, they weren’t, they didn’t even know cybersecurity was a word, right?

And now you’re interpreting it to include cyber. I could see them saying that’s not allowed anymore, for sure.

Ilona Cohen: Sure. Now, that’s not always a bad thing, right? Yesterday, the court threw out most of the solar winds. they did it in part because the, they said as a matter of statutory construction, the way the SEC was interpreting its authority was wrong. So they used a law about internal accounting and disclosure controls.

To apply to controls relating to cyber security and the court said no, that’s not what the statute says. You don’t have that authority. I’m not sure if that had been decided before the Supreme Court’s decisions. I’m not sure it would have had the same outcome. So like I said, it’s not always a bad thing.

would

You like, I don’t have, I don’t know. I don’t, I didn’t look at the facts of that case well enough to know, if that was a good thing or a bad thing, but the bottom line is there will be times when agencies completely get it wrong. And so there is a role for the courts there, but I do worry that most of the time, they’re not going to get the technical stuff right.

And they’re not going to get the sort of scope and breadth of the rule and its application. They’re not going to be able to apply common sense. I,

No.

Neal Dennis: And actually write a new regulation, new law, new rule based off of current issues, concerns, and see how that goes at some level. I think for me, that’s the big thing. Instead of using 50 year old policies to interpret things to fine or debilitate or grow either way, it goes both ways. Industry verticals making sure we have blatant, As best practice that we can regulation and understanding it’s going to change in five years, the technology behind it, but the fundamental compliance pieces or the fundamental security nature of it, the tech targeting it changes, but the construct around cybersecurity and the things that you’re expected to secure, like zero trust environment and methodology and things like that are a little less fluid, right.

From a higher level, strategic terminology perspective. And so I think back on those, I think that’s for me, the big thing. You, when we leave it open to the agency to interpret, we have someone who’s more likely to get fined or brought up on charges and spend time in court for something that, doesn’t make as much sense.

And I’m not a lawyer. So this is all just me hoping to never go to court for stuff that I think I did. Statute of limitations is up on most of the stuff I did. That was bad. So we’re good. But that being said.

Ilona Cohen: I can refer you to someone if you need it.

Neal Dennis: I might, I have confessed to way too much crap on this podcast over the last three, two years, but nobody’s knocked on my door again. So we’re good. But that being said I think there’s a time for regulation. I think back on the critical infrastructure, key resources side of the house. If you’re a large enough persona in something that’s already been defined as CIKR stuff. You already should be expecting to have some kind of regulatory requirements, both from your direct industry vertical, as well as the government. And I think that’s a fair assessment to say within those echelons. Congratulations. You’ve got federal oversight for the rest of the things that haven’t been defined back on AI and quantum computing and things like that for the time being. I don’t I don’t see a world where at the start of the technology, federal oversight. Is necessarily a great thing until we have some kind of milestone landslide moment that does push something bad. To say, wow, you definitely stepped outside your bounds. Congratulations. This was rated, whatever you need to

You

I’m just curious back on questioning.

So questioning side of the house we’ll go back and forth with our personal views all day long. Not a bad thing. We should just have a podcast where we literally just get the most divisive topic and get three, four people in a row. I don’t care what it is.

Ilona Cohen: We’ll just take Neal’s view and say this is the new standard that the courts have to apply. Whatever he thinks is right and is appropriate for federal regulation shall be regulated. And whatever is not appropriate according to Neal shall Is that’s I think what we

should do.

Neal Dennis: the whole world will go to pot. I will say the one thing recently that happened here in Texas federal judge struck down the private distillation licensing requirements by the federal government. So if you’re, I make wine part time, so this is exciting. But it might actually be, and this is why I’m talking about it, because I think it’s indirectly related to this because they just made the announcement like this week that the federal government, ATF is only allowed to apply regulation for licensing for stills to actual production, commercial facilities, and that the home production of any alcohol should be non regulated to the confines of the quantity that they allow to include distilled spirits or something like that.

Anyway, that’s exciting for me. But on that note, from a compliance perspective, again, moving forward I was very curious back on GDPR y type things, and we touched this at the beginning, when we think about state empowerment to do these things, do y’all see maybe where. Where with California’s privacy laws and the ones that are happening in Texas and a few other places, do y’all see those as possibly moments in time that could push federal regulation a little farther, whether that’s, zero trust compliance whatever it may be, but do y’all see that, more of a grassroots movement at a state level being able to help guide federal rulings and structure in law versus waiting for the federal government to do things again?

Ilona Cohen: I think the biggest impact, I don’t know, Troy, you can answer because you have to comply with these laws. But just one quick answer for me is that the biggest issues for some of these companies is not in the United States. It’s in the EU because they’re moving so much faster than everybody else. So for a multinational company, they are obsessed slash focused on what they have to do to comply with.

The laws that are so far ahead of us. And that, if anything, I think would push the U S more than some of these States, given the breadth of the

Troy Fine: I would agree. I don’t think the states are really going to push the federal government to do anything. If you look at the EU, There was a time recently where there was a lot of people didn’t want to send data over to the U S because we weren’t like, whatever. I can’t remember the terminology.

We weren’t officially approved by the EU to like GDPR to send data across the Atlantic to us. And so we came up with the new data, the DPA, the new agreement, and then they were allowed to all of a sudden, it was a quick fix, but that is what’s going to drive the U S to come up with a federal privacy law.

Businesses can’t get. Can’t be global and can do things because certain countries don’t want to send data to the U S or it’s too problematic for them to work with the EU. That’s, what’s going to drive the U S to change. Unless they can come up with a quick fix like they did and solve the problem. We were close.

We were getting there and then they did this quick fix and now it became a domestic bipartisan issue of why we don’t have a federal privacy law. And that’s probably not going to happen anytime soon.

Ilona Cohen: My goodness. There’s so many reasons we don’t have a federal privacy law. That’s a whole other podcast that we could do for hours. Yeah, the EU really is moving quickly, especially when it comes to AI, like the EU AI act for sure is, the focus of so many different companies and will, not sure you will ever see those large companies deal with self regulation because what they’ll be doing is dealing with another country’s or, bodies legislation or law, and that will force them to make changes in the U. S. regardless of whether the U. S. acts.

Neal Dennis: So thinking about that a little bit more. If we get to a point. Yeah. Where we have some kind of formalized federal policy on this. Do y’all think it’s going to be a very cut and dry or do you see things like California versus other states or whatever challenging this for years to come kind of thing courtesy of the new.

Rulings for Chevron,

up and

Ilona Cohen: There is no federal privacy law, and I don’t think there ever will be a federal privacy law in my lifetime. Maybe that’s a little bit too much of an overstatement, especially given that there was an attempt this time around with Congress introducing the bipartisan privacy framework, but, we all saw what happened to that.

It died a slow death or actually died a pretty quick death, to be honest.

Troy Fine: Yeah, there’s not going to be

Neal Dennis: Troy, would you be fine if there was one?

Troy Fine: probably not like a GDPR. There are privacy laws, right? That protect children and other stuff like that. That exists, but I’m not an expert enough to know how this would impact. Those types of laws, but I agree federal, a comprehensive 1, like GDPR, probably not.

We need to start off small, like we need basic protections at the federal level instead of trying to go all in on. Everything GDPR does, maybe like basic rights to be forgotten and things like that, we could start off small at the federal level, but we don’t normally, we try to run in the U S government before, we don’t do the crawl walk, run, we just run sometimes.

And that’s, what’s happening with privacy. They need to slow it down a little bit and get like the most foundational things Hey, we all agree. You should have a right to be forgotten. If you tell somebody, okay, let’s get that in a law, but yeah, it’ll probably be. Never before that happens. So there you go.

Ilona Cohen: Okay. you

curbing and the agency’s ability to act is going to have. Consequences because Congress does not act with expediency. As we know, and they don’t act with. Expertise necessary to make informed judgments always, so that’s suboptimal, and it’s going to lead to a suboptimal outcome.

Neal Dennis: So one last question from me back on that.

Okay.

best case scenario from y’all’s perspectives. And y’all, we hit on this a little bit throughout, but I’m just curious, quick summarization, worst case, best case for you, Troy, worst case,

Troy Fine: I don’t know if I can answer worst case, but best case for me would be that it actually accelerates the government to come out with cyber security regulations that are specific to cyber, right? If, with the, going back to the pipeline, instead of them having to interpret a old law to come up with new requirements for pipelines to protect.

Data, they should come out with a cyber security regulation where they don’t have to interpret or There’ll always be interpretation. Let’s but interpret less right not have to use the word safety To mean cyber security, right? So best cases that accelerates some of these things that are not in place That would be best case for me Accelerate i’m saying at the government’s pace, whatever that means right like instead of five years it happens in two years I don’t know but to me that would be best case please

You

Neal Dennis: worst case, you get 50 States doing 50 different things.

Ilona Cohen: Best case is that Congress can fix this. So not a done deal in that Congress can delegate discretionary authority to an agency. So they could tell

Back.

And in the interpretation of the statute, we want this agency to have the final authority, and that’s that.

And then this is not an issue and the agency can rely on its expertise and continue to go on as always. I don’t know if that’s the best case. Like the best case is always, tailored regulations, not burdensome, not cumbersome, like actually helpful to the economy, helpful to the security and the well being of all Americans.

That’s the best case generally always, but in terms of this precise issue, this is, this can be solved by Congress. I don’t see that happening anytime soon, by the way, but it can happen. The worst case is, as we’ve discussed at length. During this session, a lot of uncertainty, differing opinions, different rulings, companies not knowing what to do, sinking a ton of money into the like trying to understand how Texas rules on something versus how New York rules on something, and and then courts not being able to have the ability to issue an opinion that actually is a common sense one. So it’s hard to know what. Rules will come before these courts now without looking into all the different statutes, but you can imagine a circumstance where common sense is just not allowed to prevail in light of the restrictions now on the courts.

Neal Dennis: And I will agree on that last point. That is a concern. Whenever the rule finally has to be addressed, changed, adjudicated, wherever it goes in the court I’m right there with you on the worrisome of of that court and their ability to understand whatever that stuff is. That is a big concern.

So I’m, I appreciate it. I’m going to throw it back over to Elliot cause I definitely went down some oddities there. So I appreciate it. But I’ll leave it with this. If either of y’all want to help me interpret the Distiller’s ruling here in Texas, I’d appreciate that because I own one, but I would love to own one and be able to blatantly break it.

Say I own one without the federal government doing what they did here and calling up the club that was distilling stuff. So I would love help with that later, but anyway, moving over back to Elliot.

Elliot Volkman: Just just don’t blow yourself up or anything. Otherwise, I have to cancel the show and I don’t know how to handle that one.

Neal Dennis: I still have all my fingers mostly from the 4th of July. So we’re good. We’re good.

Ilona Cohen: Know, I would help, but, my legal advice is generally worth a little bit more than a hat and I didn’t even get that. So I’m going to pass on the additional invitation to provide free legal

advice. Elliot, look what you’re doing to me, man.

Elliot Volkman: We just have to make these a rare batch start auctioning them off and then it’ll equate to

Neal Dennis: I want to also go on record

Elliot Volkman: both of us have to blow up at your your production facility.

Neal Dennis: I want to also iterate, Elliot’s the producer. I’m just cast. I’m not even

Elliot Volkman: I literally just press the mute button on myself while I laugh. That’s my job. But I do have one question. I feel like Neal, this is into your waters for like self regulation, but it’s a dance between BS and not BS. And that’s where we get to these concepts of privacy by design, secure by design, shift left and some of these other things of like self regulation.

There are not hard guardrails that you have to follow. It’s these concepts that are like doing the right thing because it’s a different shooter and all that. Panel, what are our thoughts on that? Do we feel like that will, let’s say we devolve into a chaos with these situations and things just unravel.

Do we feel like there’s going to be organizations that stand to the occasion and start aligning more with these things to do the right thing? What are our perspectives on that?

Troy Fine: but I have a view, but I would love to ask Troy to answer first because I feel like.

There is, it already is the wild west. Look at,

Elliot Volkman: Fair,

Troy Fine: look at United Health Group, right? That was, that caused mass

Elliot Volkman: There’s a

Troy Fine: problems. That happens probably,

how Once every four months, there’s a major thing going on. It’s causing problems, not all cyber related, but CDK, same thing.

Dealerships were reporting that they were actually their financial their financial impacted for the quarter ending June 30th. They were actually reporting that because of the outages they couldn’t make their sales. So we already are in chaos and we’re just trying to figure out how to live in the chaos right now.

So I think it’s only going to get more chaotic based on what we just discussed, to be honest, but.

Ilona Cohen: Yeah, I agree with that. There are some industries that are better equipped to self regulate than others, so take the payment card industry. They have a robust industry association that has imposed Real regulation, real standards that everyone complies with because they want to be able to offer, payment services, but

a

that takes a long time to get to that point and the process is really messy.

And so AI you have in the absence of a national standard. Or even a state standard. You’ve got companies trying to impose what they’re calling responsible a I. So they’re trying to come up with their own version of what that means, and it’s all over the map. And so you’ve got companies now through contractual requirements, imposing things like, definitions of responsible a I and other companies having to comply with Multiple different versions of what that means and that is messy and it’s timely and it’s expensive and it’s hard to track and so that’s why sometimes a national or even state standard can be useful because it reduces the cost to comply.

So

And

Neal Dennis: Inception as well as at its growth points to where it’s at, it’s getting back to critical industry, key resource related stuff. On your anecdote, there’s another comparable piece for advertising on the internet.

There’s a large company or large group called tag trust and accountability group. They are there specifically to regulate what it means to be a content media content provider in the advertising world on the internet. And they started off as a very fledgling foundation, like any good thing, sitting up in DC on the Hill, trying to get their pundits together.

And now they own the regulation as a private industry group for how to appropriately put. Things through content delivery providers and stuff like that on the internet. And it’s started off as a US based thing, but it’s a global impact regulatory body, private regulatory body now. So then it did like with everything else, it did take them time.

They were born semi out of the inception of some of the stuff in the mid 2000s where a lot of threat actors were leveraging CDNs to push a lot of not so good things. So they had their little kind of. Come hit their moment there, their milestone moment to push them over the edge and here they are self regulatory body.

And I think that’s probably what helped keep the government out of those worlds and keep the Googles and Microsoft’s safe, but in a private trust group mentality. But to your point, that’s not always going to happen. I agree. That is not always going to happen. And in absence of that happening, especially in a timely fashion, when you have a breakdown, unfortunately, you’re going to get slapped legally somewhere and someone’s going to have to create a rule to make sure it doesn’t happen elsewhere.

So bust your butts off to make some kind of private regulatory landmark for your group. And if it sticks and self regulate. If it doesn’t, congratulations, you’re in the Supreme court. So at least that’s, yeah,

Elliot Volkman: that takes us to the end of the episode. So Ilona, Troy, thank you so much for joining Neal. So glad you’re back. So I don’t have to talk to the wall. We will see y’all next time. And yeah we’re hopefully off a little hiatus for a summer, but otherwise expect back to our every other week publishing schedule.

Neal Dennis: you putting this out before Black Hat?

Elliot Volkman: I don’t know. Hopefully.

Neal Dennis: See ya at Black Hat, if he gets this out in time.

Elliot Volkman: Oh yeah. We’re going to be at black hat. I think find us somewhere. We’ll figure it out.

Announcer: Thank you for joining a Z T an independent series. Your hosts have been Elliot Volkman and Neil Dennis to learn more about zero. Go to adopting zero trust.com. Subscribe to our newsletter or join our slack community viewpoint express during the show did not reflect the brands, employers, or companies of our hosts, guests or potential sponsors.

*** This is a Security Bloggers Network syndicated blog from Adopting Zero Trust authored by Elliot Volkman. Read the original post at: https://www.adoptingzerotrust.com/p/overturning-of-chevron-deferences


文章来源: https://securityboulevard.com/2024/08/overturning-of-chevron-deferences-impact-on-cybersecurity-regulation/
如有侵权请联系:admin#unsafe.sh