[NOTE: No post about legislation is complete without a lot of acronyms representing lengthy and forgettable names of bills. There are three main ones that I talk about in this post:
CISA – the Cyber Information Sharing Act of 2015 – Senate bill that will likely go to vote soon. The bill aims to facilitate cybersecurity information sharing and create a framework for private and government participation.
ICPA – the International Cybercrime Prevention Act of 2015 – proposed bill to extend law enforcement authorities and penalties in order to deter cybercrime. Proposed by Senators Graham and Whitehouse of the Senate Committee on the Judiciary.
CFAA – the Computer Fraud and Abuse Act – main “anti-hacking” law in the US. Passed in 1986 and updated a number of times since then, the law is considered by many to be out-of-date and in dire need of reform.]
[UPDATE - 5.25.16] Senators Graham and Whitehouse have re-introduced this amendment as S.356, which is slated to be be discussed at a meeting of the Senate Judiciary Committee tomorrow (May 26th). The bill is mostly unchanged since I wrote about it last year – the only change seems to be the removal of the section on “Stopping the Sale of American’s Financial Information” as that ended up being rolled into the Cybersecurity Act of 2015, which is what CISA eventually became when it passed at the end of 2015 as part of an omnibus spending bill.
Rapid7’s position is also unchanged – we’re still neutral on the language of the amendment itself and keen to see broader reform of the CFAA to address the underlying issues:
- Lack of clarity and definitions around “exceeding authorized access” and “accessing without authorization”
- Outdated terms (e.g. “protected computer”) and thresholds for prosecutions (e.g. the value of accessed information being set at $5000)
- Inclusion of civil causes of action, enabling technology providers to use the law to ward off researchers
Unless and until these core underlying issues are fixed, we do not intend to support any legislation that would extend authorities and penalties for CFAA prosecutions.
Finally, as detailed in the original post below, we agree that botnets are a very real problem and creating clear guidelines, expectations, and requirements for law enforcement to address them is sensible. Still, we feel the section on “shutting down botnets” raises questions around how far the proposed authorities would extend. That section would allow the Government to compel companies via injunction to take unspecified actions against botnets. This might include demanding the company hack a computer controlling a botnet, force an update to damaged computers, or re-route damaged computers' internet traffic to a site where the user can download a patch. There are a number of questions around the implications of this for the owners of victim machines – could the Government use this authority to access information on those machines? If the machines are rendered unusable through action taken to shut down the botnet, what recourse is there for the owners?
In addition, the statute that the amendment would modify – 18 USC 1345 – is broad and may permit any action, restraining order, or injunction that "is warranted to prevent a continuing and substantial injury to the United States or to any person or class of persons for whose protection the action is brought." One questions thrown up by the recent (and ongoing) encryption debate is whether this could be used as the All Writs Act was in the Apple case? That might seem far-fetched, but it’s unclear and may be a question worth asking during the Judiciary Committee’s meeting tomorrow.
[UPDATE - 10.21.15] CISA went to the floor for debate yesterday. Today we heard that the Whitehouse/Graham Amendment will not advance to a vote with CISA. According to The Hill, "Whitehouse has some parliamentary options to bring the amendment to a vote still at his disposal but any attempt would likely be blocked by opponents of the provision."
[ORIGINAL POST - 10.20.15] After several years of debate around a cybersecurity information sharing bill, it looks like we’re getting closer to something passing. Earlier this year, the House passed two information sharing bills with pretty strong support. The Senate side proposal – the Cyber Information Sharing Act of 2015 (CISA) – has been considerably more controversial, but is likely to go to a vote very soon, possibly even in the next few days. A number of concerns have been raised around this bill – mostly around privacy, data handling, and the way the Government can use information that is shared. These issues are hopefully being tackled in a number of amendments; Senate leaders unofficially agreed to a limit of 22 amendments, made up of the manager’s amendment (which is put forward by the bill’s primary sponsors), plus 11 Democratic amendments and 10 Republican amendments. You can see a full list of the proposed amendments here thanks to New America’s fantastic round up.
Included among these is Amendment 2713, proposed by Senators Whitehouse and Graham. This amendment has been controversial, and has raised alarm in the civil liberties and security research communities, resulting in a letter requesting that the amendment not go to a vote with CISA. While I was initially one of the most vocal over these concerns, I am now comfortable that the current language of the amendment, attached at the bottom of this post, does not have negative implications for researchers. (Caveat – this language is the latest available as of writing this on October 20, 2015. I reserve the right to change my position if the language changes )
A brief history of Amendment 2713
The amendment is an abridged version of the International Cybercrime Prevention Act (ICPA) (also known as the Graham/Whitehouse bill). The full bill proposed eight key areas of legislative updates to extend law enforcement authorities in order to reduce cybercrime; the CISA amendment cuts this down to four main areas of focus:
- Extending authority to prosecute the sale of US financial information overseas
- Formalizing authority for law enforcement to shutdown botnets
- Increasing the penalties for cyberattacks on critical infrastructure
- Criminalizing the trafficking of botnets and the sale of exploits to bad actors
These feel like pretty reasonable goals to me. In particular, updating the law to tackle the burgeoning botnet problem makes sense, and I like the idea of a clear legal framework for shutting down botnets. Yet, for a long time I was very worried about both the full bill, and the shorter abridged version of the Amendment. In fact, I even testified on the bill to the Senate Judiciary Subcommittee on Crime and Terrorism. For those that want to see the entire hearing, there’s a video here. This is probably the point at which I should remind people that I’m not a lawyer. (NOTE: the hearing was held before the bill was abridged and submitted as an amendment for CISA).
Most of my testimony focuses on updates to the Computer Fraud and Abuse Act (CFAA), the main anti-hacking law in the US. In the case of the Amendment, three of the four sections – critical infrastructure, shutting down botnets, trafficking in botnets and exploits – relate to the CFAA. As I’ve written before, this law is a big concern as it chills security research, which results in more opportunities for cybercriminals in my opinion. The CFAA is woefully out of date, and lacks clear boundaries for what is permissible – a state exacerbated by the law containing both criminal and civil causes of action. The TL;DR of my testimony is that I was concerned that, far from improving the situation with the CFAA, the ICPA would actually make matters for researchers worse.
In relation to the Amendment, my particular concern was the language of Section 4 (though I also provided feedback on Sections 2 and 3). It was not much of a stretch for me to imagine a scenario in which a researcher disclosing a vulnerability finding at a conference might satisfy all the requirements of Section 4 as they were written in the original proposal, resulting in the potential for them to be prosecuted.
So what changed?
In a nutshell, the language changed. And it changed because the people writing it really LISTENED to our feedback.
Maybe you can sense my surprise here. Those in the security research community that are familiar with the CFAA have felt frustration over its lack of clarity and misuse for years, and in many cases we’ve come to expect the worst of the Government. Recently though, we’ve started to see a change. We saw the Department of Justice proactively address the research community on the issue of CFAA prosecutions at Black Hat this year. We’ve seen various federal agencies, such as the FDA and FTC call for researchers to help them update their position on, and understanding of, cybersecurity challenges. We’ve even seen the introduction of legislative proposals designed to support and protect research, such as the Breaking Down Barriers to Innovation Act introduced by Senator Wyden and Representative Polis. In short, we’ve seen the start of a huge shift in attitudes towards security research in the Government sector.
Don’t get me wrong, it’s not all sunshine and kittens. A new bill proposal “To provide greater transparency, accountability, and safety authority to the National Highway Traffic Safety Administration” explicitly makes research on cars illegal, which is, at best, short-sighted, and puts consumers at risk. Similarly, the recent proposal for implementation of the Wassenaar Arrangement’s export controls on intrusion software highlighted that we in the research community have some work to do to educate the Government on the value, role, and practices of security research – work which I’m happy to say is now underway.
Yet my experience working with the offices of Senators Whitehouse and Graham highlighted that we are making progress and that there are people in the Government that genuinely value the role of security research in reducing cybercrime. Throughout their collaborative engagement, the staff mentioned numerous times that they wanted to ensure security research would not be harmed by unintended consequences of the language. OK, I know security people can – cough – sometimes be a little cynical – cough, so you’re possibly thinking that words are all well and good, but the proof is in actions. I agree. So it was probably around the time that I was reviewing redraft number 5 or so that I started to believe they may be serious about this whole “no negative consequences” thing.
Yeah, I know I sound like I drank some funky cool aid while I was on the Hill, but today an updated version of the Amendment – attached at the bottom of this post – was distributed on the Hill. Again, a quick reminder that it might change, and if so, my position may change too, but at the time of writing this, the current language is pretty well crafted. I’ve spent some time vetting it with both lawyers and security researchers, and we really tried to find ways that the language would cause problems for research, but it looks solid. The drafters listened to all the weird examples and edge case anecdotes we threw at them, and even the suspicious slurs against the concept of “prosecutorial discretion,” and they wrote language that holds up against this scrutiny and does not create negative consequences for research – at least as far as I can see (and remember, I’m not a lawyer).
I’d like to pause here to thank the drafters for that diligence, and their grace in the face of some very blunt feedback.
So what does this mean for the security community?
In the short-term it means that a significantly better Amendment will go to the vote with CISA. I do wish it had gone through more public debate – it feels like that process was side-stepped – but if it’s going to go to vote, I’m very glad that the language has been updated as it has. Any debate that takes place around CISA is likely to focus on the core challenges around privacy, the way the Government can use information that is shared, and the controversial issue of “countermeasures.” Resolution on those issues is enormously important, so it’s good that the language of Amendment 2713 is not so concerning that it needs to steal focus from CISA. As an aside, Rapid7 is supportive of legislation that creates a clear framework for sharing cyber threat information, establishes meaningful protections for Personally Identifiable Information (PII), and clarifies the role and responsibilities of government entities in the process. We think CISA has work to do to get there, but are hopeful that the discussion around the 22 amendments will result in a better outcome.
Back to Amendment 2713… If it passes, the most significant immediate outcome will be greater law enforcement authority for addressing botnets, which is a good thing, I hope.
To be clear though, this process wasn’t a silver bullet that miraculously fixed the CFAA. If the Amendment passes, the CFAA will gain a new provision ((a)(8) for those that want to get nerdy about it), but it will still lack a definition of its core principle: “authorization.” That means it still lacks a clear line for what constitutes a violation and what does not. And it will still be out of date, and will still contain both civil and criminal causes of action, providing a handy stick that defensive vendors will use to threaten researchers.
The process for addressing these issues in the CFAA will be considerably harder and will take far longer. It will likely be a bloody battle as there are many angry voices on all sides of this debate and it has already raged for a long time. But there may be some possibility for light at the end of the tunnel if we can continue to work with legislators to find common ground. For me, that’s the most significant take away from Amendment 2713 – hope for more productive collaboration, and by extension, hope for better cybersecurity legislation in the future.