This post is one in a series of reports on significant issues for startups in 2015. In the past year, the startup community’s voice helped drive notable debates in tech and entrepreneurship policy, but many of the tech world’s policy goals in 2015, such as immigration and patent reform, remain unfulfilled. Check back for more year-end updates and continue to watch this space in 2016 as we follow policy issues affecting the startup community.
by Emma Peck and Evan Engstrom
This year saw an alarming number of high-profile data breaches—as of mid-December, there had been over 750 breaches exposing nearly 178 million records, surpassing 2014’s pace. Not surprisingly, policymakers were compelled to turn their attention to data security and privacy issues, putting forth a range of proposals around privacy, encryption, surveillance, and data security. These proposals were met by the startup community with both praise and disapproval, but one clear message permeated the debates around all of these issues: too often, offered policy solutions do not take into account the needs and realities of the quickly evolving startup ecosystem. Outdated laws are applied to new technologies, and increasingly, new proposals show a clear lack of understanding of the technologies they will impact.
A prime example of a law that is completely ill-fitted for today’s technological realities is the Electronic Communications and Privacy Act (ECPA). ECPA is the privacy law that governs our interactions with the Internet. But the law was passed in 1986 (before most people even had computers in their homes), and it allows law enforcement to access electronic communications that are older than 6 months without a warrant on the grounds that such communications are “abandoned.” These rules do not conform to how people use and understand the Internet, and providing such lax privacy protections for electronic communications erodes consumer confidence in the safety and security of online services. Thankfully, a coalition of industry and civil rights organizations—including Engine—helped persuade California to correct this problem at the state level. In October, California passed reforms to ECPA requiring law enforcement to obtain a warrant before accessing a wide variety of digital information. A bill to reform ECPA at the federal level also resurfaced this year, garnering support from over 300 cosponsors in the House, which made it one of the most popular bills in Congress. There are still a few federal agencies that oppose reform, but a provision in the recently passed omnibus served as a rebuke to the remaining hold-outs. ECPA reform will likely be one of the first topics considered by the House Judiciary Committee in 2016, and we are tracking.
This year, 2016 presidential candidates and current policymakers also put forth proposals aimed at combating terrorist threats to our nation—from “closing that Internet up,” which has been broadly viewed as preposterous, to creating backdoors for encrypted technologies, which unfortunately has been met with acceptance in some circles that ought to know better. The debate around encryption is one of the best examples of the problems that arise when lawmakers don’t understand the technologies at issue in the policies they are proposing. It is well accepted by cryptographic experts that creating a backdoor (or “golden key”) for encrypted technologies is not technically feasible without undermining the security of the system as a whole. And in addition to threatening constitutional rights, it is unlikely that creating a backdoor would even be effective in keeping bad actors from using unbreakable encryption. Yet this “solution” has been central to a number of policy proposals, highlighting a lack of understanding from lawmakers on both sides of the aisle.
In October, the tech world was left in a state of confusion after the European Court of Justice (ECJ) invalidated the safe harbor agreement that allows for the legal transfer of data between the U.S. and EU. In short, the ECJ ruled that U.S. laws allowing the government to broadly and secretly collect consumer data violated EU privacy rules and therefore, the safe harbor rule was incompatible with EU law. As such, companies that had previously relied on the safe harbor as the legal basis for importing EU customer data into the U.S. had to find other ways to legally import such data. Many commentators downplayed the significance of the ruling, noting that companies could find other legal pathways for importing EU data via contractual agreements or pre-approved internal data protection mechanisms. But, while such protocols may be feasible for large multinationals, startups with EU customers but without the resources necessary to negotiate alternative arrangements were left in legal limbo. Though EU and U.S. policymakers are working on developing another safe harbor arrangement, the recent push in the U.S. to further weaken privacy protections through misguided encryption policies could end up ensuring that any new safe harbor rule would also fall short of EU privacy standards. More broadly, the court ruling helped crystalize the nonsensical nature of enforcing territorial data restrictions in a globally interconnected digital world and highlighted the impact that U.S. surveillance practices have on the ability of American startups to thrive in an increasingly global world, where many countries have concerns about the privacy of their citizens’ data.
Spurred on by the many data breaches in 2015, members of Congress spent considerable time crafting legislation regarding how companies should handle security breaches. A number of the proposals require notification within a set amount of time following discovery of a breach. While well-intentioned, security professionals agree that publicly announcing a breach too early actually decreases security by allowing for bad actors to take advantage of vulnerabilities that have not yet been patched. It’s true that a uniform federal standard would be preferable to the existing 47 state laws governing breach notification insofar as compliance with a single standard is easier for cash-strapped startups than having to comply with 47 different regimes. But the bills introduced this year are deficient in a variety of ways that highlight the alarming lack of technological expertise on Capitol Hill.
Additionally, as a result of a court decision earlier this year, the Federal Trade Commission (FTC) now has explicit authority to police whether companies’ data security practices are “reasonable”. However, determining which practices are reasonable and which are not is a difficult task—even security experts disagree over which protections companies should have in place. The FTC has tried to bridge this gap in technological understanding through their “Start with Security” initiative. And Engine hosted an event in October to help startups navigate data security policy. But it is still clear that lawmakers do not understand data security as a technical matter, and it is the role of the startup community to try to educate policymakers on the ways in which proposed policies can dictate practices, for better or worse.
Privacy and security were two of the most talked about issues in 2015 and it’s unlikely that they’ll go away in 2016. They have already featured heavily in the Presidential debates, and as more of our daily life migrates to the online world, cybersecurity threats will only continue to grow. Policymakers know they must do something to address these threats, but doing something is not necessarily better than doing nothing if proposed policies don’t reflect the technological realities of today. Before the tech sector can convince Congress to pass bills that support the work we are doing, we have an obligation to instruct policymakers on how their policies will impact and interface with technologies in practice. Engaging at this level will pay dividends down the road.