“In many ways Facebook is the Exxon of our time,” Signal founder Moxie Marlinspike said at the RSA Cryptographers’ panel April 17.

There may have been a time where the assembled crowd of cybersecurity industry professionals, assembled in auditoriums and overflow rooms in San Francisco’s Moscone Center, may have objected to the characterization of the tech giant as a sloppy disastrous behemoth, but that time is not 2018.

Marlinspike’s remarks came after the panel discussed nations asking for hard-coded backdoors into secure systems. These backdoors are essentially specific vulnerabilities designed to let the government in, but flaws that can easily be misused in nefarious hands and by their nature lend themselves to abuse.

“I think it’s easier to say ‘I can’t’ rather than ’I won’t,” said Moxie Marlinspike, arguing that principled self restraint with backdoors is far less likely to have same effect as simply not having the backdoors hard-coded in.

“Countries are starting to legislate against programs that can’t get a backdoor into,” said RSA protocol co-inventor Adi Shamir, in light of both Iran and Russia’s attempts to shut down the messaging service Telegram this year.

Marlinspike, whose app Signal is a direct competitor with Telegram, noted that Telegram’s choice to store complete message history in the cloud made users vulnerable in a way that end-to-end encryption with messages only stored on users personal devices doesn’t.

“Entities that are choosing risk for us are not us,” said security research Paul Kocher, “and they may not make choices that we would make.”

RSA CTO Zulfikar Ramzan echoed the danger of this widespread trend in companies choosing risk different than users, citing a study that suggested “the cost of data insecurity is trillions depending how you count on it.

”External costs. Those are external costs,” replied pioneering cryptographer Whitfield Diffie, hitting at the failure of the market to imposes the costs of that external risk on the companies creating the risk themselves. This spun the conversation right back to Facebook, whose market share hasn’t changed despite a series of revelations about how the company mishandled and even just gave away the data it collected on users.

Yet simply opting out of the social network may not be as easy as suggested. Marlinspike compared Facebook to Exxon. “No matter how much oil we see them spilling, for many people Exxon was civilization. For many people Facebook is the internet,” he said.

Exxon is a well-chosen metaphor: emblematic of an industry whose product people use daily for the essence of modern life, and yet forever linked in the public understanding with disaster and external cost. The consequences and damages from a breached oil tanker differ wildly from freely surrendered and exploited data, but for the average consumer it feels roughly similar: something awful happened in the process of getting something needed, and what are we to do with the consequences? The panel ended before the panelists could further explore the implications, but again Exxon is an instructive comparison.

A year after the Exxon Valdez disaster Congress passed and President George H.W. Bush signed into law the Oil Pollution act of 1990. Exxon-Valdez was the highest-profile of the oil pollution incidents that year, though it was just one of several that captured headlines, a flurry of literal hull breaches and other mishandlings that combined meant an industry-wide problem, more than a company-wide problem. The regulation that passed to meet these problems hit multiple fronts: requiring better hulls for oil transport, requiring better training for the people in charge of the cargo, funding research into new cleanup techniques, and requiring companies preemptively pay into a fund to clean up any disasters that may come.

Were Congress to take the same approach to Facebook, it could adapt almost every measure in some form. Better security and more siloed data would limit what could be obtained in an instance, stolen in a breach, or handed over to researchers at a time. Laws regulating the handling and stewardship of data could serve a similar function to ones that require ships have trained pilots, making sure that the capability is externally verified. Research into data security means that what is collected stays held securely, rather than freely laundered across the open web. And a preemptive payment from tech companies to clean up future breaches is a cost that can certainly be borne by some of the most valuable companies in the world.

Should Congress choose to regulate tech like it regulated oil industries after Exxon Valdez, it could also learn from the limitations of that legislation. Write in dollar amounts tied to inflation, rather than static numbers set in a given year. Requirements that companies not only make plans for how to deal with data misuse and loss, but also that companies update those plans regularly. Congress could nudge companies to devote more resources to the prevention of data loss, theft, or misuse in the first place, since cleaning up after a disaster is harder than preventing that disaster.

For now, Facebook may be as much a part of the infrastructure of life for hundreds of millions as Exxon was in 1989 (and still is today). Treating it like a problem that people can opt out of, rather than an industry with external costs that can be regulated and mitigated, misses the greater opportunity to tackle what is ultimately a structural problem.