Matthew Herrick, a restaurant worker and aspiring actor in New York, claimed that for months an ex-boyfriend used the dating app Grindr to harass him.
His former partner created fake profiles on the app to impersonate Herrick and then direct men to show up at Herrick’s home and the restaurant where he worked asking for sex, sometimes more than a dozen times per day. Herrick took action against his ex, filing 14 police reports.
He also filed a lawsuit against Grindr in 2017. The alleged harassment continued for months, even after Herrick obtained a temporary restraining order against Grindr that required the company to disable the impersonating profiles.
Herrick’s story echoes the online harassment that many people have experienced, often with little to no legal consequences for the companies that created the technology in question. A 1996 law designed to foster free speech online generally protects companies from liability.
But Herrick is pursuing an unusual legal theory as he continues to push back against Grindr, arguing that tech companies should face greater accountability for what happens on their platforms. His lawsuit alleges that the software developers who write code for Grindr have been negligent, producing an app that’s defective in its design and that is “fundamentally unsafe” and “unreasonably dangerous” — echoing language that’s more typically used in lawsuits about, say, a faulty kitchen appliance or a defective car part.
If successful, the lawsuit could bring about a significant legal change to the risks tech companies face for what happens on their platforms, adding to growing public and political pressure for change.
“This is a case about a company abdicating responsibility for a dangerous product it released into the stream of commerce,” his lawsuit argues, adding: “Grindr’s inaction enables the weaponization of its products and services.”
Software, hard problem
Lawsuits over product-related injuries or harm fall under a category of the law known as products liability, which exist to hold manufacturers responsible for defective items they put into the “stream of commerce” and ultimately to keep people safe.
Those laws generally haven’t been applied to software such as smartphone apps, but lawyers for Herrick aim to do just that — a development that could reshape consumers’ relationship with software, alter speech protections online and put pressure on Silicon Valley to find flaws in products before introducing them to the world.
“Products liability started as people thinking, ‘Oh, my stove burnt me,’ or, ‘This saw cut my hand,’” said Christopher Robinette, a law professor at Widener University who specializes in that area of the law. “But as people have started to purchase more information-related items, we have to reconsider how we classify those things.”
A federal appeals court is scheduled to consider the subject on Monday, weighing whether Herrick’s case should be allowed to move forward after a federal district judge threw it out last year. A ruling is likely within a few months.
The tech industry is pushing back on Herrick, saying in court papers that he is trying to artfully skirt the protections afforded free speech online.
Carrie Goldberg, one of Herrick’s attorneys, said they decided to pursue the argument out of frustration with Grindr’s failure to add product features to reduce harassment.
“Grindr has created a defective product,” she said in an interview. “It was very foreseeable that their product could be used this way.”
Grindr said in a statement on Thursday that it is committed to creating a safe and secure environment, and that any fraudulent account is a clear violation of its terms of service. Its staff removes offending profiles as appropriate, the company said.
In court, Grindr is relying on the more sweeping defense allowed by the 1996 law known as the Communications Decency Act. The act’s Section 230 has been interpreted by courts to immunize internet services from liability for content posted online by third parties — whether ex-boyfriends or otherwise.
That immunity, though, is subject to a raging debate about whether social media companies and other tech firms should be so free to introduce products without much forethought about the hazards they could create.
“When someone is injured, they and their families want recourse, but our legal system is woefully bad at delivering justice,” Sen. Ron Wyden, D-Ore., said in March as Congress debated creating an exception to Section 230 aimed at cracking down on alleged sex traffickers. One of the authors of Section 230, Wyden has warned the law may be weakened if tech companies don’t police their platforms more effectively.
Tech scandals over the past two years have led to mounting concerns about unchecked industry power.
Facebook and other online ad systems allowed Russia-based operatives to buy paid political ads until a public outcry led the companies to self-regulate. YouTube’s recommendations algorithm has at times encouraged the spread of conspiracy theories, prompting the CEO of parent company Google to tell Congress he is studying the subject.
Herrick’s case has drawn interest from the tech industry, its supporters and its critics who see his lawsuit as a test for a possible new legal theory for holding tech firms to account.
The Computer & Communications Industry Association, a trade group that represents a broad swath of the tech industry including Facebook and Google, said in a filing with the appeals court that Herrick’s suit would gut protections it says have made the U.S. tech industry the world’s leader.
The Electronic Frontier Foundation, which advocates for privacy online, is among the groups that have lined up to support Grindr’s position, while Herrick has drawn supportive court briefs from organizations including the National Network to End Domestic Violence.
Two things make the lawsuit different from past challenges to tech companies’ immunity under Section 230, said Marc Rotenberg, president of the Electronic Privacy Information Center, which filed a brief in the case backing Herrick. One is the timing, he said, as calls are rising for more ethics in the tech sector, and the second is the borrowing of arguments from cases about manufacturing defects.
“When you make a manufacturer effectively immune, it means that the consequences will be borne by the user,” Rotenberg said in a phone interview.
The Electronic Privacy Information Center has similarly argued that the U.S. Consumer Product Safety Commission should broaden its oversight to include internet-connected devices. The commission held a hearing on the subject last year.
Other personal injury lawsuits have taken aim at smartphone apps. In Georgia, lawyers for a man with permanent brain damage have sued a driver and Snapchat after they say a speed tracker on the app caused the car crash that injured him. Snapchat says the suit has no merit and has moved to dismiss it. An appeals court has let the suit move forward.
Beyond the question of whether software developers can be held liable for user-generated speech, software is sometimes considered not a product at all but a service, which may put it out of the reach of laws affecting manufacturers.
“A number of people are still hung up on that idea: Is it a product?” Robinette, the law professor, said. “It’s going to take some cases before people become accustomed to that idea.”