Accountability for Technology and AI
The Recent Negligent Design Verdict Should Be a Wake-Up Call
AI and digital technologies have been unleashed upon us, with an unprecedented zeal and recklessness, and a reckoning is long overdue.
The Meta and YouTube $6 million addiction verdict this week is an important milestone. I’m not an expert on the addiction issue, so I won’t opine on the merits of the case. Instead, I want to point out how the case affects the bigger picture of accountability for technology and AI.
The case was focused on liability for negligent design, a theory that avoids CDA Section 230 immunity because it views technology as a product rather than a form of speech. This is a complicated issue, one I’ll examine in a later post. For now, if you’re interested in exploring it, I recommend a few thoughtful works:
Zahra Takhshid, Large Language Models as Products
Caitlin Burke, Are Platforms Products?
Nita Farahany, Is An Algorithm Speech?
On the issue of addiction, you should be sure to read Gaia Bernstein’s book, Unwired: Gaining Control over Addictive Technologies. She just published a new edition of this book a week ago. Talk about perfect timing!
Design Is Key
It is essential for the law to regulate design. As I wrote in my recent book, On Privacy and Technology:
The way we experience the internet is shaped by design choices. We don’t have direct access to the internet; our access is instead mediated by design, which directs what we see. Designers control the default settings on websites, which can be initially set to maximize data collection, sharing, and use. Designers shape how readily people can opt into or out of certain data uses. Designers code the algorithms that determine the content we see on platforms. The more we share and engage, the more tech companies benefit financially. So, unsurprisingly, the interfaces often nudge, cajole, and manipulate us to share and engage.
Woodrow Hartzog’s Privacy’s Blueprint: The Battle to Control the Design of New Technologies is a great account of why regulating design is key to any meaningful protection of privacy.
[D}esign is power. Every design decision makes a certain reality more or less likely. Designers therefore wield a certain amount of control over others. . . . Design affects our perceptions of relationships and risk. It also affects our behavior. . . . The power of design makes it dangerous to ignore.
The same holds true for AI. There are many ways to regulate design that don't involve second-guessing or micromanaging. The law can impose meaningful standards, just as it does for cars and other products. Few people cry out that car safety requirements are destroying any innovation in cars.
Techlash Is Real, It Ain’t Over, and It’s Justified
The verdict in Los Angeles against Meta and YouTube, along with a $375 million verdict in New Mexico against Meta for child safety issues, demonstrate that the public has had enough. These verdicts send a powerful message that the public is convinced that Big Tech is creating harmful products and must pay more attention to addressing the harms.
I hope Big Tech gets the message and doesn’t keep responding indignantly. Instead of focusing on the particularities of each case, Big Tech should take a step back and look at the bigger picture. The industry as a whole must do a lot better. Their products are causing harm.
Tech companies can certainly point to limitations and safety features they’ve included with their technologies, but they aren’t enough. There are too many cases of spreading disinformation, invading privacy, encouraging malicious gossip and defamation, facilitating suicide and self-harm, sparking violence, affirming delusions, inspiring dangerous behavior, stoking extremism, boosting harassment and threats, enabling the creation of deepfakes, allowing nude and sexualized images of people (including children), manipulating and controlling people, and much more. Digital technologies can bring about wonderful things, but why do they need to produce so much blood, death, pain, lies, hate, and tears?
The public is tired of the typical responses:
“Oops! We really care about safety and are doing our best.”
“Everything has hiccups. Overall, it’s not so bad.”
“It’s not our fault. Blame our users.”
“It’s free speech!”
“Nothing should stand in the way of innovation.”
A different ethos is needed. The current ethos is a Speed Racer approach of win the AI and tech race at all costs. Speed can only think about racing; he’s maniacal about it. “You gotta win if you want to keep driving,” he says, “and that’s what I want to do.”
Tech companies want to roll out tech as fast as possible in a scrum for money and supremacy.
I get why companies do this. They feel existential pressure for speed and money. But they need to be reigned in. That’s the law’s responsibility, and policymakers have been failing to do enough.
Private Litigation Is Essential
These cases are evidence that private litigation is an essential enforcement tool. I argue this in my recent article. Enforcing Privacy Law: Why Private Litigation Is Essential, forthcoming 107 B.U. L. Rev.
As I wrote:
Private litigation can supplement government enforcement, helping to overcome problems with limited funding and staff. As discussed earlier, there are incentives for enforcement agencies to pursue open-and-shut violations, but these aren’t necessarily the ones that cause the most harm. Litigation has ability to escape the reticence of government enforcers because the economics works differently for lawsuits. Success in litigation is rewarded by a payoff. In contrast, the budgets for government enforcers rarely grow with success. . . .
It is immensely empowering for individuals who are victims of privacy violations to have a mechanism to stand up for themselves. Otherwise, they would have to raise a complaint to a beleaguered government enforcer without the funding and personnel to be able to perform a thorough investigation or pursue the case vigorously (or at all). Victims can be made to sit by helplessly, often never hearing anything back, never having an opportunity to be heard, and then most likely seeing the perpetrator receive a slap on the wrist or a warning or nothing at all. Private rights of action allow victims to punch back, demand accountability, and make a powerful statement that their injuries matter and that they shouldn’t be ignored or exploited in a company’s quest for power and profit. Private rights of action allow victims to vindicate their rights when government enforcers forsake them. . . .
Because the vast majority of enforcement authorities are under-resourced, overstretched, and politically-constrained, they are unable to enforce frequently and consistently enough; and they must adopt strategies that weaken their effectiveness and leave their full powers unused.
The best most resilient enforcement involves multiple enforcers using different enforcement tools. Private litigation is an essential dimension of enforcement, as it complements government enforcement by filling in where it is weak.
Truly effective enforcement must create the right incentives. Organizations must be forced to fully internalize their costs. Enforcement must counteract the manifold forces that lead organizations to risk violating the law and externalizing costs and harms to individuals. Enforcement must change the risk equation.
For more, check out the whole piece.






