Could a burning hoverboard help change platform liability rules?
Developments in platform liability could signal a new approach to responsibility for safety…
Product liability sets out where legal responsibility, and therefore compensation for harm lies when consumer products fail. In other words, who gets sued if something goes badly wrong? The concept of liability has become more important since the rise of online marketplace platforms who have been defined as ‘intermediaries’ or ‘neutral platforms’ who connect buyers and sellers. By this definition, they are ‘just the tubes’ and therefore nothing to do with what flows through them. This has left some serious gaps in safeguards for shoppers who can and do buy products which are unsafe. A 2020 study by European consumer associations found two thirds of products bought on online marketplaces were non-compliant with EU safety laws including exploding power banks, toxic toys and faulty smoke alarms. Self-regulatory codes and various schemes are in place, but to date online marketplaces have fought hard not to be held liable for problems in most jurisdictions.
A recent US judgement could change this, it deemed Amazon was legally and financially responsible for the safety of a hoverboard that caught fire even though it didn't make the product. Effectively, the judgement says Amazon is a key part of transactions as it has a 'direct link' to customers and this puts it in a different category to a neutral intermediary. In Europe, product safety laws are under review for the first time since 2001 and consumer groups want online marketplaces to be seen as ‘business operators’ in the supply chain and thus have more responsibility for product safety.
Another interesting US development on platform liability happened this week when a US court ruled that Snapchat could be sued for the wrongful deaths of a teenage driver and their passengers who had a fatal accident whilst using a 'speed filter'. This filter captures and broadcasts the speeds at which people drive. The case was initially dismissed under a rule known as Section 230 which covers platforms in the US. The rule states that “interactive computer service” can’t be treated as publishers and are therefore not responsible for third-party content. The initial claim by the drivers’ parents was turned down because it was the users who were uploading third-party content to the ‘speed filter’. However, on appeal the judge ruled that the design of the Snapchat platform itself - as well as the actual user content being posted - could also be implicitly rewarding the reckless and ultimately fatal behaviour, .
Suggesting that the way platforms are designed could contribute to their liability could shake up how responsibility is understood. According to Stanford’s Digital Policy Centre, the path is now open for a precedent to be set:
This uncertainty regarding Sec. 230’s applicability may create an opening for more cases against tech companies for flawed platform design, and increases the chances that the Supreme Court will hear a Section 230 case to settle the contested legal landscape
The Snapchat and Amazon marketplace decisions means the argument about who bears responsibility and liability could reach a head in the next few years which could fundamentally change the way platforms operate.