Tesla employees passed around videos taken in car owners’ private garages and other interesting recordings captured by the cameras built in to the company’s vehicles, Reuters recently reported. “We could see them doing laundry and really intimate things. We could see their kids,” according to one of nine former employees who told the news agency about the practice.
File this under “shocking but unsurprising.” Shocking because it’s a significant abuse of privacy — but unsurprising because this is a pattern that has happened repeatedly for many years. Unfortunately, today, when you accept some company’s recording devices into your life, you lose control of your privacy.
Examples abound. Video streaming from Amazon-owned Ring cameras installed by homeowners was monitored by Ring employees, who could watch video from any Ring camera in the world with little more than the click of a mouse. (We wrote about the privacy problems with Ring cameras in 2019.) Amazon employees also listened to recordings made by its voice assistant Alexa. Some told Bloomberg they believe they overheard a sexual assault. Contractors also monitored video from the company’s Cloud Cam security cameras.
Recordings made by Google’s voice assistant were listened to by company contractors. Some of the recordings were leaked to a Belgian media outlet, which was able to identify some of the people in the clips, and was able to hear recordings of people discussing medical conditions and a woman in distress. Human contractors working for Microsoft listened to personal audio conversations by users of the Skype telephone translation app. One told Vice that some of the calls included phone sex. Human contractors regularly listened to audio recordings transmitted by Apple’s Siri voice assistant. A whistleblower told the Guardian that contractors regularly heard medical conversations, drug deals, and recordings of people having sex. Facebook likewise paid contractors to listen to and transcribe audio clips from its services.
It’s unclear how much of this spying continues. In some of the above cases the companies announced they were discontinuing human review or reviewing their practices; in others, privacy policies and opt-out possibilities were clarified.
Surveillance camera manufacturers have also been found accessing the data from their devices. Hackers broke into the servers of the company Verkada in 2021 and discovered that it had a secret backdoor into roughly 150,000 cameras around the world installed by its customers, which permitted the company to view video from any of those cameras at any time. In 2017, a security researcher discovered that Hikvision, a Chinese camera manufacturer which then had 12 percent of the U.S. surveillance camera market, had a backdoor that allowed access to any of its cameras. (Hikvision was subsequently banned by the U.S. government.)
The reasons for corporate monitoring of customer devices vary. Where AI is part of a company’s product, that company will always have a strong incentive to look at customer data in order to improve the training of its algorithms. In the case of Tesla, it appears to have been employee curiosity and gawking — though as Reuters explains, the reason Tesla collects video is to allow human “data labelers” to sift through images taken by its cars for AI training. In other cases, perhaps, people at companies can’t resist engaging in surveillance even with no specific purpose other than a general will to power.
None of those incentives are going away, so people should continue to be very deliberate about buying products with Internet-connected cameras or microphones. As we have discussed, when people invite an Internet-connected camera or microphone into their lives, they (and their potentially unsuspecting friends and family) become susceptible not only to the companies who make those devices, but also to hackers and to law enforcement officials with their plentiful powers to make companies do their bidding.
Finally, the Tesla story in particular is a reminder of the specific issues raised by the driverless cars (of all makes and models) and other robots that may end up proliferating in our lives. To make navigational and other decisions, those robots are going to have to have sensors to collect data about the world — and how that data is used will not always be good.
Everyone needs to be more aware of the current risks. Companies need to be clear about what kind of data they are collecting and exactly how they’re using it. Wherever possible they need to give users control over those data flows, such as by providing hardware on/off switches. (We outlined our recommendations for devices in more detail here.) And above all, we need good privacy laws to give some teeth to the good practices that people have every right to expect.