A recent study conducted by the University of Guelph in Ontario looked at privacy and the practices of computer repair shops. You know, like the repair shop where Hunter Biden took his Mac computer to be repaired. The study concluded that the repair shops committed “widespread privacy violations” including “snooping on personal data, copying data off the device, and removing [evidence] of snooping activities.” What’s worse, the service technicians snooped the most on the data belonging to women—searching for photographs and other personal information about them. In the case of president (then-vice president) Biden’s son, Hunter, the repair shop not only reviewed data on the laptop, but also provided (possibly) forensic copies of the hard drive to the local FBI office. So, what are your rights with respect to the privacy of data on a computer that you give to some service technician to repair?
When you drop off a computer for repair, in general, the technician is going to need to have access to the data on the device. This means access to the system password. With this password, the technician likely (but not inevitably) has access to all of the unencrypted files stored on the device, and likely also has access to stored passwords providing access to cloud-based storage or processing, online Gmail or other accounts (as well as the mail stored on the device), and virtually anything else that is on the device. If the online access (say, the corporate VPN) requires additional credentials to access (like MFA), then ordinarily this would not be provided to a technician—unless the problem was associated with access to the VPN, in which case the customer should remain with the laptop to provide the MFA credential (and then log out when the technician is done, right?)
So, when you provide a computer for repair or diagnosis, you should expect that everything on or connected to that computer is accessible to the technician. And maybe to others. Thus, it’s a good idea not only to encrypt the drive but also to encrypt any sensitive data with a separate encryption protocol. Right. You do that, right? The other problem is if the encrypted files themselves are the problem (they’re probably not, but bear with me here) the technician may need to decrypt those files—and, again, they will need the password.
Effectively, in order to do their job, the repair shop needs the ability to become the customer. The question is whether they can be trusted with that kind of access.
Whose Computer is it, Anyway?
The next question is about ownership of the computer and the data on it. Typically (but not universally) corporate computers should be repaired by corporate agents—either in-house or outsourced technicians. When an employee takes a corporate device to a repair shop, they are not only giving up their own privacy but that of their employer, the employer’s customers and any entities which have provided data to the employer. Before you drag your laptop to the Apple store to fix a glitch (like that turkey gravy you accidentally poured onto the keyboard), you should make sure that your employer knows you are doing this. That’s even more true if your company is responsible for sensitive data, PII or PHI and the repair shop is a smaller, independent repair shop. It’s not that the repair shop won’t do a good job. It’s just that the employer should know where the computer is.
In fact, the employer’s policy typically includes langage warning users to NEVER SHARE PASSWORDS ever, under any circumstances even under threat of death! So, before you share your password with some geek, make sure that someone in the IT department approves it. Better safe than sorry.
But that’s just ownership of the device. With the new work-from-home (WFH) modality, many employees, contractors or independent third parties are using corporate data—including sensitive data—on personal or hybrid devices. Sharing credentials with technicians exposes the data on these devices, irrespective of who holds title to the hardware. If the tech does not need access to cloud services or connected cloud storage, don’t provide that. Important safety tip: Don’t cross the streams.
Promises of Privacy
The other promises of privacy come from the repair shop itself. What data will they access, and what are they allowed to do with it? That’s where things get, shall we say, hairy.
The Repair “Contract”
“Our service processes are designed so that if the service requested doesn’t require access to the data stored on a device, the Agent does not access that data. For example, our Geek Squad Agents are trained to never access data on a customer’s device provided to Geek Squad for service except in limited circumstances, and only to the extent necessary to perform the service, such as when you ask us to recover your data. Devices that could have your information on them and are traded in or provided to Best Buy for recycling are protected until they are sent to the appropriate location for data wipe or other required disposition.”
In general, it should be assumed that the technician has sufficient access to the computer to do their job, and also that the understanding of the parties (unwritten) is that the access is provided only for that purpose. If you give the keys to your candy apple red 1985 Modena Spyder to some parking lot attendant in Chicago, you should reasonably expect them to use those keys simply to park the car and move it as necessary. You certainly would not expect the attendant (and a friend) to take a “day off” with your car. If a technician goes joy riding with your personal data, it’s not clear that you have a cause of action against them. But you might.
CSAM Say (Uncle) Sam
Various states require repair shops to report any child pornography or Child Sexual Abuse Materials (CSAM) they may find on a device they are repairing. While these laws do not require repair shops to actively search for CSAM, they require the repair shops to report any CSAM they find. Indeed, if a repair shop “knowingly” possesses CSAM, it (and the technicians) may have criminal liability for the mere possession. Michigan law not only requires computer technicians to report CSAM but also provides statutory anonymity and immunity. States that require reporting of suspected CSAM include Arkansas, Missouri, Oklahoma, North Carolina, South Carolina and South Dakota. It’s not clear whether repair shops have processes for searching for CSAM or even whether they have access to the National Center for Missing and Exploited Children (NCMEC) database for the purposes of identifying the MD5 or other hashes of these materials. Similarly, ISPs are required, under 42 USC 13032, to report CSAM they know about and these ISPs have processes to actually look for it.
The problem is, if the technician finds a single copy of CSAM (even in a thumbnail) they don’t just turn over that file—they copy and turn over the entire contents of the computer. Because this is (likely) a private search, Fourth Amendment limitations on warrants and scope of searches may not apply when the computer is turned over to the cops.
While the repair shop, as a bailee, has a duty to return the bailed property in good shape, it’s not clear what their duty is as to the data on that laptop.
In the United States, the Fourth Amendment protects against “unreasonable searches and seizures” that invade the rights of people to be secure in their “person, places, houses and effects.” However, to come under the scope of this provision, the search must involve a “state actor”—that is, some government agent. Purely private searches—like those performed by computer technicians—are not generally covered by the Fourth Amendment. Of course, this assumes that the technicians are purely “commercial.” In 2018, it was revealed that the FBI was offering a “bounty” to computer technicians to search for and report to the FBI evidence of certain crimes. In such cases, the expectation of payment or reward, or the direction provided by the FBI, might turn the technician into a “government agent” and render the otherwise “private” search into a government search. Depends on how cozy the relationship is.
We think of the technician as inadvertently coming across some contraband where it is readily apparent that there is evidence of a crime; things like CSAM and the like. But the technician has access to everything on the computer and many things that the computer may connect to. If, by giving your userid and password, you have abandoned your privacy rights, then there’s nothing to stop the technician from searching for evidence of tax fraud (by you or the company), procurement fraud, infidelity or, indeed, just about anything. Again, it’s reasonable to assume that the tech can access the data to fix the computer and going beyond this is “unreasonable”—but it’s better to specify this in a contract.
According to the Guelph study, “[W]hile data theft was uncommon, casual snooping of customers’ data was a regular occurrence. Accordingly, viewing of (revealing) pictures or casual folder snooping was noted as the most common violation. We note that while our logs did not provide any evidence of the theft of financial data, the technicians may have copied it using other means (i.e., copied to a piece of paper). Our investigation also showed that the technicians made an effort not to leave a trace of their snooping…” More disturbingly, the Guelph study found that the incidents of casual snooping of customers’ photographs was significantly higher if the perceived “owner” or “user” of the computer was a woman and higher still when the researchers added female-coded revealing pictures.”
It is equally disturbing that the technicians were rummaging through these customers’ intimate photographs and that they were taking steps to conceal the fact that they were doing so.
Consistent with standard data privacy practices, it should be the law that a consumer who takes a computer in for diagnosis or repair consents to examination of that computer only to the extent strictly necessary for that diagnosis or repair. Whether that is, in fact, the law remains to be seen, especially in the United States, which lacks a comprehensive data privacy regime.
Hunter Biden’s Laptop
With the GOP regaining control over the House of Representatives, we can expect additional hearings concerning data found on the laptop computer Hunter Biden reportedly delivered for repair to a computer shop in Delaware. The owner of the shop examined the contents of the computer and turned them over to the FBI (which actually gave the owner a “cover subpoena” for the laptop). The computer shop alleged that Hunter abandoned the laptop by not paying for the repairs in a timely fashion or picking up the computer and that, therefore, both the laptop and its contents were fair game.
This may, in fact, be the case. Under Delaware law, “any person who holds, stores, safekeeps or otherwise is left with possession of any abandoned personal property … shall be vested with complete and absolute title to said abandoned personal property and shall have all right to sell, alienate, gift or otherwise dispose of the said abandoned personal property…” The caveat here is that the “abandoned property” statute applies to “tangible personal property” which is abandoned—the physical laptop computer. So, once Hunter abandoned the laptop, the ability to gain full title to the computer vests in the computer shop. Cool. Cool. Cool.
But “intangible property” does not vest in the computer repair shop. There is no transfer of copyright of Hunter’s photos, diaries, etc. If a book containing the unpublished writings of J.R.R. Tolkein was found in a bungalow on Lakeside Road in Bournemouth, Hampshire, England, the book itself might vest in the bungalow’s owner as “abandoned property”, but the right to publish that book would vest in the Tolkein estate.
Privacy is a strange form of intangible property. The right to privacy has a “property” component to it, but it’s not the same as “ownership.” Does the owner of an abandoned laptop have unfettered rights to publish and disseminate the contents of that laptop? Do they now “own” those contents? Has the prior “owner” abandoned their expectations of privacy along with abandoning the computer itself? Have those who have shared information with the owner of the laptop taken the risk that their own privacy rights may be subject to the laptop owner’s failure to expeditiously pick up the computer? Is Hunter liable to those who shared personal information with him for failure to protect that information by not picking up his computer in time?
Courts have struggled with the concept of privacy as a property right. When personal data is compromised, a tort may lie for breach of privacy or breach of a contract (express or implied) to protect privacy. Privacy related data has real value and is bought and sold on the dark web and by data brokers all the time. Invasions of privacy (lawful and unlawful) are big business. Because “privacy” exists in this netherworld of property and something else, it’s not clear who “owns” the data or when a privacy right has been abandoned. Do you surrender your privacy when you give the guy at the Genius Bar your password, or is it permanently surrendered when you fail to pick up your computer?
At the end of the day, we need to disassociate the “ownership” of the device from the privacy rights associated with the data on the device or traveling through the device. Each needs to be protected but with possibly a different set of laws.
Until then, just try not to spill Pepsi on your keyboard.