Apple's App 'Privacy Labels' Are Here—and They're a Big Step Forward

It remains unclear how effective the warnings will be, but the attempt alone is a promising development.
apple store
Starting today, you'll see labels in the iOS and Mac App Stores that show just how seriously developers take your privacy.Photograph: PHILIPPE LOPEZ/Getty Images

Starting today, apps in the Mac and iOS App Stores will display mandatory labels that provide a rundown of their privacy policies. Think of it as a sort of "nutrition facts" for apps. It's Apple's most visible move yet to give you easily digestible details about what data every app collects and has access to—and what they do with it.

The idea of developing privacy or security breakdown labels for laypeople isn't new. In the early 2010s, academic researchers had already developed mobile app privacy label prototypes. More recently, countries like Finland, Singapore, and the United Kingdom have started pushing security-focused labels for internet-of-things products. But Apple is seemingly the first global tech giant to embrace and promote the tactic so extensively.

"Apple’s approach looks very promising, but it's unclear how much user testing went into it," says Lorrie Cranor, director of Carnegie Mellon's CyLab Usable Privacy and Security Laboratory. "As it rolls out with real apps and real users it will be interesting to see what works and what doesn’t—whether developers understand how to accurately complete the information, whether they actually tell the truth, and whether consumers understand what this means are all open questions."

Courtesy of Apple

The labels have three categories: Data Used to Track You, Data Linked to You, and Data Not Linked to You, with bullet points for each detailing what the app has going on under the hood. A label might reveal that an app wants to collect your location data, financial details, and contact information, and links all of that to an in-service account or identifiers like your device's ID number. The label might also show that the app goes a step farther and shares that information with other companies to track you across their websites and services as well.

Many apps that have submitted information will have their labels go live today, but it will take some time before they become universal. The privacy details are only mandatory once a developer submits a new app or an update to Apple for review, and many apps have infrequent update cycles. Apple says, though, that some developers have added the information anyway, perhaps to avoid the appearance of withholding something.

In the reality of today's app landscape, it's difficult to find mainstream software that doesn't do at least some linking and tracking. The privacy labels will help drive that point home, but that pervasiveness might also make it hard to find something actionable in the information. And while providing data for the labels is now mandatory in the iOS and macOS App Stores, it's also the developer's job to provide factual information and revise it over time.

"You’re responsible for keeping your responses accurate and up to date," Apple says in its developer guidelines. The company told WIRED that it will vet the information as part of its app review process, but app stores like Google Play and the App Store have consistently struggled over the years with malicious apps that slip past these audit and review processes. Given the ongoing nature of the challenge, it seems likely that misleading privacy details will also sneak by sometimes, at least until researchers or concerned users catch and flag discrepancies.

Pardis Emami-Naeini, a privacy researcher at the University of Washington who has worked with Cranor and others on developing an IoT security label, points out that lying isn't the only hurdle. Some developers also may not fully understand what's being asked, or might not have an accurate understanding to begin with of how their app collects and manages data. It may feel like this should be obvious, but in truth developers often simply build what they're told to without a directive to specifically map and understand the flow of information. For example, as a matter of course, apps often incorporate preexisting, open source code that could contain data collection mechanisms or trackers that developers don't fully recognize. The process of providing privacy details to Apple could serve as a positive opportunity for developers to make sure they truly understand what's going on inside their software. But it's just as easy to imagine some app makers phoning it in and overlooking important details.

There are also certain types of data collection that are "optional to disclose," because the data isn't used for tracking or is collected infrequently. The language is meant to make things easier, since there are benign situations where an app collects, say, a one-time location ping but doesn't share it anywhere and gives a clear option for users to decline. The problem, though, is that the "optional disclosure" category feels like open season for loopholes and workarounds.

"If you satisfy all of the parameters, then you don’t have to disclose certain information you collect, which seems like not a great idea," Emami-Naeini says. "It’s so easy for app developers to just say, 'We satisfy all of them.'"

Emami-Naeini applauds Apple for taking such a large step toward normalizing consumer-friendly disclosures without having to slog through a complicated and opaque privacy policy. But she emphasizes that she is also worried about how Apple will monitor the veracity of the information in the labels. As with nutrition facts on food, there will be lots of users who simply ignore them or only check for one criteria they care about. But for people who actually examine and think about the labels, the information needs to paint an accurate picture to be helpful.

"I am cautiously optimistic that these labels will actually turn out to be pretty useful," says Carnegie Mellon's Cranor.

Apple says it is still very much in a listening mode when it comes to the labels and that the company plans to absorb feedback and translate it into appropriate changes. But as Emami-Naeini and Cranor both point out, real effectiveness will take real enforcement—not just from Apple, but from government regulators as well.


More Great WIRED Stories