These Are the 8 Ways Apple Tracks You — and Their #1 Security Flaw

Privacy  /  Tech Today
Scroll Down

This is the next installment in our ongoing series about the choices we make about our personal data, and how these choices relate to the online products and services many of us use daily. 


Recently on these pages, I wondered: what does Google know about me?  The short answer: a lot. The detailed answer: 8.75 GB of search history, browser history, emails, GPS locations, voice searches, voicemails, a few short YouTube videos, and an ad profile built from my behavior with all of the above. Kinda creepy, though to be fair actually also kinda transparent — I found all this data because Google has a handy tool for downloading everything at once. This, in theory, allowed me  to see what I’ve been giving  the company since my google account was created in 2008. Creepy? Yes. And, just maybe, also cool.

So now I want to know what Apple knows about me. How does the company store my information? And how transparent are they about what they’re doing?

Starting With the Man in the Mirror

Apple tracks you
So many ports on Apple’s Aluminum PowerBook G4

First, full disclosure: I’m a Mac guy. I bought my first one, a 17″ Aluminum Powerbook G4, in 2005. A few years later, I traded in my Palm Treo 700wx for my first iPhone. Apple devices always felt like magic, and I’ve been a loyal user, willingly adopting each new exciting product, feature, and service as announced at the company’s signature events.

I started my exploration of data privacy with Google because they’re so open about us being their product. But maybe I also started with Google because I was trying to avoid taking a deeper look at a different company that has even more opportunity to collect my personal data. So it’s time to tear off the band-aid: let’s look at what Apple has, why they have it, and how much privacy figures into their business model.

Want to jump ahead?


An Apple Backstory of Piracy and Privacy

Apple tracks you
“Blue box”: by all accounts, an especially popular product among criminals.

Apple started in 1976 as a hardware company  founded by Steve Jobs, Stephen Wozniak (aka Woz), and the lesser-known Ronald Wayne. But if we go back even further, the first product created by Jobs and Woz was in 1972. It was called “blue box” — a nifty little gadget that could make free, untraceable (aka illegal) long-distance calls on the AT&T network. For obvious reasons, this venture was shut down, and the team turned their attention toward computers. But it’s instructive to see how themes of technology, convenience, privacy — and above all, control — would reverberate through Apple for decades to come.

Apple’s Current Position on Privacy

Today, Apple is ranked the #1 most valuable brand in 2016 by Forbes Magazine, a position Apple attained, they say, by making trust their number one value. How do they say they establish trust? By protecting privacy. Google hosts a page with extensive content on your personal data and how they use it; Apple has a similar section on their website about privacy. On it, you’re greeted by a letter from Apple CEO Tim Cook:

“We believe in telling you up front exactly what’s going to happen to your personal information, and asking for your permission before you share it with us. And if you change your mind later, we make it easy to stop sharing with us. Every Apple product is designed around those principles. When we do ask to use your data, it’s to provide you with a better user experience.”

– Tim Cook, Apple CEO

Read on, and Apple says, “As you add photos, messages, contacts, and credit cards to your Apple devices, they become more personal. So we design innovative ways to protect that data. And we build powerful safeguards into our operating systems, our apps, and the devices themselves. Because the things you rely on every day should keep your personal information safe.”

In other words, Apple products become more valuable and useful when they hold and make use of important personal data, making them an intimate part of all aspects of life that involve information and communication. They effectively wouldn’t exist without our personal data, so their privacy and security should be considered alongside utility and convenience.

Apple says they believe you should have control over which data and services you’re willing to use, and how and when you want to use them. We can take Apple at their word, or we can look at the public record of their products and policies — which are actually pretty impressive. First the product side:

Some of Apple’s Notable Security and Privacy Innovations

Apple has a tradition of innovating around user-controlled privacy features, including full disc encryption (FileVault), isolated hardware for sensitive data like fingerprints (TouchID), and using known, authenticated devices to authorize new ones (two-step and two-factor verification).

FileVault Disc Encryption (2003)

Apple introduced disc encryption with FileVault, a feature that shipped with Max OS X Panther (10.3) in October 2003. Flipping this on encrypts the entire volume of data on a computer using Mac OS X, making the data only readable by a logged in user.

TouchID (2013)

Components in the iPhone/iPad Home Button with TouchID Apple tracks you
Components in the iPhone/iPad Home Button with TouchID

Starting with iPhone 5S in 2013, the home button on iPhones — and eventually iPads — had a fingerprint sensor that could be used to unlock the device, authenticate for apps, and even sign for payments. To keep this secure, Apple added an isolated portion of their custom A7 chip to handle TouchID separately from the rest of the functions of the phone. The only data passing back and forth would be a successful or unsuccessful identification. So even if someone gained access to the iPhone data and hardware, TouchID would not be directly accessible.

Two Step (2013) vs Two Factor (2015) Verification

Going beyond passwords, Apple has found ways to improve security by adding additional steps when the Apple ID is activated on a new device. Two-step verification was introduced in 2013.

Secondary steps for two-step verification:

  1. Push notification to a trusted device
  2. Text message or phone call to a registered number
  3. Offline recovery key
  4. Application-specific password

Two-factor authentication, released with iOS 9 and OS X El Capitan (10.11) in 2015, replaces offline recovery keys and app-specific passwords.

Secondary steps for two-factor verification:

  1. Push notification to a trusted device (previously authenticated device)
  2. Text message or phone call to a registered number (via “Did not get a verification code?” prompt)
  3. Apple suggests appending a 6-digit authentication code to the original password
  4. Offline, time-dependent code generated from the Settings of a trusted device

Apple vs. The FBI

You may remember the recent public kerfuffle when Apple’s privacy policy collided with the FBI’s need-to-know mandate.

The FBI had an iPhone from one of the terrorist involved in the San Bernardino, California attack in 2015, and wanted Apple to give them the ability bypass the lockscreen. Apple made themselves the symbol of privacy protection by refusing to add a backdoor, saying that this same entry point could be used by malicious hackers, gaining a lot of publicity along the way. Here’s more on the topics of privacy, encryption, and law enforcement from both sides of the debate:

.

Good for Users, Good for Business

All of this reinforces a culture at Apple that prioritizes the development of privacy and security in tandem with new advances in hardware, software and services. One of the ongoing selling points for Apple Computers is the very limited number of valid security threats, virtually eliminating the need for malware and virus protection software. The latest MacBook Pro with Touchbar has already sold four times the units of Microsofts Surface Pro.

Want to learn more? Get it straight from the horse’s mouth.

What Personal Data Does Apple Have, and How Is It Protected?

Apple has a lot of our data, but most of it is pretty opaque to them. On Apple computers running macOS (formerly known as OS X), data is not encrypted automatically, but you can turn it on using FileVault. However, all data is protected by username and password, or by using your AppleID to log in on startup by default. All data stored on iOS devices (iPhone, iPad, iPod Touch) is encrypted on the device. Data in transfer to Apple’s iCloud Servers is encrypted from any source device, and in storage on the iCloud Servers. This includes basics like Contacts, Calendars, and iOS device backups.

There are some other forms of personal data that deserve more than just a mention:

  1. Photos and Live Photos
  2. Location Services
  3. HealthKit & ResearchKit
  4. ApplePay
  5. Call History
  6. iCloud Keychain
  7. Diagnostocs & Usage
  8. Advertising

 

Photos and Live Photos

Apple iOS Live Photo Apple tracks you
Live Photo on iPhone Home Screen

iPhones and iPads have some of the best mobile built-in cameras on the market. And like all other data stored on the iOS devices, Photos (including Live Photos with 3 seconds of still images and audio) are encrypted on the device, in transfer, and when stored on iCloud.

Location Services

If you use GPS, Maps, weather info, location-based reminders, etc., then you’re using Location Services. Apple collects location information to improve user experience, and deliver relevant content to you based on where you are and what you are doing. This data can be collected in obvious ways through the use of the app, and is also collected when the device is on but inactive in your pocket or bag. Location services are something you opt into on your first use of a device, and whenever the operating system is updated to a new version. On iOS, you can update your Location Services under Settings > Privacy > Location Services. There, you can turn this whole feature on and off for the device, and for individual apps. For apps, you can choose from three options: Never, While Using the App, and Always.

In April of 2011, Apple came under fire for 10 months of detailed location information in an unprotected and unencrypted “consolidated.db” file on iOS devices. Since coming to light, Apple has encrypted this data along with the rest of the personal data on the device, in the cloud, and in transit.

HealthKit & ResearchKit

In 2014, Apple announced HealthKit, a system for collecting health data form user input, built in sensors, and third party hardware and apps. This central collection of data could then be accessed by app developers to create health-based apps to promote health — and even provide data for medical research. Health data is some of the most personal data we have, and the fact that HealthKit interacts with third parties means that there are some opportunities for our data to be used in ways that we’re unaware of to generate profits for businesses.

Here’s how Apple explains the situation:”You decide which information is placed in Health and which apps can access your data through the Health app. When your phone is locked with a passcode or Touch ID, all of your health and fitness data in the Health app — other than what you’ve added to your Medical ID emergency card — is encrypted with your passcode. You can back up data stored in the Health app to iCloud, where it is encrypted while in transit and at rest. Apps that access HealthKit are required to have a privacy policy, so be sure to review these policies before providing apps with access to your health and fitness data.”

ResearchKit Announcement at WWDC '16 -- Apple tracks you
ResearchKit Announcement at WWDC ’16

App developers are not allowed to share the data they collect with additional third parties (fourth parties?) and must include a privacy policy detailing who has access to user data, when, and for what purposes. Apps may not use or disclose to third parties data gathered in the health, fitness, and medical research context—including from the HealthKit API, Motion and Fitness, or health-related human subject research—for advertising or other use-based data mining purposes other than improving health management, or for the purpose of health research, and then only with permission. And in the cases of apps collecting data for human subject medical research, they must also submit themselves to and independent ethics review board.

All of this requires that:

  1. App developers willingly comply
  2. Apple’s review process is through, complete and accurate

With tens of thousands of HealthKit app developers, and the fact that this is a human process, it is exposed to human error, negligence and exploitation. Apple sets the bar pretty high, and there has been no widely reported instance of a misuse of data to date. At the same time, make sure you read the privacy statement of an app before you try it.

ApplePay

ApplePay allows you to securely pay for products and services in apps and using NFC (Near Field Communication) at participating brick-and-mortar merchants. Banking institutions that support ApplePay are quick to tell their customers that card information isn’t stored by Apple on its servers — in fact, the card information isn’t even on your iOS device. Instead, your device receives and stores a device-specific Device Account Number (DAN) through a series of encrypted communications with your bank. Every time you setup the same payment card on a new device, a new DAN is generated for that device and that payment method, so that only that device can authorize that payment’s unique DAN. If the device is lost or stolen, only the device’s DAN needs to be cancelled, not the card itself. Apple doesn’t get to see what you bought, and the seller never receives your account number.

Source: insights.moneris.com

This is great, unless someone has collected some additional data at the time of payment, like your geolocation from Location Services. Here is an example. If you make an ApplePay purchase at a juice bar in the mall, and you use their app to sign into their loyalty program, the fact that those two events happened close together in time can allow them to connect you to your transaction behavior. Most of the time, no big deal. What we must then wonder is how secure this data is and how it is stored by the juice bar. If the system gets hacked and you have a list of transactions with loyalty program events, they can be easily matched up giving the hacker detailed information including your name, phone number, address, as well as times and frequency of visits. So, ApplePay itself is hyper secure, but we can compromise that by volunteering personal data or our identity at the time of making a purchase.

Call History

A lesser-known, rarely-mentioned feature of iCloud is how it saves up to four months of your iPhone call history. You have to have iCloud Drive turned on to activate this feature, but it’s not something Apple makes you aware of, nor do they offer an opportunity to opt out. This call history holds detailed information including phone numbers, dates, times made or received, and duration, missed and ignored calls. Starting with iOS 10, this call history includes data from VoIP apps that use Apple’s new CallKit framework. Four months is twice the length of time mobile carriers in the US hold that information, and the feature has raised concerns because this is exactly the kind of information the FBI and other intelligence and law enforcement agencies desire most. The only way to turn this off: turn off iCloud Drive on your iPhone, iPad or iPod Touch.

iCloud Keychain

Passwords, passwords, passwords. So many apps and services require them, and requirements for these passwords keep getting more strict. Meanwhile, best practices for data security mean that we shouldn’t repeat passwords, and should change them often. This has led to the rise of password services like 1Password, Dashlane, and LastPass — apps that securely store passwords for all apps and services in one place, and fill in forms automatically as long as you are logged into that service.

What many folks don’t realize is that Apple has had a convenient and extremely secure password safe built right into macOS and iCloud — it’s called iCloud Keychain. When you type a password on an Apple device, you might be asked if you would like to save the login credentials to the Keychain. If iCloud Keychain is on, this data is then synced to all other Apple devices (macOS and iOS) that are accessed using the same Apple ID. And though the data is encrypted on the device, on the cloud, and in transit between them, Apple goes a step further by using device-based encryption to prevent even Apple from being able or compelled to attempt to unlock your Keychain. And to get it set up, Apple requires a separate two-factor verification method beyond what is required to activate a device with your apple ID.

Diagnostocs & Usage

iphone7-ios10-settings-privacy-diagnostics-usage-improve-activity
Click to enlarge.

iOS and macOS ask you on initial startup whether you’re willing to share diagnostic and usage information with Apple and third party app developers to help them improve software and services. If accepted, this collects a log of all usage and behavior that can be sent to Apple and/or app developers if something crashes.

To change your settings on your iOS 10 device, go to Settings > Privacy > Diagnostics & Usage, and select Automatically Send or Don’t Send. To change your settings on macOS, choose Apple menu > System Preferences, click Security & Privacy, click Privacy, click Diagnostics & Usage, then select or deselect “Share crash data with app developers” and select/deselect “Send diagnostic & usage data to Apple.”

Click to enlarge.
Click to enlarge.

On macOS, you can view diagnostic and usage information in the Console app. To do this, open Console, click Diagnostic and Usage Data in the left panel. Data is collected even if you opted out of automatically sending to Apple and app developers. Any lines that include SubmitDiagInfo indicate when diagnostic and usage information was sent to Apple. If you’re logged in as an administrator, you can view any item in the section. If you’re not logged in as an administrator, you can only view User Diagnostic Reports.

Advertising. Yes, Apple Does That Too

Apple collects personal data to target you for advertisements on their ads platform on iOS and AppleTV. Apple collects what it calls “contextual” information and “targeting” information on users by default. This data is owned by Apple, and is not made available to advertisers on an individual user level, but only in aggregate as targeting groups.

Contextual data is collected into a profile that determines what ads you see. This includes keyboard language settings, device type, connection type location information (if Location Services is on), App Store searches and Apple News articles you read.

Targeting data is collected to place you into various groups of users that advertisers can choose to share their ads with. This includes account information (associated with your AppleID including name, address, age, gender, etc.), purchase and download history, in-app data (purchases, game levels completed, etc.), and past interactions with ads in their ads platform.

Recently, users have gained new controls over information shared with advertisers:

Change Advertising Settings on iOS (iPhone, iPad, iPod Touch)

In iOS 10, you can change your ad tracking in Settings > Privacy > Advertising. You can also control whether you allow location-based ads in Settings > Privacy > Location Services > System Services. These settings affect in-app advertising as well as Apple News specific advertising. To be clear, the settings don’t stop ads from appearing. By turning off tracking, you are no longer contributing data that would make the ads you are seeing more relevant.

Change Advertising Settings on AppleTV

AppleTV lets you manage ad tracking in Settings > General > Privacy. Similarly with iOS, this setting does not stop ads from showing.

Apple tracks you

Got That? Now Here’s Apple’s #1 Security Vulnerability:

You are.

Or in my case, I am. We all are. Because though Apple’s been vigilant in pursuing the highest levels of privacy and encryption for personal data, there’s still a dead simple way bad guys can gain access to it: just ask for it. Or more likely, phish for it.

Apple's #1 Vulnerability - Phishing

Phishing is a variation on the age-old hacker technique Social Engineering (which you may know by less vaunted terms like grifting, conning, or scamming). Social Engineering means persuading a real-life human to bypass security protocols;  phishing means sending said human fake electronic communications that appear to come from a legitimate source (like Apple), then getting them to share their secure information. These messages usually include a dire warning about a security issue; to fix the issue, the target is asked to provide their login information, or click a false link which then harvests that information.

It’s like being barricaded in a castle, then tossing down the keys to a guy who says he’s with Castle Maintenance. In Apple’s case, giving someone access to your Apple ID can give them access to your iCloud account — including contacts, calendar, photos, and iOS device backups. Doesn’t sound like it would work, but it does.

That’s exactly what happened in 2014 when over 500 very personal pictures of celebrities (no, we won’t show them here) were made public. These celebrities had fallen victim to phishing scams, and provided their Apple IDs. And the rest — christened The Fappening — was internet history.

How To Defend Yourself Against Apple (or any) Phishing Scams

  1. NEVER send account login or personal information by text or email.
  2. Change passwords often (hooray for iCloud Keychain for keeping track of this).
  3. When in doubt about any contact from Apple (or any other technology provider), contact their support desk directly. Here’s Apple Customer Support for Apple ID.
  4. When provided, be sure and watch for notifications and emails from technology providers indicating login activity to your account (Apple does this).

But Is All This Enough? A (Sort of) Conclusion

Here’s a reminder of the three common user perspectives on data privacy:

  1. Couldn’t care less
  2. Concerned
  3. Fatalistic

And as I noted in my previous writing on this topic, where you fall on this spectrum will depend in part on what these three words mean to you: Privacy, Choice, and Control.

In the case of Apple, privacy appears to be prioritized over the profits that could be made if Apple were more Google-like with their use of third party data, and sold it to the highest bidder. Apple does allow more opportunities for us to opt-out of that business model without giving up on the many magical devices and services they provide. But remember: these opt-out opportunities buried deep in system settings offer the veneer of choice, giving us an illusion of control.

Compared to Google, Apple seems like an impenetrable tower of privacy. But don’t let a few toggle switches fool you: Apple’s success is just as tied up with personal data as Google’s, and your willingness and trust that they handle this data is what makes them able to sell products and services. And unlike Google, nowhere do we get to see what Apple does with our data, what they know about us, and what leverage this information gives them.

I guess we can take their word for it, but there are precious few windows in this tower. Another thing to consider in this walled world: am I experiencing my data and media the way I want, or am I just cruising down the consumer highways of expected behavior? For many (even most?), the slick and ‘simple’ proprietary structures Apple provides are adequate tools for radical creativity and innovation, and any privacy tradeoffs are worthwhile.  For others — not so much. And that’s where your own personal decision matrix comes to play. Privacy, Choice and Control: where do you stand?


Next up: Facebook

Facebook Privacy SettingsIf we’re going to line up the 1,000 lb gorillas of personal data, then we need to bring Facebook into the room. Facebook’s been on a trajectory of unprecedented growth, most recently taking on news and entertainment as it moves to become a part of all aspects of life. Pics or it didn’t happen, right? But what happens to all these shares, and what does Facebook do with all this information? Stay tuned!




Subscribe

Want to hear about the photo world, new Mylio features, and useful tips and tricks? Sign up for our emails!