A recent report published in the Financial Times reveals that app developers are considering resorting to surreptitious new forms of user tracking in order to circumvent Apple’s new privacy rules. The new  rules, which were introduced late last year, look set to bring sweeping changes to the mobile advertising industry over the coming months. While users and privacy campaigners may welcome the news, app developers risk a dramatic cut in revenue. Faced with such a bleak outlook, some developers are reportedly exploring underhand new forms of user tracking to evade Apple’s new rules. Tempting though this may be, those that do so risk not only being excluded from Apple’s App Store, but enforcement action under European data protection rules and potential group litigation claims (‘class actions’) for misuse of private information. 

Apple’s new privacy rules

Apple first announced its intention to provide more transparency around iOS app data collection in June of last year, with the launch of ‘privacy labels’ on its App Store. The new labels will be required for apps on all of its platforms, which includes iOS, iPadOS, macOS, watchOS and tvOS. Any new or updated app must include a privacy label, or it will not be permitted on the App Store. Apple will also ensure that its own software, such as Apple Music, Apple TV and Apple Wallet is labelled. 

Apple breaks data collection down into three categories: ‘data used to track you’; ‘data linked to you’; and ‘data not linked to you’. Tracking involves the collection of app users’ personal data or location data. App developers then link this data to other businesses’ apps or sites, for purposes such as targeted advertising or other ad-related activities, in order to generate revenue streams.  

As well as introducing privacy ‘nutrition labels’, late last year, Apple announced that as part of the iOS 14 update, it would give users the choice to block the IDFA identifier to apps. An IDFA (Identifier for Advertisers) is a unique identifier for mobile devices that is used to target advertising and measure effectiveness at an individual user level. The change means that app developers will need to obtain users’ permission to collect and share data. The effect of this change is likely to be significant. According to one source, ‘Currently, about 70% of IOS users share their IDFA data with app publishers, after this change it’s estimated that this number will drop to 10% or 15% 

Apple’s goal is to address privacy concerns where it may not have been clear to individuals how app developers use their personal data. Users and privacy advocates will welcome Apple’s changes. However, app developers now face an unenviable choice between accepting the changes and watching their revenues plummet or attempting an unauthorized ‘work around’ and being thrown out of the App Store. The App Store is the gateway to an economy estimated to be worth US$500 bn. 

Clandestine work-arounds

Some app developers are reportedly considering ‘device fingerprinting’ to secretly track users. Device fingerprinting is banned on the App Store, but it is difficult to detect. The technique can be used to recognize repeat visits from the same device, and may be used across multiple apps. Device fingerprinting works by correlating a combination of a device’s hardware and software characteristics, such as internet connections, battery and language settings and usage patterns. 

Another clandestine approach involves the use of ‘hashed emails’ whereby email addresses used to sign up for services and games are converted into a string of letters and numbers. Hashing email addresses enables companies to share users’ details without actually sharing the email address. 

Application of the GDPR

Developers that use surreptitious tracking techniques risk not only being barred from the App Store. They may also be in breach of European data protection rules. The General Data Protection Regulation (GDPR) regulates the processing of ‘personal data’, that is, information that directly or indirectly identifies a living individual. The GDPR definition of personal data includes any ‘identifier such as a name, an identification number, location data, an online identifier’, which could include a device ID, an IDFA or potentially a hashed email address. In other words, any identifier which an app developer is able to tie to a specific user is likely to be personal data and hence will be subject to the GDPR. 

The GDPR applies to businesses that are established in the European Union. It also applies to businesses established outside the EU, but which offer goods and services to citizens in the European Union, or monitors their behavior. This ‘long arm’ legislation applies regardless of whether payment is required from individuals. Accordingly, a developer established in the US, that offers free games to people in Europe, is likely to be subject to the GDPR. Equivalent rules also apply in the post-Brexit United Kingdom, thanks to the ‘UK GDPR’.

A fundamental principle of the GDPR is that personal data must be processed lawfully, fairly and in a transparent manner. In practice, app developers will almost certainly need to establish users’ informed consent in order to collect their information. The GDPR requirements around consent are strict, and many prevalent practices around consent are likely to be unlawful. In simple terms, any practice that a user is likely to think of as sneaky, creepy or dishonest is likely to be in breach of the lawfulness, fairness and transparency principle. As such, this is likely to be a breach of the GDPR. Clandestine tracking falls squarely in the ‘sneaky, creepy and dishonest’ category and if it came to the attention of the data protection authorities, this could result in enforcement action. This is more than a theoretical risk; in the UK, the Information Commissioner’s Office (ICO) has already conducted an investigation into the workings of the ad tech sector. The ICO expressed concerns around what it described as ‘invisible processing’, i.e. processing of which the individual is unaware. The ICO is not a toothless regulator. Since the GDPR took effect in May 2018, the ICO has fined British Airways £20m (reduced from £183 million due to the coronavirus pandemic), Marriot Hotel Group £18.4 million (reduced from £99 million) and Ticketmaster £1.25 million. Developers would be unwise to ignore this risk. 

Other Risks

In the UK, developments in case law mean that individuals may claim compensation where their personal data has been misused. The first group litigation claim (or ‘class action’) for a personal data breach in the UK was recently brought against British Airways. The claim follows a data breach that took place in 2018, and reportedly affected 432,000 people. Lawyers acting for 16,000 affected individuals estimate that each individual could be entitled to £2,000 in compensation from BA. While as an individual sum this is not particularly significant, taken at an aggregate level, the value of the claim is likely to be in the region of £32,000,000. 

In the UK, it is early days for this type of legal action and it could come to nothing. However, if the litigation were to be successful, data breach group litigation claims could become more widespread. An app that collects users’ personal information could result in the misuse of private information, potentially on a large scale, depending on how widely the app has been distributed. App developers that may be considering flouting the rules should at least add this to their risk registers. 

How business leaders in this sector should manage these risks

App developers may find themselves between a rock and a hard place as a result of Apple’s new privacy rules. Faced with the prospect of dwindling revenues, engaging in surreptitious tracking may appear to be a potential solution. However, any developer that is considering doing so must consider the potential risks.