In this session we bust Apple on their differential privacy claims for iOS devices by reverse-engineering telemetry data. We’ll illustrate how the privacy-preserving algorithm systemically suffers from implementation issues, how it leads to re-identification risk, how advertising IDs and hardware IDs are being misused to fingerprint users, and what needs to be done to preserve privacy. We start with a working threat model of an iOS device, layer on privacy threats, and expose how personal information is collected and abused in advertising. We expose the flaws in the differential privacy algorithm despite claims that user data is only being used in aggregate form. What makes this session innovative? First, it dives deep into threat modelling the implementation of Apple’s differential privacy algorithm. Second, we expose the weaknesses of the algorithm and the risks that present day data aggregation methods are causing harm to users when used in targeting Ads.