· 10 min read
Around the world, governments are racing to anchor identity in the digital realm. India’s Aadhaar links more than a billion citizens to fingerprints and iris scans. China couples a sweeping Social Credit System with a pilot central-bank digital currency that tracks and shapes everyday life. The European Union is building its own Digital Identity Wallet to let citizens authenticate across borders. Now the United Kingdom, long a privacy minded democracy, is preparing to tie every citizen’s legal identity to the device they carry every waking hour: the smartphone.
The UK plan centres on GOV.UK One Login and its companion “digital wallet” is a single, verified national credential for public services. Ministers present it as a way to cut fraud and deter illegal immigration. But it effectively fuses state identity with the most powerful personal tracking device yet invented.
The logic falters on inspection. Employment fraud and informal hiring already evade existing checks. Legal work requires a National Insurance number and, for migrants, an eVisa. Neither prevents cash-in-hand jobs nor off-book renting. HM Revenue & Customs estimates the “hidden economy,” largely small employers paying in cash, costs more than £1.8 billion annually.¹ A mandatory phone-based ID cannot stop a landlord from renting a room for cash or a tradesperson from hiring day labourers outside official systems. A centralised credential will not solve a problem rooted in offline cash transactions and weak enforcement. The mismatch exposes the policy’s true vector: surveillance, not fraud control.
Equally revealing is the political framing. Ministers have tied the digital-ID proposal to immigration enforcement, a connection that plays to public concern over border control while doing little to address it. Invoking migration as a rationale for a universal digital credential is a strategic nudge, leveraging a high-salience issue to normalise pervasive identification and tracking for everyone. The linkage is less about solving a genuine immigration problem than about making a surveillance infrastructure more palatable to voters.
Britain’s record on large-scale digital projects offers further caution. Birmingham City Council’s Oracle ERP upgrade, launched with a £19 million budget, is now projected to exceed £100 million and was cited as a factor in the council’s 2023 Section 114 “effective bankruptcy” declaration.² National schemes have fared little better: the NHS National Programme for IT was written off after costs ballooned beyond £12 billion, and the Emergency Services Network has slipped years behind schedule with projected costs over £11 billion.³ These failures underscore how centralised, high-stakes IT projects can overrun in cost and complexity, an omen for a nationwide smartphone ID.
Digital identity itself is not new. More than 270 private firms already provide trusted verification services worth about £2 billion a year, supporting everything from NHS log-ins and HMRC filings to Companies House records.⁴ Existing law, including the Data (Use and Access) Act and the OfDIA trust framework, enables verification as reliable as paper documents without a central government database. The challenge is adoption and trust. Janine Hirt, Chief Executive of Innovate Finance and RegTech UK, welcomed government support for digital identity but warned that a mandatory state credential could “crowd out companies who have been developing innovative solutions” and risk a costly monopoly.⁵
Why, then, insist on the smartphone? Nearly 65% of UK internet use occurs on mobile devices.⁶ The device is rarely more than a few metres away and already tracks far more than location. GPS, Wi-Fi, and cell-tower signals create an unbroken record of movement. Accelerometers and gyroscopes reveal gait and posture. Cameras and depth sensors feed facial-recognition systems. Paired wearables capture heart rate, sleep, and stress. A 2022 Nature Communications study showed that fewer than four randomly chosen time-stamped location points can re-identify 99% of individuals in large datasets.⁷ MIT researchers have demonstrated that accelerometer data alone can distinguish individuals by their walking patterns.⁸ Machine-learning models now infer mood or stress from subtle micro-movements and typing rhythms, turning every sensor into a behavioural fingerprint that is constantly refreshed and impossible to falsify.
British habits make that fingerprint even richer. Ofcom reports that adults spend more than four hours a day online, overwhelmingly on their phones, and more than 80% keep their handset in the bedroom overnight.⁹ For practical purposes, the smartphone is an extension of the body. Linking a state-mandated ID to that device would give the government a continuously updated map of citizens’ movements and interactions.
Law enforcement already treats the smartphone as an evidentiary treasure chest. Under the Police and Criminal Evidence Act 1984, officers may seize a phone at the moment of arrest without a separate warrant if they believe it contains evidence.¹⁰ Forensic tools such as Cellebrite and GrayKey can, once authorised, recover deleted texts, app data, and location histories. Physical seizure ensures that data are preserved while the owner is cut off from communication at the moment of greatest vulnerability.
Technology companies have long understood the economic value of this intimate record. Google earns the bulk of its revenue by selling targeted advertising based on location and behavioural data. Apple, while marketing privacy as a competitive advantage, now derives more than a quarter of its profits from services that rely on detailed analytics to keep users buying apps, music, and wearables.¹¹ Apple Watch and Android-compatible wearables monitor heart rhythms and sleep cycles, aggregating readings into datasets of immense medical and commercial value.¹² Each new capability, from ultra-wideband chips that allow centimetre-level location tracking to models that infer mood from micro-movements, deepens the companies’ ability to predict and shape behaviour.
Regulators push back, but penalties barely dent the business model. In 2022, Google agreed to pay $391 million to settle investigations by 40 U.S. states after continuing to collect location data when users thought tracking was off.¹³ The Irish Data Protection Commission fined Meta €1.2 billion in 2023 for transferring European data to the United States,¹⁴ and Apple has faced EU probes and fines for failing to obtain valid consent for targeted advertising.¹⁵ These sums remain tiny relative to revenues and therefore weak deterrents.
If private companies can steer shoppers toward a purchase, governments can steer citizens toward desired behaviours. The UK’s Behavioural Insights Team, the original “nudge unit,” applies psychological insights to policy, from encouraging timely tax payments to boosting organ-donor registration.¹⁶ During the pandemic, states worldwide used targeted digital messages and location alerts to influence compliance with health restrictions. Authoritarian regimes go further. China’s Social Credit System aggregates financial records, travel patterns, and social-media activity into a “trustworthiness” score that affects access to loans, jobs, and travel, while the People’s Bank of China’s pilot central-bank digital currency adds programmable money to the mix.¹⁷
Data are not merely descriptive but prescriptive. Knowing when someone is tired, isolated, or financially stressed reveals when they are most open to a nudge, an advert, or a political message. Binding a verified national ID to the smartphone would merge government authority with the commercial telemetry of Apple and Google, creating a single point where identity and behavioural prediction converge.
Some argue, “I have nothing to hide, so why should I care?” This view misses three dangers. Awareness of constant monitoring discourages free expression and association, leading to self-censorship even when no wrongdoing exists. Algorithms misclassify, and a corrupted dataset or misread pattern can wrongly flag someone as a security risk or deny them a service with limited recourse. Predictive systems replicate or amplify discrimination, and credit scoring, insurance pricing, and hiring models can penalise minority groups or the poor even when the raw data are “accurate.”¹⁸ Privacy is not about hiding; it is about the freedom to live without unjustified oversight or algorithmic gatekeeping.
The implications for personal freedom reach far beyond individual privacy. Algorithmic feeds can amplify some views and bury others, shaping discourse without overt coercion. Location profiles enable dynamic pricing and personalised insurance rates. Lenders already experiment with phone-based behavioural signals as proxies for credit risk. Security risks compound the problem. Unlike a password, a facial template or a detailed location history cannot be changed once leaked. India’s Aadhaar programme, linking fingerprints and iris scans to welfare and banking, has suffered repeated breaches and documented exclusion when biometric verification fails.¹⁹ Even if the UK’s digital-ID credentials are strongly encrypted, the smartphones that hold them will continue to run third-party apps, each a potential entry point for attackers or inadvertent data leakage.
The United States shows how easily government and corporate interests can align. Under Section 702 of the Foreign Intelligence Surveillance Act, U.S. intelligence agencies can compel American service providers to provide communications data of non-U.S. persons abroad with only a secret court’s approval. The 2018 CLOUD Act widened that reach by allowing U.S. authorities to demand information held on servers anywhere in the world so long as the company controlling the data is subject to U.S. jurisdiction.²⁰ Because Google, Apple, Microsoft, and Amazon dominate global cloud infrastructure, the practical effect is near-global access to data routed through their systems.
Europe presents a similar tension even while positioning itself as a privacy leader. The proposed European Digital Identity Wallet promises decentralised control, but member states continue to push for data retention rules, citing security concerns. GDPR provides strong formal protections, but national-security exceptions show how robust frameworks bend under political pressure. At the same time, both Brussels and London face lobbying from Big Tech and the U.S. administration to weaken or delay the EU’s Digital Markets Act and related privacy measures.²¹ China offers a glimpse of the endpoint as its Social Credit System aggregates vast datasets into a trust score that determines eligibility for loans, travel, or education, and the e-CNY pilot gives the state programmable money and real-time visibility of transactions.²²
Digital identity is not inherently a tool of oppression. Properly designed, it can reduce fraud and speed transactions without eroding privacy. A safer model is “self-sovereign identity,” in which individuals hold cryptographically signed credentials in a secure wallet they control. Verification is peer-to-peer, so a bank can confirm age or residency without learning anything else, and no central server stores every transaction. Credentials can be revoked or rotated at will. The UK could adopt this federated approach, issuing a core “UK citizen” credential while banks, universities, and healthcare providers supply others. The One Login app could become a standards-based wallet under the user’s cryptographic control, backed by strong encryption, open-source code, and independent audits.
Britain, therefore, faces a pivotal decision. It can follow the gravitational pull of convenience toward a centralised smartphone ID that doubles as a tracking badge or build a decentralised system that treats identity as a personal right. Convenience must not trump control. A trustworthy digital identity should empower citizens, not monitor them. The smartphone is already the universal key to modern life. The question is whether that key remains in the hands of its citizens or in the pocket of the state.
illuminem Voices is a democratic space presenting the thoughts and opinions of leading Sustainability & Energy writers, their opinions do not necessarily represent those of illuminem.
See how the companies in your sector perform on sustainability. On illuminem’s Data Hub™, access emissions data, ESG performance, and climate commitments for thousands of industrial players across the globe.
References
1. HM Revenue & Customs, Measuring Tax Gaps 2023 Edition (UK Government, 2023).
2. Birmingham City Council, Section 114 Notice and Audit Committee Reports (2023).
3. UK National Audit Office, NHS National Programme for IT (2011); UK Home Office, Emergency Services Network Progress Review (2024).
4. UK Cabinet Office, Digital Identity and Attributes Trust Framework (2023).
5. Innovate Finance, Digital Identity Consultation Response (2024).
6. Ofcom, Online Nation 2023 Report (2023).
7. Yves-Alexandre de Montjoye et al., “Unique in the Crowd,” Nature Communications (2022 update of 2013 dataset).
8. MIT Media Lab, “Gait Recognition via Accelerometer Data” (2021).
9. Ofcom, Adults’ Media Use and Attitudes (2023).
10. Police and Criminal Evidence Act 1984 (UK Statute).
11. Alphabet Inc. Annual Report (2023); Apple Inc. Annual Report (2023).
12. Apple, “Health Research Highlights” (2023).
13. “Google to Pay $391.5 Million in Location-Tracking Settlement,” Michigan Attorney General (2022).
14. Irish Data Protection Commission, “Meta Platforms Ireland Decision” (May 2023).
15. Irish Data Protection Commission, “Apple Privacy Enforcement Action” (2021).
16. UK Cabinet Office, Behavioural Insights Team Annual Report (2022).
17. Rogier Creemers, “China’s Social Credit System,” Leiden University (2022); International Monetary Fund, “e-CNY Pilot” (2023).
18. Solon Barocas and Andrew Selbst, “Big Data’s Disparate Impact,” California Law Review (2016).
19. Human Rights Watch, “India: Aadhaar Failures and Data Breaches” (2024).
20. U.S. Congress, “Foreign Intelligence Surveillance Act Section 702; CLOUD Act” (2018).
21. European Commission, “EU–U.S. Privacy and Digital Markets Act Briefings” (2024). 22. International Monetary Fund, “Central Bank Digital Currency in China” (2023).