Surveillance Capitalism: Data, Behavior & Power
- Surveillance capitalism is an economic system that monetizes personal data through behavioral prediction and manipulation, reshaping sectors from advertising to governance.
- It leverages digital tracking, IoT, and machine learning technologies to extract actionable insights from user interactions for profit.
- Ongoing debates call for enhanced regulatory, ethical, and technical interventions to safeguard autonomy and ensure data justice.
Surveillance capitalism is an economic logic and mode of accumulation in which human experience becomes free raw material for extraction, behavioral prediction, and influence, as theorized by Zuboff (2019) and subsequently elaborated across a spectrum of empirical and philosophical studies. Surveillance capitalism operates by appropriating digital traces—intentional and unintentional—from users on digital platforms and connected devices, translating them into structured data assets that are algorithmically modeled to optimize, predict, and ultimately steer behavior for profit (Wang, 19 Jun 2025, Foini, 2023, Bonfils, 10 Aug 2025, Valeriani, 2023, Jin, 2022, Zard, 2023, Adams et al., 2018, Dehaye et al., 2020, Cremonini, 2023, Torre et al., 10 May 2024, Belgroun et al., 8 Sep 2025, Michel et al., 26 Feb 2024, Olof-Ors et al., 9 Nov 2025). This system is not confined to advertising but extends to insurance, employment, finance, law enforcement, health, and political governance.
1. Theoretical Foundations and Definitions
Surveillance capitalism is formally characterized by:
- Behavioral data appropriation: Corporations systematically harvest data from user interactions (clicks, GPS traces, sleep patterns, social networks, biometric sensors) with or without explicit awareness or consent (Wang, 19 Jun 2025, Bonfils, 10 Aug 2025, Adams et al., 2018).
- Predictive and behavioral modulation: Extracted data populate proprietary machine learning models for profiling, prediction, and direct modification of behavior (recommendation engines, nudges, scoring systems) (Wang, 19 Jun 2025, Bonfils, 10 Aug 2025, Zard, 2023, Foini, 2023).
- Profit via behavioral surplus: The economic value derives not merely from data, but from "behavioral surplus": actionable predictions and interventions that generate new markets for influencing and selling future behavior (Foini, 2023, Bonfils, 10 Aug 2025, Torre et al., 10 May 2024).
- Instrumentarian power: This term (Zuboff) encapsulates the capacity to shape, direct, and control populations outside of traditional mechanisms, via continuous, algorithmically-mediated feedback loops (Wang, 19 Jun 2025, Olof-Ors et al., 9 Nov 2025).
Surveillance capitalism is analytically distinguished from:
- Datafication: Turning lived activities into data. Surveillance capitalism is a subset where data are systematically monetized at scale (Adams et al., 2018).
- Platform capitalism: Ownership and operation of digital marketplaces may coexist with or enable surveillance capitalism but does not require behavioral extraction as a core revenue driver (Adams et al., 2018).
2. Technical Infrastructures and Data Extraction Mechanisms
Surveillance capitalists exploit a layered technical stack designed for pervasive data capture, transport, modeling, and activation:
- Web and mobile tracking: Cookies (first- and third-party), fingerprinting, pixel tags, SDKs embedded across apps, and SSO authentication tokens facilitate continuous, cross-site user tracking (Adams et al., 2018, Bonfils, 10 Aug 2025, Dehaye et al., 2020, Jin, 2022).
- Wearables and IoT: Biometric streams (heart-rate, GPS, sleep), sensor data, Bluetooth beacons, and machine-generated location signatures further densify the data substrate (Wang, 19 Jun 2025, Jin, 2022).
- Cloud-based aggregation: Identity-linked and pseudonymized data converge in data lakes, supporting high-dimensional user vectors for ML-based profiling (Wang, 19 Jun 2025, Foini, 2023).
- Machine learning pipelines: Models (deep nets, clustering, regression) infer latent traits, preferences, vulnerabilities, and reveal micro-tendencies for individualized product, risk, or behavior targeting (Wang, 19 Jun 2025, Foini, 2023, Zard, 2023).
- Behavioral feedback loops: Recommendations, nudges, and dynamically personalized interfaces feed behavioral outputs back into the models, intensifying modulation (Michel et al., 26 Feb 2024, Torre et al., 10 May 2024).
A representative formalization (from (Valeriani, 2023)):
where is the raw data footprint for user , maps data to profile vectors, and monetizes the resulting profiles.
3. Scope, Empirical Evidence, and Industrial Actors
Large-scale studies provide quantitative and qualitative evidence for the pervasiveness and normalization of surveillance capitalism:
- Patent pipeline evidence: 79% of CVPR-patented papers from the 2010s contributed to surveillance-enabled patents; >11,000 such patents cite academic research on human data extraction (Kalluri et al., 2023).
- Tracking reach metrics: Google trackers appear on ~78% of observed web page loads; Facebook on 21%; Amazon 17%, forming a three-tier stratification in corporate surveillance reach (Bonfils, 10 Aug 2025).
- Bluetooth tracking networks: Commercial platforms (Apple, Samsung, Tile) mobilize user devices as unpaid digital labor, scanning and uploading BLE metadata en masse for proprietary aggregation (Jin, 2022).
- Ambient datafication: Epidemiological tools (proximity tracing apps) are structurally embedded in a surveillance infrastructure via re-identification and biosurveillance attacks, exploiting Android permission models and SDKs with >100 million installs (Dehaye et al., 2020).
Dominant actors: Elite US and Chinese universities, "big tech" platforms (Google, Facebook/Meta, Amazon, Microsoft) and data-broker intermediaries (Acxiom, Epsilon, Equifax) operate as central nodes (Kalluri et al., 2023, Bonfils, 10 Aug 2025, Cremonini, 2023, Jin, 2022).
4. Behavioral Influence, Manipulation, and Autonomy
Surveillance capitalism focuses not just on inference, but on behavioral modulation and manipulation:
- Nudging via interface and feedback design: Gamification (leaderboards, badges), default setting manipulation, and notification schemes modulate user routines, often transforming "empowerment" into neoliberal self-discipline (Wang, 19 Jun 2025, Michel et al., 26 Feb 2024, Zard, 2023).
- Online Behavioral Advertising (OBA): Real-time ad auctions use tracking-derived "interest" and "psychographic" profiles for microtargeting, exploiting users' layered vulnerabilities covertly (Zard, 2023).
- Algorithmic emotional governance: Content-recommendation systems optimize for high-arousal/negative engagement, reinforcing filter bubbles, emotional polarization, and viral dissemination of misinformation (Michel et al., 26 Feb 2024, Torre et al., 10 May 2024, Belgroun et al., 8 Sep 2025).
- Anthropomorphic AI agents: Chatbots and virtual assistants implement "simulated affection" and engineered trust mechanisms to elicit persistent, intimate self-disclosure, strengthening behavioral surplus and dependency (Olof-Ors et al., 9 Nov 2025).
This pervasive behavioral steering erodes personal and collective autonomy, narrowing the space for independent decision-making and rational reflection (Foini, 2023, Michel et al., 26 Feb 2024, Olof-Ors et al., 9 Nov 2025).
5. Social, Psychological, and Environmental Externalities
Surveillance capitalism imposes a range of negative externalities:
- Attention commodification: Platforms capture, fragment, and auction "brain time" as a divisible, rivalrous commodity, often resulting in overexploitation characteristic of a "tragedy of the attentional commons" (Belgroun et al., 8 Sep 2025).
- Mental and physical health impacts: Over-capture is associated with increased anxiety, depression, ADHD symptoms, sleep disruption, and child development harms (Belgroun et al., 8 Sep 2025, Michel et al., 26 Feb 2024).
- Democratic and societal harms: Algorithmic ecosystems facilitate political radicalization, misinformation virality, and erosion of civic deliberation (Michel et al., 26 Feb 2024, Torre et al., 10 May 2024, Belgroun et al., 8 Sep 2025).
- Agency and labor: Users become uncompensated data laborers, producing economic value through passive device operation and engagement, without recourse to meaningful consent or agency (Jin, 2022, Wang, 19 Jun 2025).
- Environmental costs: Web tracking inflates bandwidth use (up to 10 MB/page for some trackers), increases power consumption and carbon emissions, and makes ad-blocking a partial environmental measure (Bonfils, 10 Aug 2025).
6. Regulatory, Ethical, and Design Interventions
Policy responses and reform proposals address, but have not yet resolved, foundational surveillance capitalism dynamics:
- Consent and data minimization: While GDPR, CCPA, and similar regulations enshrine consent and minimization, the default-permissive, dark-patterned, and post-consent regimes undermine genuine user control (Wang, 19 Jun 2025, Cremonini, 2023, Adams et al., 2018).
- Transparency and auditability: Proposals include algorithmic transparency, human-auditable A/B-testing, mandated disclosure of data flows, and right to contest model-based risk assessments (Wang, 19 Jun 2025, Torre et al., 10 May 2024, Bonfils, 10 Aug 2025, Olof-Ors et al., 9 Nov 2025).
- Governance innovation: Data trusts, collective bargaining intermediaries, and design-justice frameworks advocate for shifting from contractual "user rights" to "citizen rights," emphasizing collective control, democratic accountability, and community stewardship of data (Wang, 19 Jun 2025, Jin, 2022).
- Technical architectures: Decentralized data processing (edge/on-device computation), active friction in data submission, and privacy-preserving alternatives (e.g., Gemini protocol, cryptographic cohorts) offer concrete technical countermeasures to extraction (Bonfils, 10 Aug 2025, Wang, 19 Jun 2025, Jin, 2022).
- Economic regulation: Pigouvian taxes on attention capture, antitrust measures, and incentive realignment reflect efforts to internalize social costs and curb the "attention arbitrage" foundations of surveillance capitalism (Belgroun et al., 8 Sep 2025, Michel et al., 26 Feb 2024).
- Socio-structural interventions: Strengthening public mental health infrastructure, promoting participatory design, and emphasizing the value of individual and collective autonomy recenter human flourishing over optimization for profit (Olof-Ors et al., 9 Nov 2025, Torre et al., 10 May 2024).
Comprehensive regulation must move beyond the checkbox-consent paradigm to interrogate asymmetrical power structures, enforce accountable oversight, and develop architectures that foreground agency, justice, and care (Wang, 19 Jun 2025, Cremonini, 2023).
7. Critiques, Controversies, and Future Directions
Empirical studies challenge claims of omnipotent surveillance: targeting efficacy is marginal over classical approaches, and algorithmic exclusions can reinforce data "deserts" or misclassifications (Cremonini, 2023). The received narrative of surveillance capitalism's technical infallibility is often perpetuated by both critics and advocates, obscuring the nuanced, sociotechnical reality of adaptation, resistance, and incremental reform.
Emerging directions include:
- Beyond individual optimization: Advocacy for shifting the focus from personal productivity or micro-optimization to structural reforms addressing collective data justice, ecological attention, and democratic accountability (Wang, 19 Jun 2025, Torre et al., 10 May 2024).
- Digital commons and public mission: Positioning platforms as public or civic infrastructure, subject to transparent governance, mission-based mandates, and educational obligations (Michel et al., 26 Feb 2024, Torre et al., 10 May 2024).
- Robust empirical monitoring: Longitudinal and fine-grained tracking of surveillance practices, protocols, and regulatory impacts (e.g., auditing browser APIs, SEC filings, and device-level data flows) (Bonfils, 10 Aug 2025).
- Paradigm shift in governance and design: Implementing architectures of care, friction, and deliberative agency over exploitation and seamless manipulation, with policy instruments targeting both technical and socio-economic vectors of exploitation (Wang, 19 Jun 2025, Torre et al., 10 May 2024, Belgroun et al., 8 Sep 2025).
Future work must confront both the empirical limits and structural resilience of surveillance capitalism, combining interdisciplinary methods, technological innovation, and ambitious regulatory frameworks to re-align digital infrastructures with autonomy, justice, and the public good.