Research on privacy information disclosure and protection mechanism of digital platform users

Student thesis: PhD Thesis

Abstract

The growth of the digital economy has made personal information (PI) a valuable asset, driving innovation and revenue on digital platforms. However, this shift has raised concerns about user privacy, presenting ethical, regulatory, and practical challenges in balancing privacy protection with economic interests. This doctoral thesis investigates privacy information disclosure and protection mechanisms in the context of China, where regulatory, cultural, and market factors shape unique privacy practices and attitudes. Through a compilation thesis format, this research examines specific aspects of digital privacy, including PI valuation, consumer behavior, regulatory frameworks, and innovative privacy protection strategies. Combining theoretical frameworks with empirical methods, the thesis advances understanding of consumer privacy issues and proposes a multi-layered regulatory framework to guide privacy practices in the digital economy.
The research aims to comprehensively understand privacy information disclosure and protection mechanisms for digital platform users. It focuses on four main objectives: (1) analyzing factors influencing privacy information disclosure, (2) evaluating the effectiveness of various regulatory tools on consumer privacy behaviors, (3) exploring the potential of embedded regulatory tools in fostering privacy-conscious decisions, and (4) proposing a multi-layered regulatory framework combining control-command, economic incentive, and information-led policies. These objectives address critical questions about the dynamics between consumer behavior, platform practices, and regulatory effectiveness, particularly within China’s regulatory and cultural landscape.
The thesis draws on privacy computing theory, behavioral economics, and regulatory policy analysis to interpret consumer privacy behaviors and assess regulatory approaches. Privacy computing theory explores how consumers weigh privacy risks against perceived benefits, while behavioral economics explains cognitive biases—such as optimism bias, the endowment effect, and social norms—that influence privacy choices and contribute to the privacy paradox. Regulatory policy frameworks, including control-command, economic incentives, and information-led policies, are analyzed to evaluate their applicability and limitations in the digital economy. Given China’s high digital integration, evolving public awareness, and regulatory changes, this thesis explores how these perspectives apply to China’s unique context.
This thesis addresses core aspects of consumer privacy within the digital economy, focusing on PI valuation, privacy paradoxes, regulatory frameworks, and market dynamics within China. Findings reveal that although consumers recognize the value of their PI, they often undervalue it due to information asymmetry and limited understanding of data monetization practices. Experimental studies show a notable gap between consumers’ willingness to pay (WTP) for privacy protections and their willingness to accept (WTA) compensation for data disclosure. This discrepancy underscores the complexity of PI valuation, which is influenced by cultural, psychological, and contextual factors unique to different market contexts, particularly in China. These findings highlight the need for PI valuation frameworks that incorporate both economic and subjective dimensions.
Further, this research explores the privacy paradox, demonstrating that cognitive biases—such as optimism bias—and social influences significantly impact users’ privacy decision-making. Many consumers underestimate privacy risks, assuming potential harms are unlikely to affect them. Additionally, social norms within digital communities encourage data sharing as users seek social rewards and personalized services. This gap between privacy awareness and action reveals limitations in relying solely on informed consent or user awareness, suggesting that effective privacy protections must also consider cognitive and social influences.
The thesis emphasizes the potential of embedded regulatory tools—such as privacy prompts, default privacy settings, and real-time notifications—in fostering privacy-conscious behaviors. Traditional regulatory approaches, including control-command, economic incentives, and information-led policies, show limitations in addressing privacy issues in China’s fast-evolving digital economy. By embedding regulatory tools within these frameworks, the study suggests that regulatory models can more effectively influence user behaviors and mitigate market failures in consumer privacy protection. Findings indicate that privacy-by-design principles, which integrate such tools within platform interfaces, promote more informed decision-making by simplifying privacy choices and addressing behavioral biases. This integrated approach aligns with China’s Personal Information Protection Law (PIPL) and offers a pathway for achieving both regulatory compliance and user autonomy.
Lastly, the thesis proposes a comprehensive, multi-layered regulatory framework that combines control-command policies, economic incentives, information-led approaches, and embedded regulatory tools. By establishing control-command policies with mandatory privacy protections, providing economic incentives for privacy-conscious practices, promoting transparency through information-led policies, and embedding privacy tools within platform interfaces, this framework balances regulatory rigor with flexibility. It is especially suited to China’s digital landscape, where consumer expectations, regulatory frameworks, and digital practices are evolving alongside the global economy.
The thesis makes theoretical contributions by extending privacy valuation theory, enhancing understanding of the privacy paradox, and refining regulatory theory with embedded tools. It demonstrates that PI value is influenced by cultural, psychological, and contextual factors, broadening traditional privacy models. The study also clarifies the privacy paradox, showing how cognitive biases and social influences shape privacy behaviors. By proposing embedded regulatory tools, the study refines regulatory theory, suggesting that effective privacy protections can be incorporated into digital platforms to address behavioral and cognitive barriers.
Practically, this research provides actionable recommendations for policymakers and digital platforms. For governments, the findings support adopting multi-layered regulatory frameworks that integrate embedded tools with traditional approaches. Policymakers can enhance privacy outcomes by incorporating economic incentives, promoting transparency, and embedding privacy prompts to address behavioral biases. For businesses, the study emphasizes privacy-by-design principles, recommending that companies integrate user-friendly privacy tools to build trust and meet evolving regulations. This consumer-centric approach fosters customer loyalty and provides a competitive advantage in the digital marketplace.
This thesis offers a comprehensive analysis of privacy information disclosure and protection mechanisms in the digital economy, particularly within China. Through a multi-theoretical framework and empirical studies, the research advances understanding of consumer privacy behaviors, regulatory efficacy, and PI valuation. By proposing a multi-layered regulatory framework that incorporates control-command policies, economic incentives, information-led approaches, and embedded tools, the study presents a balanced solution to digital privacy challenges. The insights from this research provide valuable guidance for policymakers, businesses, and scholars navigating the complex landscape of digital privacy, where PI is both a commodity and a right.
Date of Award17 Mar 2025
Original languageEnglish
Awarding Institution
  • University of Nottingham
SupervisorJie YU (Supervisor), Alain Chong (Supervisor), Nana Kufuor (Supervisor) & Haijun Bao (Supervisor)

Keywords

  • Digital Privacy
  • Privacy Computing Theory
  • Information Framing
  • Privacy Regulatory Tool
  • User Psychological Characteristics

Cite this

'