Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectCountry Of PublicationPublisherSourceTarget AudienceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
850,492
result(s) for
"PRIVACY"
Sort by:
Contextual Integrity Up and Down the Data Food Chain
by
Nissenbaum, Helen
in
Privacy
2019
According to the theory of contextual integrity (CI), privacy norms prescribe information flows with reference to five parameters — sender, recipient, subject, information type, and transmission principle. Because privacy is grasped contextually (e.g., health, education, civic life, etc.), the values of these parameters range over contextually meaningful ontologies — of information types (or topics) and actors (subjects, senders, and recipients), in contextually defined capacities. As an alternative to predominant approaches to privacy, which were ineffective against novel information practices enabled by IT, CI was able both to pinpoint sources of disruption and provide grounds for either accepting or rejecting them. Mounting challenges from a burgeoning array of networked, sensor-enabled devices (IoT) and data-ravenous machine learning systems, similar in form though magnified in scope, call for renewed attention to theory. This Article introduces the metaphor of a data (food) chain to capture the nature of these challenges. With motion up the chain, where data of higher order is inferred from lower-order data, the crucial question is whether privacy norms governing lower-order data are sufficient for the inferred higher-order data. While CI has a response to this question, a greater challenge comes from data primitives, such as digital impulses of mouse clicks, motion detectors, and bare GPS coordinates, because they appear to have no meaning. Absent a semantics, they escape CI’s privacy norms entirely
.
Journal Article
Privacy and Manipulation in the Digital Age
by
Zarsky, Tal Z.
in
Privacy
2019
The digital age brings with it novel forms of data flow. As a result, individuals are constantly being monitored while consuming products, services and content. These abilities have given rise to a variety of concerns, which are most often framed using “privacy” and “data protection”-related paradigms. An important, oft-noted yet undertheorized concern is that these dynamics might facilitate the manipulation of subjects; a process in which firms strive to motivate and influence individuals to take specific steps and make particular decisions in a manner considered to be socially unacceptable. That it is important and imperative to battle manipulation carries with it a strong intuitive appeal. Intuition, however, does not always indicate the existence of a sound justification or policy option. For that, a deeper analytic and academic discussion is called for
.
This Article begins by emphasizing the importance of addressing the manipulation-based argument, which derives from several crucial problems and flaws in the legal and policy setting currently striving to meet the challenges of the digital age. Next, the Article examines whether the manipulation-based concerns are sustainable, or are merely a visceral response to changing technologies which cannot be provided with substantial analytical backing. Here the Article details the reasons for striving to block manipulative conduct and, on the other hand, reasons why legal intervention should be, in the best case, limited. The Article concludes with some general implications of this discussion for the broader themes and future directions of privacy law, while trying to ascertain whether the rise of the manipulation-based discourse will lead to information privacy’s expansion or perhaps its demise
.
Journal Article
Privacy : a very short introduction
Some would argue that scarcely a day passes without a new assault on our privacy. In the wake of the whistle-blower Edward Snowden's revelations about the extent of surveillance conducted by the security services in the United States, Britain, and elsewhere, concerns about individual privacy have significantly increased. The Internet generates risks, unimagined even twenty years ago, to the security and integrity of information in all its forms. The manner in which information is collected, stored, exchanged, and used has changed forever; and with it, the character of the threats to individual privacy. The scale of accessible private data generated by the phenomenal growth of blogs, social media, and other contrivances of our information age pose disturbing threats to our privacy. And the hunger for gossip continues to fuel sensationalist media that frequently degrade the notion of a private domain to which we reasonably lay claim. In the new edition of this Very Short Introduction, Raymond Wacks looks at all aspects of privacy to include numerous recent changes, and considers how this fundamental value might be reconciled with competing interests such as security and freedom of expression.
Role of Incentives in the Use of Blockchain-Based Platforms for Sharing Sensitive Health Data: Experimental Study
2023
Blockchain is an emerging technology that enables secure and decentralized approaches to reduce technical risks and governance challenges associated with sharing data. Although blockchain-based solutions have been suggested for sharing health information, it is still unclear whether a suitable incentive mechanism (intrinsic or extrinsic) can be identified to encourage individuals to share their sensitive data for research purposes. This study aimed to investigate how important extrinsic incentives are and what type of incentive is the best option in blockchain-based platforms designed for sharing sensitive health information. In this study, we conducted 3 experiments with 493 individuals to investigate the role of extrinsic incentives (ie, cryptocurrency, money, and recognition) in data sharing with research organizations. The findings highlight that offering different incentives is insufficient to encourage individuals to use blockchain technology or to change their perceptions about the technology's premise for sharing sensitive health data. The results demonstrate that individuals still attribute serious risks to blockchain-based platforms. Privacy and security concerns, trust issues, lack of knowledge about the technology, lack of public acceptance, and lack of regulations are reported as top risks. In terms of attracting people to use blockchain-based platforms for data sharing in health care, we show that the effects of extrinsic motivations (cryptoincentives, money, and status) are significantly overshadowed by inhibitors to technology use. We suggest that before emphasizing the use of various types of extrinsic incentives, the users must be educated about the capabilities and benefits offered by this technology. Thus, an essential first step for shifting from an institution-based data exchange to a patient-centric data exchange (using blockchain) is addressing technology inhibitors to promote patient-driven data access control. This study shows that extrinsic incentives alone are inadequate to change users' perceptions, increase their trust, or encourage them to use technology for sharing health data.
Journal Article
Three Control Views on Privacy
by
Menges, Leonhard
in
Privacy
2022
This paper discusses the idea that the concept of privacy should be understood in terms of control. Three different attempts to spell out this idea will be critically discussed. The conclusion will be that the Source Control View on privacy is the most promising version of the idea that privacy is to be understood in terms of control.
Journal Article
Digital by Default: Children's Capacity to Understand and Manage Online Data and Privacy
2020
How do children understand the privacy implications of the contemporary digital environment? This question is pressing as technologies transform children's lives into data which is recorded, tracked, aggregated, analysed and monetized. This article takes a child-centred, qualitative approach to charting the nature and limits of children's understanding of privacy in digital contexts. We conducted focus group interviews with 169 UK children aged 11-16 to explore their understanding of privacy in three distinct digital contexts--interpersonal, institutional and commercial. We find, first, that children primarily conceptualize privacy in relation to interpersonal contexts, conceiving of personal information as something they have agency and control over as regards deciding when and with whom to share it, even if they do not always exercise such control. This leads them to some misapprehensions about how personal data is collected, inferred and used by organizations, be these public institutions such as their schools or commercial businesses. Children's expectation of agency in interpersonal contexts, and their tendency to trust familiar institutions such as their schools, make for a doubly problematic orientation towards data and privacy online in commercial contexts, leading to a mix of frustration, misapprehension and risk. We argue that, since the complexity of the digital environment challenges teachers' capacity to address children's knowledge gaps, businesses, educators, parents and the state must exercise a shared responsibility to create a legible, transparent and privacy-respecting digital environment in which children can exercise genuine choice and agency.
Journal Article