WorldWideScience

Sample records for decisions privacy human

  1. Fuzzy Privacy Decision for Context-Aware Access Personal Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qingsheng; QI Yong; ZHAO Jizhong; HOU Di; NIU Yujie

    2007-01-01

    A context-aware privacy protection framework was designed for context-aware services and privacy control methods about access personal information in pervasive environment. In the process of user's privacy decision, it can produce fuzzy privacy decision as the change of personal information sensitivity and personal information receiver trust. The uncertain privacy decision model was proposed about personal information disclosure based on the change of personal information receiver trust and personal information sensitivity. A fuzzy privacy decision information system was designed according to this model. Personal privacy control policies can be extracted from this information system by using rough set theory. It also solves the problem about learning privacy control policies of personal information disclosure.

  2. Towards context adaptive privacy decisions in ubiquitous computing

    NARCIS (Netherlands)

    Schaub, Florian; Könings, Bastian; Weber, M.; Kargl, Frank

    2012-01-01

    In ubiquitous systems control of privacy settings will be increasingly difficult due to the pervasive nature of sensing and communication capabilities. We identify challenges for privacy decisions in ubiquitous systems and propose a system for in situ privacy decision support. When context changes

  3. Self-disclosure decision making based on intimacy and privacy

    OpenAIRE

    Such, Jose M.; Espinosa, Agustin; Garcia-Fornes, Ana; Sierra, Caries

    2012-01-01

    Autonomous agents may encapsulate their principals¿ personal data attributes. These attributes may be disclosed to other agents during agent interactions, producing a loss of privacy. Thus, agents need self-disclosure decision-making mechanisms to autonomously decide whether disclosing personal data attributes to other agents is acceptable or not. Current self-disclosure decision-making mechanisms consider the direct benefit and the privacy loss of disclosing an attribute. Howe...

  4. Footprints near the Surf: Individual Privacy Decisions in Online Contexts

    Science.gov (United States)

    McDonald, Aleecia M.

    2010-01-01

    As more people seek the benefits of going online, more people are exposed to privacy risks from their time online. With a largely unregulated Internet, self-determination about privacy risks must be feasible for people from all walks of life. Yet in many cases decisions are either not obvious or not accessible. As one example, privacy policies are…

  5. An overview of human genetic privacy.

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2017-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. © 2016 New York Academy of Sciences.

  6. An overview of human genetic privacy

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2016-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. PMID:27626905

  7. FCJ-195 Privacy, Responsibility, and Human Rights Activism

    Directory of Open Access Journals (Sweden)

    Becky Kazansky

    2015-06-01

    Full Text Available In this article, we argue that many difficulties associated with the protection of digital privacy are rooted in the framing of privacy as a predominantly individual responsibility. We examine how models of privacy protection, such as Notice and Choice, contribute to the ‘responsibilisation’ of human rights activists who rely on the use of technologies for their work. We also consider how a group of human rights activists countered technology-mediated threats that this ‘responsibilisation’ causes by developing a collective approach to address their digital privacy and security needs. We conclude this article by discussing how technological tools used to maintain or counter the loss of privacy can be improved in order to support the privacy and digital security of human rights activists.

  8. Privacy-preserving clinical decision support system using Gaussian kernel-based classification.

    Science.gov (United States)

    Rahulamathavan, Yogachandran; Veluru, Suresh; Phan, Raphael C-W; Chambers, Jonathon A; Rajarajan, Muttukrishnan

    2014-01-01

    A clinical decision support system forms a critical capability to link health observations with health knowledge to influence choices by clinicians for improved healthcare. Recent trends toward remote outsourcing can be exploited to provide efficient and accurate clinical decision support in healthcare. In this scenario, clinicians can use the health knowledge located in remote servers via the Internet to diagnose their patients. However, the fact that these servers are third party and therefore potentially not fully trusted raises possible privacy concerns. In this paper, we propose a novel privacy-preserving protocol for a clinical decision support system where the patients' data always remain in an encrypted form during the diagnosis process. Hence, the server involved in the diagnosis process is not able to learn any extra knowledge about the patient's data and results. Our experimental results on popular medical datasets from UCI-database demonstrate that the accuracy of the proposed protocol is up to 97.21% and the privacy of patient data is not compromised.

  9. A model-driven privacy compliance decision support for medical data sharing in Europe.

    Science.gov (United States)

    Boussi Rahmouni, H; Solomonides, T; Casassa Mont, M; Shiu, S; Rahmouni, M

    2011-01-01

    Clinical practitioners and medical researchers often have to share health data with other colleagues across Europe. Privacy compliance in this context is very important but challenging. Automated privacy guidelines are a practical way of increasing users' awareness of privacy obligations and help eliminating unintentional breaches of privacy. In this paper we present an ontology-plus-rules based approach to privacy decision support for the sharing of patient data across European platforms. We use ontologies to model the required domain and context information about data sharing and privacy requirements. In addition, we use a set of Semantic Web Rule Language rules to reason about legal privacy requirements that are applicable to a specific context of data disclosure. We make the complete set invocable through the use of a semantic web application acting as an interactive privacy guideline system can then invoke the full model in order to provide decision support. When asked, the system will generate privacy reports applicable to a specific case of data disclosure described by the user. Also reports showing guidelines per Member State may be obtained. The advantage of this approach lies in the expressiveness and extensibility of the modelling and inference languages adopted and the ability they confer to reason with complex requirements interpreted from high level regulations. However, the system cannot at this stage fully simulate the role of an ethics committee or review board.

  10. Human Rights, Privacy and Medical Research; Analysing UK Policy on Tissue and Data

    OpenAIRE

    Gillott, John

    2006-01-01

    This report is one outcome of a study into privacy and human genetics initiated by John Gillott and staff and trustees of the Genetic Interest Group. \\ud \\ud The initial focus was on genetics and human rights, with an emphasis on legal aspects and policy decisions informed by law and rights ideology. Article 8 of the Human Rights Act 1998, the right to respect for private and family life,1 is of most relevance to this study, though other Articles are considered.\\ud \\ud The study as a whole co...

  11. Personality and Social Framing in Privacy Decision-Making: A Study on Cookie Acceptance.

    Science.gov (United States)

    Coventry, Lynne M; Jeske, Debora; Blythe, John M; Turland, James; Briggs, Pam

    2016-01-01

    Despite their best intentions, people struggle with the realities of privacy protection and will often sacrifice privacy for convenience in their online activities. Individuals show systematic, personality dependent differences in their privacy decision making, which makes it interesting for those who seek to design 'nudges' designed to manipulate privacy behaviors. We explore such effects in a cookie decision task. Two hundred and ninety participants were given an incidental website review task that masked the true aim of the study. At the task outset, they were asked whether they wanted to accept a cookie in a message that either contained a social framing 'nudge' (they were told that either a majority or a minority of users like themselves had accepted the cookie) or contained no information about social norms (control). At the end of the task, participants were asked to complete a range of personality assessments (impulsivity, risk-taking, willingness to self-disclose and sociability). We found social framing to be an effective behavioral nudge, reducing cookie acceptance in the minority social norm condition. Further, we found personality effects in that those scoring highly on risk-taking and impulsivity were significantly more likely to accept the cookie. Finally, we found that the application of a social nudge could attenuate the personality effects of impulsivity and risk-taking. We explore the implications for those working in the privacy-by-design space.

  12. Variability in adolescent portal privacy features: how the unique privacy needs of the adolescent patient create a complex decision-making process.

    Science.gov (United States)

    Sharko, Marianne; Wilcox, Lauren; Hong, Matthew K; Ancker, Jessica S

    2018-05-17

    Medical privacy policies, which are clear-cut for adults and young children, become ambiguous during adolescence. Yet medical organizations must establish unambiguous rules about patient and parental access to electronic patient portals. We conducted a national interview study to characterize the diversity in adolescent portal policies across a range of institutions and determine the factors influencing decisions about these policies. Within a sampling framework that ensured diversity of geography and medical organization type, we used purposive and snowball sampling to identify key informants. Semi-structured interviews were conducted and analyzed with inductive thematic analysis, followed by a member check. We interviewed informants from 25 medical organizations. Policies established different degrees of adolescent access (from none to partial to complete), access ages (from 10 to 18 years), degrees of parental access, and types of information considered sensitive. Federal and state law did not dominate policy decisions. Other factors in the decision process were: technology capabilities; differing patient population needs; resources; community expectations; balance between information access and privacy; balance between promoting autonomy and promoting family shared decision-making; and tension between teen privacy and parental preferences. Some informants believed that clearer standards would simplify policy-making; others worried that standards could restrict high-quality polices. In the absence of universally accepted standards, medical organizations typically undergo an arduous decision-making process to develop teen portal policies, weighing legal, economic, social, clinical, and technological factors. As a result, portal access policies are highly inconsistent across the United States and within individual states.

  13. Personality and Social Framing in Privacy Decision-Making: A Study on Cookie Acceptance

    Directory of Open Access Journals (Sweden)

    Lynne Margaret Coventry

    2016-09-01

    Full Text Available Despite their best intentions, people struggle with the realities of privacy protection and will often sacrifice privacy for convenience in their online activities. Individuals show systematic, personality dependent differences in their privacy decision making, which makes it interesting for those who seek to design ‘nudges’ designed to manipulate privacy behaviors. We explore such effects in a cookie decision task. Two hundred and ninety participants were given an incidental website review task that masked the true aim of the study. At the task outset, they were asked whether they wanted to accept a cookie in a message that either contained a social framing ’nudge’ (they were told that either a majority or a minority of users like themselves had accepted the cookie or contained no information about social norms (control. At the end of the task, participants were asked to complete a range of personality assessments (impulsivity, risk-taking, willingness to self-disclose and sociability. We found social framing to be an effective behavioral nudge, reducing cookie acceptance in the minority social norm condition. Further, we found personality effects such that those scoring highly on risk-taking and impulsivity were significantly more likely to accept the cookie. Finally, we found that the application of a social nudge could attenuate the personality effects of impulsivity and risk-taking. We explore the implications for those working in the privacy-by-design space.

  14. Personality and Social Framing in Privacy Decision-Making: A Study on Cookie Acceptance

    Science.gov (United States)

    Coventry, Lynne M.; Jeske, Debora; Blythe, John M.; Turland, James; Briggs, Pam

    2016-01-01

    Despite their best intentions, people struggle with the realities of privacy protection and will often sacrifice privacy for convenience in their online activities. Individuals show systematic, personality dependent differences in their privacy decision making, which makes it interesting for those who seek to design ‘nudges’ designed to manipulate privacy behaviors. We explore such effects in a cookie decision task. Two hundred and ninety participants were given an incidental website review task that masked the true aim of the study. At the task outset, they were asked whether they wanted to accept a cookie in a message that either contained a social framing ‘nudge’ (they were told that either a majority or a minority of users like themselves had accepted the cookie) or contained no information about social norms (control). At the end of the task, participants were asked to complete a range of personality assessments (impulsivity, risk-taking, willingness to self-disclose and sociability). We found social framing to be an effective behavioral nudge, reducing cookie acceptance in the minority social norm condition. Further, we found personality effects in that those scoring highly on risk-taking and impulsivity were significantly more likely to accept the cookie. Finally, we found that the application of a social nudge could attenuate the personality effects of impulsivity and risk-taking. We explore the implications for those working in the privacy-by-design space. PMID:27656157

  15. The Privacy Coach: Supporting customer privacy in the Internet of Things

    OpenAIRE

    Broenink, Gerben; Hoepman, Jaap-Henk; Hof, Christian van 't; van Kranenburg, Rob; Smits, David; Wisman, Tijmen

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing technologies on the RFID tags themselves. Instead the Privacy Coach functions as a mediator between customer privacy preferences and corporate privacy policies, trying to find a match between the ...

  16. The privacy coach: Supporting customer privacy in the internet of things

    NARCIS (Netherlands)

    Broenink, E.G.; Hoepman, J.H.; Hof, C. van 't; Kranenburg, R. van; Smits, D.; Wisman, T.

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing

  17. Human Flesh Search Engine and Online Privacy.

    Science.gov (United States)

    Zhang, Yang; Gao, Hong

    2016-04-01

    Human flesh search engine can be a double-edged sword, bringing convenience on the one hand and leading to infringement of personal privacy on the other hand. This paper discusses the ethical problems brought about by the human flesh search engine, as well as possible solutions.

  18. Protecting Privacy and Confidentiality in Environmental Health Research.

    Science.gov (United States)

    Resnik, David B

    2010-01-01

    Environmental health researchers often need to make difficult decisions on how to protect privacy and confidentiality when they conduct research in the home or workplace. These dilemmas are different from those normally encountered in clinical research. Although protecting privacy and confidentiality is one of the most important principles of research involving human subjects, it can be overridden to prevent imminent harm to individuals or if required by law. Investigators should carefully consider the facts and circumstances and use good judgment when deciding whether to breach privacy or confidentiality.

  19. 78 FR 43258 - Privacy Act; System of Records: Human Resources Records, State-31

    Science.gov (United States)

    2013-07-19

    ... DEPARTMENT OF STATE [Public Notice 8384] Privacy Act; System of Records: Human Resources Records... system of records, Human Resources Records, State- 31, pursuant to the provisions of the Privacy Act of... State proposes that the current system will retain the name ``Human Resources Records'' (previously...

  20. An overview of human genetic privacy

    OpenAIRE

    Shi, Xinghua; Wu, Xintao

    2016-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that nee...

  1. Privacy preserving mechanisms for optimizing cross-organizational collaborative decisions based on the Karmarkar algorithm

    NARCIS (Netherlands)

    Zhu, H.; Liu, H.W.; Ou, Carol; Davison, R.M.; Yang, Z.R.

    2017-01-01

    Cross-organizational collaborative decision-making involves a great deal of private information which companies are often reluctant to disclose, even when they need to analyze data collaboratively. The lack of effective privacy-preserving mechanisms for optimizing cross-organizational collaborative

  2. Privacy as human flourishing: could a shift towards virtue ethics strengthen privacy protection in the age of Big Data?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Privacy is commonly seen as an instrumental value in relation to negative freedom, human dignity and personal autonomy. Article 8 ECHR, protecting the right to privacy, was originally coined as a doctrine protecting the negative freedom of citizens in vertical relations, that is between citizen and

  3. A Taxonomy of Privacy Constructs for Privacy-Sensitive Robotics

    OpenAIRE

    Rueben, Matthew; Grimm, Cindy M.; Bernieri, Frank J.; Smart, William D.

    2017-01-01

    The introduction of robots into our society will also introduce new concerns about personal privacy. In order to study these concerns, we must do human-subject experiments that involve measuring privacy-relevant constructs. This paper presents a taxonomy of privacy constructs based on a review of the privacy literature. Future work in operationalizing privacy constructs for HRI studies is also discussed.

  4. Advertising and Invasion of Privacy.

    Science.gov (United States)

    Rohrer, Daniel Morgan

    The right of privacy as it relates to advertising and the use of a person's name or likeness is discussed in this paper. After an introduction that traces some of the history of invasion of privacy in court decisions, the paper examines cases involving issues such as public figures and newsworthy items, right of privacy waived, right of privacy…

  5. Protection of the right to privacy in the practice of the European Court of Human Rights

    Directory of Open Access Journals (Sweden)

    Mladenov Marijana

    2013-01-01

    Full Text Available The right to privacy is a fundamental human right and an essential component of the protection of human autonomy and freedom. The development of science and information systems creates various opportunities for interferences with physical and moral integrity of a person. Therefore, it is necessary to determine the precise content of the right to privacy. The European Convention on Human Rights and Fundamental Freedoms guarantees this right under Article 8. The European Court of Human Rights did not precisely define the content of the right to privacy and thereby the applicants could bring different aspects of life into the scope of respect for private life. According to the Court, the concept of privacy and private life includes the following areas of human life: the right to establish and maintain relationships with other human beings, protection of the physical and moral integrity of persons, protection of personal data, change of personal name, various issues related to sexual orientation and transgender. The subject of this paper is referring to previously mentioned spheres of human life in the light of interpretation of Article 8 of the Convention.

  6. Owning genetic information and gene enhancement techniques: why privacy and property rights may undermine social control of the human genome.

    Science.gov (United States)

    Moore, A D

    2000-04-01

    In this article I argue that the proper subjects of intangible property claims include medical records, genetic profiles, and gene enhancement techniques. Coupled with a right to privacy these intangible property rights allow individuals a zone of control that will, in most cases, justifiably exclude governmental or societal invasions into private domains. I argue that the threshold for overriding privacy rights and intangible property rights is higher, in relation to genetic enhancement techniques and sensitive personal information, than is commonly suggested. Once the bar is raised, so-to-speak, the burden of overriding it is formidable. Thus many policy decisions that have been recently proposed or enacted--citywide audio and video surveillance, law enforcement DNA sweeps, genetic profiling, national bans on genetic testing and enhancement of humans, to name a few--will have to be backed by very strong arguments.

  7. 32 CFR 701.118 - Privacy, IT, and PIAs.

    Science.gov (United States)

    2010-07-01

    ...) Development. Privacy must be considered when requirements are being analyzed and decisions are being made...-347) directs agencies to conduct reviews of how privacy issues are considered when purchasing or... a PIA to effectively address privacy factors. Guidance is provided at http://www.doncio.navy.mil. (f...

  8. Unpicking the privacy paradox: can structuration theory help to explain location-based privacy decisions?

    OpenAIRE

    Zafeiropoulou, Aristea M.; Millard, David E.; Webber, Craig; O'Hara, Kieron

    2013-01-01

    Social Media and Web 2.0 tools have dramatically increased the amount of previously private data that users share on the Web; now with the advent of GPS-enabled smartphones users are also actively sharing their location data through a variety of applications and services. Existing research has explored people’s privacy attitudes, and shown that the way people trade their personal data for services of value can be inconsistent with their stated privacy preferences (a phenomenon known as the pr...

  9. Valuating Privacy with Option Pricing Theory

    Science.gov (United States)

    Berthold, Stefan; Böhme, Rainer

    One of the key challenges in the information society is responsible handling of personal data. An often-cited reason why people fail to make rational decisions regarding their own informational privacy is the high uncertainty about future consequences of information disclosures today. This chapter builds an analogy to financial options and draws on principles of option pricing to account for this uncertainty in the valuation of privacy. For this purpose, the development of a data subject's personal attributes over time and the development of the attribute distribution in the population are modeled as two stochastic processes, which fit into the Binomial Option Pricing Model (BOPM). Possible applications of such valuation methods to guide decision support in future privacy-enhancing technologies (PETs) are sketched.

  10. I am not that interesting. Social media, privacy literacy, and the interplay between knowledge and experience

    OpenAIRE

    Rundhovde, Heidi Molvik

    2013-01-01

    Sharing of personal information on the Internet has become increasingly popular. In social media interactions users face a trade-off between the pleasure and usefulness of sharing and the need to protect their privacy. This study employs recent theory in the research area Human-Computer interaction to investigate users' privacy decisions on the social networking service Facebook from a holistic view, including aspects like emotions, dialectics, and social and temporal context. The purpose is ...

  11. Big Data and Consumer Participation in Privacy Contracts: Deciding who Decides on Privacy

    Directory of Open Access Journals (Sweden)

    Michiel Rhoen

    2015-02-01

    Full Text Available Big data puts data protection to the test. Consumers granting permission to process their personal data are increasingly opening up their personal lives, thanks to the “datafication” of everyday life, indefinite data retention and the increasing sophistication of algorithms for analysis.The privacy implications of big data call for serious consideration of consumers’ opportunities to participate in decision-making processes about their contracts. If these opportunities are insufficient, the resulting rules may represent special interests rather than consumers’ needs. This may undermine the legitimacy of big data applications.This article argues that providing sufficient consumer participation in privacy matters requires choosing the best available decision making mechanism. Is a consumer to negotiate his own privacy terms in the market, will lawmakers step in on his behalf, or is he to seek protection through courts? Furthermore is this a matter of national law or European law? These choices will affect the opportunities for achieving different policy goals associated with the possible benefits of the “big data revolution”.

  12. BORDERS OF COMMUNICATION PRIVACY IN SLOVENIAN CRIMINAL PROCEDURE – CONSTITUTIONAL CHALLENGES

    Directory of Open Access Journals (Sweden)

    Sabina Zgaga

    2015-01-01

    Full Text Available Due to fast technological development and our constant communication protection of communication privacy in every aspect of our (legal life has become more important than ever before. Regarding protection of privacy in criminal procedure special emphasis should be given to the regulation of privacy in Slovenian Constitution and its interpretation in the case law of the Constitutional Court. This paper presents the definition of privacy and communication privacy in Slovenian constitutional law and exposes the main issues of communication privacy that have been discussed in the case law of the Constitutional Court in the last twenty years. Thereby the paper tries to show the general trend in the case law of Constitutional Court regarding the protection of communication privacy and to expose certain unsolved issues and unanswered challenges. Slovenian constitutional regulation of communication privacy is very protective, considering the broad definition of privacy and the strict conditions for encroachment of communication privacy. The case law of Slovenian Constitutional Court has also shown such trend, with the possible exception of the recent decision on a dynamic IP address. The importance of this decision is however significant, since it could be applicable to all forms of communication via internet, the prevailing form of communication nowadays. Certain challenges still lay ahead, such as the current proposal for the amendment of Criminal Procedure Act-M, which includes the use of IMSI catchers and numerous unanswered issues regarding data retention after the decisive annulment of its partial legal basis by the Constitutional Court.

  13. Privacy and human behavior in the age of information.

    Science.gov (United States)

    Acquisti, Alessandro; Brandimarte, Laura; Loewenstein, George

    2015-01-30

    This Review summarizes and draws connections between diverse streams of empirical research on privacy behavior. We use three themes to connect insights from social and behavioral sciences: people's uncertainty about the consequences of privacy-related behaviors and their own preferences over those consequences; the context-dependence of people's concern, or lack thereof, about privacy; and the degree to which privacy concerns are malleable—manipulable by commercial and governmental interests. Organizing our discussion by these themes, we offer observations concerning the role of public policy in the protection of privacy in the information age. Copyright © 2015, American Association for the Advancement of Science.

  14. Quantifying the costs and benefits of privacy-preserving health data publishing.

    Science.gov (United States)

    Khokhar, Rashid Hussain; Chen, Rui; Fung, Benjamin C M; Lui, Siu Man

    2014-08-01

    Cost-benefit analysis is a prerequisite for making good business decisions. In the business environment, companies intend to make profit from maximizing information utility of published data while having an obligation to protect individual privacy. In this paper, we quantify the trade-off between privacy and data utility in health data publishing in terms of monetary value. We propose an analytical cost model that can help health information custodians (HICs) make better decisions about sharing person-specific health data with other parties. We examine relevant cost factors associated with the value of anonymized data and the possible damage cost due to potential privacy breaches. Our model guides an HIC to find the optimal value of publishing health data and could be utilized for both perturbative and non-perturbative anonymization techniques. We show that our approach can identify the optimal value for different privacy models, including K-anonymity, LKC-privacy, and ∊-differential privacy, under various anonymization algorithms and privacy parameters through extensive experiments on real-life data. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Human Errors in Decision Making

    OpenAIRE

    Mohamad, Shahriari; Aliandrina, Dessy; Feng, Yan

    2005-01-01

    The aim of this paper was to identify human errors in decision making process. The study was focused on a research question such as: what could be the human error as a potential of decision failure in evaluation of the alternatives in the process of decision making. Two case studies were selected from the literature and analyzed to find the human errors contribute to decision fail. Then the analysis of human errors was linked with mental models in evaluation of alternative step. The results o...

  16. Rethinking the Privacy Calculus: On the Role of Dispositional Factors and Affect

    OpenAIRE

    Kehr, Flavius; Wentzel, Daniel; Mayer, Peter

    2013-01-01

    Existing research on information privacy has mostly relied on the privacy calculus model which views privacy-related decision making as a rational process where individuals weigh the anticipated risks of disclosing personal data against the potential benefits. However, scholars have recently challenged two basic propositions of the privacy calculus model. First, some authors have distinguished between general and situational factors in the context of privacy calculus and have argued that ...

  17. Human factors influencing decision making

    OpenAIRE

    Jacobs, Patricia A.

    1998-01-01

    This report supplies references and comments on literature that identifies human factors influencing decision making, particularly military decision making. The literature has been classified as follows (the classes are not mutually exclusive): features of human information processing; decision making models which are not mathematical models but rather are descriptive; non- personality factors influencing decision making; national characteristics influencing decision makin...

  18. A Heuristic Model for Supporting Users’ Decision-Making in Privacy Disclosure for Recommendation

    Directory of Open Access Journals (Sweden)

    Hongchen Wu

    2018-01-01

    Full Text Available Privacy issues have become a major concern in the web of resource sharing, and users often have difficulty managing their information disclosure in the context of high-quality experiences from social media and Internet of Things. Recent studies have shown that users’ disclosure decisions may be influenced by heuristics from the crowds, leading to inconsistency in the disclosure volumes and reduction of the prediction accuracy. Therefore, an analysis of why this influence occurs and how to optimize the user experience is highly important. We propose a novel heuristic model that defines the data structures of items and participants in social media, utilizes a modified decision-tree classifier that can predict participants’ disclosures, and puts forward a correlation analysis for detecting disclosure inconsistences. The heuristic model is applied to real-time dataset to evaluate the behavioral effects. Decision-tree classifier and correlation analysis indeed prove that some participants’ behaviors in information disclosures became decreasingly correlated during item requesting. Participants can be “persuaded” to change their disclosure behaviors, and the users’ answers to the mildly sensitive items tend to be more variable and less predictable. Using this approach, recommender systems in social media can thus know the users better and provide service with higher prediction accuracy.

  19. Effective online privacy mechanisms with persuasive communication

    OpenAIRE

    Coopamootoo, P L

    2016-01-01

    This thesis contributes to research by taking a social psychological perspective to managing privacy online. The thesis proposes to support the effort to form a mental model that is required to evaluate a context with regards to privacy attitudes or to ease the effort by biasing activation of privacy attitudes. Privacy being a behavioural concept, the human-computer interaction design plays a major role in supporting and contributing to end users’ ability to manage their privacy online. Howev...

  20. Human decision error (HUMDEE) trees

    International Nuclear Information System (INIS)

    Ostrom, L.T.

    1993-01-01

    Graphical presentations of human actions in incident and accident sequences have been used for many years. However, for the most part, human decision making has been underrepresented in these trees. This paper presents a method of incorporating the human decision process into graphical presentations of incident/accident sequences. This presentation is in the form of logic trees. These trees are called Human Decision Error Trees or HUMDEE for short. The primary benefit of HUMDEE trees is that they graphically illustrate what else the individuals involved in the event could have done to prevent either the initiation or continuation of the event. HUMDEE trees also present the alternate paths available at the operator decision points in the incident/accident sequence. This is different from the Technique for Human Error Rate Prediction (THERP) event trees. There are many uses of these trees. They can be used for incident/accident investigations to show what other courses of actions were available and for training operators. The trees also have a consequence component so that not only the decision can be explored, also the consequence of that decision

  1. What was privacy?

    Science.gov (United States)

    McCreary, Lew

    2008-10-01

    Why is that question in the past tense? Because individuals can no longer feel confident that the details of their lives--from identifying numbers to cultural preferences--will be treated with discretion rather than exploited. Even as Facebook users happily share the names of their favorite books, movies, songs, and brands, they often regard marketers' use of that information as an invasion of privacy. In this wide-ranging essay, McCreary, a senior editor at HBR, examines numerous facets of the privacy issue, from Google searches, public shaming on the internet, and cell phone etiquette to passenger screening devices, public surveillance cameras, and corporate chief privacy officers. He notes that IBM has been a leader on privacy; its policy forswearing the use of employees' genetic information in hiring and benefits decisions predated the federal Genetic Information Nondiscrimination Act by three years. Now IBM is involved in an open-source project known as Higgins to provide users with transportable, potentially anonymous online presences. Craigslist, whose CEO calls it "as close to 100% user driven as you can get," has taken an extremely conservative position on privacy--perhaps easier for a company with a declared lack of interest in maximizing revenue. But TJX and other corporate victims of security breaches have discovered that retaining consumers' transaction information can be both costly and risky. Companies that underestimate the importance of privacy to their customers or fail to protect it may eventually face harsh regulation, reputational damage, or both. The best thing they can do, says the author, is negotiate directly with those customers over where to draw the line.

  2. A vision-based system for intelligent monitoring: human behaviour analysis and privacy by context.

    Science.gov (United States)

    Chaaraoui, Alexandros Andre; Padilla-López, José Ramón; Ferrández-Pastor, Francisco Javier; Nieto-Hidalgo, Mario; Flórez-Revuelta, Francisco

    2014-05-20

    Due to progress and demographic change, society is facing a crucial challenge related to increased life expectancy and a higher number of people in situations of dependency. As a consequence, there exists a significant demand for support systems for personal autonomy. This article outlines the vision@home project, whose goal is to extend independent living at home for elderly and impaired people, providing care and safety services by means of vision-based monitoring. Different kinds of ambient-assisted living services are supported, from the detection of home accidents, to telecare services. In this contribution, the specification of the system is presented, and novel contributions are made regarding human behaviour analysis and privacy protection. By means of a multi-view setup of cameras, people's behaviour is recognised based on human action recognition. For this purpose, a weighted feature fusion scheme is proposed to learn from multiple views. In order to protect the right to privacy of the inhabitants when a remote connection occurs, a privacy-by-context method is proposed. The experimental results of the behaviour recognition method show an outstanding performance, as well as support for multi-view scenarios and real-time execution, which are required in order to provide the proposed services.

  3. A Vision-Based System for Intelligent Monitoring: Human Behaviour Analysis and Privacy by Context

    Directory of Open Access Journals (Sweden)

    Alexandros Andre Chaaraoui

    2014-05-01

    Full Text Available Due to progress and demographic change, society is facing a crucial challenge related to increased life expectancy and a higher number of people in situations of dependency. As a consequence, there exists a significant demand for support systems for personal autonomy. This article outlines the vision@home project, whose goal is to extend independent living at home for elderly and impaired people, providing care and safety services by means of vision-based monitoring. Different kinds of ambient-assisted living services are supported, from the detection of home accidents, to telecare services. In this contribution, the specification of the system is presented, and novel contributions are made regarding human behaviour analysis and privacy protection. By means of a multi-view setup of cameras, people’s behaviour is recognised based on human action recognition. For this purpose, a weighted feature fusion scheme is proposed to learn from multiple views. In order to protect the right to privacy of the inhabitants when a remote connection occurs, a privacy-by-context method is proposed. The experimental results of the behaviour recognition method show an outstanding performance, as well as support for multi-view scenarios and real-time execution, which are required in order to provide the proposed services.

  4. Interpretation and Analysis of Privacy Policies of Websites in India

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2016-01-01

    the conditions specified in the policy document. So, ideally the privacy policies should be readable and provide sufficient information to empower users to make knowledgeable decisions. Thus, we have examined more than 50 privacy policies and discussed the content analysis in this paper. We discovered...... on information collection methods, purpose, sharing entities names and data transit. In this study, the 11 % privacy policies are compliance with privacy standards which denotes other privacy policies are less committed to support transparency, choice, and accountability in the process of information collection...... that the policies are not only unstructured but also described in complicated language. Our analysis shows that the user data security measures are nonspecific and unsatisfactory in 57% privacy policies. In spite of huge amount of information collection, the privacy policies does not have clear description...

  5. Humans Optimize Decision-Making by Delaying Decision Onset

    Science.gov (United States)

    Teichert, Tobias; Ferrera, Vincent P.; Grinband, Jack

    2014-01-01

    Why do humans make errors on seemingly trivial perceptual decisions? It has been shown that such errors occur in part because the decision process (evidence accumulation) is initiated before selective attention has isolated the relevant sensory information from salient distractors. Nevertheless, it is typically assumed that subjects increase accuracy by prolonging the decision process rather than delaying decision onset. To date it has not been tested whether humans can strategically delay decision onset to increase response accuracy. To address this question we measured the time course of selective attention in a motion interference task using a novel variant of the response signal paradigm. Based on these measurements we estimated time-dependent drift rate and showed that subjects should in principle be able trade speed for accuracy very effectively by delaying decision onset. Using the time-dependent estimate of drift rate we show that subjects indeed delay decision onset in addition to raising response threshold when asked to stress accuracy over speed in a free reaction version of the same motion-interference task. These findings show that decision onset is a critical aspect of the decision process that can be adjusted to effectively improve decision accuracy. PMID:24599295

  6. From Data Privacy to Location Privacy

    Science.gov (United States)

    Wang, Ting; Liu, Ling

    Over the past decade, the research on data privacy has achieved considerable advancement in the following two aspects: First, a variety of privacy threat models and privacy principles have been proposed, aiming at providing sufficient protection against different types of inference attacks; Second, a plethora of algorithms and methods have been developed to implement the proposed privacy principles, while attempting to optimize the utility of the resulting data. The first part of the chapter presents an overview of data privacy research by taking a close examination at the achievements from the above two aspects, with the objective of pinpointing individual research efforts on the grand map of data privacy protection. As a special form of data privacy, location privacy possesses its unique characteristics. In the second part of the chapter, we examine the research challenges and opportunities of location privacy protection, in a perspective analogous to data privacy. Our discussion attempts to answer the following three questions: (1) Is it sufficient to apply the data privacy models and algorithms developed to date for protecting location privacy? (2) What is the current state of the research on location privacy? (3) What are the open issues and technical challenges that demand further investigation? Through answering these questions, we intend to provide a comprehensive review of the state of the art in location privacy research.

  7. Privacy-Preserving Electrocardiogram Monitoring for Intelligent Arrhythmia Detection.

    Science.gov (United States)

    Son, Junggab; Park, Juyoung; Oh, Heekuck; Bhuiyan, Md Zakirul Alam; Hur, Junbeom; Kang, Kyungtae

    2017-06-12

    Long-term electrocardiogram (ECG) monitoring, as a representative application of cyber-physical systems, facilitates the early detection of arrhythmia. A considerable number of previous studies has explored monitoring techniques and the automated analysis of sensing data. However, ensuring patient privacy or confidentiality has not been a primary concern in ECG monitoring. First, we propose an intelligent heart monitoring system, which involves a patient-worn ECG sensor (e.g., a smartphone) and a remote monitoring station, as well as a decision support server that interconnects these components. The decision support server analyzes the heart activity, using the Pan-Tompkins algorithm to detect heartbeats and a decision tree to classify them. Our system protects sensing data and user privacy, which is an essential attribute of dependability, by adopting signal scrambling and anonymous identity schemes. We also employ a public key cryptosystem to enable secure communication between the entities. Simulations using data from the MIT-BIH arrhythmia database demonstrate that our system achieves a 95.74% success rate in heartbeat detection and almost a 96.63% accuracy in heartbeat classification, while successfully preserving privacy and securing communications among the involved entities.

  8. Privacy-preserving heterogeneous health data sharing.

    Science.gov (United States)

    Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila

    2013-05-01

    Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis.

  9. Monitoring Employee Behavior Through the Use of Technology and Issues of Employee Privacy in America

    Directory of Open Access Journals (Sweden)

    Mahmoud Moussa

    2015-04-01

    Full Text Available Despite the historic American love for privacy that has enhanced innovation and creativity throughout the country, encroachments on privacy restrain individual freedom. Noticeable, advances in technology have offered decision makers remarkable monitoring aptitudes that can be used in numerous tasks for multiple reasons. This has led scholars and practitioners to pose a significant number of questions about what is legitimate and illegitimate in the day-to-day affairs of a business. This article is composed of (a research about electronic monitoring and privacy concerns; (b definitions of, critiques of, and alternatives to electronic performance monitoring (EPM; (c motives behind employee monitoring and leadership behaviors; (d advice that makes monitoring less distressful; (e employee monitoring policies; (f reviewing policies and procedures; (g the role of human resource development (HRD in employee assessment and development; and (h conclusion and recommendations for further studies.

  10. Privacy preserving interactive record linkage (PPIRL).

    Science.gov (United States)

    Kum, Hye-Chung; Krishnamurthy, Ashok; Machanavajjhala, Ashwin; Reiter, Michael K; Ahalt, Stanley

    2014-01-01

    Record linkage to integrate uncoordinated databases is critical in biomedical research using Big Data. Balancing privacy protection against the need for high quality record linkage requires a human-machine hybrid system to safely manage uncertainty in the ever changing streams of chaotic Big Data. In the computer science literature, private record linkage is the most published area. It investigates how to apply a known linkage function safely when linking two tables. However, in practice, the linkage function is rarely known. Thus, there are many data linkage centers whose main role is to be the trusted third party to determine the linkage function manually and link data for research via a master population list for a designated region. Recently, a more flexible computerized third-party linkage platform, Secure Decoupled Linkage (SDLink), has been proposed based on: (1) decoupling data via encryption, (2) obfuscation via chaffing (adding fake data) and universe manipulation; and (3) minimum information disclosure via recoding. We synthesize this literature to formalize a new framework for privacy preserving interactive record linkage (PPIRL) with tractable privacy and utility properties and then analyze the literature using this framework. Human-based third-party linkage centers for privacy preserving record linkage are the accepted norm internationally. We find that a computer-based third-party platform that can precisely control the information disclosed at the micro level and allow frequent human interaction during the linkage process, is an effective human-machine hybrid system that significantly improves on the linkage center model both in terms of privacy and utility.

  11. Governance Through Privacy, Fairness, and Respect for Individuals.

    Science.gov (United States)

    Baker, Dixie B; Kaye, Jane; Terry, Sharon F

    2016-01-01

    Individuals have a moral claim to be involved in the governance of their personal data. Individuals' rights include privacy, autonomy, and the ability to choose for themselves how they want to manage risk, consistent with their own personal values and life situations. The Fair Information Practices principles (FIPPs) offer a framework for governance. Privacy-enhancing technology that complies with applicable law and FIPPs offers a dynamic governance tool for enabling the fair and open use of individual's personal data. Any governance model must protect against the risks posed by data misuse. Individual perceptions of risks are a subjective function involving individuals' values toward self, family, and society, their perceptions of trust, and their cognitive decision-making skills. Individual privacy protections and individuals' right to choose are codified in the HIPAA Privacy Rule, which attempts to strike a balance between the dual goals of information flow and privacy protection. The choices most commonly given individuals regarding the use of their health information are binary ("yes" or "no") and immutable. Recent federal recommendations and law recognize the need for granular, dynamic choices. Individuals expect that they will govern the use of their own health and genomic data. Failure to build and maintain individuals' trust increases the likelihood that they will refuse to grant permission to access or use their data. The "no surprises principle" asserts that an individual's personal information should never be collected, used, transmitted, or disclosed in a way that would surprise the individual were she to learn about it. The FIPPs provide a powerful framework for enabling data sharing and use, while maintaining trust. We introduce the eight FIPPs adopted by the Department of Health and Human Services, and provide examples of their interpretation and implementation. Privacy risk and health risk can be reduced by giving consumers control, autonomy, and

  12. Exercising privacy rights in medical science.

    Science.gov (United States)

    Hillmer, Michael; Redelmeier, Donald A

    2007-12-04

    Privacy laws are intended to preserve human well-being and improve medical outcomes. We used the Sportstats website, a repository of competitive athletic data, to test how easily these laws can be circumvented. We designed a haphazard, unrepresentative case-series analysis and applied unscientific methods based on an Internet connection and idle time. We found it both feasible and titillating to breach anonymity, stockpile personal information and generate misquotations. We extended our methods to snoop on celebrities, link to outside databases and uncover refusal to participate. Throughout our study, we evaded capture and public humiliation despite violating these 6 privacy fundamentals. We suggest that the legitimate principle of safeguarding personal privacy is undermined by the natural human tendency toward showing off.

  13. Privacy-Preserving Electrocardiogram Monitoring for Intelligent Arrhythmia Detection †

    Science.gov (United States)

    Son, Junggab; Park, Juyoung; Oh, Heekuck; Bhuiyan, Md Zakirul Alam; Hur, Junbeom; Kang, Kyungtae

    2017-01-01

    Long-term electrocardiogram (ECG) monitoring, as a representative application of cyber-physical systems, facilitates the early detection of arrhythmia. A considerable number of previous studies has explored monitoring techniques and the automated analysis of sensing data. However, ensuring patient privacy or confidentiality has not been a primary concern in ECG monitoring. First, we propose an intelligent heart monitoring system, which involves a patient-worn ECG sensor (e.g., a smartphone) and a remote monitoring station, as well as a decision support server that interconnects these components. The decision support server analyzes the heart activity, using the Pan–Tompkins algorithm to detect heartbeats and a decision tree to classify them. Our system protects sensing data and user privacy, which is an essential attribute of dependability, by adopting signal scrambling and anonymous identity schemes. We also employ a public key cryptosystem to enable secure communication between the entities. Simulations using data from the MIT-BIH arrhythmia database demonstrate that our system achieves a 95.74% success rate in heartbeat detection and almost a 96.63% accuracy in heartbeat classification, while successfully preserving privacy and securing communications among the involved entities. PMID:28604628

  14. Automated Decision-Making and Big Data: Concerns for People With Mental Illness.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha

    2016-12-01

    Automated decision-making by computer algorithms based on data from our behaviors is fundamental to the digital economy. Automated decisions impact everyone, occurring routinely in education, employment, health care, credit, and government services. Technologies that generate tracking data, including smartphones, credit cards, websites, social media, and sensors, offer unprecedented benefits. However, people are vulnerable to errors and biases in the underlying data and algorithms, especially those with mental illness. Algorithms based on big data from seemingly unrelated sources may create obstacles to community integration. Voluntary online self-disclosure and constant tracking blur traditional concepts of public versus private data, medical versus non-medical data, and human versus automated decision-making. In contrast to sharing sensitive information with a physician in a confidential relationship, there may be numerous readers of information revealed online; data may be sold repeatedly; used in proprietary algorithms; and are effectively permanent. Technological changes challenge traditional norms affecting privacy and decision-making, and continued discussions on new approaches to provide privacy protections are needed.

  15. Virtue, Privacy and Self-Determination

    DEFF Research Database (Denmark)

    Stamatellos, Giannis

    2011-01-01

    The ethical problem of privacy lies at the core of computer ethics and cyber ethics discussions. The extensive use of personal data in digital networks poses a serious threat to the user’s right of privacy not only at the level of a user’s data integrity and security but also at the level of a user......’s identity and freedom. In normative ethical theory the need for an informational self-deterministic approach of privacy is stressed with greater emphasis on the control over personal data. However, scant attention has been paid on a virtue ethics approach of information privacy. Plotinus’ discussion of self......-determination is related to ethical virtue, human freedom and intellectual autonomy. The Plotinian virtue ethics approach of self-determination is not primarily related to the sphere of moral action, but to the quality of the self prior to moral practice. In this paper, it is argued that the problem of information privacy...

  16. Disentangling privacy from property: toward a deeper understanding of genetic privacy.

    Science.gov (United States)

    Suter, Sonia M

    2004-04-01

    With the mapping of the human genome, genetic privacy has become a concern to many. People care about genetic privacy because genes play an important role in shaping us--our genetic information is about us, and it is deeply connected to our sense of ourselves. In addition, unwanted disclosure of our genetic information, like a great deal of other personal information, makes us vulnerable to unwanted exposure, stigmatization, and discrimination. One recent approach to protecting genetic privacy is to create property rights in genetic information. This Article argues against that approach. Privacy and property are fundamentally different concepts. At heart, the term "property" connotes control within the marketplace and over something that is disaggregated or alienable from the self. "Privacy," in contrast, connotes control over access to the self as well as things close to, intimately connected to, and about the self. Given these different meanings, a regime of property rights in genetic information would impoverish our understanding of that information, ourselves, and the relationships we hope will be built around and through its disclosure. This Article explores our interests in genetic information in order to deepen our understanding of the ongoing discourse about the distinction between property and privacy. It develops a conception of genetic privacy with a strong relational component. We ordinarily share genetic information in the context of relationships in which disclosure is important to the relationship--family, intimate, doctor-patient, researcher-participant, employer-employee, and insurer-insured relationships. Such disclosure makes us vulnerable to and dependent on the person to whom we disclose it. As a result, trust is essential to the integrity of these relationships and our sharing of genetic information. Genetic privacy can protect our vulnerability in these relationships and enhance the trust we hope to have in them. Property, in contrast, by

  17. Privacy proof in the cloud

    NARCIS (Netherlands)

    Jessen, Veerle; Weigand, Hans; Mouratidis, Haris

    Cloud computing has been a frequently researched subject as it brings many advantages, such as the ability to store data remotely and scale rapidly, but also comes with several issues, including privacy, trust and security. The decision whether it is best to go `into the cloud' or to `stay inside'

  18. Trust information-based privacy architecture for ubiquitous health.

    Science.gov (United States)

    Ruotsalainen, Pekka Sakari; Blobel, Bernd; Seppälä, Antto; Nykänen, Pirkko

    2013-10-08

    analysis. The architecture mimics the way humans use trust information in decision making, and enables the DS to design system-specific privacy policies using computational trust information that is based on systems' measured features. The trust attributes that were developed describe the level systems for support awareness and transparency, and how they follow general and domain-specific regulations and laws. The monitoring component of the architecture offers dynamic feedback concerning how the system enforces the polices of DS. The privacy management architecture developed in this study enables the DS to dynamically manage information privacy in ubiquitous health and to define individual policies for all systems considering their trust value and corresponding attributes. The DS can also set policies for secondary use and reuse of health information. The architecture offers protection against privacy threats existing in ubiquitous environments. Although the architecture is targeted to ubiquitous health, it can easily be modified to other ubiquitous applications.

  19. Privacy Challenges of Genomic Big Data.

    Science.gov (United States)

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  20. Scalable privacy-preserving data sharing methodology for genome-wide association studies: an application to iDASH healthcare privacy protection challenge.

    Science.gov (United States)

    Yu, Fei; Ji, Zhanglong

    2014-01-01

    In response to the growing interest in genome-wide association study (GWAS) data privacy, the Integrating Data for Analysis, Anonymization and SHaring (iDASH) center organized the iDASH Healthcare Privacy Protection Challenge, with the aim of investigating the effectiveness of applying privacy-preserving methodologies to human genetic data. This paper is based on a submission to the iDASH Healthcare Privacy Protection Challenge. We apply privacy-preserving methods that are adapted from Uhler et al. 2013 and Yu et al. 2014 to the challenge's data and analyze the data utility after the data are perturbed by the privacy-preserving methods. Major contributions of this paper include new interpretation of the χ2 statistic in a GWAS setting and new results about the Hamming distance score, a key component for one of the privacy-preserving methods.

  1. A Cross-Cultural Perspective on the Privacy Calculus

    Directory of Open Access Journals (Sweden)

    Sabine Trepte

    2017-01-01

    Full Text Available The “privacy calculus” approach to studying online privacy implies that willingness to engage in disclosures on social network sites (SNSs depends on evaluation of the resulting risks and benefits. In this article, we propose that cultural factors influence the perception of privacy risks and social gratifications. Based on survey data collected from participants from five countries (Germany [n = 740], the Netherlands [n = 89], the United Kingdom [n = 67], the United States [n = 489], and China [n = 165], we successfully replicated the privacy calculus. Furthermore, we found that culture plays an important role: As expected, people from cultures ranking high in individualism found it less important to generate social gratifications on SNSs as compared to people from collectivist-oriented countries. However, the latter placed greater emphasis on privacy risks—presumably to safeguard the collective. Furthermore, we identified uncertainty avoidance to be a cultural dimension crucially influencing the perception of SNS risks and benefits. As expected, people from cultures ranking high in uncertainty avoidance found privacy risks to be more important when making privacy-related disclosure decisions. At the same time, these participants ascribed lower importance to social gratifications—possibly because social encounters are perceived to be less controllable in the social media environment.

  2. Privacy and security in the digital age: Contemporary ethical challenges and future directions

    DEFF Research Database (Denmark)

    Hiranandani, Vanmala Sunder

    2011-01-01

    Privacy is at the core of civil rights from which all other human rights and freedoms flow. Since the twentieth century, and particularly since 9/11, rapid deployment of information and surveillance technologies in the name of national security has grave implications for individual privacy...... and human rights. This article reviews major strands in contemporary privacy-security debate, while critiquing existing conceptualisations of privacy that are inadequate in the context of multifaceted and ubiquitous surveillance technologies post 9/11. Further, this paper contends most privacy...

  3. (IN-PRIVACY IN MOBILE APPS. CUSTOMER OPPORTUNITIES

    Directory of Open Access Journals (Sweden)

    Yu.S. Chemerkina

    2016-01-01

    Full Text Available Subject of Study. The paper presents the results of an investigation of cross-platform mobile applications. This paper focuses on a cross-platform app data investigation in purpose of creating a database that helps to make decisions from data privacy viewpoint. These decisions refer to knowledge about mobile apps that are available to the public, especially on how consumer data is protected while it is stored locally or transferred via network as well as what type of data may leak. Methods. This paper proposes a forensics methodology as a cornerstone of an app data investigation process. The object of research is an application data protection under different security control types among modern mobile OS. The subject of research is a modification of forensics approach and behavioral analysis to examine application data privacy in order to find data that are not properly handled by applications which lead to data leakages, defining protection control type without forensics limits. In addition, this paper relies on using the simplest tools, proposing a limit to examine locally stored data and transmitted over the network to cover all data, excluding memory and code analysis unless it is valuable (behavioral analysis. The research methods of the tasks set in the paper include digital forensics approach methods depending on data conception (at-rest, in-use/memory, in-transit with behavioral analysis of application, and static and dynamic application code analysis. Main Results. The research was carried out for the scope of that thesis, and the following scientific results were obtained. First, the methods used to investigate the privacy of application data allow considering application features and protection code design and flaws in the context of incomplete user awareness about the privacy state due to external activity of the developer. Second, the knowledge set about facts of application data protection that allows making a knowledge database to

  4. Health information: reconciling personal privacy with the public good of human health.

    Science.gov (United States)

    Gostin, L O

    2001-01-01

    The success of the health care system depends on the accuracy, correctness and trustworthiness of the information, and the privacy rights of individuals to control the disclosure of personal information. A national policy on health informational privacy should be guided by ethical principles that respect individual autonomy while recognizing the important collective interests in the use of health information. At present there are no adequate laws or constitutional principles to help guide a rational privacy policy. The laws are scattered and fragmented across the states. Constitutional law is highly general, without important specific safeguards. Finally, a case study is provided showing the important trade-offs that exist between public health and privacy. For a model public health law, see www.critpath.org/msphpa/privacy.

  5. The Different Functions of Speech in Defamation and Privacy Cases.

    Science.gov (United States)

    Kebbel, Gary

    1984-01-01

    Reviews United States Supreme Court decisions since 1900 to show that free speech decisions often rest on the circumstances surrounding the speech. Indicates that freedom of speech wins out over privacy when social or political function but not when personal happiness is the issue.

  6. A Distributed Ensemble Approach for Mining Healthcare Data under Privacy Constraints.

    Science.gov (United States)

    Li, Yan; Bai, Changxin; Reddy, Chandan K

    2016-02-10

    In recent years, electronic health records (EHRs) have been widely adapted at many healthcare facilities in an attempt to improve the quality of patient care and increase the productivity and efficiency of healthcare delivery. These EHRs can accurately diagnose diseases if utilized appropriately. While the EHRs can potentially resolve many of the existing problems associated with disease diagnosis, one of the main obstacles in effectively using them is the patient privacy and sensitivity of the medical information available in the EHR. Due to these concerns, even if the EHRs are available for storage and retrieval purposes, sharing of the patient records between different healthcare facilities has become a major concern and has hampered some of the effective advantages of using EHRs. Due to this lack of data sharing, most of the facilities aim at building clinical decision support systems using limited amount of patient data from their own EHR systems to provide important diagnosis related decisions. It becomes quite infeasible for a newly established healthcare facility to build a robust decision making system due to the lack of sufficient patient records. However, to make effective decisions from clinical data, it is indispensable to have large amounts of data to train the decision models. In this regard, there are conflicting objectives of preserving patient privacy and having sufficient data for modeling and decision making. To handle such disparate goals, we develop two adaptive distributed privacy-preserving algorithms based on a distributed ensemble strategy. The basic idea of our approach is to build an elegant model for each participating facility to accurately learn the data distribution, and then can transfer the useful healthcare knowledge acquired on their data from these participators in the form of their own decision models without revealing and sharing the patient-level sensitive data, thus protecting patient privacy. We demonstrate that our

  7. Beyond Lawrence v. Texas: crafting a fundamental right to sexual privacy.

    Science.gov (United States)

    Fasullo, Kristin

    2009-05-01

    After the watershed 2003 U.S. Supreme Court decision Lawrence v.Texas, courts are faced with the daunting task of navigating the bounds of sexual privacy in light of Lawrence's sweeping language and unconventional structure. This Note focuses on the specific issue of state governments regulating sexual device distribution. Evaluating the substantive due process rights of sexual device retailers and users, this Note ultimately argues that the privacy interest identified in Lawrence is sufficiently broad to protect intimate decisions to engage in adult consensual sexual behavior, including the liberty to sell, purchase, and use a sexual device.

  8. Human centric security and privacy for the IoT using formal techniques

    OpenAIRE

    Kammueller, Florian

    2018-01-01

    In this paper, we summarize a new approach to make security and privacy issues in the Internet of Things (IoT) more transparent for vulnerable users. As a pilot project, we investigate monitoring of Alzheimer’s patients for a low-cost early warning system based on bio-markers supported with smart technologies. To provide trustworthy and secure IoT infrastructures, we employ formal methods and techniques that allow specification of IoT scenarios with human actors, refinement and analysis of at...

  9. Control use of data to protect privacy.

    Science.gov (United States)

    Landau, Susan

    2015-01-30

    Massive data collection by businesses and governments calls into question traditional methods for protecting privacy, underpinned by two core principles: (i) notice, that there should be no data collection system whose existence is secret, and (ii) consent, that data collected for one purpose not be used for another without user permission. But notice, designated as a fundamental privacy principle in a different era, makes little sense in situations where collection consists of lots and lots of small amounts of information, whereas consent is no longer realistic, given the complexity and number of decisions that must be made. Thus, efforts to protect privacy by controlling use of data are gaining more attention. I discuss relevant technology, policy, and law, as well as some examples that can illuminate the way. Copyright © 2015, American Association for the Advancement of Science.

  10. 45 CFR 164.534 - Compliance dates for initial implementation of the privacy standards.

    Science.gov (United States)

    2010-10-01

    ... privacy standards. 164.534 Section 164.534 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY Privacy of Individually Identifiable Health Information § 164.534 Compliance dates for initial implementation of the privacy standards. (a...

  11. Human-centric decision-making models for social sciences

    CERN Document Server

    Pedrycz, Witold

    2014-01-01

    The volume delivers a wealth of effective methods to deal with various types of uncertainty inherently existing in human-centric decision problems. It elaborates on  comprehensive decision frameworks to handle different decision scenarios, which help use effectively the explicit and tacit knowledge and intuition, model perceptions and preferences in a more human-oriented style. The book presents original approaches and delivers new results on fundamentals and applications related to human-centered decision making approaches to business, economics and social systems. Individual chapters cover multi-criteria (multiattribute) decision making, decision making with prospect theory, decision making with incomplete probabilistic information, granular models of decision making and decision making realized with the use of non-additive measures. New emerging decision theories being presented as along with a wide spectrum of ongoing research make the book valuable to all interested in the field of advanced decision-mak...

  12. Blood rights: the body and information privacy.

    Science.gov (United States)

    Alston, Bruce

    2005-05-01

    Genetic and other medical technology makes blood, human tissue and other bodily samples an immediate and accessible source of comprehensive personal and health information about individuals. Yet, unlike medical records, bodily samples are not subject to effective privacy protection or other regulation to ensure that individuals have rights to control the collection, use and transfer of such samples. This article examines the existing coverage of privacy legislation, arguments in favour of baseline protection for bodily samples as sources of information and possible approaches to new regulation protecting individual privacy rights in bodily samples.

  13. Privacy, confidentiality and abortion statistics: a question of public interest?

    Science.gov (United States)

    McHale, Jean V; Jones, June

    2012-01-01

    The precise nature and scope of healthcare confidentiality has long been the subject of debate. While the obligation of confidentiality is integral to professional ethical codes and is also safeguarded under English law through the equitable remedy of breach of confidence, underpinned by the right to privacy enshrined in Article 8 of the Human Rights Act 1998, it has never been regarded as absolute. But when can and should personal information be made available for statistical and research purposes and what if the information in question is highly sensitive information, such as that relating to the termination of pregnancy after 24 weeks? This article explores the case of In the Matter of an Appeal to the Information Tribunal under section 57 of the Freedom of Information Act 2000, concerning the decision of the Department of Health to withhold some statistical data from the publication of its annual abortion statistics. The specific data being withheld concerned the termination for serious fetal handicap under section 1(1)d of the Abortion Act 1967. The paper explores the implications of this case, which relate both to the nature and scope of personal privacy. It suggests that lessons can be drawn from this case about public interest and use of statistical information and also about general policy issues concerning the legal regulation of confidentiality and privacy in the future.

  14. Security and privacy in biometrics

    CERN Document Server

    Campisi, Patrizio

    2013-01-01

    This important text/reference presents the latest secure and privacy-compliant techniques in automatic human recognition. Featuring viewpoints from an international selection of experts in the field, the comprehensive coverage spans both theory and practical implementations, taking into consideration all ethical and legal issues. Topics and features: presents a unique focus on novel approaches and new architectures for unimodal and multimodal template protection; examines signal processing techniques in the encrypted domain, security and privacy leakage assessment, and aspects of standardizati

  15. 77 FR 48984 - Privacy Act of 1974; System of Records Notice

    Science.gov (United States)

    2012-08-15

    ... Privacy Act systems, to facilitate their ability to respond to data security breach incidents (see OMB... DEPARTMENT OF HEALTH AND HUMAN SERVICES Privacy Act of 1974; System of Records Notice AGENCY...: In accordance with the requirements of the Privacy Act of 1974, HHS gives notice of a proposed...

  16. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Directory of Open Access Journals (Sweden)

    Cynthia Cheung

    Full Text Available Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters.This study sought to understand privacy attitudes of early adopters of emerging health technologies.Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis.Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science.Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  17. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Science.gov (United States)

    Cheung, Cynthia; Bietz, Matthew J; Patrick, Kevin; Bloss, Cinnamon S

    2016-01-01

    Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters. This study sought to understand privacy attitudes of early adopters of emerging health technologies. Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis. Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science. Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  18. A Case Study on Differential Privacy

    OpenAIRE

    Asseffa, Samrawit; Seleshi, Bihil

    2017-01-01

    Throughout the ages, human beings prefer to keep most things secret and brand this overall state with the title of privacy. Like most significant terms, privacy tends to create controversy regarding the extent of its flexible boundaries, since various technological advancements are slowly leaching away the power people have over their own information. Even as cell phone brands release new upgrades, the ways in which information is communicated has drastically increased, in turn facilitating t...

  19. 76 FR 72325 - Privacy Act; Exempt Record System

    Science.gov (United States)

    2011-11-23

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES 45 CFR Part 5b RIN 0906-AA91 Privacy Act; Exempt Record... are guaranteed access to, and correction rights for, substantive information reported to the NPDB. The procedures, appearing in 45 CFR part 60, use the Privacy Act access and correction procedures as a basic...

  20. Privacy protectionism and health information: is there any redress for harms to health?

    Science.gov (United States)

    Allen, Judy; Holman, C D'arcy J; Meslin, Eric M; Stanley, Fiona

    2013-12-01

    Health information collected by governments can be a valuable resource for researchers seeking to improve diagnostics, treatments and public health outcomes. Responsible use requires close attention to privacy concerns and to the ethical acceptability of using personal health information without explicit consent. Less well appreciated are the legal and ethical issues that are implicated when privacy protection is extended to the point where the potential benefits to the public from research are lost. Balancing these issues is a delicate matter for data custodians. This article examines the legal, ethical and structural context in which data custodians make decisions about the release of data for research. It considers the impact of those decisions on individuals. While there is strong protection against risks to privacy and multiple avenues of redress, there is no redress where harms result from a failure to release data for research.

  1. Privacy under construction : A developmental perspective on privacy perception

    NARCIS (Netherlands)

    Steijn, W.M.P.; Vedder, A.H.

    2015-01-01

    We present a developmental perspective regarding the difference in perceptions toward privacy between young and old. Here, we introduce the notion of privacy conceptions, that is, the specific ideas that individuals have regarding what privacy actually is. The differences in privacy concerns often

  2. Privacy Bridges: EU and US Privacy Experts In Search of Transatlantic Privacy Solutions

    NARCIS (Netherlands)

    Abramatic, J.-F.; Bellamy, B.; Callahan, M.E.; Cate, F.; van Eecke, P.; van Eijk, N.; Guild, E.; de Hert, P.; Hustinx, P.; Kuner, C.; Mulligan, D.; O'Connor, N.; Reidenberg, J.; Rubinstein, I.; Schaar, P.; Shadbolt, N.; Spiekermann, S.; Vladeck, D.; Weitzner, D.J.; Zuiderveen Borgesius, F.; Hagenauw, D.; Hijmans, H.

    2015-01-01

    The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the

  3. Leaking privacy and shadow profiles in online social networks.

    Science.gov (United States)

    Garcia, David

    2017-08-01

    Social interaction and data integration in the digital society can affect the control that individuals have on their privacy. Social networking sites can access data from other services, including user contact lists where nonusers are listed too. Although most research on online privacy has focused on inference of personal information of users, this data integration poses the question of whether it is possible to predict personal information of nonusers. This article tests the shadow profile hypothesis, which postulates that the data given by the users of an online service predict personal information of nonusers. Using data from a disappeared social networking site, we perform a historical audit to evaluate whether personal data of nonusers could have been predicted with the personal data and contact lists shared by the users of the site. We analyze personal information of sexual orientation and relationship status, which follow regular mixing patterns in the social network. Going back in time over the growth of the network, we measure predictor performance as a function of network size and tendency of users to disclose their contact lists. This article presents robust evidence supporting the shadow profile hypothesis and reveals a multiplicative effect of network size and disclosure tendencies that accelerates the performance of predictors. These results call for new privacy paradigms that take into account the fact that individual privacy decisions do not happen in isolation and are mediated by the decisions of others.

  4. A Generic Privacy Quantification Framework for Privacy-Preserving Data Publishing

    Science.gov (United States)

    Zhu, Zutao

    2010-01-01

    In recent years, the concerns about the privacy for the electronic data collected by government agencies, organizations, and industries are increasing. They include individual privacy and knowledge privacy. Privacy-preserving data publishing is a research branch that preserves the privacy while, at the same time, withholding useful information in…

  5. Patient Privacy in the Era of Big Data.

    Science.gov (United States)

    Kayaalp, Mehmet

    2018-01-20

    Privacy was defined as a fundamental human right in the Universal Declaration of Human Rights at the 1948 United Nations General Assembly. However, there is still no consensus on what constitutes privacy. In this review, we look at the evolution of privacy as a concept from the era of Hippocrates to the era of social media and big data. To appreciate the modern measures of patient privacy protection and correctly interpret the current regulatory framework in the United States, we need to analyze and understand the concepts of individually identifiable information, individually identifiable health information, protected health information, and de-identification. The Privacy Rule of the Health Insurance Portability and Accountability Act defines the regulatory framework and casts a balance between protective measures and access to health information for secondary (scientific) use. The rule defines the conditions when health information is protected by law and how protected health information can be de-identified for secondary use. With the advents of artificial intelligence and computational linguistics, computational text de-identification algorithms produce de-identified results nearly as well as those produced by human experts, but much faster, more consistently and basically for free. Modern clinical text de-identification systems now pave the road to big data and enable scientists to access de-identified clinical information while firmly protecting patient privacy. However, clinical text de-identification is not a perfect process. In order to maximize the protection of patient privacy and to free clinical and scientific information from the confines of electronic healthcare systems, all stakeholders, including patients, health institutions and institutional review boards, scientists and the scientific communities, as well as regulatory and law enforcement agencies must collaborate closely. On the one hand, public health laws and privacy regulations define rules

  6. Choose Privacy Week: Educate Your Students (and Yourself) about Privacy

    Science.gov (United States)

    Adams, Helen R.

    2016-01-01

    The purpose of "Choose Privacy Week" is to encourage a national conversation to raise awareness of the growing threats to personal privacy online and in day-to-day life. The 2016 Choose Privacy Week theme is "respecting individuals' privacy," with an emphasis on minors' privacy. A plethora of issues relating to minors' privacy…

  7. 75 FR 63703 - Privacy Act of 1974; Privacy Act Regulation

    Science.gov (United States)

    2010-10-18

    ... FEDERAL RESERVE SYSTEM 12 CFR Part 261a [Docket No. R-1313] Privacy Act of 1974; Privacy Act... implementing the Privacy Act of 1974 (Privacy Act). The primary changes concern the waiver of copying fees... records under the Privacy Act; the amendment of special procedures for the release of medical records to...

  8. Policy recommendations for addressing privacy challenges associated with cell-based research and interventions.

    Science.gov (United States)

    Ogbogu, Ubaka; Burningham, Sarah; Ollenberger, Adam; Calder, Kathryn; Du, Li; El Emam, Khaled; Hyde-Lay, Robyn; Isasi, Rosario; Joly, Yann; Kerr, Ian; Malin, Bradley; McDonald, Michael; Penney, Steven; Piat, Gayle; Roy, Denis-Claude; Sugarman, Jeremy; Vercauteren, Suzanne; Verhenneman, Griet; West, Lori; Caulfield, Timothy

    2014-02-03

    The increased use of human biological material for cell-based research and clinical interventions poses risks to the privacy of patients and donors, including the possibility of re-identification of individuals from anonymized cell lines and associated genetic data. These risks will increase as technologies and databases used for re-identification become affordable and more sophisticated. Policies that require ongoing linkage of cell lines to donors' clinical information for research and regulatory purposes, and existing practices that limit research participants' ability to control what is done with their genetic data, amplify the privacy concerns. To date, the privacy issues associated with cell-based research and interventions have not received much attention in the academic and policymaking contexts. This paper, arising out of a multi-disciplinary workshop, aims to rectify this by outlining the issues, proposing novel governance strategies and policy recommendations, and identifying areas where further evidence is required to make sound policy decisions. The authors of this paper take the position that existing rules and norms can be reasonably extended to address privacy risks in this context without compromising emerging developments in the research environment, and that exceptions from such rules should be justified using a case-by-case approach. In developing new policies, the broader framework of regulations governing cell-based research and related areas must be taken into account, as well as the views of impacted groups, including scientists, research participants and the general public. This paper outlines deliberations at a policy development workshop focusing on privacy challenges associated with cell-based research and interventions. The paper provides an overview of these challenges, followed by a discussion of key themes and recommendations that emerged from discussions at the workshop. The paper concludes that privacy risks associated with cell

  9. Designing Privacy for You : A User Centric Approach For Privacy

    OpenAIRE

    Senarath, Awanthika; Arachchilage, Nalin A. G.; Slay, Jill

    2017-01-01

    Privacy directly concerns the user as the data owner (data- subject) and hence privacy in systems should be implemented in a manner which concerns the user (user-centered). There are many concepts and guidelines that support development of privacy and embedding privacy into systems. However, none of them approaches privacy in a user- centered manner. Through this research we propose a framework that would enable developers and designers to grasp privacy in a user-centered manner and implement...

  10. An examination of electronic health information privacy in older adults.

    Science.gov (United States)

    Le, Thai; Thompson, Hilaire; Demiris, George

    2013-01-01

    Older adults are the quickest growing demographic group and are key consumers of health services. As the United States health system transitions to electronic health records, it is important to understand older adult perceptions of privacy and security. We performed a secondary analysis of the Health Information National Trends Survey (2012, Cycle 1), to examine differences in perceptions of electronic health information privacy between older adults and the general population. We found differences in the level of importance placed on access to electronic health information (older adults placed greater emphasis on provider as opposed to personal access) and tendency to withhold information out of concerns for privacy and security (older adults were less likely to withhold information). We provide recommendations to alleviate some of these privacy concerns. This may facilitate greater use of electronic health communication between patient and provider, while promoting shared decision making.

  11. Anonymization of Court Decisions: Are Restrictions on the Right to Information in “Accordance with the Law”?

    Directory of Open Access Journals (Sweden)

    Gruodytė Edita

    2016-12-01

    Full Text Available In Lithuania rules for the anonymization of court decisions were introduced in 2005. These rules require automatic anonymization of all court decisions, which in the opinion of the authors violates the public interest to know and freedom of expression is unjustifiably restricted on behalf of the right to privacy. This issue covers two diametrically opposed human rights: the right to privacy and the right to information. The first question is how the balance between two equivalent rights could be reached. The second question is whether this regulation is in accordance with the law as it is established in the national Constitution and revealed by the Constitutional Court of the Republic of Lithuania and developed by the jurisprudence of the European Court of Human Rights. The authors conclude that the legislator is not empowered to delegate to the Judicial Council issues which are a matter of legal regulation and suggest possible solutions evaluating practice of the Court of Justice of the European Union, the European Court of Human Rights, and selected EU countries.

  12. The evolutionary roots of human decision making.

    Science.gov (United States)

    Santos, Laurie R; Rosati, Alexandra G

    2015-01-03

    Humans exhibit a suite of biases when making economic decisions. We review recent research on the origins of human decision making by examining whether similar choice biases are seen in nonhuman primates, our closest phylogenetic relatives. We propose that comparative studies can provide insight into four major questions about the nature of human choice biases that cannot be addressed by studies of our species alone. First, research with other primates can address the evolution of human choice biases and identify shared versus human-unique tendencies in decision making. Second, primate studies can constrain hypotheses about the psychological mechanisms underlying such biases. Third, comparisons of closely related species can identify when distinct mechanisms underlie related biases by examining evolutionary dissociations in choice strategies. Finally, comparative work can provide insight into the biological rationality of economically irrational preferences.

  13. Privacy vs security

    CERN Document Server

    Stalla-Bourdillon, Sophie; Ryan, Mark D

    2014-01-01

    Securing privacy in the current environment is one of the great challenges of today's democracies. Privacy vs. Security explores the issues of privacy and security and their complicated interplay, from a legal and a technical point of view. Sophie Stalla-Bourdillon provides a thorough account of the legal underpinnings of the European approach to privacy and examines their implementation through privacy, data protection and data retention laws. Joshua Philips and Mark D. Ryan focus on the technological aspects of privacy, in particular, on today's attacks on privacy by the simple use of today'

  14. Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach

    Directory of Open Access Journals (Sweden)

    Tuukka Lehtiniemi

    2017-07-01

    Full Text Available In privacy self-management, people are expected to perform cost–benefit analysis on the use of their personal data, and only consent when their subjective benefits outweigh the costs. However, the ubiquitous collection of personal data and Big Data analytics present increasing challenges to successful privacy management. A number of services and research initiatives have proposed similar solutions to provide people with more control over their data by consolidating consent decisions under a single interface. We have named this the ‘ consent intermediary ’ approach. In this paper, we first identify the eight obstacles to privacy self-management which make cost–benefit analysis conceptually and practically challenging. We then analyse to which extent consent intermediaries can help overcome the obstacles. We argue that simply bringing consent decisions under one interface offers limited help, but that the potential of this approach lies in leveraging the intermediary position to provide aides for privacy management. We find that with suitable tools, some of the more practical obstacles indeed can become solvable, while others remain fundamentally insuperable within the individuated privacy self-management model. Attention should also be paid to how the consent intermediaries may take advantage of the power vested in the intermediary positions between users and other services.

  15. Privacy transparency patterns

    NARCIS (Netherlands)

    Siljee B.I.J.

    2015-01-01

    This paper describes two privacy patterns for creating privacy transparency: the Personal Data Table pattern and the Privacy Policy Icons pattern, as well as a full overview of privacy transparency patterns. It is a first step in creating a full set of privacy design patterns, which will aid

  16. Simulation Models of Human Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Nina RIZUN

    2014-10-01

    Full Text Available The main purpose of the paper is presentation of the new concept of human decision-making process modeling via using the analogy with Automatic Control Theory. From the author's point of view this concept allows to develop and improve the theory of decision-making in terms of the study and classification of specificity of the human intellectual processes in different conditions. It was proved that the main distinguishing feature between the Heuristic / Intuitive and Rational Decision-Making Models is the presence of so-called phenomenon of "enrichment" of the input information with human propensity, hobbies, tendencies, expectations, axioms and judgments, presumptions or bias and their justification. In order to obtain additional knowledge about the basic intellectual processes as well as the possibility of modeling the decision results in various parameters characterizing the decision-maker, the complex of the simulation models was developed. These models are based on the assumptions that:  basic intellectual processes of the Rational Decision-Making Model can be adequately simulated and identified by the transient processes of the proportional-integral-derivative controller; basic intellectual processes of the Bounded Rationality and Intuitive Models can be adequately simulated and identified by the transient processes of the nonlinear elements.The taxonomy of the most typical automatic control theory elements and their compliance with certain decision-making models with a point of view of decision-making process specificity and decision-maker behavior during a certain time of professional activity was obtained.

  17. HIPPA privacy regulations: practical information for physicians.

    Science.gov (United States)

    McMahon, E B; Lee-Huber, T

    2001-07-01

    After much debate and controversy, the Bush administration announced on April 12, 2001, that it would implement the Health Insurance Portability and Accountability Act (HIPAA) privacy regulations issued by the Clinton administration in December of 2000. The privacy regulations became effective on April 14, 2001. Although the regulations are considered final, the Secretary of the Department of Health and Human Services has the power to modify the regulations at any time during the first year of implementation. These regulations affect how a patient's health information is used and disclosed, as well as how patients are informed of their privacy rights. As "covered entities," physicians have until April 14, 2003, to comply fully with the HIPAA privacy regulations, which are more than 1,500 pages in length. This article presents a basic overview of the new and complex regulations and highlights practical information about physicians' compliance with the regulations. However, this summary of the HIPAA privacy regulations should not be construed as legal advice or an opinion on specific situations. Please consult an attorney concerning your compliance with HIPAA and the regulations promulgated thereunder.

  18. Privacy Awareness: A Means to Solve the Privacy Paradox?

    Science.gov (United States)

    Pötzsch, Stefanie

    People are limited in their resources, i.e. they have limited memory capabilities, cannot pay attention to too many things at the same time, and forget much information after a while; computers do not suffer from these limitations. Thus, revealing personal data in electronic communication environments and being completely unaware of the impact of privacy might cause a lot of privacy issues later. Even if people are privacy aware in general, the so-called privacy paradox shows that they do not behave according to their stated attitudes. This paper discusses explanations for the existing dichotomy between the intentions of people towards disclosure of personal data and their behaviour. We present requirements on tools for privacy-awareness support in order to counteract the privacy paradox.

  19. Privacy Policy

    Science.gov (United States)

    ... Home → NLM Privacy Policy URL of this page: https://medlineplus.gov/privacy.html NLM Privacy Policy To ... out of cookies in the most popular browsers, http://www.usa.gov/optout_instructions.shtml. Please note ...

  20. Designing Privacy Notices: Supporting User Understanding and Control

    Science.gov (United States)

    Kelley, Patrick Gage

    2013-01-01

    Users are increasingly expected to manage complex privacy settings in their normal online interactions. From shopping to social networks, users make decisions about sharing their personal information with corporations and contacts, frequently with little assistance. Current solutions require consumers to read long documents or go out of their way…

  1. Concentrated Differential Privacy

    OpenAIRE

    Dwork, Cynthia; Rothblum, Guy N.

    2016-01-01

    We introduce Concentrated Differential Privacy, a relaxation of Differential Privacy enjoying better accuracy than both pure differential privacy and its popular "(epsilon,delta)" relaxation without compromising on cumulative privacy loss over multiple computations.

  2. Youth, Privacy and Online Media: Framing the right to privacy in public policy-making

    DEFF Research Database (Denmark)

    Hasselbalch, Gry; Jørgensen, Rikke Frank

    2015-01-01

    debate. It presents the results of a qualitative study amongst 68 Danish high school students concerning how they perceive, negotiate and control their private sphere when using social media and builds a case for utilizing the results of studies as this to inform the ongoing policy discourses concerning...... policy making that the right to privacy is challenged in new ways in a structurally transformed online public sphere, the way in which it has been framed does not seem to acknowledge this transformation. This paper therefore argues for a reformulation of “online privacy” in the current global policy......The right to privacy is a fundamental human right defined in international and regional human rights instruments. As such it has been included as a core component of key legislature and policy proceedings throughout the brief history of the World Wide Web. While it is generally recognized in public...

  3. Modeling Human Elements of Decision-Making

    Science.gov (United States)

    2002-06-01

    include factors such as personality, emotion , and level of expertise, which vary from individual to individual. The process of decision - making during... rational choice theories such as utility theory, to more descriptive psychological models that focus more on the process of decision - making ...descriptive nature, they provide a more realistic representation of human decision - making than the rationally based models. However these models do

  4. The social life of genes: privacy, property and the new genetics.

    Science.gov (United States)

    Everett, Margaret

    2003-01-01

    With the advent of the Human Genome Project and widespread fears over human cloning and medical privacy, a number of states have moved to protect genetic privacy. Oregon's unique Genetic Privacy Act of 1995, which declared that an individual had property rights to their DNA, has provoked national and international interest and controversy. This paper critically reviews the literature on genetic privacy and gene patenting from law, philosophy, science and anthropology. The debate in Oregon, from 1995 to 2001, illustrates many of the key issues in this emerging area. Both sides of the debate invoke the property metaphor, reinforcing deterministic assumptions and avoiding more fundamental questions about the integrity of the body and self-identity. The anthropological critique of the commodification of the body, and the concept of 'embodiment' are useful in analyzing the debate over DNA as property.

  5. Privacy Policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy; Petkovic, M.; Jonker, W.; Jonker, Willem

    2007-01-01

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  6. Human Decision-Making under Limited Time

    OpenAIRE

    Ortega, Pedro A.; Stocker, Alan A.

    2016-01-01

    Subjective expected utility theory assumes that decision-makers possess unlimited computational resources to reason about their choices; however, virtually all decisions in everyday life are made under resource constraints - i.e. decision-makers are bounded in their rationality. Here we experimentally tested the predictions made by a formalization of bounded rationality based on ideas from statistical mechanics and information-theory. We systematically tested human subjects in their ability t...

  7. Privacy policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, S.; Hartog, den J.I.; Petkovic, M.; Jonker, W.

    2007-01-01

    Privacy is a prime concern in today’s information society. To protect the privacy of individuals, enterprises must follow certain privacy practices while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website, processes

  8. The human factor: behavioral and neural correlates of humanized perception in moral decision making.

    Science.gov (United States)

    Majdandžić, Jasminka; Bauer, Herbert; Windischberger, Christian; Moser, Ewald; Engl, Elisabeth; Lamm, Claus

    2012-01-01

    The extent to which people regard others as full-blown individuals with mental states ("humanization") seems crucial for their prosocial motivation towards them. Previous research has shown that decisions about moral dilemmas in which one person can be sacrificed to save multiple others do not consistently follow utilitarian principles. We hypothesized that this behavior can be explained by the potential victim's perceived humanness and an ensuing increase in vicarious emotions and emotional conflict during decision making. Using fMRI, we assessed neural activity underlying moral decisions that affected fictitious persons that had or had not been experimentally humanized. In implicit priming trials, participants either engaged in mentalizing about these persons (Humanized condition) or not (Neutral condition). In subsequent moral dilemmas, participants had to decide about sacrificing these persons' lives in order to save the lives of numerous others. Humanized persons were sacrificed less often, and the activation pattern during decisions about them indicated increased negative affect, emotional conflict, vicarious emotions, and behavioral control (pgACC/mOFC, anterior insula/IFG, aMCC and precuneus/PCC). Besides, we found enhanced effective connectivity between aMCC and anterior insula, which suggests increased emotion regulation during decisions affecting humanized victims. These findings highlight the importance of others' perceived humanness for prosocial behavior - with aversive affect and other-related concern when imagining harming more "human-like" persons acting against purely utilitarian decisions.

  9. The human factor: behavioral and neural correlates of humanized perception in moral decision making.

    Directory of Open Access Journals (Sweden)

    Jasminka Majdandžić

    Full Text Available The extent to which people regard others as full-blown individuals with mental states ("humanization" seems crucial for their prosocial motivation towards them. Previous research has shown that decisions about moral dilemmas in which one person can be sacrificed to save multiple others do not consistently follow utilitarian principles. We hypothesized that this behavior can be explained by the potential victim's perceived humanness and an ensuing increase in vicarious emotions and emotional conflict during decision making. Using fMRI, we assessed neural activity underlying moral decisions that affected fictitious persons that had or had not been experimentally humanized. In implicit priming trials, participants either engaged in mentalizing about these persons (Humanized condition or not (Neutral condition. In subsequent moral dilemmas, participants had to decide about sacrificing these persons' lives in order to save the lives of numerous others. Humanized persons were sacrificed less often, and the activation pattern during decisions about them indicated increased negative affect, emotional conflict, vicarious emotions, and behavioral control (pgACC/mOFC, anterior insula/IFG, aMCC and precuneus/PCC. Besides, we found enhanced effective connectivity between aMCC and anterior insula, which suggests increased emotion regulation during decisions affecting humanized victims. These findings highlight the importance of others' perceived humanness for prosocial behavior - with aversive affect and other-related concern when imagining harming more "human-like" persons acting against purely utilitarian decisions.

  10. Respecting an Individual's Privacy Critical in Making Difficult Journalistic Judgments.

    Science.gov (United States)

    Adams, Julian

    1984-01-01

    Discusses the criteria of "newsworthy" regarding news reporting and the right to privacy. Examines the thin line between what is legal and what is ethical to print and some components of the law to consider when making such decisions. (HTH)

  11. 76 FR 64115 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-092)] Privacy Act of 1974; Privacy Act... retirement of one Privacy Act system of records notice. SUMMARY: In accordance with the Privacy Act of 1974, NASA is giving notice that it proposes to cancel the following Privacy Act system of records notice...

  12. Genetic privacy.

    Science.gov (United States)

    Sankar, Pamela

    2003-01-01

    During the past 10 years, the number of genetic tests performed more than tripled, and public concern about genetic privacy emerged. The majority of states and the U.S. government have passed regulations protecting genetic information. However, research has shown that concerns about genetic privacy are disproportionate to known instances of information misuse. Beliefs in genetic determinacy explain some of the heightened concern about genetic privacy. Discussion of the debate over genetic testing within families illustrates the most recent response to genetic privacy concerns.

  13. Privacy preserving processing of genomic data: A survey.

    Science.gov (United States)

    Akgün, Mete; Bayrak, A Osman; Ozer, Bugra; Sağıroğlu, M Şamil

    2015-08-01

    Recently, the rapid advance in genome sequencing technology has led to production of huge amount of sensitive genomic data. However, a serious privacy challenge is confronted with increasing number of genetic tests as genomic data is the ultimate source of identity for humans. Lately, privacy threats and possible solutions regarding the undesired access to genomic data are discussed, however it is challenging to apply proposed solutions to real life problems due to the complex nature of security definitions. In this review, we have categorized pre-existing problems and corresponding solutions in more understandable and convenient way. Additionally, we have also included open privacy problems coming with each genomic data processing procedure. We believe our classification of genome associated privacy problems will pave the way for linking of real-life problems with previously proposed methods. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. 78 FR 69076 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2013-11-18

    ... Medical Human Resources System internet (DMHRSi). DHA 12 EDHA 12 Third Party Collection System. DHA 16 DoD... DEPARTMENT OF DEFENSE Office of the Secretary [Docket ID: DoD-2013-OS-0216] Privacy Act of 1974... Defense Health Agency's compilation of Privacy Act SORNS. The realignment of the nineteen system...

  15. Crowdsourcing for Context: Regarding Privacy in Beacon Encounters via Contextual Integrity

    Directory of Open Access Journals (Sweden)

    Bello-Ogunu Emmanuel

    2016-07-01

    Full Text Available Research shows that context is important to the privacy perceptions associated with technology. With Bluetooth Low Energy beacons, one of the latest technologies for providing proximity and indoor tracking, the current identifiers that characterize a beacon are not sufficient for ordinary users to make informed privacy decisions about the location information that could be shared. One solution would be to have standardized category and privacy labels, produced by beacon providers or an independent third-party. An alternative solution is to find an approach driven by users, for users. In this paper, we propose a novel crowdsourcing based approach to introduce elements of context in beacon encounters.We demonstrate the effectiveness of this approach through a user study, where participants use a crowd-based mobile app designed to collect beacon category and privacy information as a scavenger hunt game. Results show that our approach was effective in helping users label beacons according to the specific context of a given beacon encounter, as well as the privacy perceptions associated with it. This labeling was done with an accuracy of 92%, and with an acceptance rate of 82% of all recommended crowd labels. Lastly, we conclusively show how crowdsourcing for context can be used towards a user-centric framework for privacy management during beacon encounters.

  16. Conundrums with penumbras: the right to privacy encompasses non-gamete providers who create preembryos with the intent to become parents.

    Science.gov (United States)

    Dillon, Lainie M C

    2003-05-01

    To date, five state high courts have resolved disputes over frozen preembryos. These disputes arose during divorce proceedings between couples who had previously used assisted reproduction and cryopreserved excess preembryos. In each case, one spouse wished to have the preembryos destroyed, while the other wanted to be able to use or donate them in the future. The parties in these cases invoked the constitutional right to privacy to argue for dispositional control over the preembryos; two of the five cases were resolved by relying on this right. The constitutional right to privacy protects intimate decisions involving procreation, marriage, and family life. However, when couples use donated sperm or ova to create preembryos, a unique circumstance arises: one spouse--the gamete provider--is genetically related to the preembryos and the other is not. If courts resolve frozen preembryo disputes that involve non-gamete providers based on the constitutional right to privacy, they should find that the constitutional right to privacy encompasses the interests of both gamete and non-gamete providers. Individuals who create preembryos with the intent to become a parent have made an intimate decision involving procreation, marriage, and family life that falls squarely within the the right to privacy. In such cases, the couple together made the decision to create a family through the use of assisted reproduction, and the preembryos would not exist but for that joint decision. Therefore, gamete and non-gamete providers should be afforded equal constitutional protection in disputes over frozen preembryos.

  17. Human Decision Processes: Implications for SSA Support Tools

    Science.gov (United States)

    Picciano, P.

    2013-09-01

    Despite significant advances in computing power and artificial intelligence (AI), few critical decisions are made without a human decision maker in the loop. Space Situational Awareness (SSA) missions are both critical and complex, typically adhering to the human-in-the-loop (HITL) model. The collection of human operators injects a needed diversity of expert knowledge, experience, and authority required to successfully fulfill SSA tasking. A wealth of literature on human decision making exists citing myriad empirical studies and offering a varied set of prescriptive and descriptive models of judgment and decision making (Hastie & Dawes, 2001; Baron, 2000). Many findings have been proven sufficiently robust to allow information architects or system/interface designers to take action to improve decision processes. For the purpose of discussion, these concepts are bifurcated in two groups: 1) vulnerabilities to mitigate, and 2) capabilities to augment. These vulnerabilities and capabilities refer specifically to the decision process and should not be confused with a shortcoming or skill of a specific human operator. Thus the framing of questions and orders, the automated tools with which to collaborate, priming and contextual data, and the delivery of information all play a critical role in human judgment and choice. Evaluating the merits of any decision can be elusive; in order to constrain this discussion, ‘rational choice' will tend toward the economic model characteristics such as maximizing utility and selection consistency (e.g., if A preferred to B, and B preferred to C, than A should be preferred to C). Simple decision models often encourage one to list the pros and cons of a decision, perhaps use a weighting schema, but one way or another weigh the future benefit (or harm) of making a selection. The result (sought by the rationalist models) should drive toward higher utility. Despite notable differences in researchers' theses (to be discussed in the full

  18. Privacy og selvbeskrivelse

    DEFF Research Database (Denmark)

    Rosengaard, Hans Ulrik

    2015-01-01

    En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse......En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse...

  19. Computational Complexity and Human Decision-Making.

    Science.gov (United States)

    Bossaerts, Peter; Murawski, Carsten

    2017-12-01

    The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Couldn't or wouldn't? The influence of privacy concerns and self-efficacy in privacy management on privacy protection.

    Science.gov (United States)

    Chen, Hsuan-Ting; Chen, Wenghong

    2015-01-01

    Sampling 515 college students, this study investigates how privacy protection, including profile visibility, self-disclosure, and friending, are influenced by privacy concerns and efficacy regarding one's own ability to manage privacy settings, a factor that researchers have yet to give a great deal of attention to in the context of social networking sites (SNSs). The results of this study indicate an inconsistency in adopting strategies to protect privacy, a disconnect from limiting profile visibility and friending to self-disclosure. More specifically, privacy concerns lead SNS users to limit their profile visibility and discourage them from expanding their network. However, they do not constrain self-disclosure. Similarly, while self-efficacy in privacy management encourages SNS users to limit their profile visibility, it facilitates self-disclosure. This suggests that if users are limiting their profile visibility and constraining their friending behaviors, it does not necessarily mean they will reduce self-disclosure on SNSs because these behaviors are predicted by different factors. In addition, the study finds an interaction effect between privacy concerns and self-efficacy in privacy management on friending. It points to the potential problem of increased risk-taking behaviors resulting from high self-efficacy in privacy management and low privacy concerns.

  1. Semantic Security: Privacy Definitions Revisited

    OpenAIRE

    Jinfei Liu; Li Xiong; Jun Luo

    2013-01-01

    In this paper we illustrate a privacy framework named Indistinguishabley Privacy. Indistinguishable privacy could be deemed as the formalization of the existing privacy definitions in privacy preserving data publishing as well as secure multi-party computation. We introduce three representative privacy notions in the literature, Bayes-optimal privacy for privacy preserving data publishing, differential privacy for statistical data release, and privacy w.r.t. semi-honest behavior in the secure...

  2. Cyberbullying: Should Schools choose between Safety and Privacy?

    Directory of Open Access Journals (Sweden)

    Michael Laubscher

    2015-12-01

    Full Text Available In this theoretical article, we explore the tangled messiness of the application of human rights versus the 21st-century monster called "cyberbullying" in schools and focus on some of the challenges schools face daily. The research will reveal that cyberbullying victims were almost twice as likely to attempt suicide as youth who had not experienced cyberbullying, which implies that this is a phenomenon schools ought not to take lightly. We argue that everyone has a right to the freedom of expression, including in cyberspace, and begin by exploring how legal principles evolved in an attempt to deal with the limitations placed on an individual's right to freedom of expression. As we are about to reveal, though, matters become even more complicated when this freedom of expression relates to cyberspace, a space where users might have an expectation of privacy and even enjoy a state of anonymity. Clearly, the right to privacy and the right to freedom of expression need to be balanced and respected should school authorities be called upon to identify and discipline a cyberbully. This balancing act is one that needs to be investigated and carefully expounded upon, and is an issue that has not yet been sufficiently addressed in South Africa. Seeing that countries such as the United States of America and Canada have attempted to deal with this issue, it would be prudent to discuss the strides these countries have made, the challenges they have faced, and the insights they have gained, in an attempt to alert South Africa to the complex issues cyberbullying could raise. Working from this premise, this article will focus on the right to privacy, specifically in relation to Bill C-13 recently passed in Canada and the resultant Canadian Supreme Court decision in the case R v Spencer, a case that shed further light on the issue of privacy in cyberspace. We conclude the discussion by highlighting several potential pitfalls legislation such as Bill C-13 could

  3. Tales from the dark side: Privacy dark strategies and privacy dark patterns

    DEFF Research Database (Denmark)

    Bösch, Christoph; Erb, Benjamin; Kargl, Frank

    2016-01-01

    Privacy strategies and privacy patterns are fundamental concepts of the privacy-by-design engineering approach. While they support a privacy-aware development process for IT systems, the concepts used by malicious, privacy-threatening parties are generally less understood and known. We argue...... that understanding the “dark side”, namely how personal data is abused, is of equal importance. In this paper, we introduce the concept of privacy dark strategies and privacy dark patterns and present a framework that collects, documents, and analyzes such malicious concepts. In addition, we investigate from...... a psychological perspective why privacy dark strategies are effective. The resulting framework allows for a better understanding of these dark concepts, fosters awareness, and supports the development of countermeasures. We aim to contribute to an easier detection and successive removal of such approaches from...

  4. Workshop--E-leaks: the privacy of health information in the age of electronic information.

    Science.gov (United States)

    Vonn, Michael; Lang, Renée; Perras, Maude

    2011-10-01

    This workshop examined some of the new challenges to health-related privacy emerging as a result of the proliferation of electronic communications and data storage, including through social media, electronic health records and ready access to personal information on the internet. The right to privacy is a human right. As such, protecting privacy and enforcing the duty of confidentiality regarding health information are fundamental to treating people with autonomy, dignity and respect. For people living with HIV, unauthorized disclosure of their status can lead to discrimination and breaches of other human rights. While this is not new, in this information age a new breed of privacy violation is emerging and our legal protections are not necessarily keeping pace.

  5. Emotion-affected decision making in human simulation.

    Science.gov (United States)

    Zhao, Y; Kang, J; Wright, D K

    2006-01-01

    Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the emotion process with decision making. A computational emotion model is proposed, and the initial framework of this model in virtual human simulation within the platform of Virtools is presented.

  6. Dissociating sensory from decision processes in human perceptual decision making

    OpenAIRE

    Mostert, Pim; Kok, Peter; de Lange, Floris P.

    2015-01-01

    A key question within systems neuroscience is how the brain translates physical stimulation into a behavioral response: perceptual decision making. To answer this question, it is important to dissociate the neural activity underlying the encoding of sensory information from the activity underlying the subsequent temporal integration into a decision variable. Here, we adopted a decoding approach to empirically assess this dissociation in human magnetoencephalography recordings. We used a funct...

  7. Privacy at end of life in ICU: A review of the literature.

    Science.gov (United States)

    Timmins, Fiona; Parissopoulos, Stelios; Plakas, Sotirios; Naughton, Margaret T; de Vries, Jan Ma; Fouka, Georgia

    2018-06-01

    To explore the issues surrounding privacy during death in ICU. While the provision of ICU care is vital, the nature and effect of the potential lack of privacy during death and dying in ICUs have not been extensively explored. A literature search using CINAHL and Pubmed revealed articles related to privacy, death and dying in ICU. Keywords used in the search were "ICU," "Privacy," "Death" and "Dying." A combination of these terms using Boolean operators "or" or "and" revealed a total of 23 citations. Six papers were ultimately deemed suitable for inclusion in the review and were subjected to code analysis with Atlas.ti v8 QDA software. The analysis of the studies revealed eight themes, and this study presents the three key themes that were found to be recurring and strongly interconnected to the experience of privacy and death in ICU: "Privacy in ICU," "ICU environment" and "End-of-Life Care". Research has shown that patient and family privacy during the ICU hospitalisation and the provision of the circumstances that lead to an environment of privacy during and after death remains a significant challenge for ICU nurses. Family members have little or no privacy in shared room and cramped waiting rooms, while they wish to be better informed and involved in end-of-life decisions. Hence, death and dying for many patients takes place in open and/or shared spaces which is problematic in terms of both the level of privacy and respect that death ought to afford. It is best if end-of-life care in the ICU is planned and coordinated, where possible. Nurses need to become more self-reflective and aware in relation to end-of-life situations in ICU in order to develop privacy practices that are responsive to family and patient needs. © 2018 John Wiley & Sons Ltd.

  8. The Human Factor: Behavioral and Neural Correlates of Humanized Perception in Moral Decision Making

    Science.gov (United States)

    Majdandžić, Jasminka; Bauer, Herbert; Windischberger, Christian; Moser, Ewald; Engl, Elisabeth; Lamm, Claus

    2012-01-01

    The extent to which people regard others as full-blown individuals with mental states (“humanization”) seems crucial for their prosocial motivation towards them. Previous research has shown that decisions about moral dilemmas in which one person can be sacrificed to save multiple others do not consistently follow utilitarian principles. We hypothesized that this behavior can be explained by the potential victim’s perceived humanness and an ensuing increase in vicarious emotions and emotional conflict during decision making. Using fMRI, we assessed neural activity underlying moral decisions that affected fictitious persons that had or had not been experimentally humanized. In implicit priming trials, participants either engaged in mentalizing about these persons (Humanized condition) or not (Neutral condition). In subsequent moral dilemmas, participants had to decide about sacrificing these persons’ lives in order to save the lives of numerous others. Humanized persons were sacrificed less often, and the activation pattern during decisions about them indicated increased negative affect, emotional conflict, vicarious emotions, and behavioral control (pgACC/mOFC, anterior insula/IFG, aMCC and precuneus/PCC). Besides, we found enhanced effective connectivity between aMCC and anterior insula, which suggests increased emotion regulation during decisions affecting humanized victims. These findings highlight the importance of others’ perceived humanness for prosocial behavior - with aversive affect and other-related concern when imagining harming more “human-like” persons acting against purely utilitarian decisions. PMID:23082194

  9. Assessment of human decision reliability - a case study

    International Nuclear Information System (INIS)

    Pyy, P

    1998-01-01

    In his discussion of this case study, the author indicates that human beings are not merely machines who use rules. Thus, more focus needs to be put on studying decision making situations and their contexts. Decision theory (both normative and descriptive) and contextual psychological approaches may offer tools to cope with operator decision making. Further an ideal decision space needs to be defined for operators. The case study specifically addressed a loss of feedwater scenario and the various operator decisions that were involved in that scenario. It was concluded from this particular study that there are significant differences in the crew decision behaviours that are not explained by process variables. Through use of evidence from simulator tests with expert judgement, an approach to estimate probabilities has been developed. The modelling approach presented in this discussion is an extension of current HRA paradigms, but a natural one since all human beings make decisions

  10. 75 FR 81205 - Privacy Act: Revision of Privacy Act Systems of Records

    Science.gov (United States)

    2010-12-27

    ... DEPARTMENT OF AGRICULTURE Office of the Secretary Privacy Act: Revision of Privacy Act Systems of Records AGENCY: Office of the Secretary, USDA. ACTION: Notice to Revise Privacy Act Systems of Records... two Privacy Act Systems of Records entitled ``Information on Persons Disqualified from the...

  11. 78 FR 40515 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-07-05

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-071] Privacy Act of 1974; Privacy Act System of Records AGENCY: National Aeronautics and Space Administration (NASA). ACTION: Notice of Privacy Act system of records. SUMMARY: Each Federal agency is required by the Privacy Act of 1974 to publish...

  12. Privacy, autonomy, and public policy: French and North American perspectives.

    Science.gov (United States)

    Merchant, Jennifer

    2016-12-01

    This article raises the question of whether in both the United States and in France, an individual's autonomy and private decision-making right(s) in matters of health care and access to reproductive technologies can be conciliated with the general interest, and more specifically, the role of the State. Can a full-fledged right to privacy, the ability to exercise one's autonomy, exist alongside the general interest, and depend neither on financial resources like in the United States nor on centralised government decisions or the medical hierarchy like in France? The contrast between these two modern democracies justify the importance of comparing them. I will demonstrate that overlaps do exist: the free exercise of religion and opinion, freedom of expression, the inherent value of each individual. What differs, however, are the institutions and how they provide, protect, promote, or frame access to and expressions of these democratic principles. The impact of the global economy, the exposure of people around the world to each other via the internet, and the mirror effects of social media, blogs, and other such forums, have created new perspectives that countries project onto one another. For example, does France now seem to tout 'autonomy' as a new and important value because it appears to be an 'American success story'? Does the United States now seem to value human rights and a social-democratic approach because of the 'French model'? There seems to be some truth behind these assertions, but as this article will demonstrate, the portrayals of what the 'right to privacy' is in the United States and what 'socialised medicine' is in France are not necessarily fully accurate.

  13. 78 FR 77503 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-12-23

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-149] Privacy Act of 1974; Privacy Act... proposed revisions to existing Privacy Act systems of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  14. Privacy Preserving Association Rule Mining Revisited: Privacy Enhancement and Resources Efficiency

    Science.gov (United States)

    Mohaisen, Abedelaziz; Jho, Nam-Su; Hong, Dowon; Nyang, Daehun

    Privacy preserving association rule mining algorithms have been designed for discovering the relations between variables in data while maintaining the data privacy. In this article we revise one of the recently introduced schemes for association rule mining using fake transactions (FS). In particular, our analysis shows that the FS scheme has exhaustive storage and high computation requirements for guaranteeing a reasonable level of privacy. We introduce a realistic definition of privacy that benefits from the average case privacy and motivates the study of a weakness in the structure of FS by fake transactions filtering. In order to overcome this problem, we improve the FS scheme by presenting a hybrid scheme that considers both privacy and resources as two concurrent guidelines. Analytical and empirical results show the efficiency and applicability of our proposed scheme.

  15. 76 FR 67763 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-11-02

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-109)] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  16. 76 FR 64114 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-093)] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  17. 77 FR 69898 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2012-11-21

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 12-100] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  18. Privacy and Innovation

    OpenAIRE

    Avi Goldfarb; Catherine Tucker

    2011-01-01

    Information and communication technology now enables firms to collect detailed and potentially intrusive data about their customers both easily and cheaply. This means that privacy concerns are no longer limited to government surveillance and public figures' private lives. The empirical literature on privacy regulation shows that privacy regulation may affect the extent and direction of data-based innovation. We also show that the impact of privacy regulation can be extremely heterogeneous. T...

  19. Privacy-preserving techniques of genomic data-a survey.

    Science.gov (United States)

    Aziz, Md Momin Al; Sadat, Md Nazmus; Alhadidi, Dima; Wang, Shuang; Jiang, Xiaoqian; Brown, Cheryl L; Mohammed, Noman

    2017-11-07

    Genomic data hold salient information about the characteristics of a living organism. Throughout the past decade, pinnacle developments have given us more accurate and inexpensive methods to retrieve genome sequences of humans. However, with the advancement of genomic research, there is a growing privacy concern regarding the collection, storage and analysis of such sensitive human data. Recent results show that given some background information, it is possible for an adversary to reidentify an individual from a specific genomic data set. This can reveal the current association or future susceptibility of some diseases for that individual (and sometimes the kinship between individuals) resulting in a privacy violation. Regardless of these risks, our genomic data hold much importance in analyzing the well-being of us and the future generation. Thus, in this article, we discuss the different privacy and security-related problems revolving around human genomic data. In addition, we will explore some of the cardinal cryptographic concepts, which can bring efficacy in secure and private genomic data computation. This article will relate the gaps between these two research areas-Cryptography and Genomics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Privacy in Digital Age: Dead or Alive?! Regarding the New EU Data Protection Regulations

    Directory of Open Access Journals (Sweden)

    Seyed Ebrahim Dorraji

    2015-02-01

    Full Text Available Purpose – To review and critically discuss the current state of privacy in the context of constant technological changes and to emphasize the pace of technological advancements and developments reached over the time when the last EU data protection laws came into effect. These facts inevitably affect the perception of privacy and raise the question of whether privacy is dead or takes the last breath in the digital age? This paper is an attempt to address this question.Design/Methodology/Approach – Based on the comparison and systematic analysis of scientific literature, the authors discuss problematic issues related to privacy and data protection in the technology era – where these issues are too complicated to be clearly regulated by laws and rules since “laws move as a function of years and technology moves as a function of months” (Ron Rivest. Therefore, this analytical approach towards the issue may help to facilitate reaching the best-fit decision in this area.Findings – The authors emphasize the change of perception of privacy, which originated and grew on the idea of “an integral part of our humanity”, the “heart of our liberty” and “the beginning of all freedoms” (Solove, 2008, leading to the recently raised idea that privacy is severely hanging with threat. The authors are of the opinion that legislation and regulation may be one of the best and effective techniques for protecting privacy in the twenty-first century, but it is not currently adequate (Wacks, 2012. One of the solutions lies in technology design.Research limitations/implications – The aspects of privacy and data protection in the European Union have been widely discussed recently because of their broad applicability. Therefore, it is hardly possible to review and cover all the important aspects of the issue. This article focuses on the roles of technology and legislation in securing privacy. The authors examine and provide their own views based on

  1. Dissociating sensory from decision processes in human perceptual decision making.

    Science.gov (United States)

    Mostert, Pim; Kok, Peter; de Lange, Floris P

    2015-12-15

    A key question within systems neuroscience is how the brain translates physical stimulation into a behavioral response: perceptual decision making. To answer this question, it is important to dissociate the neural activity underlying the encoding of sensory information from the activity underlying the subsequent temporal integration into a decision variable. Here, we adopted a decoding approach to empirically assess this dissociation in human magnetoencephalography recordings. We used a functional localizer to identify the neural signature that reflects sensory-specific processes, and subsequently traced this signature while subjects were engaged in a perceptual decision making task. Our results revealed a temporal dissociation in which sensory processing was limited to an early time window and consistent with occipital areas, whereas decision-related processing became increasingly pronounced over time, and involved parietal and frontal areas. We found that the sensory processing accurately reflected the physical stimulus, irrespective of the eventual decision. Moreover, the sensory representation was stable and maintained over time when it was required for a subsequent decision, but unstable and variable over time when it was task-irrelevant. In contrast, decision-related activity displayed long-lasting sustained components. Together, our approach dissects neuro-anatomically and functionally distinct contributions to perceptual decisions.

  2. Dissociating sensory from decision processes in human perceptual decision making

    Science.gov (United States)

    Mostert, Pim; Kok, Peter; de Lange, Floris P.

    2015-01-01

    A key question within systems neuroscience is how the brain translates physical stimulation into a behavioral response: perceptual decision making. To answer this question, it is important to dissociate the neural activity underlying the encoding of sensory information from the activity underlying the subsequent temporal integration into a decision variable. Here, we adopted a decoding approach to empirically assess this dissociation in human magnetoencephalography recordings. We used a functional localizer to identify the neural signature that reflects sensory-specific processes, and subsequently traced this signature while subjects were engaged in a perceptual decision making task. Our results revealed a temporal dissociation in which sensory processing was limited to an early time window and consistent with occipital areas, whereas decision-related processing became increasingly pronounced over time, and involved parietal and frontal areas. We found that the sensory processing accurately reflected the physical stimulus, irrespective of the eventual decision. Moreover, the sensory representation was stable and maintained over time when it was required for a subsequent decision, but unstable and variable over time when it was task-irrelevant. In contrast, decision-related activity displayed long-lasting sustained components. Together, our approach dissects neuro-anatomically and functionally distinct contributions to perceptual decisions. PMID:26666393

  3. Privacy versus autonomy: a tradeoff model for smart home monitoring technologies.

    Science.gov (United States)

    Townsend, Daphne; Knoefel, Frank; Goubran, Rafik

    2011-01-01

    Smart homes are proposed as a new location for the delivery of healthcare services. They provide healthcare monitoring and communication services, by using integrated sensor network technologies. We validate a hypothesis regarding older adults' adoption of home monitoring technologies by conducting a literature review of articles studying older adults' attitudes and perceptions of sensor technologies. Using current literature to support the hypothesis, this paper applies the tradeoff model to decisions about sensor acceptance. Older adults are willing to trade privacy (by accepting a monitoring technology), for autonomy. As the information captured by the sensor becomes more intrusive and the infringement on privacy increases, sensors are accepted if the loss in privacy is traded for autonomy. Even video cameras, the most intrusive sensor type were accepted in exchange for the height of autonomy which is to remain in the home.

  4. The Models of Applying Online Privacy Literacy Strategies: A Case Study of Instagram Girl Users

    OpenAIRE

    Abdollah Bicharanlou; Seyedeh farzaneh Siasi rad

    2017-01-01

    Social networks affect remarkably in the lives of virtual space users. These networks like most human relations involve compromising between self-disclosure and privacy protection. A process which is realized through improving privacy and empowering the user at the personal level. This study aimed to assess strategies based on online privacy literacy. In particular, strategies that Instagram young girls users should employ to achieve the optimum level of privacy. For this purpose, firstly the...

  5. Internet and Privacy

    OpenAIRE

    Al-Fadhli, Meshal Shehab

    2007-01-01

    The concept of privacy is hard to understand and is not easy to define, because this concept is linked with several dimensions. Internet Privacy is associated with the use of the Internet and most likely appointed under communications privacy, involving the user of the Internet’s personal information and activities, and the disclosure of them online. This essay is going to present the meaning of privacy and the implications of it for Internet users. Also, this essay will demonstrate some of t...

  6. An approach for assessing human decision reliability

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-01-01

    This paper presents a method to study human reliability in decision situations related to nuclear power plant disturbances. Decisions often play a significant role in handling of emergency situations. The method may be applied to probabilistic safety assessments (PSAs) in cases where decision making is an important dimension of an accident sequence. Such situations are frequent e.g. in accident management. In this paper, a modelling approach for decision reliability studies is first proposed. Then, a case study with two decision situations with relatively different characteristics is presented. Qualitative and quantitative findings of the study are discussed. In very simple decision cases with time pressure, time reliability correlation proved out to be a feasible reliability modelling method. In all other decision situations, more advanced probabilistic decision models have to be used. Finally, decision probability assessment by using simulator run results and expert judgement is presented

  7. 76 FR 64112 - Privacy Act of 1974; Privacy Act System of Records Appendices

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-091)] Privacy Act of 1974; Privacy Act...: Revisions of NASA Appendices to Privacy Act System of Records. SUMMARY: Notice is hereby given that NASA is... Privacy Act of 1974. This notice publishes those amendments as set forth below under the caption...

  8. Privacy encounters in Teledialogue

    DEFF Research Database (Denmark)

    Andersen, Lars Bo; Bøge, Ask Risom; Danholt, Peter

    2017-01-01

    Privacy is a major concern when new technologies are introduced between public authorities and private citizens. What is meant by privacy, however, is often unclear and contested. Accordingly, this article utilises grounded theory to study privacy empirically in the research and design project...... Teledialogue aimed at introducing new ways for public case managers and placed children to communicate through IT. The resulting argument is that privacy can be understood as an encounter, that is, as something that arises between implicated actors and entails some degree of friction and negotiation....... An argument which is further qualified through the philosophy of Gilles Deleuze. The article opens with a review of privacy literature before continuing to present privacy as an encounter with five different foci: what technologies bring into the encounter; who is related to privacy by implication; what...

  9. Human monitoring and decision-making in man/machine systems

    International Nuclear Information System (INIS)

    Johannsen, G.

    1979-01-01

    Monitoring and decision-making together are very well characterizing the role of the human operator in highly automated systems. In this report, the analysis of human monitoring and decision-making behavior as well as its modeling are described. The goal is to present a survey. 'Classic' and optimal control theoretic monitoring models are dealt with. The relationship between attention allocation and eye movements is discussed. As an example for applications, the evaluation of predictor displays by means of the optimal control model is explained. Fault detection in continuous signals and decision-making behavior of the human operator in fault diagnosis during different operation and maintenance situations are illustrated. The computer-aided decision-making is considered as a queueing problem. It is shown to what extent computer-aiding may be based on the state of human activity as measured by psychophysiological quantities. Finally, management information systems for different application areas are mentioned. As an appendix, the report includes an English-written paper in which the possibilities of mathematical modeling of human behavior in complex man-machine systems are critically assessed. (orig.) 891 GL/orig. 892 MKO [de

  10. Redefining genomic privacy: trust and empowerment.

    Directory of Open Access Journals (Sweden)

    Yaniv Erlich

    2014-11-01

    Full Text Available Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  11. Redefining genomic privacy: trust and empowerment.

    Science.gov (United States)

    Erlich, Yaniv; Williams, James B; Glazer, David; Yocum, Kenneth; Farahany, Nita; Olson, Maynard; Narayanan, Arvind; Stein, Lincoln D; Witkowski, Jan A; Kain, Robert C

    2014-11-01

    Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  12. Trajectory data privacy protection based on differential privacy mechanism

    Science.gov (United States)

    Gu, Ke; Yang, Lihao; Liu, Yongzhi; Liao, Niandong

    2018-05-01

    In this paper, we propose a trajectory data privacy protection scheme based on differential privacy mechanism. In the proposed scheme, the algorithm first selects the protected points from the user’s trajectory data; secondly, the algorithm forms the polygon according to the protected points and the adjacent and high frequent accessed points that are selected from the accessing point database, then the algorithm calculates the polygon centroids; finally, the noises are added to the polygon centroids by the differential privacy method, and the polygon centroids replace the protected points, and then the algorithm constructs and issues the new trajectory data. The experiments show that the running time of the proposed algorithms is fast, the privacy protection of the scheme is effective and the data usability of the scheme is higher.

  13. Life Written in Bytes . The Superinformacional and New Technologies Company : Will the End of Privacy and Human Dignity ?

    Directory of Open Access Journals (Sweden)

    Cleide Aparecida Gomes Rodrigues Fermentão

    2015-12-01

    Full Text Available Recent technologies have changed the way media of the human being, which shall establish direct contact with many people anywhere in the world. Allied to this fact, there is a virtualization increasing the human person, culminating in an immersion in the virtual world, which ultimately creates an increasing dependence on technology in order to exist socially. This transformation in the world of concepts makes the virtual pass to have direct impact in the real world. Attracted by the glitter and glamor of virtual network, the person finds no limits to their self-promotion. The private life is increasingly exposed to an undetermined number of people. So the person who is exposed in the virtual media in search of acceptance, forgets that it is not only stripping of his clothes or his privacy, but mainly is stripped of his dignity. The frantic search for some "tanned" finds no limit on common sense, coisificando the person and transforming it into mere virtual profile. The human person is in this state, the total lack of dignity, without realizing it, it becomes an object on display. The internet is a stage conducive to the spectacle of the self virtual, making it fertile ground for the indignity. The history of civilization dating back to fighting and winning the dignity of the human person, however, the time in which we live watch a reverse movement. Contemporaneously it is no longer the state or private to be constant threat to human dignity. Those who, seduced by the possibility of becoming the personality of the time, voluntarily abdicate their dignity in a process whose reversibility is questionable. The legislation can not keep up the speed of the transformations occurred in the virtual world and this mismatch can leave unprotected person especially in relation to their rights to intimacy, privacy and human dignity itself.

  14. Practical Privacy Assessment

    DEFF Research Database (Denmark)

    Peen, Søren; Jansen, Thejs Willem; Jensen, Christian D.

    2008-01-01

    This chapter proposes a privacy assessment model called the Operational Privacy Assessment Model that includes organizational, operational and technical factors for the protection of personal data stored in an IT system. The factors can be evaluated in a simple scale so that not only the resulting...... graphical depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment tool may be used to standardize privacy assessment criteria, making it less painful...... for the management to assess privacy risks on their systems....

  15. The Common Law Threesome: Libel, Slander, and Invasion of Privacy.

    Science.gov (United States)

    Anapol, Malthon M.

    Unlike most of the regulatory constraints which have impact on the media, libel, slander, and invasion of privacy are common law concepts developed from the precedents of previous court decisions and from reasoning employed in the written judicial opinions of appellate courts. Since common law is thus both traditional in nature and subject to…

  16. Commodification and privacy: a Lockean perspective.

    Science.gov (United States)

    Volkman, Richard

    2010-09-01

    This paper defends the thesis that privacy as a right is derived from fundamental rights to life, liberty, and property and does not permit restricting the commodification of bodily material; however, privacy as life, liberty, property does require conventions that ensure a robust and just market in bodily material. The analysis proceeds by defending a general commitment to liberty and markets, but not in the manner one might expect from a 'doctrinaire' libertarian. Ethical concerns about commodification are legitimate in the context of new medical and information technologies, but these concerns are not sufficiently well defined to justify political conclusions, since not every ethical concern is in itself a political concern, and the best way to resolve certain ethical difficulties is to draw up political boundaries that facilitate the discovery and testing of various solutions to our ethical puzzles. To illustrate the point, I will indicate how privacy as life, liberty, property defines such a dynamic solution to the problems of commodification of human bodily material and slippery information in insurance markets.

  17. Modelling human emotions for tactical decision-making games

    NARCIS (Netherlands)

    Visschedijk, G.C.; Lazonder, A.W.; Hulst, A.H. van der; Vink, N.; Leemkuil, H.

    2013-01-01

    The training of tactical decision making increasingly occurs through serious computer games. A challenging aspect of designing such games is the modelling of human emotions. Two studieswere performed to investigate the relation between fidelity and human emotion recognition in virtual human

  18. Modelling human emotions for tactical decision-making games

    NARCIS (Netherlands)

    Visschedijk, G.; Lazonder, Adrianus W.; van der Hulst, A.; Vink, N.; Leemkuil, Hendrik H.

    2013-01-01

    The training of tactical decision making increasingly occurs through serious computer games. A challenging aspect of designing such games is the modelling of human emotions. Two studies were performed to investigate the relation between fidelity and human emotion recognition in virtual human

  19. Neuroethics and Brain Privacy

    DEFF Research Database (Denmark)

    Ryberg, Jesper

    2017-01-01

    An introduction is presented in which editor discusses various articles within the issue on topics including ethical challenges with importance of privacy for well-being, impact of brain-reading on mind privacy and neurotechnology.......An introduction is presented in which editor discusses various articles within the issue on topics including ethical challenges with importance of privacy for well-being, impact of brain-reading on mind privacy and neurotechnology....

  20. IoT Privacy and Security Challenges for Smart Home Environments

    OpenAIRE

    Huichen Lin; Neil W. Bergmann

    2016-01-01

    Often the Internet of Things (IoT) is considered as a single problem domain, with proposed solutions intended to be applied across a wide range of applications. However, the privacy and security needs of critical engineering infrastructure or sensitive commercial operations are very different to the needs of a domestic Smart Home environment. Additionally, the financial and human resources available to implement security and privacy vary greatly between application domains. In domestic enviro...

  1. Context-Aware Generative Adversarial Privacy

    Directory of Open Access Journals (Sweden)

    Chong Huang

    2017-12-01

    Full Text Available Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP. GAP leverages recent advancements in generative adversarial networks (GANs to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals’ private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP’s performance, we investigate two simple (yet canonical statistical dataset models: (a the binary data model; and (b the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.

  2. Context-Aware Generative Adversarial Privacy

    Science.gov (United States)

    Huang, Chong; Kairouz, Peter; Chen, Xiao; Sankar, Lalitha; Rajagopal, Ram

    2017-12-01

    Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP). GAP leverages recent advancements in generative adversarial networks (GANs) to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals' private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP's performance, we investigate two simple (yet canonical) statistical dataset models: (a) the binary data model, and (b) the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion) match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.

  3. Protecting genetic privacy.

    Science.gov (United States)

    Roche, P A; Annas, G J

    2001-05-01

    This article outlines the arguments for and against new rules to protect genetic privacy. We explain why genetic information is different to other sensitive medical information, why researchers and biotechnology companies have opposed new rules to protect genetic privacy (and favour anti-discrimination laws instead), and discuss what can be done to protect privacy in relation to genetic-sequence information and to DNA samples themselves.

  4. Privacy-Preserving Verifiability: A Case for an Electronic Exam Protocol

    DEFF Research Database (Denmark)

    Giustolisi, Rosario; Iovino, Vincenzo; Lenzini, Gabriele

    2017-01-01

    We introduce the notion of privacy-preserving verifiability for security protocols. It holds when a protocol admits a verifiability test that does not reveal, to the verifier that runs it, more pieces of information about the protocol’s execution than those required to run the test. Our definition...... of privacy-preserving verifiability is general and applies to cryptographic protocols as well as to human security protocols. In this paper we exemplify it in the domain of e-exams. We prove that the notion is meaningful by studying an existing exam protocol that is verifiable but whose verifiability tests...... are not privacy-preserving. We prove that the notion is applicable: we review the protocol using functional encryption so that it admits a verifiability test that preserves privacy according to our definition. We analyse, in ProVerif, that the verifiability holds despite malicious parties and that the new...

  5. The "GeneTrustee": a universal identification system that ensures privacy and confidentiality for human genetic databases.

    Science.gov (United States)

    Burnett, Leslie; Barlow-Stewart, Kris; Proos, Anné L; Aizenberg, Harry

    2003-05-01

    This article describes a generic model for access to samples and information in human genetic databases. The model utilises a "GeneTrustee", a third-party intermediary independent of the subjects and of the investigators or database custodians. The GeneTrustee model has been implemented successfully in various community genetics screening programs and has facilitated research access to genetic databases while protecting the privacy and confidentiality of research subjects. The GeneTrustee model could also be applied to various types of non-conventional genetic databases, including neonatal screening Guthrie card collections, and to forensic DNA samples.

  6. 75 FR 5604 - Privacy Act of 1974; Report of an Altered System of Records

    Science.gov (United States)

    2010-02-03

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Health Resources and Services Administration Privacy Act...). SUMMARY: In accordance with the requirements of the Privacy Act of 1974, the Health Resources and Services... to include breach notification language required by Memoranda (M) 07-16, Safeguarding Against and...

  7. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees.

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-09-10

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one.

  8. Privacy of genetic information: a review of the laws in the United States.

    Science.gov (United States)

    Fuller, B; Ip, M

    2001-01-01

    This paper examines the privacy of genetic information and the laws in the United States designed to protect genetic privacy. While all 50 states have laws protecting the privacy of health information, there are many states that have additional laws that carve out additional protections specifically for genetic information. The majority of the individual states have enacted legislation to protect individuals from discrimination on the basis of genetic information, and most of this legislation also has provisions to protect the privacy of genetic information. On the Federal level, there has been no antidiscrimination or genetic privacy legislation. Secretary Donna Shalala of the Department of Health and Human Services has issued proposed regulations to protect the privacy of individually identifiable health information. These regulations encompass individually identifiable health information and do not make specific provisions for genetic information. The variety of laws regarding genetic privacy, some found in statutes to protect health information and some found in statutes to prevent genetic discrimination, presents challenges to those charged with administering and executing these laws.

  9. Privacy Verification Using Ontologies

    NARCIS (Netherlands)

    Kost, Martin; Freytag, Johann-Christoph; Kargl, Frank; Kung, Antonio

    2011-01-01

    As information systems extensively exchange information between participants, privacy concerns may arise from its potential misuse. A Privacy by Design (PbD) approach considers privacy requirements of different stakeholders during the design and the implementation of a system. Currently, a

  10. Human centred design of software agent in social network service against privacy concerns

    OpenAIRE

    Kim, Hojung

    2016-01-01

    This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University London The rapid growth and influence of social network services has led many scholars to focus on privacy issues. However, the research described in this thesis was motivated by the small number of design studies that have focused on practical approaches to identifying tacit information from users’ instant non-verbal responses to privacy issues. The research therefore aimed to propose pers...

  11. Identity management and privacy languages technologies: Improving user control of data privacy

    Science.gov (United States)

    García, José Enrique López; García, Carlos Alberto Gil; Pacheco, Álvaro Armenteros; Organero, Pedro Luis Muñoz

    The identity management solutions have the capability to bring confidence to internet services, but this confidence could be improved if user has more control over the privacy policy of its attributes. Privacy languages could help to this task due to its capability to define privacy policies for data in a very flexible way. So, an integration problem arises: making work together both identity management and privacy languages. Despite several proposals for accomplishing this have already been defined, this paper suggests some topics and improvements that could be considered.

  12. Privacy and internet services

    OpenAIRE

    Samec, Marek

    2010-01-01

    This thesis is focused on internet services user privacy. Goal of this thesis is to determine level of user awareness of how is their privacy approached while using internet services. Then suggest procedure to improve this awareness, or that will lead to better control of individual privacy. In theoretical part I analyze general and legislative approach to privacy, followed by analysis of behaviour of internet service users and providers. Part of this analysis deals with usage of web cookies ...

  13. Gender and online privacy among teens: risk perception, privacy concerns, and protection behaviors.

    Science.gov (United States)

    Youn, Seounmi; Hall, Kimberly

    2008-12-01

    Survey data from 395 high school students revealed that girls perceive more privacy risks and have a higher level of privacy concerns than boys. Regarding privacy protection behaviors, boys tended to read unsolicited e-mail and register for Web sites while directly sending complaints in response to unsolicited e-mail. This study found girls to provide inaccurate information as their privacy concerns increased. Boys, however, refrained from registering to Web sites as their concerns increased.

  14. Accommodating complexity and human behaviors in decision analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Siirola, John Daniel; Schoenwald, David Alan; Strip, David R.; Hirsch, Gary B.; Bastian, Mark S.; Braithwaite, Karl R.; Homer, Jack [Homer Consulting

    2007-11-01

    This is the final report for a LDRD effort to address human behavior in decision support systems. One sister LDRD effort reports the extension of this work to include actual human choices and additional simulation analyses. Another provides the background for this effort and the programmatic directions for future work. This specific effort considered the feasibility of five aspects of model development required for analysis viability. To avoid the use of classified information, healthcare decisions and the system embedding them became the illustrative example for assessment.

  15. Ratio Utility and Cost Analysis for Privacy Preserving Subspace Projection

    OpenAIRE

    Al, Mert; Wan, Shibiao; Kung, Sun-Yuan

    2017-01-01

    With a rapidly increasing number of devices connected to the internet, big data has been applied to various domains of human life. Nevertheless, it has also opened new venues for breaching users' privacy. Hence it is highly required to develop techniques that enable data owners to privatize their data while keeping it useful for intended applications. Existing methods, however, do not offer enough flexibility for controlling the utility-privacy trade-off and may incur unfavorable results when...

  16. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-01-01

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one. PMID:27626417

  17. 48 CFR 39.105 - Privacy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Privacy. 39.105 Section 39... CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.105 Privacy. Agencies shall ensure that contracts for information technology address protection of privacy in accordance with the Privacy Act (5 U.S.C...

  18. Privacy in domestic environments

    OpenAIRE

    Radics, Peter J; Gracanin, Denis

    2011-01-01

    non-peer-reviewed While there is a growing body of research on privacy,most of the work puts the focus on information privacy. Physical and psychological privacy issues receive little to no attention. However, the introduction of technology into our lives can cause problems with regard to these aspects of privacy. This is especially true when it comes to our homes, both as nodes of our social life and places for relaxation. This paper presents the results of a study intended to captu...

  19. Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy

    Science.gov (United States)

    Koopman, Colin; Doty, Nick

    2016-01-01

    The meaning of privacy has been much disputed throughout its history in response to wave after wave of new technological capabilities and social configurations. The current round of disputes over privacy fuelled by data science has been a cause of despair for many commentators and a death knell for privacy itself for others. We argue that privacy’s disputes are neither an accidental feature of the concept nor a lamentable condition of its applicability. Privacy is essentially contested. Because it is, privacy is transformable according to changing technological and social conditions. To make productive use of privacy’s essential contestability, we argue for a new approach to privacy research and practical design, focused on the development of conceptual analytics that facilitate dissecting privacy’s multiple uses across multiple contexts. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336797

  20. Genetic secrets: Protecting privacy and confidentiality in the genetic era

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-07-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  1. Decision science a human-oriented perspective

    CERN Document Server

    Mengov, George

    2015-01-01

    This book offers a new perspective on human decision-making by comparing the established methods in decision science with innovative modelling at the level of neurons and neural interactions. The book presents a new generation of computer models, which can predict with astonishing accuracy individual economic choices when people make them by quick intuition rather than by effort. A vision for a new kind of social science is outlined, whereby neural models of emotion and cognition capture the dynamics of socioeconomic systems and virtual social networks. The exposition is approachable by experts as well as by advanced students. The author is an Associate Professor of Decision Science with a doctorate in Computational Neuroscience, and a former software consultant to banks in the City of London.  .

  2. Modelling Human Emotions for Tactical Decision-Making Games

    Science.gov (United States)

    Visschedijk, Gillian C.; Lazonder, Ard W.; van der Hulst, Anja; Vink, Nathalie; Leemkuil, Henny

    2013-01-01

    The training of tactical decision making increasingly occurs through serious computer games. A challenging aspect of designing such games is the modelling of human emotions. Two studies were performed to investigate the relation between fidelity and human emotion recognition in virtual human characters. Study 1 compared five versions of a virtual…

  3. Towards Territorial Privacy in Smart Environments

    NARCIS (Netherlands)

    Könings, Bastian; Schaub, Florian; Weber, M.; Kargl, Frank

    Territorial privacy is an old concept for privacy of the personal space dating back to the 19th century. Despite its former relevance, territorial privacy has been neglected in recent years, while privacy research and legislation mainly focused on the issue of information privacy. However, with the

  4. Privacy and Library Records

    Science.gov (United States)

    Bowers, Stacey L.

    2006-01-01

    This paper summarizes the history of privacy as it relates to library records. It commences with a discussion of how the concept of privacy first originated through case law and follows the concept of privacy as it has affected library records through current day and the "USA PATRIOT Act."

  5. HUMAN DECISIONS AND MACHINE PREDICTIONS.

    Science.gov (United States)

    Kleinberg, Jon; Lakkaraju, Himabindu; Leskovec, Jure; Ludwig, Jens; Mullainathan, Sendhil

    2018-02-01

    Can machine learning improve human decision making? Bail decisions provide a good test case. Millions of times each year, judges make jail-or-release decisions that hinge on a prediction of what a defendant would do if released. The concreteness of the prediction task combined with the volume of data available makes this a promising machine-learning application. Yet comparing the algorithm to judges proves complicated. First, the available data are generated by prior judge decisions. We only observe crime outcomes for released defendants, not for those judges detained. This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions. Second, judges may have a broader set of preferences than the variable the algorithm predicts; for instance, judges may care specifically about violent crimes or about racial inequities. We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges. Even accounting for these concerns, our results suggest potentially large welfare gains: one policy simulation shows crime reductions up to 24.7% with no change in jailing rates, or jailing rate reductions up to 41.9% with no increase in crime rates. Moreover, all categories of crime, including violent crimes, show reductions; and these gains can be achieved while simultaneously reducing racial disparities. These results suggest that while machine learning can be valuable, realizing this value requires integrating these tools into an economic framework: being clear about the link between predictions and decisions; specifying the scope of payoff functions; and constructing unbiased decision counterfactuals. JEL Codes: C10 (Econometric and statistical methods and methodology), C55 (Large datasets: Modeling and analysis), K40 (Legal procedure, the legal system, and illegal behavior).

  6. Data privacy for the smart grid

    CERN Document Server

    Herold, Rebecca

    2015-01-01

    The Smart Grid and PrivacyWhat Is the Smart Grid? Changes from Traditional Energy Delivery Smart Grid Possibilities Business Model Transformations Emerging Privacy Risks The Need for Privacy PoliciesPrivacy Laws, Regulations, and Standards Privacy-Enhancing Technologies New Privacy Challenges IOT Big Data What Is the Smart Grid?Market and Regulatory OverviewTraditional Electricity Business SectorThe Electricity Open Market Classifications of Utilities Rate-Making ProcessesElectricity Consumer

  7. The Genetic Privacy Act and commentary

    Energy Technology Data Exchange (ETDEWEB)

    Annas, G.J.; Glantz, L.H.; Roche, P.A.

    1995-02-28

    The Genetic Privacy Act is a proposal for federal legislation. The Act is based on the premise that genetic information is different from other types of personal information in ways that require special protection. The DNA molecule holds an extensive amount of currently indecipherable information. The major goal of the Human Genome Project is to decipher this code so that the information it contains is accessible. The privacy question is, accessible to whom? The highly personal nature of the information contained in DNA can be illustrated by thinking of DNA as containing an individual`s {open_quotes}future diary.{close_quotes} A diary is perhaps the most personal and private document a person can create. It contains a person`s innermost thoughts and perceptions, and is usually hidden and locked to assure its secrecy. Diaries describe the past. The information in one`s genetic code can be thought of as a coded probabilistic future diary because it describes an important part of a unique and personal future. This document presents an introduction to the proposal for federal legislation `the Genetic Privacy Act`; a copy of the proposed act; and comment.

  8. Designing Privacy-by-Design

    NARCIS (Netherlands)

    Rest, J.H.C. van; Boonstra, D.; Everts, M.H.; Rijn, M. van; Paassen, R.J.G. van

    2014-01-01

    The proposal for a new privacy regulation d.d. January 25th 2012 introduces sanctions of up to 2% of the annual turnover of enterprises. This elevates the importance of mitigation of privacy risks. This paper makes Privacy by Design more concrete, and positions it as the mechanism to mitigate these

  9. Metabolic state alters economic decision making under risk in humans.

    Directory of Open Access Journals (Sweden)

    Mkael Symmonds

    2010-06-01

    Full Text Available Animals' attitudes to risk are profoundly influenced by metabolic state (hunger and baseline energy stores. Specifically, animals often express a preference for risky (more variable food sources when below a metabolic reference point (hungry, and safe (less variable food sources when sated. Circulating hormones report the status of energy reserves and acute nutrient intake to widespread targets in the central nervous system that regulate feeding behaviour, including brain regions strongly implicated in risk and reward based decision-making in humans. Despite this, physiological influences per se have not been considered previously to influence economic decisions in humans. We hypothesised that baseline metabolic reserves and alterations in metabolic state would systematically modulate decision-making and financial risk-taking in humans.We used a controlled feeding manipulation and assayed decision-making preferences across different metabolic states following a meal. To elicit risk-preference, we presented a sequence of 200 paired lotteries, subjects' task being to select their preferred option from each pair. We also measured prandial suppression of circulating acyl-ghrelin (a centrally-acting orexigenic hormone signalling acute nutrient intake, and circulating leptin levels (providing an assay of energy reserves. We show both immediate and delayed effects on risky decision-making following a meal, and that these changes correlate with an individual's baseline leptin and changes in acyl-ghrelin levels respectively.We show that human risk preferences are exquisitely sensitive to current metabolic state, in a direction consistent with ecological models of feeding behaviour but not predicted by normative economic theory. These substantive effects of state changes on economic decisions perhaps reflect shared evolutionarily conserved neurobiological mechanisms. We suggest that this sensitivity in human risk-preference to current metabolic state has

  10. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao; Liu, An; Zhang, Xiangliang; Li, Zhixu; Liu, Guanfeng; Zhao, Lei; Zhou, Xiaofang

    2017-01-01

    result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP

  11. Engineering and lawyering privacy by design : understanding online privacy both as a technical and an international human rights issue

    NARCIS (Netherlands)

    Rachovitsa, Adamantia

    2016-01-01

    There is already evidence that “governmental mass surveillance emerges as a dangerous habit”. Despite the serious interests at stake, we are far from fully comprehending the ramifications of the systematic and pervasive violation of privacy online. This article underscores the reasons that

  12. Privacy information management for video surveillance

    Science.gov (United States)

    Luo, Ying; Cheung, Sen-ching S.

    2013-05-01

    The widespread deployment of surveillance cameras has raised serious privacy concerns. Many privacy-enhancing schemes have been proposed to automatically redact images of trusted individuals in the surveillance video. To identify these individuals for protection, the most reliable approach is to use biometric signals such as iris patterns as they are immutable and highly discriminative. In this paper, we propose a privacy data management system to be used in a privacy-aware video surveillance system. The privacy status of a subject is anonymously determined based on her iris pattern. For a trusted subject, the surveillance video is redacted and the original imagery is considered to be the privacy information. Our proposed system allows a subject to access her privacy information via the same biometric signal for privacy status determination. Two secure protocols, one for privacy information encryption and the other for privacy information retrieval are proposed. Error control coding is used to cope with the variability in iris patterns and efficient implementation is achieved using surrogate data records. Experimental results on a public iris biometric database demonstrate the validity of our framework.

  13. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  14. Advanced research in data privacy

    CERN Document Server

    Torra, Vicenç

    2015-01-01

    This book provides an overview of the research work on data privacy and privacy enhancing technologies carried by the participants of the ARES project. ARES (Advanced Research in Privacy an Security, CSD2007-00004) has been one of the most important research projects funded by the Spanish Government in the fields of computer security and privacy. It is part of the now extinct CONSOLIDER INGENIO 2010 program, a highly competitive program which aimed to advance knowledge and open new research lines among top Spanish research groups. The project started in 2007 and will finish this 2014. Composed by 6 research groups from 6 different institutions, it has gathered an important number of researchers during its lifetime. Among the work produced by the ARES project, one specific work package has been related to privacy. This books gathers works produced by members of the project related to data privacy and privacy enhancing technologies. The presented works not only summarize important research carried in the proje...

  15. A privacy protection model to support personal privacy in relational databases.

    OpenAIRE

    2008-01-01

    The individual of today incessantly insists on more protection of his/her personal privacy than a few years ago. During the last few years, rapid technological advances, especially in the field of information technology, directed most attention and energy to the privacy protection of the Internet user. Research was done and is still being done covering a vast area to protect the privacy of transactions performed on the Internet. However, it was established that almost no research has been don...

  16. Internet privacy options for adequate realisation

    CERN Document Server

    2013-01-01

    A thorough multidisciplinary analysis of various perspectives on internet privacy was published as the first volume of a study, revealing the results of the achatech project "Internet Privacy - A Culture of Privacy and Trust on the Internet." The second publication from this project presents integrated, interdisciplinary options for improving privacy on the Internet utilising a normative, value-oriented approach. The ways in which privacy promotes and preconditions fundamental societal values and how privacy violations endanger the flourishing of said values are exemplified. The conditions which must be fulfilled in order to achieve a culture of privacy and trust on the internet are illuminated. This volume presents options for policy-makers, educators, businesses and technology experts how to facilitate solutions for more privacy on the Internet and identifies further research requirements in this area.

  17. Emotion-affected decision making in human simulation

    OpenAIRE

    Zhao, Y; Kang, J; Wright, D K

    2006-01-01

    Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the...

  18. Cognitive Privacy for Personal Clouds

    Directory of Open Access Journals (Sweden)

    Milena Radenkovic

    2016-01-01

    Full Text Available This paper proposes a novel Cognitive Privacy (CogPriv framework that improves privacy of data sharing between Personal Clouds for different application types and across heterogeneous networks. Depending on the behaviour of neighbouring network nodes, their estimated privacy levels, resource availability, and social network connectivity, each Personal Cloud may decide to use different transmission network for different types of data and privacy requirements. CogPriv is fully distributed, uses complex graph contacts analytics and multiple implicit novel heuristics, and combines these with smart probing to identify presence and behaviour of privacy compromising nodes in the network. Based on sensed local context and through cooperation with remote nodes in the network, CogPriv is able to transparently and on-the-fly change the network in order to avoid transmissions when privacy may be compromised. We show that CogPriv achieves higher end-to-end privacy levels compared to both noncognitive cellular network communication and state-of-the-art strategies based on privacy-aware adaptive social mobile networks routing for a range of experiment scenarios based on real-world user and network traces. CogPriv is able to adapt to varying network connectivity and maintain high quality of service while managing to keep low data exposure for a wide range of privacy leakage levels in the infrastructure.

  19. Discrimination and Privacy in the Information Society Data Mining and Profiling in Large Databases

    CERN Document Server

    Calders, Toon; Schermer, Bart; Zarsky, Tal

    2013-01-01

    Vast amounts of data are nowadays collected, stored and processed, in an effort to assist in  making a variety of administrative and governmental decisions. These innovative steps considerably improve the speed, effectiveness and quality of decisions. Analyses are increasingly performed by data mining and profiling technologies that statistically and automatically determine patterns and trends. However, when such practices lead to unwanted or unjustified selections, they may result in unacceptable forms of  discrimination. Processing vast amounts of data may lead to situations in which data controllers know many of the characteristics, behaviors and whereabouts of people. In some cases, analysts might know more about individuals than these individuals know about themselves. Judging people by their digital identities sheds a different light on our views of privacy and data protection. This book discusses discrimination and privacy issues related to data mining and profiling practices. It provides technologic...

  20. Space in Space: Designing for Privacy in the Workplace

    Science.gov (United States)

    Akin, Jonie

    2015-01-01

    Privacy is cultural, socially embedded in the spatial, temporal, and material aspects of the lived experience. Definitions of privacy are as varied among scholars as they are among those who fight for their personal rights in the home and the workplace. Privacy in the workplace has become a topic of interest in recent years, as evident in discussions on Big Data as well as the shrinking office spaces in which people work. An article in The New York Times published in February of this year noted that "many companies are looking to cut costs, and one way to do that is by trimming personal space". Increasingly, organizations ranging from tech start-ups to large corporations are downsizing square footage and opting for open-office floorplans hoping to trim the budget and spark creative, productive communication among their employees. The question of how much is too much to trim when it comes to privacy, is one that is being actively addressed by the National Aeronautics and Space Administration (NASA) as they explore habitat designs for future space missions. NASA recognizes privacy as a design-related stressor impacting human health and performance. Given the challenges of sustaining life in an isolated, confined, and extreme environment such as Mars, NASA deems it necessary to determine the acceptable minimal amount for habitable volume for activities requiring at least some level of privacy in order to support optimal crew performance. Ethnographic research was conducted in 2013 to explore perceptions of privacy and privacy needs among astronauts living and working in space as part of a long-distance, long-duration mission. The allocation of space, or habitable volume, becomes an increasingly complex issue in outer space due to the costs associated with maintaining an artificial, confined environment bounded by limitations of mass while located in an extreme environment. Privacy in space, or space in space, provides a unique case study of the complex notions of

  1. Cybersecurity and Privacy

    DEFF Research Database (Denmark)

    he huge potential in future connected services has as a precondition that privacy and security needs are dealt with in order for new services to be accepted. This issue is increasingly on the agenda both at the company and at individual level. Cybersecurity and Privacy – bridging the gap addresses...... two very complex fields of the digital world, i.e., Cybersecurity and Privacy. These multifaceted, multidisciplinary and complex issues are usually understood and valued differently by different individuals, data holders and legal bodies. But a change in one field immediately affects the others....... Policies, frameworks, strategies, laws, tools, techniques, and technologies – all of these are tightly interwoven when it comes to security and privacy. This book is another attempt to bridge the gap between the industry and academia. The book addresses the views from academia and industry on the subject...

  2. Privacy for Sale?

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Sørensen, Jannick Kirk; Khajuria, Samant

    Data brokers have become central players in the collection online of private user data. Data brokers’ activities are however not very transparent or even known by users. Many users regard privacy a central element when they use online services. Based on 12 short interviews with users, this paper...... analyses how users perceive the concept of online privacy in respect to data brokers col- lection of private data, and particularly novel services that offer users the possi- bility to sell their private data. Two groups of users are identified: Those who are considering selling their data under specific...... conditions, and those who reject the idea completely. Based on the literature we identify two positions to privacy either as an instrumental good, or as an intrinsic good. The paper positions vari- ous user perceptions on privacy that are relevant for future service develop- ment....

  3. 78 FR 47210 - National Practitioner Data Bank and Privacy Act; Exempt Records System; Technical Correction

    Science.gov (United States)

    2013-08-05

    ... reference cited in the Privacy Act regulations. The National Practitioner Data Bank (NPDB) system of records... DEPARTMENT OF HEALTH AND HUMAN SERVICES 45 CFR Part 5b RIN 0906-AA97 National Practitioner Data Bank and Privacy Act; Exempt Records System; Technical Correction AGENCY: Health Resources and Services...

  4. Forensic DNA phenotyping: Developing a model privacy impact assessment.

    Science.gov (United States)

    Scudder, Nathan; McNevin, Dennis; Kelty, Sally F; Walsh, Simon J; Robertson, James

    2018-05-01

    Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Location Privacy in RFID Applications

    Science.gov (United States)

    Sadeghi, Ahmad-Reza; Visconti, Ivan; Wachsmann, Christian

    RFID-enabled systems allow fully automatic wireless identification of objects and are rapidly becoming a pervasive technology with various applications. However, despite their benefits, RFID-based systems also pose challenging risks, in particular concerning user privacy. Indeed, improvident use of RFID can disclose sensitive information about users and their locations allowing detailed user profiles. Hence, it is crucial to identify and to enforce appropriate security and privacy requirements of RFID applications (that are also compliant to legislation). This chapter first discusses security and privacy requirements for RFID-enabled systems, focusing in particular on location privacy issues. Then it explores the advances in RFID applications, stressing the security and privacy shortcomings of existing proposals. Finally, it presents new promising directions for privacy-preserving RFID systems, where as a case study we focus electronic tickets (e-tickets) for public transportation.

  6. Privacy enhancing techniques - the key to secure communication and management of clinical and genomic data.

    Science.gov (United States)

    De Moor, G J E; Claerhout, B; De Meyer, F

    2003-01-01

    To introduce some of the privacy protection problems related to genomics based medicine and to highlight the relevance of Trusted Third Parties (TTPs) and of Privacy Enhancing Techniques (PETs) in the restricted context of clinical research and statistics. Practical approaches based on two different pseudonymisation models, both for batch and interactive data collection and exchange, are described and analysed. The growing need of managing both clinical and genetic data raises important legal and ethical challenges. Protecting human rights in the realm of privacy, while optimising research potential and other statistical activities is a challenge that can easily be overcome with the assistance of a trust service provider offering advanced privacy enabling/enhancing solutions. As such, the use of pseudonymisation and other innovative Privacy Enhancing Techniques can unlock valuable data sources.

  7. Privacy and Data-Based Research

    OpenAIRE

    Ori Heffetz; Katrina Ligett

    2013-01-01

    What can we, as users of microdata, formally guarantee to the individuals (or firms) in our dataset, regarding their privacy? We retell a few stories, well-known in data-privacy circles, of failed anonymization attempts in publicly released datasets. We then provide a mostly informal introduction to several ideas from the literature on differential privacy, an active literature in computer science that studies formal approaches to preserving the privacy of individuals in statistical databases...

  8. 76 FR 59073 - Privacy Act

    Science.gov (United States)

    2011-09-23

    ... CENTRAL INTELLIGENCE AGENCY 32 CFR Part 1901 Privacy Act AGENCY: Central Intelligence Agency. ACTION: Proposed rule. SUMMARY: Consistent with the Privacy Act (PA), the Central Intelligence Agency...-1379. SUPPLEMENTARY INFORMATION: Consistent with the Privacy Act (PA), the CIA has undertaken and...

  9. Vehicular Internet: Security & Privacy Challenges and Opportunities

    Directory of Open Access Journals (Sweden)

    Kamran Zaidi

    2015-07-01

    Full Text Available The vehicular internet will drive the future of vehicular technology and intelligent transportation systems (ITS. Whether it is road safety, infotainment, or driver-less cars, the vehicular internet will lay the foundation for the future of road travel. Governments and companies are pursuing driver-less vehicles as they are considered to be more reliable than humans and, therefore, safer. The vehicles today are not just a means of transportation but are also equipped with a wide range of sensors that provide valuable data. If vehicles are enabled to share data that they collect with other vehicles or authorities for decision-making and safer driving, they thereby form a vehicular network. However, there is a lot at stake in vehicular networks if they are compromised. With the stakes so high, it is imperative that the vehicular networks are secured and made resilient to any attack or attempt that may have serious consequences. The vehicular internet can also be the target of a cyber attack, which can be devastating. In this paper, the opportunities that the vehicular internet offers are presented and then various security and privacy aspects are discussed and some solutions are presented.

  10. The Models of Applying Online Privacy Literacy Strategies: A Case Study of Instagram Girl Users

    Directory of Open Access Journals (Sweden)

    Abdollah Bicharanlou

    2017-09-01

    Full Text Available Social networks affect remarkably in the lives of virtual space users. These networks like most human relations involve compromising between self-disclosure and privacy protection. A process which is realized through improving privacy and empowering the user at the personal level. This study aimed to assess strategies based on online privacy literacy. In particular, strategies that Instagram young girls users should employ to achieve the optimum level of privacy. For this purpose, firstly the paradox of privacy, benefits and risks of self-disclosure are explained, then according to online privacy literacy, some social and technological strategies are introduced by which users can solve the “paradox of privacy.” In the result section, after describing the main benefits and risks of self-disclosure by girl users, the current models of using these social and technological strategies to solve the mentioned paradox are discussed. The research method is ethnography based on non-collaborative observation of Instagram pages and semi-structured interviews with 20 girl users of social networks.

  11. A new privacy preserving technique for cloud service user endorsement using multi-agents

    Directory of Open Access Journals (Sweden)

    D. Chandramohan

    2016-01-01

    Full Text Available In data analysis the present focus on storage services are leveraged to attain its crucial part while user data get compromised. In the recent years service user’s valuable information has been utilized by unauthorized users and service providers. This paper examines the privacy awareness and importance of user’s secrecy preserving in the current cloud computing era. Gradually the information kept under the cloud environment gets increased due to its elasticity and availability. However, highly sensitive information is in a serious attack from various sources. Once private information gets misused, the probability of privacy breaching increases which thereby reduces user’s trust on cloud providers. In the modern internet world, information management and maintenance is one among the most decisive tasks. Information stored in the cloud by the finance, healthcare, government sectors, etc. makes it all the more challenging since such tasks are to be handled globally. The present scenario therefore demands a new Petri-net Privacy Preserving Framework (PPPF for safeguarding user’s privacy and, providing consistent and breach-less services from the cloud. This paper illustrates the design of PPPF and mitigates the cloud provider’s trust among users. The proposed technique conveys and collaborates with Privacy Preserving Cohesion Technique (PPCT, to develop validate, promote, adapt and also increase the need for data privacy. Moreover, this paper focuses on clinching and verification of unknown user intervention into the confidential data present in storage area and ensuring the performance of the cloud services. It also acts as an information preserving guard for high secrecy data storage areas.

  12. Protecting patron privacy

    CERN Document Server

    Beckstrom, Matthew

    2015-01-01

    In a world where almost anyone with computer savvy can hack, track, and record the online activities of others, your library can serve as a protected haven for your visitors who rely on the Internet to conduct research-if you take the necessary steps to safeguard their privacy. This book shows you how to protect patrons' privacy while using the technology that your library provides, including public computers, Internet access, wireless networks, and other devices. Logically organized into two major sections, the first part of the book discusses why the privacy of your users is of paramount

  13. Bridging the transatlantic divide in privacy

    Directory of Open Access Journals (Sweden)

    Paula Kift

    2013-08-01

    Full Text Available In the context of the US National Security Agency surveillance scandal, the transatlantic privacy divide has come back to the fore. In the United States, the right to privacy is primarily understood as a right to physical privacy, thus the protection from unwarranted government searches and seizures. In Germany on the other hand, it is also understood as a right to spiritual privacy, thus the right of citizens to develop into autonomous moral agents. The following article will discuss the different constitutional assumptions that underlie American and German attitudes towards privacy, namely privacy as an aspect of liberty or as an aspect of dignity. As data flows defy jurisdictional boundaries, however, policymakers across the Atlantic are faced with a conundrum: how can German and American privacy cultures be reconciled?

  14. Friends, Connections, and Social Norms of Privacy. Do Social Network Sites Change Our Conception of Friendship?

    NARCIS (Netherlands)

    Roessler, B.

    2013-01-01

    Technological changes have always had an influence on human relationships in general, as well as more particularly on social norms of privacy - think only of Georg Simmel's observations on changing norms of privacy after the invention of the metropolitan subway and its influence on our behaviour

  15. Toward Privacy-Preserving Personalized Recommendation Services

    Directory of Open Access Journals (Sweden)

    Cong Wang

    2018-02-01

    Full Text Available Recommendation systems are crucially important for the delivery of personalized services to users. With personalized recommendation services, users can enjoy a variety of targeted recommendations such as movies, books, ads, restaurants, and more. In addition, personalized recommendation services have become extremely effective revenue drivers for online business. Despite the great benefits, deploying personalized recommendation services typically requires the collection of users’ personal data for processing and analytics, which undesirably makes users susceptible to serious privacy violation issues. Therefore, it is of paramount importance to develop practical privacy-preserving techniques to maintain the intelligence of personalized recommendation services while respecting user privacy. In this paper, we provide a comprehensive survey of the literature related to personalized recommendation services with privacy protection. We present the general architecture of personalized recommendation systems, the privacy issues therein, and existing works that focus on privacy-preserving personalized recommendation services. We classify the existing works according to their underlying techniques for personalized recommendation and privacy protection, and thoroughly discuss and compare their merits and demerits, especially in terms of privacy and recommendation accuracy. We also identity some future research directions. Keywords: Privacy protection, Personalized recommendation services, Targeted delivery, Collaborative filtering, Machine learning

  16. Towards Privacy Managment of Information Systems

    OpenAIRE

    Drageide, Vidar

    2009-01-01

    This masters thesis provides insight into the concept of privacy. It argues why privacy is important, and why developers and system owners should keep privacy in mind when developing and maintaining systems containing personal information. Following this, a strategy for evaluating the overall level of privacy in a system is defined. The strategy is then applied to parts of the cellphone system in an attempt to evaluate the privacy of traffic and location data in this system.

  17. Adding query privacy to robust DHTs

    DEFF Research Database (Denmark)

    Backes, Michael; Goldberg, Ian; Kate, Aniket

    2012-01-01

    intermediate peers that (help to) route the queries towards their destinations. In this paper, we satisfy this requirement by presenting an approach for providing privacy for the keys in DHT queries. We use the concept of oblivious transfer (OT) in communication over DHTs to preserve query privacy without...... privacy over robust DHTs. Finally, we compare the performance of our privacy-preserving protocols with their more privacy-invasive counterparts. We observe that there is no increase in the message complexity...

  18. Privacy in an Ambient World

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  19. Information Privacy Revealed

    Science.gov (United States)

    Lavagnino, Merri Beth

    2013-01-01

    Why is Information Privacy the focus of the January-February 2013 issue of "EDUCAUSE Review" and "EDUCAUSE Review Online"? Results from the 2012 annual survey of the International Association of Privacy Professionals (IAPP) indicate that "meeting regulatory compliance requirements continues to be the top perceived driver…

  20. A Survey of Privacy on Data Integration

    OpenAIRE

    Do Son, Thanh

    2015-01-01

    This survey is an integrated view of other surveys on privacy preserving for data integration. First, we review the database context and challenges and research questions. Second, we formulate the privacy problems for schema matching and data matching. Next, we introduce the elements of privacy models. Then, we summarize the existing privacy techniques and the analysis (proofs) of privacy guarantees. Finally, we describe the privacy frameworks and their applications.

  1. Privacy in social networking sites

    OpenAIRE

    Λεονάρδος, Γεώργιος; Leonardos, Giorgos

    2016-01-01

    The purpose of this study is to explore the aspects of privacy over the use of social networks web sites. More specific, we will show the types of social networks, their privacy mechanisms that are different in each social network site, their privacy options that are offered to users. We will report some serious privacy violations incidents of the most popular social networks sites such as Facebook, Twitter, LinkedIn. Also, we will report some important surveys about social networks and pr...

  2. Methodology for eliciting, encoding and simulating human decision making behaviour

    OpenAIRE

    Rider, Conrad Edgar Scott

    2012-01-01

    Agent-based models (ABM) are an increasingly important research tool for describing and predicting interactions among humans and their environment. A key challenge for such models is the ability to faithfully represent human decision making with respect to observed behaviour. This thesis aims to address this challenge by developing a methodology for empirical measurement and simulation of decision making in humanenvironment systems. The methodology employs the Beliefs-Desires-I...

  3. Location-Related Privacy in Geo-Social Networks

    DEFF Research Database (Denmark)

    Ruiz Vicente, Carmen; Freni, Dario; Bettini, Claudio

    2011-01-01

    -ins." However, this ability to reveal users' locations causes new privacy threats, which in turn call for new privacy-protection methods. The authors study four privacy aspects central to these social networks - location, absence, co-location, and identity privacy - and describe possible means of protecting...... privacy in these circumstances....

  4. PRIVACY AS A CULTURAL PHENOMENON

    Directory of Open Access Journals (Sweden)

    Garfield Benjamin

    2017-07-01

    Full Text Available Privacy remains both contentious and ever more pertinent in contemporary society. Yet it persists as an ill-defined term, not only within specific fields but in its various uses and implications between and across technical, legal and political contexts. This article offers a new critical review of the history of privacy in terms of two dominant strands of thinking: freedom and property. These two conceptions of privacy can be seen as successive historical epochs brought together under digital technologies, yielding increasingly complex socio-technical dilemmas. By simplifying the taxonomy to its socio-cultural function, the article provides a generalisable, interdisciplinary approach to privacy. Drawing on new technologies, historical trends, sociological studies and political philosophy, the article presents a discussion of the value of privacy as a term, before proposing a defense of the term cyber security as a mode of scalable cognitive privacy that integrates the relative needs of individuals, governments and corporations.

  5. Personal-Data Disclosure in a Field Experiment: Evidence on Explicit Prices, Political Attitudes, and Privacy Preferences

    Directory of Open Access Journals (Sweden)

    Joachim Plesch

    2018-05-01

    Full Text Available Many people implicitly sell or give away their data when using online services and participating in loyalty programmes—despite growing concerns about company’s use of private data. Our paper studies potential reasons and co-variates that contribute to resolving this apparent paradox, which has not been studied previously. We ask customers of a bakery delivery service for their consent to disclose their personal data to a third party in exchange for a monetary rebate on their past orders. We study the role of implicitly and explicitly stated prices and add new determinants such as political orientation, income proxies and membership in loyalty programmes to the analysis of privacy decision. We document large heterogeneity in privacy valuations, and that the offered monetary benefits have less predictive power for data-disclosure decisions than expected. However, we find significant predictors of such decisions, such as political orientation towards liberal democrats (FDP and membership in loyalty programmes. We also find suggestive evidence that loyalty programmes are successful in disguising their “money for data” exchange mechanism.

  6. 77 FR 31371 - Public Workshop: Privacy Compliance Workshop

    Science.gov (United States)

    2012-05-25

    ... presentations, including the privacy compliance fundamentals, privacy and data security, and the privacy... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Public Workshop: Privacy Compliance... Homeland Security Privacy Office will host a public workshop, ``Privacy Compliance Workshop.'' DATES: The...

  7. Why should we respect the privacy of donors of biological material?

    Science.gov (United States)

    Tännsjö, Torbjörn

    2011-02-01

    Why should we respect the privacy of donors of biological material? The question is answered in the present article in general philosophical terms from the point of view of an ethics of honour, a libertarian theory of rights, a view of respect for privacy based on the idea that autonomy is of value in itself, and utilitarianism respectively. For different reasons the ethics of honour and the idea of the value of autonomy are set to one side. It surfaces that the moral rights theory and utilitarianism present conflicting answers to the question. The main thrust of the argument is that there is no way of finding an overlapping consensus, so politicians have to take decisions that are bound to be controversial in that they can be questioned on reasonable philosophical grounds.

  8. Privacy Expectations in Online Contexts

    Science.gov (United States)

    Pure, Rebekah Abigail

    2013-01-01

    Advances in digital networked communication technology over the last two decades have brought the issue of personal privacy into sharper focus within contemporary public discourse. In this dissertation, I explain the Fourth Amendment and the role that privacy expectations play in the constitutional protection of personal privacy generally, and…

  9. Second thoughts about privacy, safety and deception

    Science.gov (United States)

    Sorell, Tom; Draper, Heather

    2017-07-01

    In this paper, we point out some difficulties with interpreting three of five principles formulated at a retreat on robot ethics sponsored by the Arts and Humanities Council and the Engineering and Physical Sciences Research Council. We also attempt to iron out some conflicts between the principles. Some of the difficulties arise from the way that the autonomy of robot users - their capacity to live by their own choices - can be a goal in the design of care robots. We discuss (a) problems for Principle 2 that arise from competing legal and philosophical understandings of privacy; (b) a tension between privacy and safety (Principles 2 and 3) and (c) some scepticism about the application of Principle 4, which addresses robot design that might result in the deception of vulnerable users.

  10. Online Tracking Technologies and Web Privacy:Technologieën voor Online volgen en Web Privacy

    OpenAIRE

    Acar, Mustafa Gunes Can

    2017-01-01

    In my PhD thesis, I would like to study the problem of online privacy with a focus on Web and mobile applications. Key research questions to be addressed by my study are the following: How can we formalize and quantify web tracking? What are the threats presented against privacy by different tracking techniques such as browser fingerprinting and cookie based tracking? What kind of privacy enhancing technologies (PET) can be used to ensure privacy without degrading service quality? The stud...

  11. 39 CFR 262.5 - Systems (Privacy).

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Systems (Privacy). 262.5 Section 262.5 Postal... DEFINITIONS § 262.5 Systems (Privacy). (a) Privacy Act system of records. A Postal Service system containing... individual. (c) Computer matching program. A “matching program,” as defined in the Privacy Act, 5 U.S.C. 552a...

  12. Extending SQL to Support Privacy Policies

    Science.gov (United States)

    Ghazinour, Kambiz; Pun, Sampson; Majedi, Maryam; Chinaci, Amir H.; Barker, Ken

    Increasing concerns over Internet applications that violate user privacy by exploiting (back-end) database vulnerabilities must be addressed to protect both customer privacy and to ensure corporate strategic assets remain trustworthy. This chapter describes an extension onto database catalogues and Structured Query Language (SQL) for supporting privacy in Internet applications, such as in social networks, e-health, e-governmcnt, etc. The idea is to introduce new predicates to SQL commands to capture common privacy requirements, such as purpose, visibility, generalization, and retention for both mandatory and discretionary access control policies. The contribution is that corporations, when creating the underlying databases, will be able to define what their mandatory privacy policies arc with which all application users have to comply. Furthermore, each application user, when providing their own data, will be able to define their own privacy policies with which other users have to comply. The extension is supported with underlying catalogues and algorithms. The experiments demonstrate a very reasonable overhead for the extension. The result is a low-cost mechanism to create new systems that arc privacy aware and also to transform legacy databases to their privacy-preserving equivalents. Although the examples arc from social networks, one can apply the results to data security and user privacy of other enterprises as well.

  13. Is Electronic Privacy Achievable?

    National Research Council Canada - National Science Library

    Irvine, Cynthia E; Levin, Timothy E

    2000-01-01

    ... individuals. The purpose of this panel was to focus on how new technologies are affecting privacy. Technologies that might adversely affect privacy were identified by Rein Turn at previous symposia...

  14. Environmental psychology, Privacy in the workplace (analysis of ...

    African Journals Online (AJOL)

    Work environment in the organization is one of important issues that is important all the time and the importance of and how to create a conducive environment and efficient human resources is emphasized; An environment with the proper definition of privacy and jurisdiction, can increase the growth and productivity of ...

  15. Privacy driven internet ecosystem

    OpenAIRE

    Trinh, Tuan Anh; Gyarmati, Laszlo

    2012-01-01

    The dominant business model of today's Internet is built upon advertisements; users can access Internet services while the providers show ads to them. Although significant efforts have been made to model and analyze the economic aspects of this ecosystem, the heart of the current status quo, namely privacy, has not received the attention of the research community yet. Accordingly, we propose an economic model of the privacy driven Internet ecosystem where privacy is handled as an asset that c...

  16. Adding Query Privacy to Robust DHTs

    DEFF Research Database (Denmark)

    Backes, Michael; Goldberg, Ian; Kate, Aniket

    2011-01-01

    intermediate peers that (help to) route the queries towards their destinations. In this paper, we satisfy this requirement by presenting an approach for providing privacy for the keys in DHT queries. We use the concept of oblivious transfer (OT) in communication over DHTs to preserve query privacy without...... of obtaining query privacy over robust DHTs. Finally, we compare the performance of our privacy-preserving protocols with their more privacy-invasive counterparts. We observe that there is no increase in the message complexity and only a small overhead in the computational complexity....

  17. Data Fusion Research of Triaxial Human Body Motion Gesture based on Decision Tree

    Directory of Open Access Journals (Sweden)

    Feihong Zhou

    2014-05-01

    Full Text Available The development status of human body motion gesture data fusion domestic and overseas has been analyzed. A triaxial accelerometer is adopted to develop a wearable human body motion gesture monitoring system aimed at old people healthcare. On the basis of a brief introduction of decision tree algorithm, the WEKA workbench is adopted to generate a human body motion gesture decision tree. At last, the classification quality of the decision tree has been validated through experiments. The experimental results show that the decision tree algorithm could reach an average predicting accuracy of 97.5 % with lower time cost.

  18. Security and privacy issues in implantable medical devices: A comprehensive survey.

    Science.gov (United States)

    Camara, Carmen; Peris-Lopez, Pedro; Tapiador, Juan E

    2015-06-01

    Bioengineering is a field in expansion. New technologies are appearing to provide a more efficient treatment of diseases or human deficiencies. Implantable Medical Devices (IMDs) constitute one example, these being devices with more computing, decision making and communication capabilities. Several research works in the computer security field have identified serious security and privacy risks in IMDs that could compromise the implant and even the health of the patient who carries it. This article surveys the main security goals for the next generation of IMDs and analyzes the most relevant protection mechanisms proposed so far. On the one hand, the security proposals must have into consideration the inherent constraints of these small and implanted devices: energy, storage and computing power. On the other hand, proposed solutions must achieve an adequate balance between the safety of the patient and the security level offered, with the battery lifetime being another critical parameter in the design phase. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Genetic secrets: Protecting privacy and confidentiality in the genetic era. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-09-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  20. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  1. Reasoning, learning, and creativity: frontal lobe function and human decision-making.

    Directory of Open Access Journals (Sweden)

    Anne Collins

    Full Text Available The frontal lobes subserve decision-making and executive control--that is, the selection and coordination of goal-directed behaviors. Current models of frontal executive function, however, do not explain human decision-making in everyday environments featuring uncertain, changing, and especially open-ended situations. Here, we propose a computational model of human executive function that clarifies this issue. Using behavioral experiments, we show that unlike others, the proposed model predicts human decisions and their variations across individuals in naturalistic situations. The model reveals that for driving action, the human frontal function monitors up to three/four concurrent behavioral strategies and infers online their ability to predict action outcomes: whenever one appears more reliable than unreliable, this strategy is chosen to guide the selection and learning of actions that maximize rewards. Otherwise, a new behavioral strategy is tentatively formed, partly from those stored in long-term memory, then probed, and if competitive confirmed to subsequently drive action. Thus, the human executive function has a monitoring capacity limited to three or four behavioral strategies. This limitation is compensated by the binary structure of executive control that in ambiguous and unknown situations promotes the exploration and creation of new behavioral strategies. The results support a model of human frontal function that integrates reasoning, learning, and creative abilities in the service of decision-making and adaptive behavior.

  2. Reasoning, learning, and creativity: frontal lobe function and human decision-making.

    Science.gov (United States)

    Collins, Anne; Koechlin, Etienne

    2012-01-01

    The frontal lobes subserve decision-making and executive control--that is, the selection and coordination of goal-directed behaviors. Current models of frontal executive function, however, do not explain human decision-making in everyday environments featuring uncertain, changing, and especially open-ended situations. Here, we propose a computational model of human executive function that clarifies this issue. Using behavioral experiments, we show that unlike others, the proposed model predicts human decisions and their variations across individuals in naturalistic situations. The model reveals that for driving action, the human frontal function monitors up to three/four concurrent behavioral strategies and infers online their ability to predict action outcomes: whenever one appears more reliable than unreliable, this strategy is chosen to guide the selection and learning of actions that maximize rewards. Otherwise, a new behavioral strategy is tentatively formed, partly from those stored in long-term memory, then probed, and if competitive confirmed to subsequently drive action. Thus, the human executive function has a monitoring capacity limited to three or four behavioral strategies. This limitation is compensated by the binary structure of executive control that in ambiguous and unknown situations promotes the exploration and creation of new behavioral strategies. The results support a model of human frontal function that integrates reasoning, learning, and creative abilities in the service of decision-making and adaptive behavior.

  3. The Impact of Privacy Concerns and Perceived Vulnerability to Risks on Users Privacy Protection Behaviors on SNS: A Structural Equation Model

    OpenAIRE

    Noora Sami Al-Saqer; Mohamed E. Seliaman

    2016-01-01

    This research paper investigates Saudi users’ awareness levels about privacy policies in Social Networking Sites (SNSs), their privacy concerns and their privacy protection measures. For this purpose, a research model that consists of five main constructs namely information privacy concern, awareness level of privacy policies of social networking sites, perceived vulnerability to privacy risks, perceived response efficacy, and privacy protecting behavior was developed. An online survey questi...

  4. Regulating Online Data Privacy

    OpenAIRE

    Paul Reid

    2004-01-01

    With existing data protection laws proving inadequate in the fight to protect online data privacy and with the offline law of privacy in a state of change and uncertainty, the search for an alternative solution to the important problem of online data privacy should commence. With the inherent problem of jurisdiction that the Internet presents, such a solution is best coming from a multi-national body with the power to approximate laws in as many jurisdictions as possible, with a recognised au...

  5. Neural mechanisms underlying human consensus decision-making.

    Science.gov (United States)

    Suzuki, Shinsuke; Adachi, Ryo; Dunne, Simon; Bossaerts, Peter; O'Doherty, John P

    2015-04-22

    Consensus building in a group is a hallmark of animal societies, yet little is known about its underlying computational and neural mechanisms. Here, we applied a computational framework to behavioral and fMRI data from human participants performing a consensus decision-making task with up to five other participants. We found that participants reached consensus decisions through integrating their own preferences with information about the majority group members' prior choices, as well as inferences about how much each option was stuck to by the other people. These distinct decision variables were separately encoded in distinct brain areas-the ventromedial prefrontal cortex, posterior superior temporal sulcus/temporoparietal junction, and intraparietal sulcus-and were integrated in the dorsal anterior cingulate cortex. Our findings provide support for a theoretical account in which collective decisions are made through integrating multiple types of inference about oneself, others, and environments, processed in distinct brain modules. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Customer privacy on UK healthcare websites.

    Science.gov (United States)

    Mundy, Darren P

    2006-09-01

    Privacy has been and continues to be one of the key challenges of an age devoted to the accumulation, processing, and mining of electronic information. In particular, privacy of healthcare-related information is seen as a key issue as health organizations move towards the electronic provision of services. The aim of the research detailed in this paper has been to analyse privacy policies on popular UK healthcare-related websites to determine the extent to which consumer privacy is protected. The author has combined approaches (such as approaches focused on usability, policy content, and policy quality) used in studies by other researchers on e-commerce and US healthcare websites to provide a comprehensive analysis of UK healthcare privacy policies. The author identifies a wide range of issues related to the protection of consumer privacy through his research analysis using quantitative results. The main outcomes from the author's research are that only 61% of healthcare-related websites in their sample group posted privacy policies. In addition, most of the posted privacy policies had poor readability standards and included a variety of privacy vulnerability statements. Overall, the author's findings represent significant current issues in relation to healthcare information protection on the Internet. The hope is that raising awareness of these results will drive forward changes in the industry, similar to those experienced with information quality.

  7. Lattice Based Mix Network for Location Privacy in Mobile System

    Directory of Open Access Journals (Sweden)

    Kunwar Singh

    2015-01-01

    Full Text Available In 1981, David Chaum proposed a cryptographic primitive for privacy called mix network (Mixnet. A mixnet is cryptographic construction that establishes anonymous communication channel through a set of servers. In 2004, Golle et al. proposed a new cryptographic primitive called universal reencryption which takes the input as encrypted messages under the public key of the recipients not the public key of the universal mixnet. In Eurocrypt 2010, Gentry, Halevi, and Vaikunthanathan presented a cryptosystem which is an additive homomorphic and a multiplicative homomorphic for only one multiplication. In MIST 2013, Singh et al. presented a lattice based universal reencryption scheme under learning with error (LWE assumption. In this paper, we have improved Singh et al.’s scheme using Fairbrother’s idea. LWE is a lattice hard problem for which till now there is no polynomial time quantum algorithm. Wiangsripanawan et al. proposed a protocol for location privacy in mobile system using universal reencryption whose security is reducible to Decision Diffie-Hellman assumption. Once quantum computer becomes a reality, universal reencryption can be broken in polynomial time by Shor’s algorithm. In postquantum cryptography, our scheme can replace universal reencryption scheme used in Wiangsripanawan et al. scheme for location privacy in mobile system.

  8. Big data privacy protection model based on multi-level trusted system

    Science.gov (United States)

    Zhang, Nan; Liu, Zehua; Han, Hongfeng

    2018-05-01

    This paper introduces and inherit the multi-level trusted system model that solves the Trojan virus by encrypting the privacy of user data, and achieve the principle: "not to read the high priority hierarchy, not to write the hierarchy with low priority". Thus ensuring that the low-priority data privacy leak does not affect the disclosure of high-priority data privacy. This paper inherits the multi-level trustworthy system model of Trojan horse and divides seven different risk levels. The priority level 1˜7 represent the low to high value of user data privacy, and realize seven kinds of encryption with different execution efficiency Algorithm, the higher the priority, the greater the value of user data privacy, at the expense of efficiency under the premise of choosing a more encrypted encryption algorithm to ensure data security. For enterprises, the price point is determined by the unit equipment users to decide the length of time. The higher the risk sub-group algorithm, the longer the encryption time. The model assumes that users prefer the lower priority encryption algorithm to ensure efficiency. This paper proposes a privacy cost model for each of the seven risk subgroups. Among them, the higher the privacy cost, the higher the priority of the risk sub-group, the higher the price the user needs to pay to ensure the privacy of the data. Furthermore, by introducing the existing pricing model of economics and the human traffic model proposed by this paper and fluctuating with the market demand, this paper improves the price of unit products when the market demand is low. On the other hand, when the market demand increases, the profit of the enterprise will be guaranteed under the guidance of the government by reducing the price per unit of product. Then, this paper introduces the dynamic factors of consumers' mood and age to optimize. At the same time, seven algorithms are selected from symmetric and asymmetric encryption algorithms to define the enterprise

  9. Exercising privacy rights in medical science

    OpenAIRE

    Hillmer, Michael; Redelmeier, Donald A.

    2007-01-01

    Privacy laws are intended to preserve human well-being and improve medical outcomes. We used the Sportstats website, a repository of competitive athletic data, to test how easily these laws can be circumvented. We designed a haphazard, unrepresentative case-series analysis and applied unscientific methods based on an Internet connection and idle time. We found it both feasible and titillating to breach anonymity, stockpile personal information and generate misquotations. We extended our metho...

  10. Redefining Genomic Privacy: Trust and Empowerment

    OpenAIRE

    Erlich, Yaniv; Williams, James B.; Glazer, David; Yocum, Kenneth; Farahany, Nita; Olson, Maynard; Narayanan, Arvind; Stein, Lincoln D.; Witkowski, Jan A.; Kain, Robert C.

    2014-01-01

    Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we int...

  11. Information privacy fundamentals for librarians and information professionals

    CERN Document Server

    Givens, Cherie L

    2014-01-01

    This book introduces library and information professionals to information privacy, provides an overview of information privacy in the library and information science context, U.S. privacy laws by sector, information privacy policy, and key considerations when planning and creating a privacy program.

  12. Privacy-invading technologies : safeguarding privacy, liberty & security in the 21st century

    NARCIS (Netherlands)

    Klitou, Demetrius

    2012-01-01

    With a focus on the growing development and deployment of the latest technologies that threaten privacy, the PhD dissertation argues that the US and UK legal frameworks, in their present form, are inadequate to defend privacy and other civil liberties against the intrusive capabilities of body

  13. Opioid Modulation of Value-Based Decision-Making in Healthy Humans.

    Science.gov (United States)

    Eikemo, Marie; Biele, Guido; Willoch, Frode; Thomsen, Lotte; Leknes, Siri

    2017-08-01

    Modifying behavior to maximize reward is integral to adaptive decision-making. In rodents, the μ-opioid receptor (MOR) system encodes motivation and preference for high-value rewards. Yet it remains unclear whether and how human MORs contribute to value-based decision-making. We reasoned that if the human MOR system modulates value-based choice, this would be reflected by opposite effects of agonist and antagonist drugs. In a double-blind pharmacological cross-over study, 30 healthy men received morphine (10 mg), placebo, and the opioid antagonist naltrexone (50 mg). They completed a two-alternative decision-making task known to induce a considerable bias towards the most frequently rewarded response option. To quantify MOR involvement in this bias, we fitted accuracy and reaction time data with the drift-diffusion model (DDM) of decision-making. The DDM analysis revealed the expected bidirectional drug effects for two decision subprocesses. MOR stimulation with morphine increased the preference for the stimulus with high-reward probability (shift in starting point). Compared to placebo, morphine also increased, and naltrexone reduced, the efficiency of evidence accumulation. Since neither drug affected motor-coordination, speed-accuracy trade-off, or subjective state (indeed participants were still blinded after the third session), we interpret the MOR effects on evidence accumulation efficiency as a consequence of changes in effort exerted in the task. Together, these findings support a role for the human MOR system in value-based choice by tuning decision-making towards high-value rewards across stimulus domains.

  14. Privacy in Social Networks

    CERN Document Server

    Zheleva, Elena

    2012-01-01

    This synthesis lecture provides a survey of work on privacy in online social networks (OSNs). This work encompasses concerns of users as well as service providers and third parties. Our goal is to approach such concerns from a computer-science perspective, and building upon existing work on privacy, security, statistical modeling and databases to provide an overview of the technical and algorithmic issues related to privacy in OSNs. We start our survey by introducing a simple OSN data model and describe common statistical-inference techniques that can be used to infer potentially sensitive inf

  15. SIED, a Data Privacy Engineering Framework

    OpenAIRE

    Mivule, Kato

    2013-01-01

    While a number of data privacy techniques have been proposed in the recent years, a few frameworks have been suggested for the implementation of the data privacy process. Most of the proposed approaches are tailored towards implementing a specific data privacy algorithm but not the overall data privacy engineering and design process. Therefore, as a contribution, this study proposes SIED (Specification, Implementation, Evaluation, and Dissemination), a conceptual framework that takes a holist...

  16. User Privacy in RFID Networks

    Science.gov (United States)

    Singelée, Dave; Seys, Stefaan

    Wireless RFID networks are getting deployed at a rapid pace and have already entered the public space on a massive scale: public transport cards, the biometric passport, office ID tokens, customer loyalty cards, etc. Although RFID technology offers interesting services to customers and retailers, it could also endanger the privacy of the end-users. The lack of protection mechanisms being deployed could potentially result in a privacy leakage of personal data. Furthermore, there is the emerging threat of location privacy. In this paper, we will show some practical attack scenarios and illustrates some of them with cases that have received press coverage. We will present the main challenges of enhancing privacy in RFID networks and evaluate some solutions proposed in literature. The main advantages and shortcomings will be briefly discussed. Finally, we will give an overview of some academic and industrial research initiatives on RFID privacy.

  17. Comparison of two speech privacy measurements, articulation index (AI) and speech privacy noise isolation class (NIC'), in open workplaces

    Science.gov (United States)

    Yoon, Heakyung C.; Loftness, Vivian

    2002-05-01

    Lack of speech privacy has been reported to be the main dissatisfaction among occupants in open workplaces, according to workplace surveys. Two speech privacy measurements, Articulation Index (AI), standardized by the American National Standards Institute in 1969, and Speech Privacy Noise Isolation Class (NIC', Noise Isolation Class Prime), adapted from Noise Isolation Class (NIC) by U. S. General Services Administration (GSA) in 1979, have been claimed as objective tools to measure speech privacy in open offices. To evaluate which of them, normal privacy for AI or satisfied privacy for NIC', is a better tool in terms of speech privacy in a dynamic open office environment, measurements were taken in the field. AIs and NIC's in the different partition heights and workplace configurations have been measured following ASTM E1130 (Standard Test Method for Objective Measurement of Speech Privacy in Open Offices Using Articulation Index) and GSA test PBS-C.1 (Method for the Direct Measurement of Speech-Privacy Potential (SPP) Based on Subjective Judgments) and PBS-C.2 (Public Building Service Standard Method of Test Method for the Sufficient Verification of Speech-Privacy Potential (SPP) Based on Objective Measurements Including Methods for the Rating of Functional Interzone Attenuation and NC-Background), respectively.

  18. An Alternative View of Privacy on Facebook

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2011-02-01

    Full Text Available The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy tends to ignore the political economy of privacy in capitalism that can mask socio-economic inequality and protect capital and the rich from public accountability. Facebook is in this paper analyzed with the help of an approach, in which privacy for dominant groups, in regard to the ability of keeping wealth and power secret from the public, is seen as problematic, whereas privacy at the bottom of the power pyramid for consumers and normal citizens is seen as a protection from dominant interests. Facebook’s privacy concept is based on an understanding that stresses self-regulation and on an individualistic understanding of privacy. The theoretical analysis of the political economy of privacy on Facebook in this paper is based on the political theories of Karl Marx, Hannah Arendt and Jürgen Habermas. Based on the political economist Dallas Smythe’s concept of audience commodification, the process of prosumer commodification on Facebook is analyzed. The political economy of privacy on Facebook is analyzed with the help of a theory of drives that is grounded in Herbert Marcuse’s interpretation of Sigmund Freud, which allows to analyze Facebook based on the concept of play labor (= the convergence of play and labor.

  19. An Alternative View of Privacy on Facebook

    OpenAIRE

    Christian Fuchs

    2011-01-01

    The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy ...

  20. PriBots: Conversational Privacy with Chatbots

    OpenAIRE

    Harkous, Hamza; Fawaz, Kassem; Shin, Kang G.; Aberer, Karl

    2016-01-01

    Traditional mechanisms for delivering notice and enabling choice have so far failed to protect users’ privacy. Users are continuously frustrated by complex privacy policies, unreachable privacy settings, and a multitude of emerging standards. The miniaturization trend of smart devices and the emergence of the Internet of Things (IoTs) will exacerbate this problem further. In this paper, we propose Conversational Privacy Bots (PriBots) as a new way of delivering notice and choice through a two...

  1. Privacy Protection: Mandating New Arrangements to Implement and Assess Federal Privacy Policy and Practice

    National Research Council Canada - National Science Library

    Relyea, Harold C

    2004-01-01

    When Congress enacted the Privacy Act of 1974, it established a temporary national study commission to conduct a comprehensive assessment of privacy policy and practice in both the public and private...

  2. 24 CFR 3280.107 - Interior privacy.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Interior privacy. 3280.107 Section 3280.107 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... privacy. Bathroom and toilet compartment doors shall be equipped with a privacy lock. ...

  3. Online privacy: overview and preliminary research

    Directory of Open Access Journals (Sweden)

    Renata Mekovec

    2010-12-01

    Full Text Available Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4 Over the last decade using the Internet for online shopping, information browsing and searching as well as for online communication has become part of everyday life. Although the Internet technology has a lot of benefits for users, one of the most important disadvantages is related to the increasing capacity for users’ online activity surveillance. However, the users are increasingly becoming aware of online surveillance methods, which results in their increased concern for privacy protection. Numerous factors influence the way in which individuals perceive the level of privacy protection when they are online. This article provides a review of factors that influence the privacy perception of Internet users. Previous online privacy research related to e-business was predominantly focused on the dimension of information privacy and concerned with the way users’ personal information is collected, saved and used by an online company. This article’s main aim is to provide an overview of numerous Internet users’ privacy perception elements across various privacy dimensions as well as their potential categorization. In addition, considering that e-banking and online shopping are one of the most widely used e-services, an examination of online privacy perception of e-banking/online shopping users was performed. 

  4. Protection of the Locational Privacy Using Mosaic Theory of Data (Varstvo lokacijske zasebnosti s pomočjo mozaične teorije podatkov

    Directory of Open Access Journals (Sweden)

    Primož Križnar

    2016-12-01

    Full Text Available The individual’s right to privacy is one of the fundamental human rights. Part of this »embedded« right presents a person’s capability to move from a variety of different points and locations with reasonable expectation that performed paths, stops and current locations are not systematically recorded and stored for future use. Notwithstanding this, individuals often seem to be ignorant of the modern technology capabilities, which is aggressively interfering with wide spectrum of their privacy, part of which is also locational privacy. However, the following as one of the existential component of privacy must also be given all the necessary legal protection, which, at least for the time being, is reflected in the implementation of the mosaic theory in the European legal traditions with the help of established legal standards of the European Court of Human Rights regarding privacy.

  5. 45 CFR 503.1 - Definitions-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Definitions-Privacy Act. 503.1 Section 503.1... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.1 Definitions—Privacy Act. For the purpose of this part: Agency...

  6. 48 CFR 52.224-2 - Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Privacy Act. 52.224-2... AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 52.224-2 Privacy... agency function: Privacy Act (APR 1984) (a) The Contractor agrees to— (1) Comply with the Privacy Act of...

  7. Treatment of human-computer interface in a decision support system

    International Nuclear Information System (INIS)

    Heger, A.S.; Duran, F.A.; Cox, R.G.

    1992-01-01

    One of the most challenging applications facing the computer community is development of effective adaptive human-computer interface. This challenge stems from the complex nature of the human part of this symbiosis. The application of this discipline to the environmental restoration and waste management is further complicated due to the nature of environmental data. The information that is required to manage environmental impacts of human activity is fundamentally complex. This paper will discuss the efforts at Sandia National Laboratories in developing the adaptive conceptual model manager within the constraint of the environmental decision-making. A computer workstation, that hosts the Conceptual Model Manager and the Sandia Environmental Decision Support System will also be discussed

  8. Older and Wiser? Facebook Use, Privacy Concern, and Privacy Protection in the Life Stages of Emerging, Young, and Middle Adulthood

    Directory of Open Access Journals (Sweden)

    Evert Van den Broeck

    2015-11-01

    Full Text Available A large part of research conducted on privacy concern and protection on social networking sites (SNSs concentrates on children and adolescents. Individuals in these developmental stages are often described as vulnerable Internet users. But how vulnerable are adults in terms of online informational privacy? This study applied a privacy boundary management approach and investigated Facebook use, privacy concern, and the application of privacy settings on Facebook by linking the results to Erikson’s three stages of adulthood: emerging, young, and middle adulthood. An online survey was distributed among 18- to 65-year-old Dutch-speaking adults ( N  = 508, 51.8% females. Analyses revealed clear differences between the three adult age groups in terms of privacy concern, Facebook use, and privacy protection. Results indicated that respondents in young adulthood and middle adulthood were more vulnerable in terms of privacy protection than emerging adults. Clear discrepancies were found between privacy concern and protection for these age groups. More particularly, the middle adulthood group was more concerned about their privacy in comparison to the emerging adulthood and young adulthood group. Yet, they reported to use privacy settings less frequently than the younger age groups. Emerging adults were found to be pragmatic and privacy conscious SNS users. Young adults occupied the intermediate position, suggesting a developmental shift. The impact of generational differences is discussed, as well as implications for education and governmental action.

  9. 49 CFR 10.13 - Privacy Officer.

    Science.gov (United States)

    2010-10-01

    ... INDIVIDUALS General § 10.13 Privacy Officer. (a) To assist with implementation, evaluation, and administration issues, the Chief Information Officer appoints a principal coordinating official with the title Privacy... 49 Transportation 1 2010-10-01 2010-10-01 false Privacy Officer. 10.13 Section 10.13...

  10. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  11. 77 FR 33761 - Privacy Act of 1974; Notification to Update an Existing Privacy Act System of Records, “Grievance...

    Science.gov (United States)

    2012-06-07

    ... of a data breach. (See also on HUD's privacy Web site, Appendix I for other ways that the Privacy Act... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5613-N-04] Privacy Act of 1974; Notification to Update an Existing Privacy Act System of Records, ``Grievance Records'' AGENCY: Office of the...

  12. Online Privacy as a Corporate Social Responsibility

    DEFF Research Database (Denmark)

    Pollach, Irene

    2011-01-01

    Information technology and the Internet have added a new stakeholder concern to the corporate social responsibility agenda: online privacy. While theory suggests that online privacy is a corporate social responsibility, only very few studies in the business ethics literature have connected...... of the companies have comprehensive privacy programs, although more than half of them voice moral or relational motives for addressing online privacy. The privacy measures they have taken are primarily compliance measures, while measures that stimulate a stakeholder dialogue are rare. Overall, a wide variety...

  13. Privacy-related context information for ubiquitous health.

    Science.gov (United States)

    Seppälä, Antto; Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-03-11

    Ubiquitous health has been defined as a dynamic network of interconnected systems. A system is composed of one or more information systems, their stakeholders, and the environment. These systems offer health services to individuals and thus implement ubiquitous computing. Privacy is the key challenge for ubiquitous health because of autonomous processing, rich contextual metadata, lack of predefined trust among participants, and the business objectives. Additionally, regulations and policies of stakeholders may be unknown to the individual. Context-sensitive privacy policies are needed to regulate information processing. Our goal was to analyze privacy-related context information and to define the corresponding components and their properties that support privacy management in ubiquitous health. These properties should describe the privacy issues of information processing. With components and their properties, individuals can define context-aware privacy policies and set their privacy preferences that can change in different information-processing situations. Scenarios and user stories are used to analyze typical activities in ubiquitous health to identify main actors, goals, tasks, and stakeholders. Context arises from an activity and, therefore, we can determine different situations, services, and systems to identify properties for privacy-related context information in information-processing situations. Privacy-related context information components are situation, environment, individual, information technology system, service, and stakeholder. Combining our analyses and previously identified characteristics of ubiquitous health, more detailed properties for the components are defined. Properties define explicitly what context information for different components is needed to create context-aware privacy policies that can control, limit, and constrain information processing. With properties, we can define, for example, how data can be processed or how components

  14. Privacy-Aware Image Encryption Based on Logistic Map and Data Hiding

    Science.gov (United States)

    Sun, Jianglin; Liao, Xiaofeng; Chen, Xin; Guo, Shangwei

    The increasing need for image communication and storage has created a great necessity for securely transforming and storing images over a network. Whereas traditional image encryption algorithms usually consider the security of the whole plain image, region of interest (ROI) encryption schemes, which are of great importance in practical applications, protect the privacy regions of plain images. Existing ROI encryption schemes usually adopt approximate techniques to detect the privacy region and measure the quality of encrypted images; however, their performance is usually inconsistent with a human visual system (HVS) and is sensitive to statistical attacks. In this paper, we propose a novel privacy-aware ROI image encryption (PRIE) scheme based on logistical mapping and data hiding. The proposed scheme utilizes salient object detection to automatically, adaptively and accurately detect the privacy region of a given plain image. After private pixels have been encrypted using chaotic cryptography, the significant bits are embedded into the nonprivacy region of the plain image using data hiding. Extensive experiments are conducted to illustrate the consistency between our automatic ROI detection and HVS. Our experimental results also demonstrate that the proposed scheme exhibits satisfactory security performance.

  15. Enhancing Privacy for Digital Rights Management

    NARCIS (Netherlands)

    Petkovic, M.; Conrado, C.; Schrijen, G.J.; Jonker, Willem

    2007-01-01

    This chapter addresses privacy issues in DRM systems. These systems provide a means of protecting digital content, but may violate the privacy of users in that the content they purchase and their actions in the system can be linked to specific users. The chapter proposes a privacy-preserving DRM

  16. Location privacy protection in mobile networks

    CERN Document Server

    Liu, Xinxin

    2013-01-01

    This SpringerBrief analyzes the potential privacy threats in wireless and mobile network environments, and reviews some existing works. It proposes multiple privacy preserving techniques against several types of privacy threats that are targeting users in a mobile network environment. Depending on the network architecture, different approaches can be adopted. The first proposed approach considers a three-party system architecture where there is a trusted central authority that can be used to protect users? privacy. The second approach considers a totally distributed environment where users per

  17. Privacy enhanced recommender system

    NARCIS (Netherlands)

    Erkin, Zekeriya; Erkin, Zekeriya; Beye, Michael; Veugen, Thijs; Lagendijk, Reginald L.

    2010-01-01

    Recommender systems are widely used in online applications since they enable personalized service to the users. The underlying collaborative filtering techniques work on user’s data which are mostly privacy sensitive and can be misused by the service provider. To protect the privacy of the users, we

  18. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  19. The Privacy Calculus: Mobile Apps and User Perceptions of Privacy and Security

    Directory of Open Access Journals (Sweden)

    Elizabeth Fife

    2012-07-01

    Full Text Available A continuing stream of new mobile data services are being released that rely upon the collection of personal data to support a business model. New technologies including facial recognition, sensors and Near Field Communications (NFC will increasingly become a part of everyday services and applications that challenge traditional concepts of individual privacy. The average person as well as the “tech‐savvy” mobile phone user may not yet be fully aware of the extent to which their privacy and security are being affected through their mobile activities and how comparable this situation is to personal computer usage. We investigate perceptions and usage of mobile data services that appear to have specific privacy and security sensitivities, specifically social networking,\tbanking/payments\tand\thealth‐related activities. Our annual survey of smartphone users in the U.S. and Japan is presented from 2011. This nationally representative survey data is used to show demographic and cultural differences, and substantiate our hypotheses about the links between use and privacy concerns

  20. Simulation of human decision making

    Science.gov (United States)

    Forsythe, J Chris [Sandia Park, NM; Speed, Ann E [Albuquerque, NM; Jordan, Sabina E [Albuquerque, NM; Xavier, Patrick G [Albuquerque, NM

    2008-05-06

    A method for computer emulation of human decision making defines a plurality of concepts related to a domain and a plurality of situations related to the domain, where each situation is a combination of at least two of the concepts. Each concept and situation is represented in the computer as an oscillator output, and each situation and concept oscillator output is distinguishable from all other oscillator outputs. Information is input to the computer representative of detected concepts, and the computer compares the detected concepts with the stored situations to determine if a situation has occurred.

  1. Management and Decisions in the Structures of Human Activities

    Directory of Open Access Journals (Sweden)

    Tadeusz Galanc

    2017-01-01

    Full Text Available This article has been devoted to the key dimensions of decision-making. The main goal of the authors was to point out the role and effect of invariants of nature, logic and conceptual systems of science and management, which are extremely important in decision-making processes. The research hypothesis has been tested that the complexity of decision-making and management are determined by the state of reality (Nature. This hypothesis is related to the fact that in science there is currently no uniform methodology associated with decision-making, just as science is not methodologically uniform. One can even doubt whether it is possible to describe the essential dimensions of decisions undertaken by Man, as discussed in this article. These problems are not a novelty to science, since they have been analysed by many scientists in the past. The authors of the article present the complexity and diversity of concepts defining systems of decision-making and management, based on selected fields of knowledge which are generally relevant to this issue, in particular fields associated with ontology and epistemology. Therefore, the text refers broadly to investigating the reality of basic areas of human knowledge and the overlapping relationships between them. This applies to the so-called circle of the sciences proposed and examined by the psychologist J. Piaget. An additional aim of the authors was to create a text presenting contemporary human knowledge about the reality which surrounds us. To understand reality means to be in relative equilibrium with it. (original abstract

  2. Privacy Protection Research of Mobile RFID

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Radio Frequency Identification is one of the most controversial technologies at present.It is very difficult to detect who reads a tag incorporated into products owned by a person,a significant concern to privacy threats in RFID system arises from this reason.User privacy problem is prior considersion for mobile RFID service,because most mobile RFID service based on end-user service.Propose a solution for user privacy protection,which is a modification of EPC Class 1 Generation 2 protocol,and introduce a privacy protection scenario for mobile RFID service using this method.

  3. Analysis of Privacy on Social Networks

    OpenAIRE

    Tomandl, Luboš

    2015-01-01

    This thesis deals with a question of privacy in a context of social networks. The main substance of these services is the users' option to share an information about their lives. This alone can be a problem for privacy. In the first part of this thesis concentrates on the meaning of privacy as well as its value for both individuals and the society. In the next part the privacy threats on social networks, namely Facebook, are discussed. These threats are disclosed on four levels according to f...

  4. 75 FR 17938 - Privacy Act of 1974; Deletion of an Existing System of Records

    Science.gov (United States)

    2010-04-08

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Health Resources and Services Administration Privacy Act of 1974; Deletion of an Existing System of Records AGENCY: Department of Health and Human Services... address comments to: Policy Director, Bureau of Clinician Recruitment and Service (BCRS), Health Resources...

  5. Privacy by design in personal health monitoring.

    Science.gov (United States)

    Nordgren, Anders

    2015-06-01

    The concept of privacy by design is becoming increasingly popular among regulators of information and communications technologies. This paper aims at analysing and discussing the ethical implications of this concept for personal health monitoring. I assume a privacy theory of restricted access and limited control. On the basis of this theory, I suggest a version of the concept of privacy by design that constitutes a middle road between what I call broad privacy by design and narrow privacy by design. The key feature of this approach is that it attempts to balance automated privacy protection and autonomously chosen privacy protection in a way that is context-sensitive. In personal health monitoring, this approach implies that in some contexts like medication assistance and monitoring of specific health parameters one single automatic option is legitimate, while in some other contexts, for example monitoring in which relatives are receivers of health-relevant information rather than health care professionals, a multi-choice approach stressing autonomy is warranted.

  6. Pre-Capture Privacy for Small Vision Sensors.

    Science.gov (United States)

    Pittaluga, Francesco; Koppal, Sanjeev Jagannatha

    2017-11-01

    The next wave of micro and nano devices will create a world with trillions of small networked cameras. This will lead to increased concerns about privacy and security. Most privacy preserving algorithms for computer vision are applied after image/video data has been captured. We propose to use privacy preserving optics that filter or block sensitive information directly from the incident light-field before sensor measurements are made, adding a new layer of privacy. In addition to balancing the privacy and utility of the captured data, we address trade-offs unique to miniature vision sensors, such as achieving high-quality field-of-view and resolution within the constraints of mass and volume. Our privacy preserving optics enable applications such as depth sensing, full-body motion tracking, people counting, blob detection and privacy preserving face recognition. While we demonstrate applications on macro-scale devices (smartphones, webcams, etc.) our theory has impact for smaller devices.

  7. Through Patients' Eyes: Regulation, Technology, Privacy, and the Future.

    Science.gov (United States)

    Petersen, Carolyn

    2018-04-22

    Privacy is commonly regarded as a regulatory requirement achieved via technical and organizational management practices. Those working in the field of informatics often play a role in privacy preservation as a result of their expertise in information technology, workflow analysis, implementation science, or related skills. Viewing privacy from the perspective of patients whose protected health information is at risk broadens the considerations to include the perceived duality of privacy; the existence of privacy within a context unique to each patient; the competing needs inherent within privacy management; the need for particular consideration when data are shared; and the need for patients to control health information in a global setting. With precision medicine, artificial intelligence, and other treatment innovations on the horizon, health care professionals need to think more broadly about how to preserve privacy in a health care environment driven by data sharing. Patient-reported privacy preferences, privacy portability, and greater transparency around privacy-preserving functionalities are potential strategies for ensuring that privacy regulations are met and privacy is preserved. Georg Thieme Verlag KG Stuttgart.

  8. Development of a statistical method for predicting human driver decisions.

    Science.gov (United States)

    2015-09-01

    As autonomous vehicles enter the fleet, there will be a long period when these vehicles will have to interact with : human drivers. One of the challenges for autonomous vehicles is that human drivers do not communicate their : decisions well. However...

  9. 75 FR 9012 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/U.S. Department of Health and...

    Science.gov (United States)

    2010-02-26

    ... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), amended... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2009-0052] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ U.S. Department of Health and Human Services (HHS), Administration for...

  10. The privacy implications of Bluetooth

    OpenAIRE

    Kostakos, Vassilis

    2008-01-01

    A substantial amount of research, as well as media hype, has surrounded RFID technology and its privacy implications. Currently, researchers and the media focus on the privacy threats posed by RFID, while consumer groups choose to boycott products bearing RFID tags. At the same, however, a very similar technology has quietly become part of our everyday lives: Bluetooth. In this paper we highlight the fact that Bluetooth is a widespread technology that has real privacy implications. Furthermor...

  11. Big data privacy: The datafication of personal information

    DEFF Research Database (Denmark)

    Mai, Jens-Erik

    2016-01-01

    . This broadened approach will take our thinking beyond current preoccupation with whether or not individuals’ consent was secured for data collection to privacy issues arising from the development of new information on individuals' likely behavior through analysis of already collected data—this new information......In the age of big data we need to think differently about privacy. We need to shift our thinking from definitions of privacy (characteristics of privacy) to models of privacy (how privacy works). Moreover, in addition to the existing models of privacy—the surveillance model and capture model......—we need to also consider a new model: the datafication model presented in this article, wherein new personal information is deduced by employing predictive analytics on already-gathered data. These three models of privacy supplement each other; they are not competing understandings of privacy...

  12. Human and Citizen Rights Guarantees While Providing Information Security

    Directory of Open Access Journals (Sweden)

    Serhii Yesimov

    2018-05-01

    Full Text Available With the development of information and communication technologies, issues of providing information security are becoming more and more aggravated. These are crimes related to the use of electronic computers, systems and computer networks and telecommunication networks, the propaganda of separatism and extremism, etc. While providing information security in the digital environment, the role of technical and legal human rights guarantees, due to technical means of protection, is increasing. Relying on the developers of technical means of protection determines the difference between the aforesaid concepts and the traditional approach to ensuring the protection of human and citizen rights, in which responsibilities are put on information intermediaries, owners of confidential information. Technical guarantees of human rights are a necessary component of ensuring information security, but the effectiveness of the application is provided in conjunction with the legal guarantees of human rights, as evidenced by the tendency to recognize the principles of inviolability of privacy on the basis of design decisions in the law of the European Union as legal acts. Providing information security is a legitimate goal of establishing constraints of human rights, since it can be correlated with the norms of international law. The establishment of constraints of human rights is permissible in order to attain other objectives–ensuring state security, public order, health, rights and freedoms of the person in the information sphere. The legitimacy of this goal is determined by its compliance with the objectives envisaged by international agreements ratified in an established order. The article examines the impact of the use of technical means in the field of providing information security in the aspect of following the fundamental human and civil rights in Ukraine, taking into account the legislation of the European Union and the decision of the European Court

  13. IoT Privacy and Security Challenges for Smart Home Environments

    Directory of Open Access Journals (Sweden)

    Huichen Lin

    2016-07-01

    Full Text Available Often the Internet of Things (IoT is considered as a single problem domain, with proposed solutions intended to be applied across a wide range of applications. However, the privacy and security needs of critical engineering infrastructure or sensitive commercial operations are very different to the needs of a domestic Smart Home environment. Additionally, the financial and human resources available to implement security and privacy vary greatly between application domains. In domestic environments, human issues may be as important as technical issues. After surveying existing solutions for enhancing IoT security, the paper identifies key future requirements for trusted Smart Home systems. A gateway architecture is selected as the most appropriate for resource-constrained devices, and for high system availability. Two key technologies to assist system auto-management are identified. Firstly, support for system auto-configuration will enhance system security. Secondly, the automatic update of system software and firmware is needed to maintain ongoing secure system operation.

  14. Decision-making in the inductive mode : The role of human behavior

    OpenAIRE

    Nobel, Johan

    2013-01-01

    Economists have convulsively maintained the assumption that humans are able to arrive at decisions by perfect deductive rationality, despite the fact empirical evidences are showing otherwise. The contradicting evidences have resulted in a personal view that instead of finding a unified theory about decision-making, a sound approach would be to study how humans in fact are reasoning in specific contexts. The context of interest for this paper is where it could be assumed humans’ persistence o...

  15. The Insertion of Human Factors Concerns into NextGen Programmatic Decisions

    Science.gov (United States)

    Beard, Bettina L.; Holbrook, Jon Brian; Seely, Rachel

    2013-01-01

    Since the costs of proposed improvements in air traffic management exceed available funding, FAA decision makers must select and prioritize what actually gets implemented. We discuss a set of methods to help forecast operational and human performance issues and benefits before new automation is introduced. This strategy could minimize the impact of politics, assist decision makers in selecting and prioritizing potential improvements, make the process more transparent and strengthen the link between the engineering and human factors domains.

  16. Social Media Users’ Legal Consciousness About Privacy

    Directory of Open Access Journals (Sweden)

    Katharine Sarikakis

    2017-02-01

    Full Text Available This article explores the ways in which the concept of privacy is understood in the context of social media and with regard to users’ awareness of privacy policies and laws in the ‘Post-Snowden’ era. In the light of presumably increased public exposure to privacy debates, generated partly due to the European “Right to be Forgotten” ruling and the Snowden revelations on mass surveillance, this article explores users’ meaning-making of privacy as a matter of legal dimension in terms of its violations and threats online and users’ ways of negotiating their Internet use, in particular social networking sites. Drawing on the concept of legal consciousness, this article explores through focus group interviews the ways in which social media users negotiate privacy violations and what role their understanding of privacy laws (or lack thereof might play in their strategies of negotiation. The findings are threefold: first, privacy is understood almost universally as a matter of controlling one’s own data, including information disclosure even to friends, and is strongly connected to issues about personal autonomy; second, a form of resignation with respect to control over personal data appears to coexist with a recognized need to protect one’s private data, while respondents describe conscious attempts to circumvent systems of monitoring or violation of privacy, and third, despite widespread coverage of privacy legal issues in the press, respondents’ concerns about and engagement in “self-protecting” tactics derive largely from being personally affected by violations of law and privacy.

  17. Post-event human decision errors: operator action tree/time reliability correlation

    International Nuclear Information System (INIS)

    Hall, R.E.; Fragola, J.; Wreathall, J.

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations

  18. Post-event human decision errors: operator action tree/time reliability correlation

    Energy Technology Data Exchange (ETDEWEB)

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  19. 31 CFR 0.216 - Privacy Act.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Privacy Act. 0.216 Section 0.216... RULES OF CONDUCT Rules of Conduct § 0.216 Privacy Act. Employees involved in the design, development, operation, or maintenance of any system of records or in maintaining records subject to the Privacy Act of...

  20. Privacy-Related Context Information for Ubiquitous Health

    Science.gov (United States)

    Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-01-01

    Background Ubiquitous health has been defined as a dynamic network of interconnected systems. A system is composed of one or more information systems, their stakeholders, and the environment. These systems offer health services to individuals and thus implement ubiquitous computing. Privacy is the key challenge for ubiquitous health because of autonomous processing, rich contextual metadata, lack of predefined trust among participants, and the business objectives. Additionally, regulations and policies of stakeholders may be unknown to the individual. Context-sensitive privacy policies are needed to regulate information processing. Objective Our goal was to analyze privacy-related context information and to define the corresponding components and their properties that support privacy management in ubiquitous health. These properties should describe the privacy issues of information processing. With components and their properties, individuals can define context-aware privacy policies and set their privacy preferences that can change in different information-processing situations. Methods Scenarios and user stories are used to analyze typical activities in ubiquitous health to identify main actors, goals, tasks, and stakeholders. Context arises from an activity and, therefore, we can determine different situations, services, and systems to identify properties for privacy-related context information in information-processing situations. Results Privacy-related context information components are situation, environment, individual, information technology system, service, and stakeholder. Combining our analyses and previously identified characteristics of ubiquitous health, more detailed properties for the components are defined. Properties define explicitly what context information for different components is needed to create context-aware privacy policies that can control, limit, and constrain information processing. With properties, we can define, for example, how

  1. 75 FR 28051 - Public Workshop: Pieces of Privacy

    Science.gov (United States)

    2010-05-19

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Public Workshop: Pieces of Privacy AGENCY: Privacy Office, DHS. ACTION: Notice announcing public workshop. SUMMARY: The Department of Homeland Security Privacy Office will host a public workshop, ``Pieces of Privacy.'' DATES: The workshop will be...

  2. Data Mining and Privacy of Social Network Sites' Users: Implications of the Data Mining Problem.

    Science.gov (United States)

    Al-Saggaf, Yeslam; Islam, Md Zahidul

    2015-08-01

    This paper explores the potential of data mining as a technique that could be used by malicious data miners to threaten the privacy of social network sites (SNS) users. It applies a data mining algorithm to a real dataset to provide empirically-based evidence of the ease with which characteristics about the SNS users can be discovered and used in a way that could invade their privacy. One major contribution of this article is the use of the decision forest data mining algorithm (SysFor) to the context of SNS, which does not only build a decision tree but rather a forest allowing the exploration of more logic rules from a dataset. One logic rule that SysFor built in this study, for example, revealed that anyone having a profile picture showing just the face or a picture showing a family is less likely to be lonely. Another contribution of this article is the discussion of the implications of the data mining problem for governments, businesses, developers and the SNS users themselves.

  3. Using genetic information while protecting the privacy of the soul.

    Science.gov (United States)

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  4. Genome privacy: challenges, technical approaches to mitigate risk, and ethical considerations in the United States

    Science.gov (United States)

    Wang, Shuang; Jiang, Xiaoqian; Singh, Siddharth; Marmor, Rebecca; Bonomi, Luca; Fox, Dov; Dow, Michelle; Ohno-Machado, Lucila

    2016-01-01

    Accessing and integrating human genomic data with phenotypes is important for biomedical research. Making genomic data accessible for research purposes, however, must be handled carefully to avoid leakage of sensitive individual information to unauthorized parties and improper use of data. In this article, we focus on data sharing within the scope of data accessibility for research. Current common practices to gain biomedical data access are strictly rule based, without a clear and quantitative measurement of the risk of privacy breaches. In addition, several types of studies require privacy-preserving linkage of genotype and phenotype information across different locations (e.g., genotypes stored in a sequencing facility and phenotypes stored in an electronic health record) to accelerate discoveries. The computer science community has developed a spectrum of techniques for data privacy and confidentiality protection, many of which have yet to be tested on real-world problems. In this article, we discuss clinical, technical, and ethical aspects of genome data privacy and confidentiality in the United States, as well as potential solutions for privacy-preserving genotype–phenotype linkage in biomedical research. PMID:27681358

  5. Do Privacy Concerns Matter for Millennials?

    DEFF Research Database (Denmark)

    Fodor, Mark; Brem, Alexander

    2015-01-01

    data have raised the question, if location data are considered as sensitive data by users. Thus, we use two privacy concern models, namely Concern for Information Privacy (CFIP) and Internet Users’ Information Privacy Concerns (IUIPC) to find out. Our sample comprises of 235 individuals between 18...... and 34 years (Generation C) from Germany. The results of this study indicate that the second-order factor IUIPC showed better fit for the underlying data than CFIP did. Overall privacy concerns have been found to have an impact on behavioral intentions of users for LBS adoption. Furthermore, other risk...

  6. Predicting user concerns about online privacy in Hong Kong.

    Science.gov (United States)

    Yao, Mike Z; Zhang, Jinguang

    2008-12-01

    Empirical studies on people's online privacy concerns have largely been conducted in the West. The global threat of privacy violations on the Internet calls for similar studies to be done in non-Western regions. To fill this void, the current study develops a path model to investigate the influence of people's Internet use-related factors, their beliefs in the right to privacy, and psychological need for privacy on Hong Kong people's concerns about online privacy. Survey responses from 332 university students were analyzed. Results from this study show that people's belief in the right to privacy was the most important predictor of their online privacy concerns. It also significantly mediated the relationship between people's psychological need for privacy and their concerns with privacy violations online. Moreover, while frequent use of the Internet may increase concerns about online privacy issues, Internet use diversity may actually reduce such worries. The final model, well supported by the observed data, successfully explained 25% of the variability in user concerns about online privacy.

  7. Privacy Issues: Journalists Should Balance Need for Privacy with Need to Cover News.

    Science.gov (United States)

    Plopper, Bruce

    1998-01-01

    Notes that journalists have to balance their desire to print the news with personal rights to privacy. Argues that a working knowledge of ethics and law helps journalism students resolve such issues. Discusses ethical issues; legal aspects of privacy; and "training" administrators. Offers a list of questions to ask, six notable court…

  8. The Regulatory Framework for Privacy and Security

    Science.gov (United States)

    Hiller, Janine S.

    The internet enables the easy collection of massive amounts of personally identifiable information. Unregulated data collection causes distrust and conflicts with widely accepted principles of privacy. The regulatory framework in the United States for ensuring privacy and security in the online environment consists of federal, state, and self-regulatory elements. New laws have been passed to address technological and internet practices that conflict with privacy protecting policies. The United States and the European Union approaches to privacy differ significantly, and the global internet environment will likely cause regulators to face the challenge of balancing privacy interests with data collection for many years to come.

  9. An anonymization technique using intersected decision trees

    Directory of Open Access Journals (Sweden)

    Sam Fletcher

    2015-07-01

    Full Text Available Data mining plays an important role in analyzing the massive amount of data collected in today’s world. However, due to the public’s rising awareness of privacy and lack of trust in organizations, suitable Privacy Preserving Data Mining (PPDM techniques have become vital. A PPDM technique provides individual privacy while allowing useful data mining. We present a novel noise addition technique called Forest Framework, two novel data quality evaluation techniques called EDUDS and EDUSC, and a security evaluation technique called SERS. Forest Framework builds a decision forest from a dataset and preserves all the patterns (logic rules of the forest while adding noise to the dataset. We compare Forest Framework to its predecessor, Framework, and another established technique, GADP. Our comparison is done using our three evaluation criteria, as well as Prediction Accuracy. Our experimental results demonstrate the success of our proposed extensions to Framework and the usefulness of our evaluation criteria.

  10. Privacy-Preserving Trajectory Collection

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Xuegang, Huang; Pedersen, Torben Bach

    2008-01-01

    In order to provide context--aware Location--Based Services, real location data of mobile users must be collected and analyzed by spatio--temporal data mining methods. However, the data mining methods need precise location data, while the mobile users want to protect their location privacy....... To remedy this situation, this paper first formally defines novel location privacy requirements. Then, it briefly presents a system for privacy--preserving trajectory collection that meets these requirements. The system is composed of an untrusted server and clients communicating in a P2P network. Location...... data is anonymized in the system using data cloaking and data swapping techniques. Finally, the paper empirically demonstrates that the proposed system is effective and feasible....

  11. Data Security and Privacy in Apps for Dementia: An Analysis of Existing Privacy Policies.

    Science.gov (United States)

    Rosenfeld, Lisa; Torous, John; Vahia, Ipsit V

    2017-08-01

    Despite tremendous growth in the number of health applications (apps), little is known about how well these apps protect their users' health-related data. This gap in knowledge is of particular concern for apps targeting people with dementia, whose cognitive impairment puts them at increased risk of privacy breaches. In this article, we determine how many dementia apps have privacy policies and how well they protect user data. Our analysis included all iPhone apps that matched the search terms "medical + dementia" or "health & fitness + dementia" and collected user-generated content. We evaluated all available privacy policies for these apps based on criteria that systematically measure how individual user data is handled. Seventy-two apps met the above search teams and collected user data. Of these, only 33 (46%) had an available privacy policy. Nineteen of the 33 with policies (58%) were specific to the app in question, and 25 (76%) specified how individual-user as opposed to aggregate data would be handled. Among these, there was a preponderance of missing information, the majority acknowledged collecting individual data for internal purposes, and most admitted to instances in which they would share user data with outside parties. At present, the majority of health apps focused on dementia lack a privacy policy, and those that do exist lack clarity. Bolstering safeguards and improving communication about privacy protections will help facilitate consumer trust in apps, thereby enabling more widespread and meaningful use by people with dementia and those involved in their care. Copyright © 2017. Published by Elsevier Inc.

  12. The medical examination in United States immigration applications: the potential use of genetic testing leads to heightened privacy concerns.

    Science.gov (United States)

    Burroughs, A Maxwell

    2005-01-01

    The medical examination has been an integral part of the immigration application process since the passing of the Immigration Act of 1891. Failing the medical examination can result in denial of the application. Over the years the medical examination has been expanded to include questioning about diseases that are scientifically shown to be rooted in an individual's genetic makeup. Recent advances in the fields of genomics and bioinformatics are making accurate and precise screening for these conditions a reality. Government policymakers will soon be faced with decisions regarding whether or not to sanction the use of these newly-developed genetic tests in the immigration application procedure. The terror threat currently facing the United States may ultimately bolster the argument in favor of genetic testing and/or DNA collection of applicants. However, the possibility of a government mandate requiring genetic testing raises a host of ethical issues; including the threat of eugenics and privacy concerns. Genetic testing has the ability to uncover a wealth of sensitive medical information about an individual and currently there are no medical information privacy protections afforded to immigration applicants. This article examines the potential for genetic testing in the immigration application process and the ethical issues surrounding this testing. In particular, this article explores the existing framework of privacy protections afforded to individuals living in the United States and how this and newly-erected standards like those released by the Health and Human Services (HHS) might apply to individuals seeking to immigrate to the United States.

  13. Where is the harm in a privacy violation : Calculating the damages afforded in privacy cases by the European Court of Human Rights

    NARCIS (Netherlands)

    van der Sloot, Bart

    2017-01-01

    It has always been difficult to pinpoint what harm follows a privacy violation. What harm is done by someone entering your home without permission, or by the state eavesdropping on a telephone conversation when no property is stolen or information disclosed to third parties? The question is becoming

  14. Explicable Planning and Replanning for Human-in-the-loop Decision Support

    Data.gov (United States)

    National Aeronautics and Space Administration — For the decision support scenarios that are particularly relevant to NASA, such as planning for human space missions, human operators will need a system that can (i)...

  15. Privacy concerns, dead or misunderstood? : The perceptions of privacy amongst the young and old

    NARCIS (Netherlands)

    Steijn, Wouter; Vedder, Anton

    2015-01-01

    The concept of ‘privacy’ has become an important topic for academics and policy-makers. Ubiquitous computing and internet access raise new questions in relation to privacy in the virtual world, including individuals’ appreciation of privacy and how this can be safeguarded. This article contributes

  16. Privacy in the Genomic Era

    Science.gov (United States)

    NAVEED, MUHAMMAD; AYDAY, ERMAN; CLAYTON, ELLEN W.; FELLAY, JACQUES; GUNTER, CARL A.; HUBAUX, JEAN-PIERRE; MALIN, BRADLEY A.; WANG, XIAOFENG

    2015-01-01

    Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highly-detailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward. PMID:26640318

  17. Privacy in the Genomic Era.

    Science.gov (United States)

    Naveed, Muhammad; Ayday, Erman; Clayton, Ellen W; Fellay, Jacques; Gunter, Carl A; Hubaux, Jean-Pierre; Malin, Bradley A; Wang, Xiaofeng

    2015-09-01

    Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highly-detailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward.

  18. Comparative Approaches to Biobanks and Privacy.

    Science.gov (United States)

    Rothstein, Mark A; Knoppers, Bartha Maria; Harrell, Heather L

    2016-03-01

    Laws in the 20 jurisdictions studied for this project display many similar approaches to protecting privacy in biobank research. Although few have enacted biobank-specific legislation, many countries address biobanking within other laws. All provide for some oversight mechanisms for biobank research, even though the nature of that oversight varies between jurisdictions. Most have some sort of controlled access system in place for research with biobank specimens. While broad consent models facilitate biobanking, countries without national or federated biobanks have been slow to adopt broad consent. International guidelines have facilitated sharing and generally take a proportional risk approach, but many countries have provisions guiding international sharing and a few even limit international sharing. Although privacy laws may not prohibit international collaborations, the multi-prong approach to privacy unique to each jurisdiction can complicate international sharing. These symposium issues can serve as a resource for explaining the sometimes intricate privacy laws in each studied jurisdiction, outlining the key issues with regards to privacy and biobanking, and serving to describe a framework for the process of harmonization of privacy laws. © 2016 American Society of Law, Medicine & Ethics.

  19. Privacy Implications of Surveillance Systems

    DEFF Research Database (Denmark)

    Thommesen, Jacob; Andersen, Henning Boje

    2009-01-01

    This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed for vari......This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed...... for various purposes, even as a service for those being observed, but in any case they will to some degree invade their privacy. The model provided here can indicate how invasive any particular system may be – and be used to compare the invasiveness of different systems. Applying a functional approach......, the model is established by first considering the social function of privacy in everyday life, which in turn lets us determine which different domains will be considered as private, and finally identify the different types of privacy invasion. This underlying model (function – domain – invasion) then serves...

  20. Privacy and the Connected Society

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Khajuria, Samant; Skouby, Knud Erik

    The Vision of the 5G enabled connected society is highly based on the evolution and implementation of Internet of Things. This involves, amongst others, a significant raise in devices, sensors and communication in pervasive interconnections as well as cooperation amongst devices and entities across...... the society. Enabling the vision of the connected society, researchers point in the direction of security and privacy as areas to challenge the vision. By use of the Internet of Things reference model as well as the vision of the connected society, this paper identifies privacy of the individual with respect...... to three selected areas: Shopping, connected cars and online gaming. The paper concludes that privacy is a complexity within the connected society vision and that thee is a need for more privacy use cases to shed light on the challenge....

  1. Privacy-preserving Kruskal-Wallis test.

    Science.gov (United States)

    Guo, Suxin; Zhong, Sheng; Zhang, Aidong

    2013-10-01

    Statistical tests are powerful tools for data analysis. Kruskal-Wallis test is a non-parametric statistical test that evaluates whether two or more samples are drawn from the same distribution. It is commonly used in various areas. But sometimes, the use of the method is impeded by privacy issues raised in fields such as biomedical research and clinical data analysis because of the confidential information contained in the data. In this work, we give a privacy-preserving solution for the Kruskal-Wallis test which enables two or more parties to coordinately perform the test on the union of their data without compromising their data privacy. To the best of our knowledge, this is the first work that solves the privacy issues in the use of the Kruskal-Wallis test on distributed data. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Negotiating privacy in surveillant welfare relations

    DEFF Research Database (Denmark)

    Andersen, Lars Bo; Lauritsen, Peter; Bøge, Ask Risom

    . However, while privacy is central to debates of surveillance, it has proven less productive as an analytical resource for studying surveillance in practice. Consequently, this paper reviews different conceptualisations of privacy in relation to welfare and surveillance and argues for strengthening...... the analytical capacity of the concept by rendering it a situated and relational concept. The argument is developed through a research and design project called Teledialogue meant to improve the relation between case managers and children placed at institutions or in foster families. Privacy in Teledialogue...... notion of privacy are discussed in relation to both research- and public debates on surveillance in a welfare setting....

  3. 32 CFR 311.7 - OSD/JS Privacy Office Processes.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 2 2010-07-01 2010-07-01 false OSD/JS Privacy Office Processes. 311.7 Section...) PRIVACY PROGRAM OFFICE OF THE SECRETARY OF DEFENSE AND JOINT STAFF PRIVACY PROGRAM § 311.7 OSD/JS Privacy Office Processes. The OSD/JS Privacy Office shall: (a) Exercise oversight and administrative control of...

  4. 32 CFR 701.101 - Privacy program terms and definitions.

    Science.gov (United States)

    2010-07-01

    ... from a project on privacy issues, identifying and resolving the privacy risks, and approval by a... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy program terms and definitions. 701.101... DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.101 Privacy program terms and...

  5. An informational theory of privacy

    NARCIS (Netherlands)

    Schottmuller, C.; Jann, Ole

    2016-01-01

    We develop a theory that explains how and when privacy can increase welfare. Without privacy, some individuals misrepresent their preferences, because they will otherwise be statistically discriminated against. This "chilling effect" hurts them individually, and impairs information aggregation. The

  6. 45 CFR 503.2 - General policies-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false General policies-Privacy Act. 503.2 Section 503.2... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.2 General policies—Privacy Act. The Commission will protect the...

  7. SmartPrivacy for the smart grid : embedding privacy into the design of electricity conservation

    Energy Technology Data Exchange (ETDEWEB)

    Cavoukian, A. [Ontario Information and Privacy Commissioner, Toronto, ON (Canada); Polonetsky, J.; Wolf, C. [Future of Privacy Forum, Washington, DC (United States)

    2009-11-15

    Modernization efforts are underway to make the current electrical grid smarter. The future of the Smart Grid will be capable of informing consumers of their day-to-day energy use, curbing greenhouse gas emissions, and reducing consumers' energy bills. However, the Smart Grid also brings with it the possibility of collecting detailed information on individual energy consumption use and patterns within peoples' homes. This paper discussed the Smart Grid and its benefits, as well as the questions that should be examined regarding privacy. The paper also outlined the concept of SmartPrivacy and discussed its application to the Smart Grid scenario. Privacy by design foundational principles and Smart Grid components were also presented in an appendix. It was concluded that the information collected on a Smart Grid will form a library of personal information. The mishandling of this information could be extremely invasive of consumer privacy. 46 refs., 1 fig., 2 appendices.

  8. 16 CFR 313.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Model privacy form and examples. 313.2... PRIVACY OF CONSUMER FINANCIAL INFORMATION § 313.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  9. Preserving differential privacy under finite-precision semantics.

    Directory of Open Access Journals (Sweden)

    Ivan Gazeau

    2013-06-01

    Full Text Available The approximation introduced by finite-precision representation of continuous data can induce arbitrarily large information leaks even when the computation using exact semantics is secure. Such leakage can thus undermine design efforts aimed at protecting sensitive information. We focus here on differential privacy, an approach to privacy that emerged from the area of statistical databases and is now widely applied also in other domains. In this approach, privacy is protected by the addition of noise to a true (private value. To date, this approach to privacy has been proved correct only in the ideal case in which computations are made using an idealized, infinite-precision semantics. In this paper, we analyze the situation at the implementation level, where the semantics is necessarily finite-precision, i.e. the representation of real numbers and the operations on them, are rounded according to some level of precision. We show that in general there are violations of the differential privacy property, and we study the conditions under which we can still guarantee a limited (but, arguably, totally acceptable variant of the property, under only a minor degradation of the privacy level. Finally, we illustrate our results on two cases of noise-generating distributions: the standard Laplacian mechanism commonly used in differential privacy, and a bivariate version of the Laplacian recently introduced in the setting of privacy-aware geolocation.

  10. A Privacy Model for RFID Tag Ownership Transfer

    Directory of Open Access Journals (Sweden)

    Xingchun Yang

    2017-01-01

    Full Text Available The ownership of RFID tag is often transferred from one owner to another in its life cycle. To address the privacy problem caused by tag ownership transfer, we propose a tag privacy model which captures the adversary’s abilities to get secret information inside readers, to corrupt tags, to authenticate tags, and to observe tag ownership transfer processes. This model gives formal definitions for tag forward privacy and backward privacy and can be used to measure the privacy property of tag ownership transfer scheme. We also present a tag ownership transfer scheme, which is privacy-preserving under the proposed model and satisfies the other common security requirements, in addition to achieving better performance.

  11. 12 CFR 716.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Model privacy form and examples. 716.2 Section... PRIVACY OF CONSUMER FINANCIAL INFORMATION § 716.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  12. Privacy-preserving distributed clustering

    DEFF Research Database (Denmark)

    Erkin, Zekeriya; Veugen, Thijs; Toft, Tomas

    2013-01-01

    with any other entity, including the service provider. Such privacy concerns lead to trust issues between entities, which clearly damages the functioning of the service and even blocks cooperation between entities with similar data sets. To enable joint efforts with private data, we propose a protocol......, or in some cases, information from different databases is pooled to enrich the data so that the merged database can improve the clustering effort. However, in either case, the content of the database may be privacy sensitive and/or commercially valuable such that the owners may not want to share their data...... provider with computations. Experimental results clearly indicate that the work we present is an efficient way of deploying a privacy-preserving clustering algorithm in a distributed manner....

  13. Culture, Privacy Conception and Privacy Concern: Evidence from Europe before PRISM

    OpenAIRE

    Omrani, Nessrine; Soulié, Nicolas

    2017-01-01

    This article analyses individuals’ online privacy concerns between cultural country groups. We use a dataset of more than 14 000 Internet users collected by the European Union in 2010 in 26 EU countries. We use a probit model to examine the variables associated with the probability of being concerned about privacy, in order to draw policy and regulatory implications. The results show that women and poor people are more concerned than their counterparts. People who often use Internet are not p...

  14. Evidence Accumulation and Choice Maintenance Are Dissociated in Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Mads Lund Pedersen

    Full Text Available Perceptual decision making in monkeys relies on decision neurons, which accumulate evidence and maintain choices until a response is given. In humans, several brain regions have been proposed to accumulate evidence, but it is unknown if these regions also maintain choices. To test if accumulator regions in humans also maintain decisions we compared delayed and self-paced responses during a face/house discrimination decision making task. Computational modeling and fMRI results revealed dissociated processes of evidence accumulation and decision maintenance, with potential accumulator activations found in the dorsomedial prefrontal cortex, right inferior frontal gyrus and bilateral insula. Potential maintenance activation spanned the frontal pole, temporal gyri, precuneus and the lateral occipital and frontal orbital cortices. Results of a quantitative reverse inference meta-analysis performed to differentiate the functions associated with the identified regions did not narrow down potential accumulation regions, but suggested that response-maintenance might rely on a verbalization of the response.

  15. Guaranteeing Privacy-Observing Data Exchange

    DEFF Research Database (Denmark)

    Probst, Christian W.

    2016-01-01

    Privacy is a major concern in large of parts of the world when exchanging information. Ideally, we would like to be able to have fine-grained control about how information that we deem sensitive can be propagated and used. While privacy policy languages exist, it is not possible to control whether...... the entity that receives data is living up to its own policy specification. In this work we present our initial work on an approach that empowers data owners to specify their privacy preferences, and data consumers to specify their data needs. Using a static analysis of the two specifications, our approach...... then finds a communication scheme that complies with these preferences and needs. While applicable to online transactions, the same techniques can be used in development of IT systems dealing with sensitive data. To the best of our knowledge, no existing privacy policy languages supports negotiation...

  16. Privacy and policy for genetic research.

    Science.gov (United States)

    DeCew, Judith Wagner

    2004-01-01

    I begin with a discussion of the value of privacy and what we lose without it. I then turn to the difficulties of preserving privacy for genetic information and other medical records in the face of advanced information technology. I suggest three alternative public policy approaches to the problem of protecting individual privacy and also preserving databases for genetic research: (1) governmental guidelines and centralized databases, (2) corporate self-regulation, and (3) my hybrid approach. None of these are unproblematic; I discuss strengths and drawbacks of each, emphasizing the importance of protecting the privacy of sensitive medical and genetic information as well as letting information technology flourish to aid patient care, public health and scientific research.

  17. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    Science.gov (United States)

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code

  18. Are personal health records safe? A review of free web-accessible personal health record privacy policies.

    Science.gov (United States)

    Carrión Señor, Inmaculada; Fernández-Alemán, José Luis; Toval, Ambrosio

    2012-08-23

    Several obstacles prevent the adoption and use of personal health record (PHR) systems, including users' concerns regarding the privacy and security of their personal health information. To analyze the privacy and security characteristics of PHR privacy policies. It is hoped that identification of the strengths and weaknesses of the PHR systems will be useful for PHR users, health care professionals, decision makers, and designers. We conducted a systematic review using the principal databases related to health and computer science to discover the Web-based and free PHR systems mentioned in published articles. The privacy policy of each PHR system selected was reviewed to extract its main privacy and security characteristics. The search of databases and the myPHR website provided a total of 52 PHR systems, of which 24 met our inclusion criteria. Of these, 17 (71%) allowed users to manage their data and to control access to their health care information. Only 9 (38%) PHR systems permitted users to check who had accessed their data. The majority of PHR systems used information related to the users' accesses to monitor and analyze system use, 12 (50%) of them aggregated user information to publish trends, and 20 (83%) used diverse types of security measures. Finally, 15 (63%) PHR systems were based on regulations or principles such as the US Health Insurance Portability and Accountability Act (HIPAA) and the Health on the Net Foundation Code of Conduct (HONcode). Most privacy policies of PHR systems do not provide an in-depth description of the security measures that they use. Moreover, compliance with standards and regulations in PHR systems is still low.

  19. Privacy Analysis in Mobile Social Networks

    DEFF Research Database (Denmark)

    Sapuppo, Antonio

    2012-01-01

    disclosure decisions happening in ordinary human communication. Consequently, in this paper we provide insight into influential factors of human data disclosure decisions, by presenting and analysing results of an empirical investigation comprising two online surveys. We focus on the following influential...

  20. Mandatory Submission to The Identification of Genetic Profile for Criminal Purpose: A Broach Pursuant to the Right to Privacy and Dignity of the Human Person

    Directory of Open Access Journals (Sweden)

    George Maia Santos

    2015-12-01

    Full Text Available This article aims to demonstrate that the mandatory submission convicted of a crime committed, intentionally, with serious violence against person or heinous crime, to identify the genetic profile by DNA extraction - deoxyribonucleic acid, although by proper and painless technique is offensive to fundamental rights. For this purpose, it is part of the overall concept of the right to privacy, which is configured as a negative right or protection against unlawful state mismanagement, in order to protect a need or a basic right to the free individual self-determination. Then genetic intimacy is defined as an asset able to reveal the physical, psychological, behavioral and disease features, which, if disclosed or accessed without the consent of the accused, may generate stigmatization and discrimination of the subject involved, violating in this way, therefore, the right to privacy. In conclusion, we move towards emphasizing besides the right to privacy, compulsory provision of biological material to identify the genetic profile is offensive to fundamental rights to physical liberty or outpatient; physical integrity; to the freedom of religion or conscience; non-discrimination; the silence and non-production of evidences against himself, and in last instance, the biggest vector of all fundamental rights: the dignity of the human person.

  1. Achieving Network Level Privacy in Wireless Sensor Networks†

    Science.gov (United States)

    Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2010-01-01

    Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power), sensor networks (e.g., mobility and topology) and QoS issues (e.g., packet reach-ability and timeliness). In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks. PMID:22294881

  2. Achieving Network Level Privacy in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sungyoung Lee

    2010-02-01

    Full Text Available Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power, sensor networks (e.g., mobility and topology and QoS issues (e.g., packet reach-ability and timeliness. In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks.

  3. Digital privacy in the marketplace perspectives on the information exchange

    CERN Document Server

    Milne, George

    2015-01-01

    Digital Privacy in the Marketplace focuses on the data ex-changes between marketers and consumers, with special ttention to the privacy challenges that are brought about by new information technologies. The purpose of this book is to provide a background source to help the reader think more deeply about the impact of privacy issues on both consumers and marketers. It covers topics such as: why privacy is needed, the technological, historical and academic theories of privacy, how market exchange af-fects privacy, what are the privacy harms and protections available, and what is the likely future of privacy.

  4. 48 CFR 352.224-70 - Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Privacy Act. 352.224-70... SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 352.224-70 Privacy Act. As prescribed in 324.103(b)(2), the Contracting Officer shall insert the following clause: Privacy Act (January...

  5. Access to Information and Privacy | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    As a Crown corporation, IDRC is subject to Canada's laws on access to information and privacy protection. The following resources will help you learn more about IDRC and the access to information and privacy acts, including instructions for submitting an access to information or privacy act (ATIP) request. IDRC and ATIP ...

  6. PRIVACY PROTECTION PROBLEMS IN SOCIAL NETWORKS

    OpenAIRE

    OKUR, M. Cudi

    2011-01-01

    Protecting privacy has become a major concern for most social network users because of increased difficulties of controlling the online data. This article presents an assessment of the common privacy related risks of social networking sites. Open and hidden privacy risks of active and passive online profiles are examined and increasing share of social networking in these phenomena is discussed. Inadequacy of available legal and institutional protection is demonstrated and the effectiveness of...

  7. Facebook: Personality and privacy on profiles

    OpenAIRE

    Casado Riera, Carla; Oberst, Ursula; Carbonell, Xavier

    2015-01-01

    The aim of this study was to study the possible relationship between the privacy settings in Facebook profiles and two personality dimensions, extraversion and neuroticism, in relation to gender. The Privacy on Facebook Questionnaire and the Eysenck Personality Inventory was applied to a sample of 92 womenand 70 men, all users of Facebook. No significant relationship was found between extraversion or neuroticism and the privacy settings of Facebook profiles, but the results showed significant...

  8. New threats to health data privacy.

    Science.gov (United States)

    Li, Fengjun; Zou, Xukai; Liu, Peng; Chen, Jake Y

    2011-11-24

    Along with the rapid digitalization of health data (e.g. Electronic Health Records), there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. We study how patient privacy could be compromised with the help of today's information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  9. New threats to health data privacy

    Directory of Open Access Journals (Sweden)

    Li Fengjun

    2011-11-01

    Full Text Available Abstract Background Along with the rapid digitalization of health data (e.g. Electronic Health Records, there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. Results We study how patient privacy could be compromised with the help of today’s information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. Conclusion We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  10. PrivateRide: A Privacy-Enhanced Ride-Hailing Service

    Directory of Open Access Journals (Sweden)

    Pham Anh

    2017-04-01

    Full Text Available In the past few years, we have witnessed a rise in the popularity of ride-hailing services (RHSs, an online marketplace that enables accredited drivers to use their own cars to drive ride-hailing users. Unlike other transportation services, RHSs raise significant privacy concerns, as providers are able to track the precise mobility patterns of millions of riders worldwide. We present the first survey and analysis of the privacy threats in RHSs. Our analysis exposes high-risk privacy threats that do not occur in conventional taxi services. Therefore, we propose PrivateRide, a privacy-enhancing and practical solution that offers anonymity and location privacy for riders, and protects drivers’ information from harvesting attacks. PrivateRide lowers the high-risk privacy threats in RHSs to a level that is at least as low as that of many taxi services. Using real data-sets from Uber and taxi rides, we show that PrivateRide significantly enhances riders’ privacy, while preserving tangible accuracy in ride matching and fare calculation, with only negligible effects on convenience. Moreover, by using our Android implementation for experimental evaluations, we show that PrivateRide’s overhead during ride setup is negligible. In short, we enable privacy-conscious riders to achieve levels of privacy that are not possible in current RHSs and even in some conventional taxi services, thereby offering a potential business differentiator.

  11. Privacy Practices of Health Social Networking Sites: Implications for Privacy and Data Security in Online Cancer Communities.

    Science.gov (United States)

    Charbonneau, Deborah H

    2016-08-01

    While online communities for social support continue to grow, little is known about the state of privacy practices of health social networking sites. This article reports on a structured content analysis of privacy policies and disclosure practices for 25 online ovarian cancer communities. All of the health social networking sites in the study sample provided privacy statements to users, yet privacy practices varied considerably across the sites. The majority of sites informed users that personal information was collected about participants and shared with third parties (96%, n = 24). Furthermore, more than half of the sites (56%, n = 14) stated that cookies technology was used to track user behaviors. Despite these disclosures, only 36% (n = 9) offered opt-out choices for sharing data with third parties. In addition, very few of the sites (28%, n = 7) allowed individuals to delete their personal information. Discussions about specific security measures used to protect personal information were largely missing. Implications for privacy, confidentiality, consumer choice, and data safety in online environments are discussed. Overall, nurses and other health professionals can utilize these findings to encourage individuals seeking online support and participating in social networking sites to build awareness of privacy risks to better protect their personal health information in the digital age.

  12. Big Brother’s Little Helpers: The Right to Privacy and the Responsibility of Internet Service Providers

    Directory of Open Access Journals (Sweden)

    Yael Ronen

    2015-02-01

    Full Text Available Following the 2013 revelations on the extent of intelligence gathering through internet service providers, this article concerns the responsibility of internet service providers (ISPs involved in disclosure of personal data to government authorities under the right to privacy, by reference to the developing, non-binding standards applied to businesses under the Protect, Respect and Remedy Framework. The article examines the manner in which the Framework applies to ISPs and looks at measures that ISPs can take to fulfil their responsibility to respect the right to privacy. It utilizes the challenges to the right to privacy to discuss some aspects of the extension of human rights responsibilities to corporations. These include the respective roles of government and non-state actors, the extent to which corporations may be required to act proactively in order to protect the privacy of clients, and the relevance of transnational activity.

  13. Sexiled: Privacy Acquisition Strategies of College Roommates

    Science.gov (United States)

    Erlandson, Karen

    2014-01-01

    This study sought to understand how roommates make privacy bids in college residence halls. The results indicate that privacy for sexual activity is a problem for students living in college residence halls, as almost all participants (82%) reported having dealt with this issue. Two sets of responses were collected and analyzed: privacy acquisition…

  14. Privacy revisited? Old ideals, new realities, and their impact on biobank regimes.

    Science.gov (United States)

    Bialobrzeski, Arndt; Ried, Jens; Dabrock, Peter

    2011-11-01

    Biobanks, collecting human specimen, medical records, and lifestyle-related data, face the challenge of having contradictory missions: on the one hand serving the collective welfare through easy access for medical research, on the other hand adhering to restrictive privacy expectations of people in order to maintain their willingness to participate in such research. In this article, ethical frameworks stressing the societal value of low-privacy expectations in order to secure biomedical research are discussed. It will turn out that neither utilitarian nor communitarian or classical libertarian ethics frameworks will help to serve both goals. Instead, John Rawls' differentiation of the "right" and the "good" is presented in order to illustrate the possibility of "serving two masters": individual interests of privacy, and societal interests of scientific progress and intergenerational justice. In order to illustrate this counterbalancing concept with an example, the five-pillar concept of the German Ethics Council will be briefly discussed.

  15. Privacy in the Post-NSA Era: Time for a Fundamental Revision?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Big Brother Watch and others have filed a complaint against the United Kingdom under the European Convention on Human Rights about a violation of Article 8, the right to privacy. It regards the NSA affair and UK-based surveillance activities operated by secret services. The question is whether it

  16. Privacy and CHI : methodologies for studying privacy issues

    NARCIS (Netherlands)

    Patil, S.; Romero, N.A.; Karat, J.

    2006-01-01

    This workshop aims to reflect on methodologies to empirically study privacy issues related to advanced technology. The goal is to address methodological concerns by drawing upon both theoretical perspectives as well as practical experiences.

  17. Patient Privacy in the Era of Big Data

    Directory of Open Access Journals (Sweden)

    Mehmet Kayaalp

    2018-02-01

    Full Text Available Protecting patient privacy requires various technical tools. It involves regulations for sharing, de-identifying, securely storing, transmitting and handling protected health information (PHI. It involves privacy laws and legal agreements. It requires establishing rules for monitoring privacy leaks, determining actions when they occur, and handling de-identified clinical narrative reports. Deidentification is one such indispensable instrument in this set of privacy tools

  18. Biobanking and Privacy in India.

    Science.gov (United States)

    Chaturvedi, Sachin; Srinivas, Krishna Ravi; Muthuswamy, Vasantha

    2016-03-01

    Biobank-based research is not specifically addressed in Indian statutory law and therefore Indian Council for Medical Research guidelines are the primary regulators of biobank research in India. The guidelines allow for broad consent and for any level of identification of specimens. Although privacy is a fundamental right under the Indian Constitution, courts have limited this right when it conflicts with other rights or with the public interest. Furthermore, there is no established privacy test or actionable privacy right in the common law of India. In order to facilitate biobank-based research, both of these lacunae should be addressed by statutory law specifically addressing biobanking and more directly addressing the accompanying privacy concerns. A biobank-specific law should be written with international guidelines in mind, but harmonization with other laws should not be attempted until after India has created a law addressing biobank research within the unique legal and cultural environment of India. © 2016 American Society of Law, Medicine & Ethics.

  19. Location Privacy Techniques in Client-Server Architectures

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yiu, Man Lung

    2009-01-01

    A typical location-based service returns nearby points of interest in response to a user location. As such services are becoming increasingly available and popular, location privacy emerges as an important issue. In a system that does not offer location privacy, users must disclose their exact...... locations in order to receive the desired services. We view location privacy as an enabling technology that may lead to increased use of location-based services. In this chapter, we consider location privacy techniques that work in traditional client-server architectures without any trusted components other....... Third, their effectiveness is independent of the distribution of other users, unlike the k-anonymity approach. The chapter characterizes the privacy models assumed by existing techniques and categorizes these according to their approach. The techniques are then covered in turn according...

  20. Privacy amplification for quantum key distribution

    International Nuclear Information System (INIS)

    Watanabe, Yodai

    2007-01-01

    This paper examines classical privacy amplification using a universal family of hash functions. In quantum key distribution, the adversary's measurement can wait until the choice of hash functions is announced, and so the adversary's information may depend on the choice. Therefore the existing result on classical privacy amplification, which assumes the independence of the choice from the other random variables, is not applicable to this case. This paper provides a security proof of privacy amplification which is valid even when the adversary's information may depend on the choice of hash functions. The compression rate of the proposed privacy amplification can be taken to be the same as that of the existing one with an exponentially small loss in secrecy of a final key. (fast track communication)

  1. 10 CFR 1304.103 - Privacy Act inquiries.

    Science.gov (United States)

    2010-01-01

    ... writing may be sent to: Privacy Act Officer, U.S. Nuclear Waste Technical Review Board, 2300 Clarendon... NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.103 Privacy Act inquiries. (a) Requests... contains a record pertaining to him or her may file a request in person or in writing, via the internet, or...

  2. The role of privacy protection in healthcare information systems adoption.

    Science.gov (United States)

    Hsu, Chien-Lung; Lee, Ming-Ren; Su, Chien-Hui

    2013-10-01

    Privacy protection is an important issue and challenge in healthcare information systems (HISs). Recently, some privacy-enhanced HISs are proposed. Users' privacy perception, intention, and attitude might affect the adoption of such systems. This paper aims to propose a privacy-enhanced HIS framework and investigate the role of privacy protection in HISs adoption. In the proposed framework, privacy protection, access control, and secure transmission modules are designed to enhance the privacy protection of a HIS. An experimental privacy-enhanced HIS is also implemented. Furthermore, we proposed a research model extending the unified theory of acceptance and use of technology by considering perceived security and information security literacy and then investigate user adoption of a privacy-enhanced HIS. The experimental results and analyses showed that user adoption of a privacy-enhanced HIS is directly affected by social influence, performance expectancy, facilitating conditions, and perceived security. Perceived security has a mediating effect between information security literacy and user adoption. This study proposes several implications for research and practice to improve designing, development, and promotion of a good healthcare information system with privacy protection.

  3. A Model-Based Privacy Compliance Checker

    OpenAIRE

    Siani Pearson; Damien Allison

    2009-01-01

    Increasingly, e-business organisations are coming under pressure to be compliant to a range of privacy legislation, policies and best practice. There is a clear need for high-level management and administrators to be able to assess in a dynamic, customisable way the degree to which their enterprise complies with these. We outline a solution to this problem in the form of a model-driven automated privacy process analysis and configuration checking system. This system models privacy compliance ...

  4. Privacy Law and Print Photojournalism.

    Science.gov (United States)

    Dykhouse, Caroline Dow

    Reviews of publications about privacy law, of recent court actions, and of interviews with newspaper photographers and attorneys indicate that torts of privacy often conflict with the freedoms to publish and to gather news. Although some guidelines have already been established (about running distorted pictures, "stealing" pictures, taking…

  5. Anonymity versus privacy in the dictator game: revealing donor decisions to recipients does not substantially impact donor behavior.

    Directory of Open Access Journals (Sweden)

    Jeffrey Winking

    Full Text Available Anonymity is often offered in economic experiments in order to eliminate observer effects and induce behavior that would be exhibited under private circumstances. However, anonymity differs from privacy in that interactants are only unaware of each others' identities, while having full knowledge of each others' actions. Such situations are rare outside the laboratory and anonymity might not meet the requirements of some participants to psychologically engage as if their actions were private. In order to explore the impact of a lack of privacy on prosocial behaviors, I expand on a study reported in Dana et al. (2006 in which recipients were left unaware of the Dictator Game and given donations as "bonuses" to their show-up fees for other tasks. In the current study, I explore whether differences between a private Dictator Game (sensu Dana et al. (2006 and a standard anonymous one are due to a desire by dictators to avoid shame or to pursue prestige. Participants of a Dictator Game were randomly assigned to one of four categories-one in which the recipient knew of (1 any donation by an anonymous donor (including zero donations, (2 nothing at all, (3 only zero donations, and (4 and only non-zero donations. The results suggest that a lack of privacy increases the shame that selfish-acting participants experience, but that removing such a cost has only minimal effects on actual behavior.

  6. Smart Grid Privacy through Distributed Trust

    Science.gov (United States)

    Lipton, Benjamin

    Though the smart electrical grid promises many advantages in efficiency and reliability, the risks to consumer privacy have impeded its deployment. Researchers have proposed protecting privacy by aggregating user data before it reaches the utility, using techniques of homomorphic encryption to prevent exposure of unaggregated values. However, such schemes generally require users to trust in the correct operation of a single aggregation server. We propose two alternative systems based on secret sharing techniques that distribute this trust among multiple service providers, protecting user privacy against a misbehaving server. We also provide an extensive evaluation of the systems considered, comparing their robustness to privacy compromise, error handling, computational performance, and data transmission costs. We conclude that while all the systems should be computationally feasible on smart meters, the two methods based on secret sharing require much less computation while also providing better protection against corrupted aggregators. Building systems using these techniques could help defend the privacy of electricity customers, as well as customers of other utilities as they move to a more data-driven architecture.

  7. Incorporating BDI Agents into Human-Agent Decision Making Research

    Science.gov (United States)

    Kamphorst, Bart; van Wissen, Arlette; Dignum, Virginia

    Artificial agents, people, institutes and societies all have the ability to make decisions. Decision making as a research area therefore involves a broad spectrum of sciences, ranging from Artificial Intelligence to economics to psychology. The Colored Trails (CT) framework is designed to aid researchers in all fields in examining decision making processes. It is developed both to study interaction between multiple actors (humans or software agents) in a dynamic environment, and to study and model the decision making of these actors. However, agents in the current implementation of CT lack the explanatory power to help understand the reasoning processes involved in decision making. The BDI paradigm that has been proposed in the agent research area to describe rational agents, enables the specification of agents that reason in abstract concepts such as beliefs, goals, plans and events. In this paper, we present CTAPL: an extension to CT that allows BDI software agents that are written in the practical agent programming language 2APL to reason about and interact with a CT environment.

  8. Privacy Act

    Science.gov (United States)

    Learn about the Privacy Act of 1974, the Electronic Government Act of 2002, the Federal Information Security Management Act, and other information about the Environmental Protection Agency maintains its records.

  9. Privacy-Preserving Restricted Boltzmann Machine

    Directory of Open Access Journals (Sweden)

    Yu Li

    2014-01-01

    Full Text Available With the arrival of the big data era, it is predicted that distributed data mining will lead to an information technology revolution. To motivate different institutes to collaborate with each other, the crucial issue is to eliminate their concerns regarding data privacy. In this paper, we propose a privacy-preserving method for training a restricted boltzmann machine (RBM. The RBM can be got without revealing their private data to each other when using our privacy-preserving method. We provide a correctness and efficiency analysis of our algorithms. The comparative experiment shows that the accuracy is very close to the original RBM model.

  10. Social cobots: Anticipatory decision-making for collaborative robots incorporating unexpected human behaviors

    CSIR Research Space (South Africa)

    Can Görür, O

    2018-03-01

    Full Text Available We propose an architecture as a robot’s decision-making mechanism to anticipate a human’s state of mind, and so plan accordingly during a human-robot collaboration task. At the core of the architecture lies a novel stochastic decision...

  11. Privacy-Preserving Location-Based Services

    Science.gov (United States)

    Chow, Chi Yin

    2010-01-01

    Location-based services (LBS for short) providers require users' current locations to answer their location-based queries, e.g., range and nearest-neighbor queries. Revealing personal location information to potentially untrusted service providers could create privacy risks for users. To this end, our objective is to design a privacy-preserving…

  12. Story Lab: Student Data Privacy

    Science.gov (United States)

    Herold, Benjamin

    2015-01-01

    Student data privacy is an increasingly high-profile--and controversial--issue that touches schools and families across the country. There are stories to tell in virtually every community. About three dozen states have passed legislation addressing student data privacy in the past two years, and eight different proposals were floating around…

  13. Vehicular ad hoc network security and privacy

    CERN Document Server

    Lin, X

    2015-01-01

    Unlike any other book in this area, this book provides innovative solutions to security issues, making this book a must read for anyone working with or studying security measures. Vehicular Ad Hoc Network Security and Privacy mainly focuses on security and privacy issues related to vehicular communication systems. It begins with a comprehensive introduction to vehicular ad hoc network and its unique security threats and privacy concerns and then illustrates how to address those challenges in highly dynamic and large size wireless network environments from multiple perspectives. This book is richly illustrated with detailed designs and results for approaching security and privacy threats.

  14. Enhancing Privacy in Wearable IoT through a Provenance Architecture

    Directory of Open Access Journals (Sweden)

    Richard K. Lomotey

    2018-04-01

    Full Text Available The Internet of Things (IoT is inspired by network interconnectedness of humans, objects, and cloud services to facilitate new use cases and new business models across multiple enterprise domains including healthcare. This creates the need for continuous data streaming in IoT architectures which are mainly designed following the broadcast model. The model facilitates IoT devices to sense and deliver information to other nodes (e.g., cloud, physical objects, etc. that are interested in the information. However, this is a recipe for privacy breaches since sensitive data, such as personal vitals from wearables, can be delivered to undesired sniffing nodes. In order to protect users’ privacy and manufacturers’ IP, as well as detecting and blocking malicious activity, this research paper proposes privacy-oriented IoT architecture following the provenance technique. This ensures that the IoT data will only be delivered to the nodes that subscribe to receive the information. Using the provenance technique to ensure high transparency, the work is able to provide trace routes for digital audit trail. Several empirical evaluations are conducted in a real-world wearable IoT ecosystem to prove the superiority of the proposed work.

  15. Ensuring privacy in the study of pathogen genetics.

    Science.gov (United States)

    Mehta, Sanjay R; Vinterbo, Staal A; Little, Susan J

    2014-08-01

    Rapid growth in the genetic sequencing of pathogens in recent years has led to the creation of large sequence databases. This aggregated sequence data can be very useful for tracking and predicting epidemics of infectious diseases. However, the balance between the potential public health benefit and the risk to personal privacy for individuals whose genetic data (personal or pathogen) are included in such work has been difficult to delineate, because neither the true benefit nor the actual risk to participants has been adequately defined. Existing approaches to minimise the risk of privacy loss to participants are based on de-identification of data by removal of a predefined set of identifiers. These approaches neither guarantee privacy nor protect the usefulness of the data. We propose a new approach to privacy protection that will quantify the risk to participants, while still maximising the usefulness of the data to researchers. This emerging standard in privacy protection and disclosure control, which is known as differential privacy, uses a process-driven rather than data-centred approach to protecting privacy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Personalized privacy-preserving frequent itemset mining using randomized response.

    Science.gov (United States)

    Sun, Chongjing; Fu, Yan; Zhou, Junlin; Gao, Hui

    2014-01-01

    Frequent itemset mining is the important first step of association rule mining, which discovers interesting patterns from the massive data. There are increasing concerns about the privacy problem in the frequent itemset mining. Some works have been proposed to handle this kind of problem. In this paper, we introduce a personalized privacy problem, in which different attributes may need different privacy levels protection. To solve this problem, we give a personalized privacy-preserving method by using the randomized response technique. By providing different privacy levels for different attributes, this method can get a higher accuracy on frequent itemset mining than the traditional method providing the same privacy level. Finally, our experimental results show that our method can have better results on the frequent itemset mining while preserving personalized privacy.

  17. Privacy as virtue: searching for a new privacy paradigm in the age of Big Data

    NARCIS (Netherlands)

    van der Sloot, B.; Beyvers, E.; Helm, P.; Hennig, M.; Keckeis, C.; Kreknin, I.; Püschel, F.

    2017-01-01

    Originally, privacy was conceived primarily as a duty of the state not to abuse its powers It could not, for example, enter a private house without legitimate reason or reasonable suspicion that the owner of the house had engaged in, for example, criminal conduct Gradually, however, privacy has been

  18. Security measures required for HIPAA privacy.

    Science.gov (United States)

    Amatayakul, M

    2000-01-01

    HIPAA security requirements include administrative, physical, and technical services and mechanisms to safeguard confidentiality, availability, and integrity of health information. Security measures, however, must be implemented in the context of an organization's privacy policies. Because HIPAA's proposed privacy rules are flexible and scalable to account for the nature of each organization's business, size, and resources, each organization will be determining its own privacy policies within the context of the HIPAA requirements and its security capabilities. Security measures cannot be implemented in a vacuum.

  19. Efficient Dynamic Searchable Encryption with Forward Privacy

    Directory of Open Access Journals (Sweden)

    Etemad Mohammad

    2018-01-01

    Full Text Available Searchable symmetric encryption (SSE enables a client to perform searches over its outsourced encrypted files while preserving privacy of the files and queries. Dynamic schemes, where files can be added or removed, leak more information than static schemes. For dynamic schemes, forward privacy requires that a newly added file cannot be linked to previous searches. We present a new dynamic SSE scheme that achieves forward privacy by replacing the keys revealed to the server on each search. Our scheme is efficient and parallelizable and outperforms the best previous schemes providing forward privacy, and achieves competitive performance with dynamic schemes without forward privacy. We provide a full security proof in the random oracle model. In our experiments on the Wikipedia archive of about four million pages, the server takes one second to perform a search with 100,000 results.

  20. Protecting privacy in a clinical data warehouse.

    Science.gov (United States)

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  1. Kids Sell: Celebrity Kids’ Right to Privacy

    Directory of Open Access Journals (Sweden)

    Seong Choul Hong

    2016-04-01

    Full Text Available The lives of celebrities are often spotlighted in the media because of their newsworthiness; however, many celebrities argue that their right to privacy is often infringed upon. Concerns about celebrity privacy are not limited to the celebrities themselves and often expand to their children. As a result of their popularity, public interest has pushed paparazzi and journalists to pursue trivial and private details about the lives of both celebrities and their children. This paper investigates conflicting areas where the right to privacy and the right to know collide when dealing with the children of celebrities. In general, the courts have been unsympathetic to celebrity privacy claims, noting their newsworthiness and self-promoted characteristic. Unless the press violates news-gathering ethics or torts, the courts will often rule in favor of the media. However, the story becomes quite different when related to an infringement on the privacy of celebrities’ children. This paper argues that all children have a right to protect their privacy regardless of their parents’ social status. Children of celebrities should not be exempt to principles of privacy just because their parents are a celebrity. Furthermore, they should not be exposed by the media without the voluntary consent of their legal patrons. That is, the right of the media to publish and the newsworthiness of children of celebrities must be restrictedly acknowledged.

  2. Practical secure decision tree learning in a teletreatment application

    NARCIS (Netherlands)

    de Hoogh, Sebastiaan; Schoenmakers, Berry; Chen, Ping; op den Akker, Harm

    In this paper we develop a range of practical cryptographic protocols for secure decision tree learning, a primary problem in privacy preserving data mining. We focus on particular variants of the well-known ID3 algorithm allowing a high level of security and performance at the same time. Our

  3. Practical secure decision tree learning in a teletreatment application

    NARCIS (Netherlands)

    Hoogh, de S.J.A.; Schoenmakers, B.; Chen, Ping; Op den Akker, H.; Christin, N.; Safavi-Naini, R.

    2014-01-01

    In this paper we develop a range of practical cryptographic protocols for secure decision tree learning, a primary problem in privacy preserving data mining. We focus on particular variants of the well-known ID3 algorithm allowing a high level of security and performance at the same time. Our

  4. 17 CFR 160.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-04-01

    ... examples. 160.2 Section 160.2 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION PRIVACY OF CONSUMER FINANCIAL INFORMATION § 160.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  5. Privacy-preserving digital rights management

    NARCIS (Netherlands)

    Conrado, C.; Petkovic, M.; Jonker, W.; Jonker, W.; Petkovic, M.

    2004-01-01

    DRM systems provide a means for protecting digital content, but at the same time they violate the privacy of users in a number of ways. This paper addresses privacy issues in DRM systems. The main challenge is how to allow a user to interact with the system in an anonymous/pseudonymous way, while

  6. Privacy-preserving Identity Management

    OpenAIRE

    Milutinovic, Milica

    2015-01-01

    With the technological advances and the evolution of online services, user privacy is becoming a crucial issue in the modern day society. Privacy in the general sense refers to individuals’ ability to protect information about themselves and selectively present it to other entities. This concept is nowadays strongly affected by everyday practices that assume personal data disclosure, such as online shopping and participation in loyalty schemes. This makes it difficult for an individual to con...

  7. The Privacy Problem: Although School Librarians Seldom Discuss It, Students' Privacy Rights Are under Attack

    Science.gov (United States)

    Adams, Helen R.

    2011-01-01

    Every day in school libraries nationwide, students' privacy rights are under attack, but many principals, teachers, parents, and community members do not know much about these rights. Even though school librarians are among the strongest proponents of privacy, the subject is rarely discussed, probably because state and federal laws can be…

  8. Privacy in Online Social Networking Sites

    OpenAIRE

    M.Ida Evones

    2015-01-01

    There are more than 192 act ive social networking websites. Bringing every kind of social group together in one place and letting them interact is really a big thing indeed .Huge amount of information process in the sites each day, end up making it vulnerable to attack. There is no systematic framework taking into account the importance of privacy. Increased privacy settings don’t always guarantee privacy when there is a loop hole in the applications. Lack of user education results is over sh...

  9. A Secure and Privacy-Preserving Targeted Ad-System

    Science.gov (United States)

    Androulaki, Elli; Bellovin, Steven M.

    Thanks to its low product-promotion cost and its efficiency, targeted online advertising has become very popular. Unfortunately, being profile-based, online advertising methods violate consumers' privacy, which has engendered resistance to the ads. However, protecting privacy through anonymity seems to encourage click-fraud. In this paper, we define consumer's privacy and present a privacy-preserving, targeted ad system (PPOAd) which is resistant towards click fraud. Our scheme is structured to provide financial incentives to all entities involved.

  10. Legal process, litigation, and judicial decisions.

    Science.gov (United States)

    Beresford, H Richard

    2013-01-01

    Ethically salient issues in neurologic care may have important legal overtones. This chapter considers some of these, emphasizing how law may influence the outcome of controversies over how best to promote autonomy, beneficence, and justice in the care of individuals with neurologic disorders. Constitutional, statutory, and judicial dimensions are addressed. With respect to autonomy, discussion emphasizes legal dimensions of the doctrine of informed consent and the obligations of medical professionals to protect the privacy and confidentiality of their patients. The discussion of beneficence focuses on issues relating to actual or potential conflicts of interest in the care of patients and on the conduct of research involving human subjects. The section on justice considers how law aims to define protectable rights and interests of individuals and to provide a fair and efficient process for resolving disputes. Applications of legal principles and doctrines are illustrated primarily through the examples afforded by judicial decisions. These cases demonstrate how law both promotes ethical decision-making and protects the rights and interests of those affected. The cases also highlight some of the ethical quandaries that evoke resort to litigation and the limits of law in advancing ethically appropriate outcomes. © 2013 Elsevier B.V. All rights reserved.

  11. Computer-Aided Identification and Validation of Privacy Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2016-05-01

    Full Text Available Privacy is a software quality that is closely related to security. The main difference is that security properties aim at the protection of assets that are crucial for the considered system, and privacy aims at the protection of personal data that are processed by the system. The identification of privacy protection needs in complex systems is a hard and error prone task. Stakeholders whose personal data are processed might be overlooked, or the sensitivity and the need of protection of the personal data might be underestimated. The later personal data and the needs to protect them are identified during the development process, the more expensive it is to fix these issues, because the needed changes of the system-to-be often affect many functionalities. In this paper, we present a systematic method to identify the privacy needs of a software system based on a set of functional requirements by extending the problem-based privacy analysis (ProPAn method. Our method is tool-supported and automated where possible to reduce the effort that has to be spent for the privacy analysis, which is especially important when considering complex systems. The contribution of this paper is a semi-automatic method to identify the relevant privacy requirements for a software-to-be based on its functional requirements. The considered privacy requirements address all dimensions of privacy that are relevant for software development. As our method is solely based on the functional requirements of the system to be, we enable users of our method to identify the privacy protection needs that have to be addressed by the software-to-be at an early stage of the development. As initial evaluation of our method, we show its applicability on a small electronic health system scenario.

  12. Gain-Based Relief for Invasion of Privacy

    Directory of Open Access Journals (Sweden)

    Sirko Harder

    2013-11-01

    Full Text Available In many common law jurisdictions, some or all instances of invasion of privacy constitute a privacy-specific wrong either at common law (including equity or under statute. A remedy invariably available for such a wrong is compensation for loss. However, the plaintiff may instead seek to claim the profit the defendant has made from the invasion. This article examines when a plaintiff is, and should be, entitled to claim that profit, provided that invasion of privacy is actionable as such. After a brief overview of the relevant law in major common law jurisdictions, the article investigates how invasion of privacy fits into a general concept of what is called ‘restitution for wrongs’. It will be argued that the right to privacy is a right against the whole world and as such forms a proper basis of awarding gain-based relief for the unauthorised use of that right.

  13. 12 CFR 573.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Model privacy form and examples. 573.2 Section... FINANCIAL INFORMATION § 573.2 Model privacy form and examples. (a) Model privacy form. Use of the model... privacy form is not required. (b) Examples. The examples in this part are not exclusive. Compliance with...

  14. 12 CFR 332.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Model privacy form and examples. 332.2 Section... POLICY PRIVACY OF CONSUMER FINANCIAL INFORMATION § 332.2 Model privacy form and examples. (a) Model... this part, although use of the model privacy form is not required. (b) Examples. The examples in this...

  15. 12 CFR 216.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 2 2010-01-01 2010-01-01 false Model privacy form and examples. 216.2 Section... PRIVACY OF CONSUMER FINANCIAL INFORMATION (REGULATION P) § 216.2 Model privacy form and examples. (a... of this part, although use of the model privacy form is not required. (b) Examples. The examples in...

  16. 43 CFR 2.47 - Records subject to Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Records subject to Privacy Act. 2.47 Section 2.47 Public Lands: Interior Office of the Secretary of the Interior RECORDS AND TESTIMONY; FREEDOM OF INFORMATION ACT Privacy Act § 2.47 Records subject to Privacy Act. The Privacy Act applies to all...

  17. Genetic privacy and non-discrimination.

    Science.gov (United States)

    Romeo Casabona, Carlos María

    2011-01-01

    The UN Inter-Agency Committee on Bioethics met for its tenth meeting at the UNESCO headquarters in Paris on 4-5th March 2011. Member organisations such as the WHO and UNESCO were in attendance alongside associate members such as the Council for Europe, the European Commission, the Organisation for Economic Co-operation and Development and the World Trade Organisation. Discussion centred on the theme "genetic privacy and nondiscrimination". The United Nations Economic and Social Council (ECOSOC) had previously considered, from a legal and ethical perspective, the implications of increasingly sophisticated technologies for genetic privacy and non-discrimination in fields such as medicine, employment and insurance. Thus, the ECOSOC requested that UNESCO report on relevant developments in the field of genetic privacy and non-discrimination. In parallel with a consultation process with member states, UNESCO launched a consultation with the UN Interagency Committee on Bioethics. This article analyses the report presented by the author concerning the analysis of the current contentions in the field and illustrates attempts at responding on a normative level to a perceived threat to genetic privacy and non-discrimination.

  18. How awareness changes the relative weights of evidence during human decision-making

    NARCIS (Netherlands)

    de Lange, F.P.; van Gaal, S.; Lamme, V.A.F.; Dehaene, S.

    2011-01-01

    Human decisions are based on accumulating evidence over time for different options. Here we ask a simple question: How is the accumulation of evidence affected by the level of awareness of the information? We examined the influence of awareness on decision-making using combined behavioral methods

  19. Data privacy foundations, new developments and the big data challenge

    CERN Document Server

    Torra, Vicenç

    2017-01-01

    This book offers a broad, cohesive overview of the field of data privacy. It discusses, from a technological perspective, the problems and solutions of the three main communities working on data privacy: statistical disclosure control (those with a statistical background), privacy-preserving data mining (those working with data bases and data mining), and privacy-enhancing technologies (those involved in communications and security) communities. Presenting different approaches, the book describes alternative privacy models and disclosure risk measures as well as data protection procedures for respondent, holder and user privacy. It also discusses specific data privacy problems and solutions for readers who need to deal with big data.

  20. Influence of biases in numerical magnitude allocation on human prosocial decision making.

    Science.gov (United States)

    Arshad, Qadeer; Nigmatullina, Yuliya; Siddiqui, Shuaib; Franka, Mustafa; Mediratta, Saniya; Ramachandaran, Sanjeev; Lobo, Rhannon; Malhotra, Paresh A; Roberts, R E; Bronstein, Adolfo M

    2017-12-01

    Over the past decade neuroscientific research has attempted to probe the neurobiological underpinnings of human prosocial decision making. Such research has almost ubiquitously employed tasks such as the dictator game or similar variations (i.e., ultimatum game). Considering the explicit numerical nature of such tasks, it is surprising that the influence of numerical cognition on decision making during task performance remains unknown. While performing these tasks, participants typically tend to anchor on a 50:50 split that necessitates an explicit numerical judgement (i.e., number-pair bisection). Accordingly, we hypothesize that the decision-making process during the dictator game recruits overlapping cognitive processes to those known to be engaged during number-pair bisection. We observed that biases in numerical magnitude allocation correlated with the formulation of decisions during the dictator game. That is, intrinsic biases toward smaller numerical magnitudes were associated with the formulation of less favorable decisions, whereas biases toward larger magnitudes were associated with more favorable choices. We proceeded to corroborate this relationship by subliminally and systematically inducing biases in numerical magnitude toward either higher or lower numbers using a visuo-vestibular stimulation paradigm. Such subliminal alterations in numerical magnitude allocation led to proportional and corresponding changes to an individual's decision making during the dictator game. Critically, no relationship was observed between neither intrinsic nor induced biases in numerical magnitude on decision making when assessed using a nonnumerical-based prosocial questionnaire. Our findings demonstrate numerical influences on decisions formulated during the dictator game and highlight the necessity to control for confounds associated with numerical cognition in human decision-making paradigms. NEW & NOTEWORTHY We demonstrate that intrinsic biases in numerical magnitude

  1. 77 FR 32111 - Privacy Act System of Records

    Science.gov (United States)

    2012-05-31

    ... contacted in order to obtain that office's advice regarding obligations under the Privacy Act; 8. Breach... FEDERAL COMMUNICATIONS COMMISSION Privacy Act System of Records AGENCY: Federal Communications Commission. ACTION: Notice; one new Privacy Act system of records. SUMMARY: Pursuant to subsection (e)(4) of...

  2. CARAVAN: Providing Location Privacy for VANET

    National Research Council Canada - National Science Library

    Sampigethaya, Krishna; Huang, Leping; Li, Mingyan; Poovendran, Radha; Matsuura, Kanta; Sezaki, Kaoru

    2005-01-01

    .... This type of tracking leads to threats on the location privacy of the vehicle's user. In this paper, we study the problem of providing location privacy in VANET by allowing vehicles to prevent tracking of their broadcast communications...

  3. Moving beyond the special rapporteur on privacy with the establishment of a new, specialised United Nations Agency : Addressing the deficit in global cooperation for the protection of data privacy

    NARCIS (Netherlands)

    de Hert, Paul; Papakonstantinou, Vagelis; Jerker Svantesson, Dan; Kloza, Dariusz

    2017-01-01

    In July 2015, the UN Human Rights Council appointed Professor Joseph Cannataci as its first-ever Special Rapporteur on the right to privacy. His mandate is, among others, to gather information, identify obstacles, take part in global initiatives and raise awareness. In order to address this global

  4. The Human Factor: Behavioral and Neural Correlates of Humanized Perception in Moral Decision Making

    OpenAIRE

    Majdandžić, Jasminka; Bauer, Herbert; Windischberger, Christian; Moser, Ewald; Engl, Elisabeth; Lamm, Claus

    2012-01-01

    The extent to which people regard others as full-blown individuals with mental states ("humanization") seems crucial for their prosocial motivation towards them. Previous research has shown that decisions about moral dilemmas in which one person can be sacrificed to save multiple others do not consistently follow utilitarian principles. We hypothesized that this behavior can be explained by the potential victim's perceived humanness and an ensuing increase in vicarious emotions and emotional ...

  5. 76 FR 4436 - Privacy Act of 1974; Report of Modified or Altered System of Records

    Science.gov (United States)

    2011-01-25

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Privacy Act of... following Breach Response Routine Use Language to comply with the Office of Management and Budget [[Page 4437

  6. Privacy and Open Government

    Directory of Open Access Journals (Sweden)

    Teresa Scassa

    2014-06-01

    Full Text Available The public-oriented goals of the open government movement promise increased transparency and accountability of governments, enhanced citizen engagement and participation, improved service delivery, economic development and the stimulation of innovation. In part, these goals are to be achieved by making more and more government information public in reusable formats and under open licences. This paper identifies three broad privacy challenges raised by open government. The first is how to balance privacy with transparency and accountability in the context of “public” personal information. The second challenge flows from the disruption of traditional approaches to privacy based on a collapse of the distinctions between public and private sector actors. The third challenge is that of the potential for open government data—even if anonymized—to contribute to the big data environment in which citizens and their activities are increasingly monitored and profiled.

  7. The Role of Intuition in Risk/Benefit Decision-Making in Human Subjects Research.

    Science.gov (United States)

    Resnik, David B

    2017-01-01

    One of the key principles of ethical research involving human subjects is that the risks of research to should be acceptable in relation to expected benefits. Institutional review board (IRB) members often rely on intuition to make risk/benefit decisions concerning proposed human studies. Some have objected to using intuition to make these decisions because intuition is unreliable and biased and lacks transparency. In this article, I examine the role of intuition in IRB risk/benefit decision-making and argue that there are practical and philosophical limits to our ability to reduce our reliance on intuition in this process. The fact that IRB risk/benefit decision-making involves intuition need not imply that it is hopelessly subjective or biased, however, since there are strategies that IRBs can employ to improve their decisions, such as using empirical data to estimate the probability of potential harms and benefits, developing classification systems to guide the evaluation of harms and benefits, and engaging in moral reasoning concerning the acceptability of risks.

  8. 5G Visions of User Privacy

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Khajuria, Samant; Skouby, Knud Erik

    2015-01-01

    Currently, the discussions are going on the elements and definition of 5G networks. One of the elements in this discussion is how to provide for user controlled privacy for securing users' digital interaction. The purpose of this paper is to present elements of user controlled privacy needed...... for the future 5G networks. The paper concludes that an ecosystem consisting of Trusted Third Party between the end user and the service providers as a distributed system could be integrated to secure the perspective of user controlled privacy for future systems...

  9. Mum's the Word: Feds Are Serious About Protecting Patients' Privacy.

    Science.gov (United States)

    Conde, Crystal

    2010-08-01

    The Health Information Technology for Economic and Clinical Health (HITECH) Act significantly changes HIPAA privacy and security policies that affect physicians. Chief among the changes are the new breach notification regulations, developed by the U.S. Department of Health and Human Services Office for Civil Rights. The Texas Medical Association has developed resources to help physicians comply with the new HIPAA regulations.

  10. 36 CFR 902.56 - Protection of personal privacy.

    Science.gov (United States)

    2010-07-01

    ... privacy. 902.56 Section 902.56 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT... Protection of personal privacy. (a) Any of the following personnel, medical, or similar records is within the... invasion of his personal privacy: (1) Personnel and background records personal to any officer or employee...

  11. Privacy-Preserving and Scalable Service Recommendation Based on SimHash in a Distributed Cloud Environment

    Directory of Open Access Journals (Sweden)

    Yanwei Xu

    2017-01-01

    Full Text Available With the increasing volume of web services in the cloud environment, Collaborative Filtering- (CF- based service recommendation has become one of the most effective techniques to alleviate the heavy burden on the service selection decisions of a target user. However, the service recommendation bases, that is, historical service usage data, are often distributed in different cloud platforms. Two challenges are present in such a cross-cloud service recommendation scenario. First, a cloud platform is often not willing to share its data to other cloud platforms due to privacy concerns, which decreases the feasibility of cross-cloud service recommendation severely. Second, the historical service usage data recorded in each cloud platform may update over time, which reduces the recommendation scalability significantly. In view of these two challenges, a novel privacy-preserving and scalable service recommendation approach based on SimHash, named SerRecSimHash, is proposed in this paper. Finally, through a set of experiments deployed on a real distributed service quality dataset WS-DREAM, we validate the feasibility of our proposal in terms of recommendation accuracy and efficiency while guaranteeing privacy-preservation.

  12. Visual privacy by context: proposal and evaluation of a level-based visualisation scheme.

    Science.gov (United States)

    Padilla-López, José Ramón; Chaaraoui, Alexandros Andre; Gu, Feng; Flórez-Revuelta, Francisco

    2015-06-04

    Privacy in image and video data has become an important subject since cameras are being installed in an increasing number of public and private spaces. Specifically, in assisted living, intelligent monitoring based on computer vision can allow one to provide risk detection and support services that increase people's autonomy at home. In the present work, a level-based visualisation scheme is proposed to provide visual privacy when human intervention is necessary, such as at telerehabilitation and safety assessment applications. Visualisation levels are dynamically selected based on the previously modelled context. In this way, different levels of protection can be provided, maintaining the necessary intelligibility required for the applications. Furthermore, a case study of a living room, where a top-view camera is installed, is presented. Finally, the performed survey-based evaluation indicates the degree of protection provided by the different visualisation models, as well as the personal privacy preferences and valuations of the users.

  13. Theoretical foundations of human decision-making in agent-based land use models – A review

    NARCIS (Netherlands)

    Groeneveld, Geert J.; Müller, B.; Buchmann, C.M.; Dressler, Gunnar; Guo, C.; Hase, N.; Hoffmann, F.; John, F.; Klassert, C.; Lauf, T.; Liebelt, V.; Nolzen, H.; Pannicke, N.; Schulze, J.; Weise, H.; Schwarz, N.

    2017-01-01

    Recent reviews stated that the complex and context-dependent nature of human decision-making resulted in ad-hoc representations of human decision in agent-based land use change models (LUCC ABMs) and that these representations are often not explicitly grounded in theory. However, a systematic survey

  14. 76 FR 51869 - Privacy Act Implementation

    Science.gov (United States)

    2011-08-19

    ... permanent residence. Maintain includes collect, use, disseminate, or control. Privacy Act means the Privacy... announces the creation, deletion, or amendment of one or more system of records. System of records notices... reference and university libraries or electronically at the [[Page 51873

  15. Measuring privacy compliance using fitness metrics

    NARCIS (Netherlands)

    Banescu, S.; Petkovic, M.; Zannone, N.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Nowadays, repurposing of personal data is a major privacy issue. Detection of data repurposing requires posteriori mechanisms able to determine how data have been processed. However, current a posteriori solutions for privacy compliance are often manual, leading infringements to remain undetected.

  16. Defining Privacy Is Supposed to Be Easy

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander; Gross, Thomas; Viganò, Luca

    2013-01-01

    Formally specifying privacy goals is not trivial. The most widely used approach in formal methods is based on the static equivalence of frames in the applied pi-calculus, basically asking whether or not the intruder is able to distinguish two given worlds. A subtle question is how we can be sure...... that we have specified all pairs of worlds to properly reflect our intuitive privacy goal. To address this problem, we introduce in this paper a novel and declarative way to specify privacy goals, called α-β privacy, and relate it to static equivalence. This new approach is based on specifying two...... formulae α and β in first-order logic with Herbrand universes, where α reflects the intentionally released information and β includes the actual cryptographic (“technical”) messages the intruder can see. Then α-β privacy means that the intruder cannot derive any “non-technical” statement from β that he...

  17. 77 FR 61275 - Privacy Act of 1974: Implementation

    Science.gov (United States)

    2012-10-09

    ... (FBI) Privacy Act system of records titled FBI Data Warehouse System, JUSTICE/FBI- 022. This system is...)(G), (H), and (I), (5), and (8); (f); and (g) of the Privacy Act: (1) FBI Data Warehouse System... security; disclose information that would constitute an unwarranted invasion of another's personal privacy...

  18. 22 CFR 212.22 - Protection of personal privacy.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Protection of personal privacy. 212.22 Section... Information for Public Inspection and Copying § 212.22 Protection of personal privacy. To the extent required to prevent a clearly unwarranted invasion of personal privacy, USAID may delete identifying details...

  19. Modelling information dissemination under privacy concerns in social media

    Science.gov (United States)

    Zhu, Hui; Huang, Cheng; Lu, Rongxing; Li, Hui

    2016-05-01

    Social media has recently become an important platform for users to share news, express views, and post messages. However, due to user privacy preservation in social media, many privacy setting tools are employed, which inevitably change the patterns and dynamics of information dissemination. In this study, a general stochastic model using dynamic evolution equations was introduced to illustrate how privacy concerns impact the process of information dissemination. Extensive simulations and analyzes involving the privacy settings of general users, privileged users, and pure observers were conducted on real-world networks, and the results demonstrated that user privacy settings affect information differently. Finally, we also studied the process of information diffusion analytically and numerically with different privacy settings using two classic networks.

  20. Privacy Preservation in Distributed Subgradient Optimization Algorithms

    OpenAIRE

    Lou, Youcheng; Yu, Lean; Wang, Shouyang

    2015-01-01

    Privacy preservation is becoming an increasingly important issue in data mining and machine learning. In this paper, we consider the privacy preserving features of distributed subgradient optimization algorithms. We first show that a well-known distributed subgradient synchronous optimization algorithm, in which all agents make their optimization updates simultaneously at all times, is not privacy preserving in the sense that the malicious agent can learn other agents' subgradients asymptotic...

  1. Certificate Transparency with Privacy

    Directory of Open Access Journals (Sweden)

    Eskandarian Saba

    2017-10-01

    Full Text Available Certificate transparency (CT is an elegant mechanism designed to detect when a certificate authority (CA has issued a certificate incorrectly. Many CAs now support CT and it is being actively deployed in browsers. However, a number of privacy-related challenges remain. In this paper we propose practical solutions to two issues. First, we develop a mechanism that enables web browsers to audit a CT log without violating user privacy. Second, we extend CT to support non-public subdomains.

  2. 37 CFR 251.23 - FOIA and Privacy Act.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false FOIA and Privacy Act. 251.23 Section 251.23 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT... Access to and Inspection of Records § 251.23 FOIA and Privacy Act. Freedom of Information Act and Privacy...

  3. 32 CFR 806b.4 - Privacy Act complaints.

    Science.gov (United States)

    2010-07-01

    ... be identified, the local Privacy Act officer will assume these duties. Issues that cannot be resolved... 32 National Defense 6 2010-07-01 2010-07-01 false Privacy Act complaints. 806b.4 Section 806b.4 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION PRIVACY ACT...

  4. Privacy in the Sharing Economy

    DEFF Research Database (Denmark)

    Ranzini, Giulia; Etter, Michael; Lutz, Christoph

    ’s digital services through providing recommendations to Europe’s institutions. The initial stage of this research project involves a set of three literature reviews of the state of research on three core topics in relation to the sharing economy: participation (1), privacy (2), and power (3). This piece...... is a literature review on the topic of privacy. It addresses key privacy challenges for different stakeholders in the sharing economy. Throughout, we use the term "consumers" to refer to users on the receiving end (e.g., Airbnb guests, Uber passengers), "providers" to refer to users on the providing end (e.......g., Airbnb hosts, Uber drivers) and "platforms" to refer to the mediating sites, apps and infrastructures matching consumers and providers (e.g., Airbnb, Uber)....

  5. AnonySense: Opportunistic and Privacy-Preserving Context Collection

    DEFF Research Database (Denmark)

    Triandopoulos, Nikolaos; Kapadia, Apu; Cornelius, Cory

    2008-01-01

    on tessellation and clustering to protect users' privacy against the system while reporting context, and k-anonymous report aggregation to improve the users' privacy against applications receiving the context. We outline the architecture and security properties of AnonySense, and focus on evaluating our....... We propose AnonySense, a general-purpose architecture for leveraging users' mobile devices for measuring context, while maintaining the privacy of the users.AnonySense features multiple layers of privacy protection-a framework for nodes to receive tasks anonymously, a novel blurring mechanism based...

  6. Privacy Data Decomposition and Discretization Method for SaaS Services

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2017-01-01

    Full Text Available In cloud computing, user functional requirements are satisfied through service composition. However, due to the process of interaction and sharing among SaaS services, user privacy data tends to be illegally disclosed to the service participants. In this paper, we propose a privacy data decomposition and discretization method for SaaS services. First, according to logic between the data, we classify the privacy data into discrete privacy data and continuous privacy data. Next, in order to protect the user privacy information, continuous data chains are decomposed into discrete data chain, and discrete data chains are prevented from being synthesized into continuous data chains. Finally, we propose a protection framework for privacy data and demonstrate its correctness and feasibility with experiments.

  7. 75 FR 17937 - Privacy Act of 1974; Deletion of an Existing System of Records

    Science.gov (United States)

    2010-04-08

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Health Resources and Services Administration Privacy Act... Clinician Recruitment and Service (BCRS), Health Resources and Services Administration (HRSA), 5600 Fishers... Administrator, Bureau of Clinician Recruitment and Service (BCRS), Health Resources and Services Administration...

  8. Privacy preserving surveillance and the tracking-paradox

    OpenAIRE

    Greiner, S.; Birnstill, Pascal; Krempel, Erik; Beckert, B.; Beyerer, Jürgen

    2013-01-01

    Increasing capabilities of intelligent video surveillance systems impose new threats to privacy while, at the same time, offering opportunities for reducing the privacy invasiveness of surveillance measures as well as their selectivity. We show that aggregating more data about observed people does not necessarily lead to less privacy, but can increase the selectivity of surveillance measures. In case of video surveillance in a company environment, if we enable the system to authenticate emplo...

  9. 76 FR 30952 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-05-27

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... the Department. These assessments were approved and published on the Privacy Office's web site between..., 2011 and March 31, 2011, the Chief Privacy Officer of the DHS approved and published sixteen Privacy...

  10. 76 FR 58814 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-09-22

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... DHS. These assessments were approved and published on the Privacy Office's Web site between June 1... 31, 2011, the Chief Privacy Officer of the DHS approved and published twenty-six Privacy Impact...

  11. 76 FR 78934 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-12-20

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on.... These assessments were approved and published on the Privacy Office's web site between September 1, 2011... November 30, 2011, the Chief Privacy Officer of the DHS approved and published seven Privacy Impact...

  12. 76 FR 37823 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-06-28

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... Department. These assessments were approved and published on the Privacy Office's Web site between March 31... 31, 2011, the Chief Privacy Officer of the DHS approved and published ten Privacy Impact Assessments...

  13. Privacy vs. Reward in Indoor Location-Based Services

    Directory of Open Access Journals (Sweden)

    Fawaz Kassem

    2016-10-01

    Full Text Available With the advance of indoor localization technology, indoor location-based services (ILBS are gaining popularity. They, however, accompany privacy concerns. ILBS providers track the users’ mobility to learn more about their behavior, and then provide them with improved and personalized services. Our survey of 200 individuals highlighted their concerns about this tracking for potential leakage of their personal/private traits, but also showed their willingness to accept reduced tracking for improved service. In this paper, we propose PR-LBS (Privacy vs. Reward for Location-Based Service, a system that addresses these seemingly conflicting requirements by balancing the users’ privacy concerns and the benefits of sharing location information in indoor location tracking environments. PR-LBS relies on a novel location-privacy criterion to quantify the privacy risks pertaining to sharing indoor location information. It also employs a repeated play model to ensure that the received service is proportionate to the privacy risk. We implement and evaluate PR-LBS extensively with various real-world user mobility traces. Results show that PR-LBS has low overhead, protects the users’ privacy, and makes a good tradeoff between the quality of service for the users and the utility of shared location data for service providers.

  14. 32 CFR 701.109 - Privacy Act (PA) appeals.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy Act (PA) appeals. 701.109 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.109 Privacy Act (PA) appeals. (a... commence when the appeal reaches the office of the review authority having jurisdiction over the record...

  15. Privacy Information Security Classification for Internet of Things Based on Internet Data

    OpenAIRE

    Lu, Xiaofeng; Qu, Zhaowei; Li, Qi; Hui, Pan

    2015-01-01

    A lot of privacy protection technologies have been proposed, but most of them are independent and aim at protecting some specific privacy. There is hardly enough deep study into the attributes of privacy. To minimize the damage and influence of the privacy disclosure, the important and sensitive privacy should be a priori preserved if all privacy pieces cannot be preserved. This paper focuses on studying the attributes of the privacy and proposes privacy information security classification (P...

  16. Differential privacy in intelligent transportation systems

    NARCIS (Netherlands)

    Kargl, Frank; Friedman, Arik; Boreli, Roksana

    2013-01-01

    In this paper, we investigate how the concept of differential privacy can be applied to Intelligent Transportation Systems (ITS), focusing on protection of Floating Car Data (FCD) stored and processed in central Traffic Data Centers (TDC). We illustrate an integration of differential privacy with

  17. The privacy paradox : Investigating discrepancies between expressed privacy concerns and actual online behavior - A systematic literature review

    NARCIS (Netherlands)

    Barth, Susanne; de Jong, Menno D.T.

    2017-01-01

    Also known as the privacy paradox, recent research on online behavior has revealed discrepancies between user attitude and their actual behavior. More specifically: While users claim to be very concerned about their privacy, they nevertheless undertake very little to protect their personal data.

  18. Privacy Breach Analysis in Social Networks

    Science.gov (United States)

    Nagle, Frank

    This chapter addresses various aspects of analyzing privacy breaches in social networks. We first review literature that defines three types of privacy breaches in social networks: interactive, active, and passive. We then survey the various network anonymization schemes that have been constructed to address these privacy breaches. After exploring these breaches and anonymization schemes, we evaluate a measure for determining the level of anonymity inherent in a network graph based on its topological structure. Finally, we close by emphasizing the difficulty of anonymizing social network data while maintaining usability for research purposes and offering areas for future work.

  19. Outsourcing medical data analyses: can technology overcome legal, privacy, and confidentiality issues?

    Science.gov (United States)

    Brumen, Bostjan; Heričko, Marjan; Sevčnikar, Andrej; Završnik, Jernej; Hölbl, Marko

    2013-12-16

    Medical data are gold mines for deriving the knowledge that could change the course of a single patient's life or even the health of the entire population. A data analyst needs to have full access to relevant data, but full access may be denied by privacy and confidentiality of medical data legal regulations, especially when the data analyst is not affiliated with the data owner. Our first objective was to analyze the privacy and confidentiality issues and the associated regulations pertaining to medical data, and to identify technologies to properly address these issues. Our second objective was to develop a procedure to protect medical data in such a way that the outsourced analyst would be capable of doing analyses on protected data and the results would be comparable, if not the same, as if they had been done on the original data. Specifically, our hypothesis was there would not be a difference between the outsourced decision trees built on encrypted data and the ones built on original data. Using formal definitions, we developed an algorithm to protect medical data for outsourced analyses. The algorithm was applied to publicly available datasets (N=30) from the medical and life sciences fields. The analyses were performed on the original and the protected datasets and the results of the analyses were compared. Bootstrapped paired t tests for 2 dependent samples were used to test whether the mean differences in size, number of leaves, and the accuracy of the original and the encrypted decision trees were significantly different. The decision trees built on encrypted data were virtually the same as those built on original data. Out of 30 datasets, 100% of the trees had identical accuracy. The size of a tree and the number of leaves was different only once (1/30, 3%, P=.19). The proposed algorithm encrypts a file with plain text medical data into an encrypted file with the data protected in such a way that external data analyses are still possible. The results

  20. Evaluating Common Privacy Vulnerabilities in Internet Service Providers

    Science.gov (United States)

    Kotzanikolaou, Panayiotis; Maniatis, Sotirios; Nikolouzou, Eugenia; Stathopoulos, Vassilios

    Privacy in electronic communications receives increased attention in both research and industry forums, stemming from both the users' needs and from legal and regulatory requirements in national or international context. Privacy in internet-based communications heavily relies on the level of security of the Internet Service Providers (ISPs), as well as on the security awareness of the end users. This paper discusses the role of the ISP in the privacy of the communications. Based on real security audits performed in national-wide ISPs, we illustrate privacy-specific threats and vulnerabilities that many providers fail to address when implementing their security policies. We subsequently provide and discuss specific security measures that the ISPs can implement, in order to fine-tune their security policies in the context of privacy protection.

  1. 76 FR 63896 - Federal Acquisition Regulation; Privacy Training, 2010-013

    Science.gov (United States)

    2011-10-14

    ... should a breach occur; and (7) Any agency-specific privacy training requirements. (d) The contractor is... Acquisition Regulation; Privacy Training, 2010-013 AGENCY: Department of Defense (DoD), General Services... contractors to complete training that addresses the protection of privacy, in accordance with the Privacy Act...

  2. Protecting Privacy in Shared Photos via Adversarial Examples Based Stealth

    Directory of Open Access Journals (Sweden)

    Yujia Liu

    2017-01-01

    Full Text Available Online image sharing in social platforms can lead to undesired privacy disclosure. For example, some enterprises may detect these large volumes of uploaded images to do users’ in-depth preference analysis for commercial purposes. And their technology might be today’s most powerful learning model, deep neural network (DNN. To just elude these automatic DNN detectors without affecting visual quality of human eyes, we design and implement a novel Stealth algorithm, which makes the automatic detector blind to the existence of objects in an image, by crafting a kind of adversarial examples. It is just like all objects disappear after wearing an “invisible cloak” from the view of the detector. Then we evaluate the effectiveness of Stealth algorithm through our newly defined measurement, named privacy insurance. The results indicate that our scheme has considerable success rate to guarantee privacy compared with other methods, such as mosaic, blur, and noise. Better still, Stealth algorithm has the smallest impact on image visual quality. Meanwhile, we set a user adjustable parameter called cloak thickness for regulating the perturbation intensity. Furthermore, we find that the processed images have transferability property; that is, the adversarial images generated for one particular DNN will influence the others as well.

  3. Syllabus for Privacy and Information Technology, Fall 2017, UCLA Information Studies

    OpenAIRE

    Borgman, Christine L.

    2017-01-01

    Privacy is a broad topic that covers many disciplines, stakeholders, and concerns. This course addresses the intersection of privacy and information technology, surveying a wide array of topics of concern for research and practice in the information fields. Among the topics covered are the history and changing contexts of privacy; privacy risks and harms; law, policies, and practices; privacy in searching for information, in reading, and in libraries; surveillance, networks, and privacy by de...

  4. Hacking Facebook Privacy and Security

    Science.gov (United States)

    2012-08-28

    REPORT Hacking Facebook Privacy and Security 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: When people talk about hacking and social networks , they’re...12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Facebook , Privacy, Security, Social Network Dr. Jeff Duffany (Advisor), Omar Galban...transmit personal information that many people that they dare not do it personally. FACEBOOK PLATFORM Facebook is a popular social networking

  5. 77 FR 46100 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2012-08-02

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... published on the Privacy Office's Web site between March 1, 2012 and May 31, 2012. DATES: The PIAs will be... approved and published fifteen Privacy Impact Assessments (PIAs) on the DHS Privacy Office Web site, www...

  6. Designing Privacy-aware Internet of Things Applications

    OpenAIRE

    Perera, Charith; Barhamgi, Mahmoud; Bandara, Arosha K.; Ajmal, Muhammad; Price, Blaine; Nuseibeh, Bashar

    2017-01-01

    Internet of Things (IoT) applications typically collect and analyse personal data that can be used to derive sensitive information about individuals. However, thus far, privacy concerns have not been explicitly considered in software engineering processes when designing IoT applications. In this paper, we explore how a Privacy-by-Design (PbD) framework, formulated as a set of guidelines, can help software engineers to design privacy-aware IoT applications. We studied the utility of our propos...

  7. Toward sensitive document release with privacy guarantees

    OpenAIRE

    David Sánchez; Montserrat Batet

    2017-01-01

    Toward sensitive document release with privacy guarantees DOI: 10.1016/j.engappai.2016.12.013 URL: http://www.sciencedirect.com/science/article/pii/S0952197616302408 Filiació URV: SI Inclòs a la memòria: SI Privacy has become a serious concern for modern Information Societies. The sensitive nature of much of the data that are daily exchanged or released to untrusted parties requires that responsible organizations undertake appropriate privacy protection measures. Nowadays, much...

  8. Frames, Biases, and Rational Decision-Making in the Human Brain

    OpenAIRE

    De Martino, Benedetto; Kumaran, Dharshan; Seymour, Ben; Dolan, Raymond J.

    2006-01-01

    Human choices are remarkably susceptible to the manner in which options are presented. This so-called “framing effect” represents a striking violation of standard economic accounts of human rationality, although its underlying neurobiology is not understood. We found that the framing effect was specifically associated with amygdala activity, suggesting a key role for an emotional system in mediating decision biases. Moreover, across individuals, orbital and medial prefrontal cortex activity p...

  9. Frames, biases, and rational decision-making in the human brain

    OpenAIRE

    De Martino, B.; Kumaran, D.; Seymour, B.; Dolan, R. J.

    2006-01-01

    Human choices are remarkably susceptible to the manner in which options are presented. This so-called "framing effect" represents a striking violation of standard economic accounts of human rationality, although its underlying neurobiology is not understood. We found that the framing effect was specifically associated with amygdala activity, suggesting a key role for an emotional system in mediating decision biases. Moreover, across individuals, orbital and medial prefrontal cortex activity p...

  10. Achieving Optimal Privacy in Trust-Aware Social Recommender Systems

    Science.gov (United States)

    Dokoohaki, Nima; Kaleli, Cihan; Polat, Huseyin; Matskin, Mihhail

    Collaborative filtering (CF) recommenders are subject to numerous shortcomings such as centralized processing, vulnerability to shilling attacks, and most important of all privacy. To overcome these obstacles, researchers proposed for utilization of interpersonal trust between users, to alleviate many of these crucial shortcomings. Till now, attention has been mainly paid to strong points about trust-aware recommenders such as alleviating profile sparsity or calculation cost efficiency, while least attention has been paid on investigating the notion of privacy surrounding the disclosure of individual ratings and most importantly protection of trust computation across social networks forming the backbone of these systems. To contribute to addressing problem of privacy in trust-aware recommenders, within this paper, first we introduce a framework for enabling privacy-preserving trust-aware recommendation generation. While trust mechanism aims at elevating recommender's accuracy, to preserve privacy, accuracy of the system needs to be decreased. Since within this context, privacy and accuracy are conflicting goals we show that a Pareto set can be found as an optimal setting for both privacy-preserving and trust-enabling mechanisms. We show that this Pareto set, when used as the configuration for measuring the accuracy of base collaborative filtering engine, yields an optimized tradeoff between conflicting goals of privacy and accuracy. We prove this concept along with applicability of our framework by experimenting with accuracy and privacy factors, and we show through experiment how such optimal set can be inferred.

  11. The study on privacy preserving data mining for information security

    Science.gov (United States)

    Li, Xiaohui

    2012-04-01

    Privacy preserving data mining have a rapid development in a short year. But it still faces many challenges in the future. Firstly, the level of privacy has different definitions in different filed. Therefore, the measure of privacy preserving data mining technology protecting private information is not the same. So, it's an urgent issue to present a unified privacy definition and measure. Secondly, the most of research in privacy preserving data mining is presently confined to the theory study.

  12. 32 CFR 701.119 - Privacy and the web.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy and the web. 701.119 Section 701.119... THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.119 Privacy and the web. DON activities shall consult SECNAVINST 5720.47B for guidance on what may be posted on a Navy Web site. ...

  13. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  14. Unveiling consumer's privacy paradox behaviour in an economic exchange.

    Science.gov (United States)

    Motiwalla, Luvai F; Li, Xiao-Bai

    2016-01-01

    Privacy paradox is of great interest to IS researchers and firms gathering personal information. It has been studied from social, behavioural, and economic perspectives independently. However, prior research has not examined the degrees of influence these perspectives contribute to the privacy paradox problem. We combine both economic and behavioural perspectives in our study of the privacy paradox with a price valuation of personal information through an economic experiment combined with a behavioural study on privacy paradox. Our goal is to reveal more insights on the privacy paradox through economic valuation on personal information. Results indicate that general privacy concerns or individual disclosure concerns do not have a significant influence on the price valuation of personal information. Instead, prior disclosure behaviour in specific scenario, like with healthcare providers or social networks, is a better indicator of consumer price valuations.

  15. Privacy and confidentiality in pragmatic clinical trials.

    Science.gov (United States)

    McGraw, Deven; Greene, Sarah M; Miner, Caroline S; Staman, Karen L; Welch, Mary Jane; Rubel, Alan

    2015-10-01

    With pragmatic clinical trials, an opportunity exists to answer important questions about the relative risks, burdens, and benefits of therapeutic interventions. However, concerns about protecting the privacy of this information are significant and must be balanced with the imperative to learn from the data gathered in routine clinical practice. Traditional privacy protections for research uses of identifiable information rely disproportionately on informed consent or authorizations, based on a presumption that this is necessary to fulfill ethical principles of respect for persons. But frequently, the ideal of informed consent is not realized in its implementation. Moreover, the principle of respect for persons—which encompasses their interests in health information privacy—can be honored through other mechanisms. Data anonymization also plays a role in protecting privacy but is not suitable for all research, particularly pragmatic clinical trials. In this article, we explore both the ethical foundation and regulatory framework intended to protect privacy in pragmatic clinical trials. We then review examples of novel approaches to respecting persons in research that may have the added benefit of honoring patient privacy considerations. © The Author(s) 2015.

  16. Fourteen Reasons Privacy Matters: A Multidisciplinary Review of Scholarly Literature

    Science.gov (United States)

    Magi, Trina J.

    2011-01-01

    Librarians have long recognized the importance of privacy to intellectual freedom. As digital technology and its applications advance, however, efforts to protect privacy may become increasingly difficult. With some users behaving in ways that suggest they do not care about privacy and with powerful voices claiming that privacy is dead, librarians…

  17. Who should make the decision on the use of GPS for people with dementia?

    Science.gov (United States)

    Landau, Ruth; Auslander, Gail K; Werner, Shirli; Shoval, Noam; Heinik, Jeremia

    2011-01-01

    In recent years advanced technologies, such as Global Positioning Systems (GPS), allow for tracking of human spatial activity and provide the ability to intervene to manage that activity. The purpose of this study is to examine the issue of who should decide about the use of electronic tracking using GPS for people with dementia. Based on quantitative data collected from 296 participants comprising cognitively intact elderly, family caregivers of people with dementia, social workers, other professionals, and social work students, study participants were asked to rate nine different potential decision-makers to make this decision. The results show that figures inside the family, particularly the spouse or the most involved family caregiver, were perceived more important in the decision-making process than figures outside the family, whereas the person with dementia was ranked third in the order of the figures. Since the decision to use GPS for tracking raises the ethical dilemma of personal safety versus autonomy and privacy of people with dementia, the findings seem to indicate that the reluctance of professional caregivers to assist family caregivers to make this decision is experienced as frustrating. The findings imply that in order to reach a balance between the wishes and interests of both people with dementia and their family caregivers, there is a need for more active involvement of the professional caregivers to facilitate the family decision-making process.

  18. Understanding Engagement with the Privacy Domain Through Design Research.

    OpenAIRE

    Vasalou, A.; Oostveen, A.; Bowers, Christopher; Beale, R.

    2015-01-01

    This paper reports findings from participatory design research aimed at uncovering how technological interventions can engage users in the domain of privacy. Our work was undertaken in the context of a new design concept “Privacy Trends” whose aspiration is to foster technology users’ digital literacy regarding ongoing privacy risks and elucidate how such risks fit within existing social, organizational and political systems, leading to a longer term privacy concern. Our study reveals two cha...

  19. Privacy Preserving Mapping Schemes Supporting Comparison

    NARCIS (Netherlands)

    Tang, Qiang

    2010-01-01

    To cater to the privacy requirements in cloud computing, we introduce a new primitive, namely Privacy Preserving Mapping (PPM) schemes supporting comparison. An PPM scheme enables a user to map data items into images in such a way that, with a set of images, any entity can determine the <, =, >

  20. Privacy een grondrecht, maar ook handelswaar

    NARCIS (Netherlands)

    Olsthoorn, P.

    2015-01-01

    Snoeihard uit journalist Brenno de Winter zijn commentaar op sprekers over privacy tijdens het NLIGF congres 2015. Hij zet Bart Schermer, adviseur van bedrijven en organisaties in de hoek. Die heeft net betoogd dat privacy geen juk (‘korvee’) mag vormen maar inzet moet zijn van innovatie door

  1. User Privacy and Empowerment: Trends, Challenges, and Opportunities

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2018-01-01

    to the service providers. Considering business models that are slanted towards service provid-ers, privacy has become a crucial issue in today’s fast growing digital world. Hence, this paper elaborates personal information flow between users, service providers, and data brokers. We also discussed the significant...... privacy issues like present business models, user awareness about privacy and user control over per-sonal data. To address such issues, this paper also identified challenges that com-prise unavailability of effective privacy awareness or protection tools and the ef-fortless way to study and see the flow...... of personal information and its manage-ment. Thus, empowering users and enhancing awareness are essential to compre-hending the value of secrecy. This paper also introduced latest advances in the domain of privacy issues like User Managed Access (UMA) can state suitable requirements for user empowerment...

  2. Overview of Privacy in Social Networking Sites (SNS)

    Science.gov (United States)

    Powale, Pallavi I.; Bhutkar, Ganesh D.

    2013-07-01

    Social Networking Sites (SNS) have become an integral part of communication and life style of people in today's world. Because of the wide range of services offered by SNSs mostly for free of cost, these sites are attracting the attention of all possible Internet users. Most importantly, users from all age groups have become members of SNSs. Since many of the users are not aware of the data thefts associated with information sharing, they freely share their personal information with SNSs. Therefore, SNSs may be used for investigating users' character and social habits by familiar or even unknown persons and agencies. Such commercial and social scenario, has led to number of privacy and security threats. Though, all major issues in SNSs need to be addressed, by SNS providers, privacy of SNS users is the most crucial. And therefore, in this paper, we have focused our discussion on "privacy in SNSs". We have discussed different ways of Personally Identifiable Information (PII) leakages from SNSs, information revelation to third-party domains without user consent and privacy related threats associated with such information sharing. We expect that this comprehensive overview on privacy in SNSs will definitely help in raising user awareness about sharing data and managing their privacy with SNSs. It will also help SNS providers to rethink about their privacy policies.

  3. Trust-aware Privacy Control for Social Media

    OpenAIRE

    Li, Na; Najafian-Razavi, Maryam; Gillet, Denis

    2011-01-01

    Due to the huge exposure of personal information in social media, a challenge now is to design effective privacy mechanisms that protect against unauthorized access to social data. In this paper, a trust model for social media is first presented. Based on the trust model, a trust-aware privacy control protocol is proposed, that exploits the underlying inter-entity trust information. The objective is to design a fine-grained privacy scheme that ensures a user’s online information is disclosed ...

  4. Electronic Mail, Privacy, and the Electronic Communications Privacy Act of 1986: Technology in Search of Law.

    Science.gov (United States)

    Samoriski, Jan H.; And Others

    1996-01-01

    Attempts to clarify the status of e-mail privacy under the Electronic Communications Privacy Act of 1986 (ECPA). Examines current law and the paucity of definitive case law. A review of cases and literature suggests there is a gap in the existing ECPA that allows for potentially abusive electronic monitoring and interception of e-mail,…

  5. From Trust in Automation to Decision Neuroscience: Applying Cognitive Neuroscience Methods to Understand and Improve Interaction Decisions Involved in Human Automation Interaction

    Science.gov (United States)

    Drnec, Kim; Marathe, Amar R.; Lukos, Jamie R.; Metcalfe, Jason S.

    2016-01-01

    Human automation interaction (HAI) systems have thus far failed to live up to expectations mainly because human users do not always interact with the automation appropriately. Trust in automation (TiA) has been considered a central influence on the way a human user interacts with an automation; if TiA is too high there will be overuse, if TiA is too low there will be disuse. However, even though extensive research into TiA has identified specific HAI behaviors, or trust outcomes, a unique mapping between trust states and trust outcomes has yet to be clearly identified. Interaction behaviors have been intensely studied in the domain of HAI and TiA and this has led to a reframing of the issues of problems with HAI in terms of reliance and compliance. We find the behaviorally defined terms reliance and compliance to be useful in their functionality for application in real-world situations. However, we note that once an inappropriate interaction behavior has occurred it is too late to mitigate it. We therefore take a step back and look at the interaction decision that precedes the behavior. We note that the decision neuroscience community has revealed that decisions are fairly stereotyped processes accompanied by measurable psychophysiological correlates. Two literatures were therefore reviewed. TiA literature was extensively reviewed in order to understand the relationship between TiA and trust outcomes, as well as to identify gaps in current knowledge. We note that an interaction decision precedes an interaction behavior and believe that we can leverage knowledge of the psychophysiological correlates of decisions to improve joint system performance. As we believe that understanding the interaction decision will be critical to the eventual mitigation of inappropriate interaction behavior, we reviewed the decision making literature and provide a synopsis of the state of the art understanding of the decision process from a decision neuroscience perspective. We forward

  6. From Trust in Automation to Decision Neuroscience: Applying Cognitive Neuroscience Methods to Understand and Improve Interaction Decisions Involved in Human Automation Interaction.

    Science.gov (United States)

    Drnec, Kim; Marathe, Amar R; Lukos, Jamie R; Metcalfe, Jason S

    2016-01-01

    Human automation interaction (HAI) systems have thus far failed to live up to expectations mainly because human users do not always interact with the automation appropriately. Trust in automation (TiA) has been considered a central influence on the way a human user interacts with an automation; if TiA is too high there will be overuse, if TiA is too low there will be disuse. However, even though extensive research into TiA has identified specific HAI behaviors, or trust outcomes, a unique mapping between trust states and trust outcomes has yet to be clearly identified. Interaction behaviors have been intensely studied in the domain of HAI and TiA and this has led to a reframing of the issues of problems with HAI in terms of reliance and compliance. We find the behaviorally defined terms reliance and compliance to be useful in their functionality for application in real-world situations. However, we note that once an inappropriate interaction behavior has occurred it is too late to mitigate it. We therefore take a step back and look at the interaction decision that precedes the behavior. We note that the decision neuroscience community has revealed that decisions are fairly stereotyped processes accompanied by measurable psychophysiological correlates. Two literatures were therefore reviewed. TiA literature was extensively reviewed in order to understand the relationship between TiA and trust outcomes, as well as to identify gaps in current knowledge. We note that an interaction decision precedes an interaction behavior and believe that we can leverage knowledge of the psychophysiological correlates of decisions to improve joint system performance. As we believe that understanding the interaction decision will be critical to the eventual mitigation of inappropriate interaction behavior, we reviewed the decision making literature and provide a synopsis of the state of the art understanding of the decision process from a decision neuroscience perspective. We forward

  7. Millennials sex differences on Snapchat perceived privacy

    Directory of Open Access Journals (Sweden)

    Antonietta Rauzzino

    2017-07-01

    Full Text Available Snapchat offers a distinctive feature from other social networks in that its users control the visibility of the contents they share with others by defining how long these contents may be available. Snapchat is changing the way men and women perceive online information privacy and content management. This paper aims to illustrate the relevance of social representation theory to evaluate perceived privacy in Snapchat users, with a sample of 268 young adults residing in Bogotá. A survey method was employed for data collection purposes. The results reveal that Snapchat users are concerned about their networks’ privacy, with no significant sex differences, although men's perception of Snapchat privacy is safer than that of women. Finally, a discussion is presented as to the limitations and implications of these results for further studies.

  8. Development and Analyses of Privacy Management Models in Online Social Networks Based on Communication Privacy Management Theory

    Science.gov (United States)

    Lee, Ki Jung

    2013-01-01

    Online social networks (OSNs), while serving as an emerging means of communication, promote various issues of privacy. Users of OSNs encounter diverse occasions that lead to invasion of their privacy, e.g., published conversation, public revelation of their personally identifiable information, and open boundary of distinct social groups within…

  9. Envisioning a Future Decision Support System for Requirements Engineering : A Holistic and Human-centred Perspective

    OpenAIRE

    Alenljung, Beatrice

    2008-01-01

    Complex decision-making is a prominent aspect of requirements engineering (RE) and the need for improved decision support for RE decision-makers has been identified by a number of authors in the research literature. The fundamental viewpoint that permeates this thesis is that RE decision-making can be substantially improved by RE decision support systems (REDSS) based on the actual needs of RE decision-makers as well as the actual generic human decision-making activities that take place in th...

  10. Preserving location and absence privacy in geo-social networks

    DEFF Research Database (Denmark)

    Freni, Dario; Vicente, Carmen Ruiz; Mascetti, Sergio

    2010-01-01

    accessible to multiple users. This renders it difficult for GeoSN users to control which information about them is available and to whom it is available. This paper addresses two privacy threats that occur in GeoSNs: location privacy and absence privacy. The former concerns the availability of information...... about the presence of users in specific locations at given times, while the latter concerns the availability of information about the absence of an individual from specific locations during given periods of time. The challenge addressed is that of supporting privacy while still enabling useful services....... The resulting geo-aware social networks (GeoSNs) pose privacy threats beyond those found in location-based services. Content published in a GeoSN is often associated with references to multiple users, without the publisher being aware of the privacy preferences of those users. Moreover, this content is often...

  11. Privacy Training Program

    Science.gov (United States)

    Recognizing that training and awareness are critical to protecting agency Personally Identifiable Information (PII), the EPA is developing online training for privacy contacts in its programs and regions.

  12. The Effects of Public Concern for Information Privacy on the Adoption of Health Information Exchanges (HIEs) by Healthcare Entities.

    Science.gov (United States)

    Esmaeilzadeh, Pouyan

    2018-05-08

    The implementation of Health Information Exchanges (HIEs) by healthcare organizations may not achieve the desired outcomes as consumers may request that their health information remains unshared because of information privacy concerns. Drawing on the insights of concern for information privacy (CFIP) literature, this work extends the application of CFIP to the HIE domain. This study attempts to develop and test a model centered on the four dimensions of CFIP construct (collection, errors, unauthorized access, and secondary use) and their antecedents to predict consumers' opt-in behavioral intention toward HIE in the presence of the perceived health status' effects. We conducted an online survey in the United States using 826 samples. The results demonstrate that the perceived health information sensitivity and computer anxiety meaningfully contribute to information privacy concerns and CFIP construct significantly impedes consumers' opt-in decision to HIEs. Interestingly, contrary to our expectation, perceived poor health status considerably attenuates the negative effects exerted by CFIP on opt-in intention. The model proposed by this study can be used as a useful conceptual tool by both further studies and practitioners to examine the complex nature of patients' reactions to information privacy threats associated with the use of HIE technology in the healthcare industry.

  13. Cyber security challenges in Smart Cities: Safety, security and privacy

    Science.gov (United States)

    Elmaghraby, Adel S.; Losavio, Michael M.

    2014-01-01

    The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the “Internet of Things.” Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect. PMID:25685517

  14. Cyber security challenges in Smart Cities: Safety, security and privacy

    Directory of Open Access Journals (Sweden)

    Adel S. Elmaghraby

    2014-07-01

    Full Text Available The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the “Internet of Things.” Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect.

  15. Cyber security challenges in Smart Cities: Safety, security and privacy.

    Science.gov (United States)

    Elmaghraby, Adel S; Losavio, Michael M

    2014-07-01

    The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the "Internet of Things." Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect.

  16. 49 CFR 801.56 - Unwarranted invasion of personal privacy.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Unwarranted invasion of personal privacy. 801.56... Unwarranted invasion of personal privacy. Pursuant to 5 U.S.C. 552(b)(6), any personal, medical, or similar... a clearly unwarranted invasion of the person's personal privacy. ...

  17. Privacy na Babel : de vermeende ongrijpbaarheid van het privacybegrip

    NARCIS (Netherlands)

    Vedder, A.H.

    1998-01-01

    De veel voorkomende en onlangs weer door Serge Gutwirth naar voren gebrachte opvatting dat privacy principieel ondefinieerbaar is, is onjuist. Voor de verdediging van privacy als waarde moet men aannemen dat privacy weliswaar een vaag complex begrip is, dat voor een deel contextueel bepaald wordt,

  18. 20 CFR 401.30 - Privacy Act and other responsibilities.

    Science.gov (United States)

    2010-04-01

    ... information privacy issues, including those relating to the collection, use, sharing, and disclosure of... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Privacy Act and other responsibilities. 401.30 Section 401.30 Employees' Benefits SOCIAL SECURITY ADMINISTRATION PRIVACY AND DISCLOSURE OF...

  19. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    Science.gov (United States)

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  20. Towards quantum-based privacy and voting

    International Nuclear Information System (INIS)

    Hillery, Mark; Ziman, Mario; Buzek, Vladimir; Bielikova, Martina

    2006-01-01

    The privacy of communicating participants is often of paramount importance, but in some situations it is an essential condition. A typical example is a fair (secret) voting. We analyze in detail communication privacy based on quantum resources, and we propose new quantum protocols. Possible generalizations that would lead to voting schemes are discussed

  1. The Privacy Attitude Questionnaire (PAQ): Initial Development and Validation

    OpenAIRE

    Chignell, Mark H.; Quan-Haase, Anabel; Gwizdka, Jacek

    2003-01-01

    Privacy has been identified as a key issue in a variety of domains, including electronic commerce and public policy. While there are many discussions of privacy issues from a legal and policy perspective, there is little information on the structure of privacy as a psychometric construct. Our goal is to develop a method for measuring attitudes towards privacy that can guide the design and personalization of services. This paper reports on the development of an initial version of the PAQ. Four...

  2. The Privacy Jungle:On the Market for Data Protection in Social Networks

    Science.gov (United States)

    Bonneau, Joseph; Preibusch, Sören

    We have conducted the first thorough analysis of the market for privacy practices and policies in online social networks. From an evaluation of 45 social networking sites using 260 criteria we find that many popular assumptions regarding privacy and social networking need to be revisited when considering the entire ecosystem instead of only a handful of well-known sites. Contrary to the common perception of an oligopolistic market, we find evidence of vigorous competition for new users. Despite observing many poor security practices, there is evidence that social network providers are making efforts to implement privacy enhancing technologies with substantial diversity in the amount of privacy control offered. However, privacy is rarely used as a selling point, even then only as auxiliary, nondecisive feature. Sites also failed to promote their existing privacy controls within the site. We similarly found great diversity in the length and content of formal privacy policies, but found an opposite promotional trend: though almost all policies are not accessible to ordinary users due to obfuscating legal jargon, they conspicuously vaunt the sites' privacy practices. We conclude that the market for privacy in social networks is dysfunctional in that there is significant variation in sites' privacy controls, data collection requirements, and legal privacy policies, but this is not effectively conveyed to users. Our empirical findings motivate us to introduce the novel model of a privacy communication game, where the economically rational choice for a site operator is to make privacy control available to evade criticism from privacy fundamentalists, while hiding the privacy control interface and privacy policy to maximize sign-up numbers and encourage data sharing from the pragmatic majority of users.

  3. MoCog1: A computer simulation of recognition-primed human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  4. Privacy concerns in smart cities

    OpenAIRE

    van Zoonen, Liesbet

    2016-01-01

    textabstractIn this paper a framework is constructed to hypothesize if and how smart city technologies and urban big data produce privacy concerns among the people in these cities (as inhabitants, workers, visitors, and otherwise). The framework is built on the basis of two recurring dimensions in research about people's concerns about privacy: one dimensions represents that people perceive particular data as more personal and sensitive than others, the other dimension represents that people'...

  5. Users or Students? Privacy in University MOOCS.

    Science.gov (United States)

    Jones, Meg Leta; Regner, Lucas

    2016-10-01

    Two terms, student privacy and Massive Open Online Courses, have received a significant amount of attention recently. Both represent interesting sites of change in entrenched structures, one educational and one legal. MOOCs represent something college courses have never been able to provide: universal access. Universities not wanting to miss the MOOC wave have started to build MOOC courses and integrate them into the university system in various ways. However, the design and scale of university MOOCs create tension for privacy laws intended to regulate information practices exercised by educational institutions. Are MOOCs part of the educational institutions these laws and policies aim to regulate? Are MOOC users students whose data are protected by aforementioned laws and policies? Many university researchers and faculty members are asked to participate as designers and instructors in MOOCs but may not know how to approach the issues proposed. While recent scholarship has addressed the disruptive nature of MOOCs, student privacy generally, and data privacy in the K-12 system, we provide an in-depth description and analysis of the MOOC phenomenon and the privacy laws and policies that guide and regulate educational institutions today. We offer privacy case studies of three major MOOC providers active in the market today to reveal inconsistencies among MOOC platform and the level and type of legal uncertainty surrounding them. Finally, we provide a list of organizational questions to pose internally to navigate the uncertainty presented to university MOOC teams.

  6. Privacy Management and Networked PPD Systems - Challenges Solutions.

    Science.gov (United States)

    Ruotsalainen, Pekka; Pharow, Peter; Petersen, Francoise

    2015-01-01

    Modern personal portable health devices (PPDs) become increasingly part of a larger, inhomogeneous information system. Information collected by sensors are stored and processed in global clouds. Services are often free of charge, but at the same time service providers' business model is based on the disclosure of users' intimate health information. Health data processed in PPD networks is not regulated by health care specific legislation. In PPD networks, there is no guarantee that stakeholders share same ethical principles with the user. Often service providers have own security and privacy policies and they rarely offer to the user possibilities to define own, or adapt existing privacy policies. This all raises huge ethical and privacy concerns. In this paper, the authors have analyzed privacy challenges in PPD networks from users' viewpoint using system modeling method and propose the principle "Personal Health Data under Personal Control" must generally be accepted at global level. Among possible implementation of this principle, the authors propose encryption, computer understandable privacy policies, and privacy labels or trust based privacy management methods. The latter can be realized using infrastructural trust calculation and monitoring service. A first step is to require the protection of personal health information and the principle proposed being internationally mandatory. This requires both regulatory and standardization activities, and the availability of open and certified software application which all service providers can implement. One of those applications should be the independent Trust verifier.

  7. Just in Time Research: Privacy Practices

    Science.gov (United States)

    Grama, Joanna Lyn

    2014-01-01

    The January 2014 edition of the ECAR Update subscriber newsletter included an informal poll on information privacy practices. The poll was intended to collect a quick snapshot of the higher education community's thoughts on this important topic during Data Privacy Month. Results of the poll will be used to inform EDUCAUSE research, programs,…

  8. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  9. Privacy-Preserving Biometric Authentication: Challenges and Directions

    Directory of Open Access Journals (Sweden)

    Elena Pagnin

    2017-01-01

    Full Text Available An emerging direction for authenticating people is the adoption of biometric authentication systems. Biometric credentials are becoming increasingly popular as a means of authenticating people due to the wide range of advantages that they provide with respect to classical authentication methods (e.g., password-based authentication. The most characteristic feature of this authentication method is the naturally strong bond between a user and her biometric credentials. This very same advantageous property, however, raises serious security and privacy concerns in case the biometric trait gets compromised. In this article, we present the most challenging issues that need to be taken into consideration when designing secure and privacy-preserving biometric authentication protocols. More precisely, we describe the main threats against privacy-preserving biometric authentication systems and give directions on possible countermeasures in order to design secure and privacy-preserving biometric authentication protocols.

  10. Routes for breaching and protecting genetic privacy.

    Science.gov (United States)

    Erlich, Yaniv; Narayanan, Arvind

    2014-06-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these data sets is vital for progress in biomedical research. However, a growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, indicate the underlying assumptions, and assess their technological complexity and maturation. We then review potential mitigation methods for privacy-preserving dissemination of sensitive data and highlight different cases that are relevant to genetic applications.

  11. Analysis of Privacy-Enhancing Identity Management Systems

    DEFF Research Database (Denmark)

    Adjei, Joseph K.; Olesen, Henning

    Privacy has become a major issue for policy makers. This has been impelled by the rapid development of technologies that facilitate collection, distribution, storage, and manipulation of personal information. Business organizations are finding new ways of leveraging the value derived from consumer...... is an attempt to understand the relationship between individuals’ intentions to disclose personal information, their actual personal information disclosure behaviours, and how these can be leveraged to develop privacy-enhancing identity management systems (IDMS) that users can trust. Legal, regulatory...... and technological aspects of privacy and technology adoption are also discussed....

  12. 32 CFR 806b.30 - Evaluating information systems for Privacy Act compliance.

    Science.gov (United States)

    2010-07-01

    ... privacy issues are unchanged. (d) The depth and content of the Privacy Impact Assessment should be... 32 National Defense 6 2010-07-01 2010-07-01 false Evaluating information systems for Privacy Act... FORCE ADMINISTRATION PRIVACY ACT PROGRAM Privacy Impact Assessments § 806b.30 Evaluating information...

  13. Fuzzy Decision-Making Fuser (FDMF for Integrating Human-Machine Autonomous (HMA Systems with Adaptive Evidence Sources

    Directory of Open Access Journals (Sweden)

    Yu-Ting Liu

    2017-06-01

    Full Text Available A brain-computer interface (BCI creates a direct communication pathway between the human brain and an external device or system. In contrast to patient-oriented BCIs, which are intended to restore inoperative or malfunctioning aspects of the nervous system, a growing number of BCI studies focus on designing auxiliary systems that are intended for everyday use. The goal of building these BCIs is to provide capabilities that augment existing intact physical and mental capabilities. However, a key challenge to BCI research is human variability; factors such as fatigue, inattention, and stress vary both across different individuals and for the same individual over time. If these issues are addressed, autonomous systems may provide additional benefits that enhance system performance and prevent problems introduced by individual human variability. This study proposes a human-machine autonomous (HMA system that simultaneously aggregates human and machine knowledge to recognize targets in a rapid serial visual presentation (RSVP task. The HMA focuses on integrating an RSVP BCI with computer vision techniques in an image-labeling domain. A fuzzy decision-making fuser (FDMF is then applied in the HMA system to provide a natural adaptive framework for evidence-based inference by incorporating an integrated summary of the available evidence (i.e., human and machine decisions and associated uncertainty. Consequently, the HMA system dynamically aggregates decisions involving uncertainties from both human and autonomous agents. The collaborative decisions made by an HMA system can achieve and maintain superior performance more efficiently than either the human or autonomous agents can achieve independently. The experimental results shown in this study suggest that the proposed HMA system with the FDMF can effectively fuse decisions from human brain activities and the computer vision techniques to improve overall performance on the RSVP recognition task. This

  14. Fuzzy Decision-Making Fuser (FDMF) for Integrating Human-Machine Autonomous (HMA) Systems with Adaptive Evidence Sources.

    Science.gov (United States)

    Liu, Yu-Ting; Pal, Nikhil R; Marathe, Amar R; Wang, Yu-Kai; Lin, Chin-Teng

    2017-01-01

    A brain-computer interface (BCI) creates a direct communication pathway between the human brain and an external device or system. In contrast to patient-oriented BCIs, which are intended to restore inoperative or malfunctioning aspects of the nervous system, a growing number of BCI studies focus on designing auxiliary systems that are intended for everyday use. The goal of building these BCIs is to provide capabilities that augment existing intact physical and mental capabilities. However, a key challenge to BCI research is human variability; factors such as fatigue, inattention, and stress vary both across different individuals and for the same individual over time. If these issues are addressed, autonomous systems may provide additional benefits that enhance system performance and prevent problems introduced by individual human variability. This study proposes a human-machine autonomous (HMA) system that simultaneously aggregates human and machine knowledge to recognize targets in a rapid serial visual presentation (RSVP) task. The HMA focuses on integrating an RSVP BCI with computer vision techniques in an image-labeling domain. A fuzzy decision-making fuser (FDMF) is then applied in the HMA system to provide a natural adaptive framework for evidence-based inference by incorporating an integrated summary of the available evidence (i.e., human and machine decisions) and associated uncertainty. Consequently, the HMA system dynamically aggregates decisions involving uncertainties from both human and autonomous agents. The collaborative decisions made by an HMA system can achieve and maintain superior performance more efficiently than either the human or autonomous agents can achieve independently. The experimental results shown in this study suggest that the proposed HMA system with the FDMF can effectively fuse decisions from human brain activities and the computer vision techniques to improve overall performance on the RSVP recognition task. This conclusion

  15. Reward-based spatial crowdsourcing with differential privacy preservation

    Science.gov (United States)

    Xiong, Ping; Zhang, Lefeng; Zhu, Tianqing

    2017-11-01

    In recent years, the popularity of mobile devices has transformed spatial crowdsourcing (SC) into a novel mode for performing complicated projects. Workers can perform tasks at specified locations in return for rewards offered by employers. Existing methods ensure the efficiency of their systems by submitting the workers' exact locations to a centralised server for task assignment, which can lead to privacy violations. Thus, implementing crowsourcing applications while preserving the privacy of workers' location is a key issue that needs to be tackled. We propose a reward-based SC method that achieves acceptable utility as measured by task assignment success rates, while efficiently preserving privacy. A differential privacy model ensures rigorous privacy guarantee, and Laplace noise is introduced to protect workers' exact locations. We then present a reward allocation mechanism that adjusts each piece of the reward for a task using the distribution of the workers' locations. Through experimental results, we demonstrate that this optimised-reward method is efficient for SC applications.

  16. Selective attention increases choice certainty in human decision making.

    Science.gov (United States)

    Zizlsperger, Leopold; Sauvigny, Thomas; Haarmeier, Thomas

    2012-01-01

    Choice certainty is a probabilistic estimate of past performance and expected outcome. In perceptual decisions the degree of confidence correlates closely with choice accuracy and reaction times, suggesting an intimate relationship to objective performance. Here we show that spatial and feature-based attention increase human subjects' certainty more than accuracy in visual motion discrimination tasks. Our findings demonstrate for the first time a dissociation of choice accuracy and certainty with a significantly stronger influence of voluntary top-down attention on subjective performance measures than on objective performance. These results reveal a so far unknown mechanism of the selection process implemented by attention and suggest a unique biological valence of choice certainty beyond a faithful reflection of the decision process.

  17. MODEL REGULATION FOR DATA PRIVACY IN THE APPLICATION OF BIOMETRIC SMART CARD

    Directory of Open Access Journals (Sweden)

    Sinta Dewi

    2017-03-01

    This article will explore data privacy model regulation which is intended to regulate and protect  data privacy. This  regulatory model  combining several approaches in managing data privacy, especially in using biometric smardcard. Firstly, through laws that enforces the principles and international standards. Secondly, through the market approach (market-based solution which is derived through industry associations to help protect consumer data privacy by applying privacy policy in the form of a statement that the industry will protect consumers' privacy by implementing fair information principles. Third, through technological approach such as PET's (privacy enchasing technology,  i.e the techniques for anonymous and pseudo-anonymous payment, communication, and web access. Fourthly, through corporate privacy rules.

  18. Privacy-Preserving Data Mining of Medical Data Using Data Separation-Based Techniques

    Directory of Open Access Journals (Sweden)

    Gang Kou

    2007-08-01

    Full Text Available Data mining is concerned with the extraction of useful knowledge from various types of data. Medical data mining has been a popular data mining topic of late. Compared with other data mining areas, medical data mining has some unique characteristics. Because medical files are related to human subjects, privacy concerns are taken more seriously than other data mining tasks. This paper applied data separation-based techniques to preserve privacy in classification of medical data. We take two approaches to protect privacy: one approach is to vertically partition the medical data and mine these partitioned data at multiple sites; the other approach is to horizontally split data across multiple sites. In the vertical partition approach, each site uses a portion of the attributes to compute its results, and the distributed results are assembled at a central trusted party using a majority-vote ensemble method. In the horizontal partition approach, data are distributed among several sites. Each site computes its own data, and a central trusted party is responsible to integrate these results. We implement these two approaches using medical datasets from UCI KDD archive and report the experimental results.

  19. European Perspectives on Privacy in the Sharing Economy

    DEFF Research Database (Denmark)

    Ranzini, Giulia; Etter, Michael; Vermeulen, Ivar

    Report from the EU H2020 Research Project Ps2Share: Participation, Privacy, and Power in the Sharing Economy. This report ‘European Perspectives on Privacy in the Sharing Economy’ forms one element of a European Union Horizon 2020 Research Project on the sharing economy: Ps2Share ‘Participation......, Privacy, and Power in the Sharing Economy’. The study is undertaken within the scope of the European Union’s Horizon 2020 research and innovation programme, funded under grant agreement No. 732117 and with the objective (ICT-35) of “Enabling responsible ICT-related research and innovation”. This project...... recommendations to Europe’s institutions. We focus on topics of participation, privacy, and power in the sharing economy....

  20. PRIVACY CONCERNS IN FACEBOOK SITE

    OpenAIRE

    Vandana Singh

    2014-01-01

    Today social networking sites play an important role and inexpensive way to maintain existing relationships and present oneself. However, the increasing use of online sites give rise to privacy concerns and risks. All Internet sites are also under attack from phishers, fraudsters, and spammers. They aim to steal user information and expose users to unwanted spam. They have so many resources at their disposal.This paper studies the awareness of college students regarding the privacy in Faceboo...