WorldWideScience

Sample records for secrets protecting privacy

  1. 76 FR 66940 - Privacy Act of 1974; Department of Homeland Security/United States Secret Service-004 Protection...

    Science.gov (United States)

    2011-10-28

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2011-0083] Privacy Act of 1974; Department of Homeland Security/United States Secret Service--004 Protection Information System... Security (DHS)/United States Secret Service (USSS)-004 System name: DHS/USSS-004 Protection Information...

  2. Control use of data to protect privacy.

    Science.gov (United States)

    Landau, Susan

    2015-01-30

    Massive data collection by businesses and governments calls into question traditional methods for protecting privacy, underpinned by two core principles: (i) notice, that there should be no data collection system whose existence is secret, and (ii) consent, that data collected for one purpose not be used for another without user permission. But notice, designated as a fundamental privacy principle in a different era, makes little sense in situations where collection consists of lots and lots of small amounts of information, whereas consent is no longer realistic, given the complexity and number of decisions that must be made. Thus, efforts to protect privacy by controlling use of data are gaining more attention. I discuss relevant technology, policy, and law, as well as some examples that can illuminate the way. Copyright © 2015, American Association for the Advancement of Science.

  3. Genetic secrets: Protecting privacy and confidentiality in the genetic era

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-07-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  4. Protecting genetic privacy.

    Science.gov (United States)

    Roche, P A; Annas, G J

    2001-05-01

    This article outlines the arguments for and against new rules to protect genetic privacy. We explain why genetic information is different to other sensitive medical information, why researchers and biotechnology companies have opposed new rules to protect genetic privacy (and favour anti-discrimination laws instead), and discuss what can be done to protect privacy in relation to genetic-sequence information and to DNA samples themselves.

  5. Genetic secrets: Protecting privacy and confidentiality in the genetic era. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-09-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  6. Protecting patron privacy

    CERN Document Server

    Beckstrom, Matthew

    2015-01-01

    In a world where almost anyone with computer savvy can hack, track, and record the online activities of others, your library can serve as a protected haven for your visitors who rely on the Internet to conduct research-if you take the necessary steps to safeguard their privacy. This book shows you how to protect patrons' privacy while using the technology that your library provides, including public computers, Internet access, wireless networks, and other devices. Logically organized into two major sections, the first part of the book discusses why the privacy of your users is of paramount

  7. Trajectory data privacy protection based on differential privacy mechanism

    Science.gov (United States)

    Gu, Ke; Yang, Lihao; Liu, Yongzhi; Liao, Niandong

    2018-05-01

    In this paper, we propose a trajectory data privacy protection scheme based on differential privacy mechanism. In the proposed scheme, the algorithm first selects the protected points from the user’s trajectory data; secondly, the algorithm forms the polygon according to the protected points and the adjacent and high frequent accessed points that are selected from the accessing point database, then the algorithm calculates the polygon centroids; finally, the noises are added to the polygon centroids by the differential privacy method, and the polygon centroids replace the protected points, and then the algorithm constructs and issues the new trajectory data. The experiments show that the running time of the proposed algorithms is fast, the privacy protection of the scheme is effective and the data usability of the scheme is higher.

  8. Privacy protection schemes for fingerprint recognition systems

    Science.gov (United States)

    Marasco, Emanuela; Cukic, Bojan

    2015-05-01

    The deployment of fingerprint recognition systems has always raised concerns related to personal privacy. A fingerprint is permanently associated with an individual and, generally, it cannot be reset if compromised in one application. Given that fingerprints are not a secret, potential misuses besides personal recognition represent privacy threats and may lead to public distrust. Privacy mechanisms control access to personal information and limit the likelihood of intrusions. In this paper, image- and feature-level schemes for privacy protection in fingerprint recognition systems are reviewed. Storing only key features of a biometric signature can reduce the likelihood of biometric data being used for unintended purposes. In biometric cryptosystems and biometric-based key release, the biometric component verifies the identity of the user, while the cryptographic key protects the communication channel. Transformation-based approaches only a transformed version of the original biometric signature is stored. Different applications can use different transforms. Matching is performed in the transformed domain which enable the preservation of low error rates. Since such templates do not reveal information about individuals, they are referred to as cancelable templates. A compromised template can be re-issued using a different transform. At image-level, de-identification schemes can remove identifiers disclosed for objectives unrelated to the original purpose, while permitting other authorized uses of personal information. Fingerprint images can be de-identified by, for example, mixing fingerprints or removing gender signature. In both cases, degradation of matching performance is minimized.

  9. A privacy protection model to support personal privacy in relational databases.

    OpenAIRE

    2008-01-01

    The individual of today incessantly insists on more protection of his/her personal privacy than a few years ago. During the last few years, rapid technological advances, especially in the field of information technology, directed most attention and energy to the privacy protection of the Internet user. Research was done and is still being done covering a vast area to protect the privacy of transactions performed on the Internet. However, it was established that almost no research has been don...

  10. 45 CFR 164.522 - Rights to request privacy protection for protected health information.

    Science.gov (United States)

    2010-10-01

    ... ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY Privacy of Individually Identifiable Health Information § 164.522 Rights to request privacy protection for protected health information. (a)(1... 45 Public Welfare 1 2010-10-01 2010-10-01 false Rights to request privacy protection for protected...

  11. Protecting privacy in a clinical data warehouse.

    Science.gov (United States)

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  12. Privacy Protection Research of Mobile RFID

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Radio Frequency Identification is one of the most controversial technologies at present.It is very difficult to detect who reads a tag incorporated into products owned by a person,a significant concern to privacy threats in RFID system arises from this reason.User privacy problem is prior considersion for mobile RFID service,because most mobile RFID service based on end-user service.Propose a solution for user privacy protection,which is a modification of EPC Class 1 Generation 2 protocol,and introduce a privacy protection scenario for mobile RFID service using this method.

  13. Smart Grid Privacy through Distributed Trust

    Science.gov (United States)

    Lipton, Benjamin

    Though the smart electrical grid promises many advantages in efficiency and reliability, the risks to consumer privacy have impeded its deployment. Researchers have proposed protecting privacy by aggregating user data before it reaches the utility, using techniques of homomorphic encryption to prevent exposure of unaggregated values. However, such schemes generally require users to trust in the correct operation of a single aggregation server. We propose two alternative systems based on secret sharing techniques that distribute this trust among multiple service providers, protecting user privacy against a misbehaving server. We also provide an extensive evaluation of the systems considered, comparing their robustness to privacy compromise, error handling, computational performance, and data transmission costs. We conclude that while all the systems should be computationally feasible on smart meters, the two methods based on secret sharing require much less computation while also providing better protection against corrupted aggregators. Building systems using these techniques could help defend the privacy of electricity customers, as well as customers of other utilities as they move to a more data-driven architecture.

  14. PRIVACY PROTECTION PROBLEMS IN SOCIAL NETWORKS

    OpenAIRE

    OKUR, M. Cudi

    2011-01-01

    Protecting privacy has become a major concern for most social network users because of increased difficulties of controlling the online data. This article presents an assessment of the common privacy related risks of social networking sites. Open and hidden privacy risks of active and passive online profiles are examined and increasing share of social networking in these phenomena is discussed. Inadequacy of available legal and institutional protection is demonstrated and the effectiveness of...

  15. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  16. The role of privacy protection in healthcare information systems adoption.

    Science.gov (United States)

    Hsu, Chien-Lung; Lee, Ming-Ren; Su, Chien-Hui

    2013-10-01

    Privacy protection is an important issue and challenge in healthcare information systems (HISs). Recently, some privacy-enhanced HISs are proposed. Users' privacy perception, intention, and attitude might affect the adoption of such systems. This paper aims to propose a privacy-enhanced HIS framework and investigate the role of privacy protection in HISs adoption. In the proposed framework, privacy protection, access control, and secure transmission modules are designed to enhance the privacy protection of a HIS. An experimental privacy-enhanced HIS is also implemented. Furthermore, we proposed a research model extending the unified theory of acceptance and use of technology by considering perceived security and information security literacy and then investigate user adoption of a privacy-enhanced HIS. The experimental results and analyses showed that user adoption of a privacy-enhanced HIS is directly affected by social influence, performance expectancy, facilitating conditions, and perceived security. Perceived security has a mediating effect between information security literacy and user adoption. This study proposes several implications for research and practice to improve designing, development, and promotion of a good healthcare information system with privacy protection.

  17. 22 CFR 212.22 - Protection of personal privacy.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Protection of personal privacy. 212.22 Section... Information for Public Inspection and Copying § 212.22 Protection of personal privacy. To the extent required to prevent a clearly unwarranted invasion of personal privacy, USAID may delete identifying details...

  18. Location privacy protection in mobile networks

    CERN Document Server

    Liu, Xinxin

    2013-01-01

    This SpringerBrief analyzes the potential privacy threats in wireless and mobile network environments, and reviews some existing works. It proposes multiple privacy preserving techniques against several types of privacy threats that are targeting users in a mobile network environment. Depending on the network architecture, different approaches can be adopted. The first proposed approach considers a three-party system architecture where there is a trusted central authority that can be used to protect users? privacy. The second approach considers a totally distributed environment where users per

  19. Gender and online privacy among teens: risk perception, privacy concerns, and protection behaviors.

    Science.gov (United States)

    Youn, Seounmi; Hall, Kimberly

    2008-12-01

    Survey data from 395 high school students revealed that girls perceive more privacy risks and have a higher level of privacy concerns than boys. Regarding privacy protection behaviors, boys tended to read unsolicited e-mail and register for Web sites while directly sending complaints in response to unsolicited e-mail. This study found girls to provide inaccurate information as their privacy concerns increased. Boys, however, refrained from registering to Web sites as their concerns increased.

  20. 78 FR 76986 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2013-12-20

    ... FEDERAL TRADE COMMISSION 16 CFR Part 312 RIN 3084-AB20 Children's Online Privacy Protection Rule... published final rule amendments to the Children's Online Privacy Protection Rule on January 17, 2013 to update the requirements set forth in the notice, parental consent, confidentiality and security, and safe...

  1. 78 FR 3971 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2013-01-17

    ... functionality or content of their properties or gain greater publicity through social media in an effort to... Children's Online Privacy Protection Rule; Final Rule #0;#0;Federal Register / Vol. 78 , No. 12 / Thursday... 3084-AB20 Children's Online Privacy Protection Rule AGENCY: Federal Trade Commission (``FTC'' or...

  2. Couldn't or wouldn't? The influence of privacy concerns and self-efficacy in privacy management on privacy protection.

    Science.gov (United States)

    Chen, Hsuan-Ting; Chen, Wenghong

    2015-01-01

    Sampling 515 college students, this study investigates how privacy protection, including profile visibility, self-disclosure, and friending, are influenced by privacy concerns and efficacy regarding one's own ability to manage privacy settings, a factor that researchers have yet to give a great deal of attention to in the context of social networking sites (SNSs). The results of this study indicate an inconsistency in adopting strategies to protect privacy, a disconnect from limiting profile visibility and friending to self-disclosure. More specifically, privacy concerns lead SNS users to limit their profile visibility and discourage them from expanding their network. However, they do not constrain self-disclosure. Similarly, while self-efficacy in privacy management encourages SNS users to limit their profile visibility, it facilitates self-disclosure. This suggests that if users are limiting their profile visibility and constraining their friending behaviors, it does not necessarily mean they will reduce self-disclosure on SNSs because these behaviors are predicted by different factors. In addition, the study finds an interaction effect between privacy concerns and self-efficacy in privacy management on friending. It points to the potential problem of increased risk-taking behaviors resulting from high self-efficacy in privacy management and low privacy concerns.

  3. 76 FR 66937 - Privacy Act of 1974; Department of Homeland Security/United States Secret Service-003 Non...

    Science.gov (United States)

    2011-10-28

    ... 1974; Department of Homeland Security/United States Secret Service--003 Non-Criminal Investigation... Security/United States Secret Service--003 Non-Criminal Investigation Information System.'' As a result of... Secret Service, 245 Murray Lane SW., Building T-5, Washington, DC 20223. For privacy issues please...

  4. Effective evaluation of privacy protection techniques in visible and thermal imagery

    Science.gov (United States)

    Nawaz, Tahir; Berg, Amanda; Ferryman, James; Ahlberg, Jörgen; Felsberg, Michael

    2017-09-01

    Privacy protection may be defined as replacing the original content in an image region with a (less intrusive) content having modified target appearance information to make it less recognizable by applying a privacy protection technique. Indeed, the development of privacy protection techniques also needs to be complemented with an established objective evaluation method to facilitate their assessment and comparison. Generally, existing evaluation methods rely on the use of subjective judgments or assume a specific target type in image data and use target detection and recognition accuracies to assess privacy protection. An annotation-free evaluation method that is neither subjective nor assumes a specific target type is proposed. It assesses two key aspects of privacy protection: "protection" and "utility." Protection is quantified as an appearance similarity, and utility is measured as a structural similarity between original and privacy-protected image regions. We performed an extensive experimentation using six challenging datasets (having 12 video sequences), including a new dataset (having six sequences) that contains visible and thermal imagery. The new dataset is made available online for the community. We demonstrate effectiveness of the proposed method by evaluating six image-based privacy protection techniques and also show comparisons of the proposed method over existing methods.

  5. 36 CFR 902.56 - Protection of personal privacy.

    Science.gov (United States)

    2010-07-01

    ... privacy. 902.56 Section 902.56 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT... Protection of personal privacy. (a) Any of the following personnel, medical, or similar records is within the... invasion of his personal privacy: (1) Personnel and background records personal to any officer or employee...

  6. Protecting Privacy and Confidentiality in Environmental Health Research.

    Science.gov (United States)

    Resnik, David B

    2010-01-01

    Environmental health researchers often need to make difficult decisions on how to protect privacy and confidentiality when they conduct research in the home or workplace. These dilemmas are different from those normally encountered in clinical research. Although protecting privacy and confidentiality is one of the most important principles of research involving human subjects, it can be overridden to prevent imminent harm to individuals or if required by law. Investigators should carefully consider the facts and circumstances and use good judgment when deciding whether to breach privacy or confidentiality.

  7. Protecting Privacy in the Global South (Phase 2) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The absence of appropriate privacy protections can lead to grave problems. Privacy ... Developing countries are home to the greatest number of Internet and mobile users, but such privacy protection is scarce. ... Agent(e) responsable du CRDI.

  8. The Impact of Privacy Concerns and Perceived Vulnerability to Risks on Users Privacy Protection Behaviors on SNS: A Structural Equation Model

    OpenAIRE

    Noora Sami Al-Saqer; Mohamed E. Seliaman

    2016-01-01

    This research paper investigates Saudi users’ awareness levels about privacy policies in Social Networking Sites (SNSs), their privacy concerns and their privacy protection measures. For this purpose, a research model that consists of five main constructs namely information privacy concern, awareness level of privacy policies of social networking sites, perceived vulnerability to privacy risks, perceived response efficacy, and privacy protecting behavior was developed. An online survey questi...

  9. 45 CFR 164.520 - Notice of privacy practices for protected health information.

    Science.gov (United States)

    2010-10-01

    ... DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY Privacy of Individually Identifiable Health Information § 164.520 Notice of privacy practices for protected health information. (a) Standard... 45 Public Welfare 1 2010-10-01 2010-10-01 false Notice of privacy practices for protected health...

  10. Are Data Sharing and Privacy Protection Mutually Exclusive?

    Science.gov (United States)

    Joly, Yann; Dyke, Stephanie O M; Knoppers, Bartha M; Pastinen, Tomi

    2016-11-17

    We review emerging strategies to protect the privacy of research participants in international epigenome research: open consent, genome donation, registered access, automated procedures, and privacy-enhancing technologies. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Privacy rules for DNA databanks. Protecting coded 'future diaries'.

    Science.gov (United States)

    Annas, G J

    1993-11-17

    In privacy terms, genetic information is like medical information. But the information contained in the DNA molecule itself is more sensitive because it contains an individual's probabilistic "future diary," is written in a code that has only partially been broken, and contains information about an individual's parents, siblings, and children. Current rules for protecting the privacy of medical information cannot protect either genetic information or identifiable DNA samples stored in DNA databanks. A review of the legal and public policy rationales for protecting genetic privacy suggests that specific enforceable privacy rules for DNA databanks are needed. Four preliminary rules are proposed to govern the creation of DNA databanks, the collection of DNA samples for storage, limits on the use of information derived from the samples, and continuing obligations to those whose DNA samples are in the databanks.

  12. Privacy Protection Method in the Era of Cloud Computing and Big Data

    Directory of Open Access Journals (Sweden)

    Liu Ying

    2015-01-01

    Full Text Available Cloud Computing has become the academic and industrial hotspot in China in recent years. Cloud Computing can help business clients manage finance more conveniently and efficiently. It can also reduce the protection of privacy. In addition, its inherent deficiencies also hinder its application in the privacy protection, such as safety, different criteria, etc. This paper analyzes the application of cloud computing and big data in privacy protection and the existing problems, and therefore puts forward ways to promote the privacy protection in the era of cloud computing and big data.

  13. Older and Wiser? Facebook Use, Privacy Concern, and Privacy Protection in the Life Stages of Emerging, Young, and Middle Adulthood

    Directory of Open Access Journals (Sweden)

    Evert Van den Broeck

    2015-11-01

    Full Text Available A large part of research conducted on privacy concern and protection on social networking sites (SNSs concentrates on children and adolescents. Individuals in these developmental stages are often described as vulnerable Internet users. But how vulnerable are adults in terms of online informational privacy? This study applied a privacy boundary management approach and investigated Facebook use, privacy concern, and the application of privacy settings on Facebook by linking the results to Erikson’s three stages of adulthood: emerging, young, and middle adulthood. An online survey was distributed among 18- to 65-year-old Dutch-speaking adults ( N  = 508, 51.8% females. Analyses revealed clear differences between the three adult age groups in terms of privacy concern, Facebook use, and privacy protection. Results indicated that respondents in young adulthood and middle adulthood were more vulnerable in terms of privacy protection than emerging adults. Clear discrepancies were found between privacy concern and protection for these age groups. More particularly, the middle adulthood group was more concerned about their privacy in comparison to the emerging adulthood and young adulthood group. Yet, they reported to use privacy settings less frequently than the younger age groups. Emerging adults were found to be pragmatic and privacy conscious SNS users. Young adults occupied the intermediate position, suggesting a developmental shift. The impact of generational differences is discussed, as well as implications for education and governmental action.

  14. Routes for breaching and protecting genetic privacy.

    Science.gov (United States)

    Erlich, Yaniv; Narayanan, Arvind

    2014-06-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these data sets is vital for progress in biomedical research. However, a growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, indicate the underlying assumptions, and assess their technological complexity and maturation. We then review potential mitigation methods for privacy-preserving dissemination of sensitive data and highlight different cases that are relevant to genetic applications.

  15. Protecting Your Child's Privacy Online

    Science.gov (United States)

    ... Keeping Up With Kids' Apps infographic Kids and Computer Security Kids and Mobile Phones Kids and Socializing Online ... email Looking for business guidance on privacy and ... The Federal Trade Commission (FTC) is the nation’s consumer protection agency. The FTC works to prevent fraudulent, deceptive ...

  16. Do Smartphone Power Users Protect Mobile Privacy Better than Nonpower Users? Exploring Power Usage as a Factor in Mobile Privacy Protection and Disclosure.

    Science.gov (United States)

    Kang, Hyunjin; Shin, Wonsun

    2016-03-01

    This study examines how consumers' competence at using smartphone technology (i.e., power usage) affects their privacy protection behaviors. A survey conducted with smartphone users shows that power usage influences privacy protection behavior not only directly but also indirectly through privacy concerns and trust placed in mobile service providers. A follow-up experiment indicates that the effects of power usage on smartphone users' information management can be a function of content personalization. Users, high on power usage, are less likely to share personal information on personalized mobile sites, but they become more revealing when they interact with nonpersonalized mobile sites.

  17. On the comprehensibility and perceived privacy protection of indirect questioning techniques.

    Science.gov (United States)

    Hoffmann, Adrian; Waubert de Puiseau, Berenike; Schmidt, Alexander F; Musch, Jochen

    2017-08-01

    On surveys that assess sensitive personal attributes, indirect questioning aims at increasing respondents' willingness to answer truthfully by protecting confidentiality. However, the assumption that subjects understand questioning procedures fully and trust them to protect their privacy is rarely tested. In a scenario-based design, we compared four indirect questioning procedures in terms of their comprehensibility and perceived privacy protection. All indirect questioning techniques were found to be less comprehensible by respondents than a conventional direct question used for comparison. Less-educated respondents experienced more difficulties when confronted with any indirect questioning technique. Regardless of education, the crosswise model was found to be the most comprehensible among the four indirect methods. Indirect questioning in general was perceived to increase privacy protection in comparison to a direct question. Unexpectedly, comprehension and perceived privacy protection did not correlate. We recommend assessing these factors separately in future evaluations of indirect questioning.

  18. Protecting privacy in data release

    CERN Document Server

    Livraga, Giovanni

    2015-01-01

    This book presents a comprehensive approach to protecting sensitive information when large data collections are released by their owners. It addresses three key requirements of data privacy: the protection of data explicitly released, the protection of information not explicitly released but potentially vulnerable due to a release of other data, and the enforcement of owner-defined access restrictions to the released data. It is also the first book with a complete examination of how to enforce dynamic read and write access authorizations on released data, applicable to the emerging data outsou

  19. Trade secrets protection mode of nuclear power plant

    International Nuclear Information System (INIS)

    Zeng Bin

    2015-01-01

    The paper analyzes the legal environment in which nuclear power enterprises are stayed, and mainly discusses the business secret protection modes of China's nuclear power enterprises. It is expected to provide a revelation and help for these enterprises to protect their business secrets. Firstly, the paper briefly expounds the legal basis of business secret protection and China's legalization status in this regard. Then it mainly puts forward the business secret management framework and postulations for nuclear power enterprises, and key points in application and protection of nuclear power business secret. (author)

  20. Anonymous communication networks protecting privacy on the web

    CERN Document Server

    Peng, Kun

    2014-01-01

    In today's interactive network environment, where various types of organizations are eager to monitor and track Internet use, anonymity is one of the most powerful resources available to counterbalance the threat of unknown spectators and to ensure Internet privacy.Addressing the demand for authoritative information on anonymous Internet usage, Anonymous Communication Networks: Protecting Privacy on the Web examines anonymous communication networks as a solution to Internet privacy concerns. It explains how anonymous communication networks make it possible for participants to communicate with

  1. Routes for breaching and protecting genetic privacy

    OpenAIRE

    Erlich, Yaniv; Narayanan, Arvind

    2013-01-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these datasets is vital for progress in biomedical research. However, one growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, point to the underlying assumptions, and assess its technological complexity and maturati...

  2. Data protection and privacy : The age of intelligent machines

    NARCIS (Netherlands)

    Leenes, Ronald; van Brakel, Rosamunde; Gutwirth, Serge; de Hert, Paul

    2017-01-01

    This volume arises from the tenth annual International Conference on Computers, Privacy, and Data Protection (CPDP 2017) held in Brussels in January 2017, bringing together papers that offer conceptual analyses, highlight issues, propose solutions, and discuss practices regarding privacy and data

  3. Protection of industrial and business secrets in environmental protection law

    International Nuclear Information System (INIS)

    Breuer, R.

    1986-01-01

    The author investigates the relation between environmental protection and data protection, especially concerning the protection of industrial and business secrets. For this kind of conflict there are only general administrative and procedural provisions with little systematic order. Special provisions of data protection covering all aspects of industrial and business secrets, as for example in social or tax law, do not exist in German law until now. (WG) [de

  4. Scalable privacy-preserving data sharing methodology for genome-wide association studies: an application to iDASH healthcare privacy protection challenge.

    Science.gov (United States)

    Yu, Fei; Ji, Zhanglong

    2014-01-01

    In response to the growing interest in genome-wide association study (GWAS) data privacy, the Integrating Data for Analysis, Anonymization and SHaring (iDASH) center organized the iDASH Healthcare Privacy Protection Challenge, with the aim of investigating the effectiveness of applying privacy-preserving methodologies to human genetic data. This paper is based on a submission to the iDASH Healthcare Privacy Protection Challenge. We apply privacy-preserving methods that are adapted from Uhler et al. 2013 and Yu et al. 2014 to the challenge's data and analyze the data utility after the data are perturbed by the privacy-preserving methods. Major contributions of this paper include new interpretation of the χ2 statistic in a GWAS setting and new results about the Hamming distance score, a key component for one of the privacy-preserving methods.

  5. 76 FR 48811 - Computer Matching and Privacy Protection Act of 1988

    Science.gov (United States)

    2011-08-09

    ... CORPORATION FOR NATIONAL AND COMMUNITY SERVICE Computer Matching and Privacy Protection Act of... of the Computer Matching and Privacy Protection Act of 1988 (54 FR 25818, June 19, 1989), and OMB... Security Administration (``SSA''). DATES: CNCS will file a report on the computer matching agreement with...

  6. Privacy as human flourishing: could a shift towards virtue ethics strengthen privacy protection in the age of Big Data?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Privacy is commonly seen as an instrumental value in relation to negative freedom, human dignity and personal autonomy. Article 8 ECHR, protecting the right to privacy, was originally coined as a doctrine protecting the negative freedom of citizens in vertical relations, that is between citizen and

  7. Privacy Protection in Cloud Using Rsa Algorithm

    OpenAIRE

    Amandeep Kaur; Manpreet Kaur

    2014-01-01

    The cloud computing architecture has been on high demand nowadays. The cloud has been successful over grid and distributed environment due to its cost and high reliability along with high security. However in the area of research it is observed that cloud computing still has some issues in security regarding privacy. The cloud broker provide services of cloud to general public and ensures that data is protected however they sometimes lag security and privacy. Thus in this work...

  8. HOTEL GUEST’S PRIVACY PROTECTION IN TOURISM BUSINESS LAW

    OpenAIRE

    Oliver Radolovic

    2010-01-01

    In the tourism business law, especially in the hotel-keeper’s contract (direct, agency, allotment), the hotel-keeper assumes certain obligations to the guests, among which, in the last twenty years, the protection of the guest’s privacy is particularly emphasized. The subject of the paper is hotel guest’s privacy protection in the Croatian and comparative tourism business law. The paper is structured in a way that it analyzes, through the laws of Croatia, France, Italy, Germany, UK and USA, t...

  9. Privacy Protection in Personal Health Information and Shared Care Records

    Directory of Open Access Journals (Sweden)

    Roderick L B Neame

    2014-03-01

    Full Text Available Background The protection of personal information privacy has become one of the most pressing security concerns for record keepers. Many institutions have yet to implement the essential infrastructure for data privacy protection and patient control when accessing and sharing data; even more have failed to instil a privacy and security awareness mindset and culture amongst their staff. Increased regulation, together with better compliance monitoring has led to the imposition of increasingly significant monetary penalties for failures to protect privacy. Objective  There is growing pressure in clinical environments to deliver shared patient care and to support this with integrated information.  This demands that more information passes between institutions and care providers without breaching patient privacy or autonomy.  This can be achieved with relatively minor enhancements of existing infrastructures and does not require extensive investment in inter-operating electronic records: indeed such investments to date have been shown not to materially improve data sharing.Requirements for Privacy  There is an ethical duty as well as a legal obligation on the part of care providers (and record keepers to keep patient information confidential and to share it only with the authorisation of the patient.  To achieve this information storage and retrieval, and communication systems must be appropriately configured. Patients may consult clinicians anywhere and at any time: therefore their data must be available for recipient-driven retrieval under patient control and kept private. 

  10. Privacy protection for personal health information and shared care records.

    Science.gov (United States)

    Neame, Roderick L B

    2014-01-01

    The protection of personal information privacy has become one of the most pressing security concerns for record keepers: this will become more onerous with the introduction of the European General Data Protection Regulation (GDPR) in mid-2014. Many institutions, both large and small, have yet to implement the essential infrastructure for data privacy protection and patient consent and control when accessing and sharing data; even more have failed to instil a privacy and security awareness mindset and culture amongst their staff. Increased regulation, together with better compliance monitoring, has led to the imposition of increasingly significant monetary penalties for failure to protect privacy: these too are set to become more onerous under the GDPR, increasing to a maximum of 2% of annual turnover. There is growing pressure in clinical environments to deliver shared patient care and to support this with integrated information. This demands that more information passes between institutions and care providers without breaching patient privacy or autonomy. This can be achieved with relatively minor enhancements of existing infrastructures and does not require extensive investment in inter-operating electronic records: indeed such investments to date have been shown not to materially improve data sharing. REQUIREMENTS FOR PRIVACY: There is an ethical duty as well as a legal obligation on the part of care providers (and record keepers) to keep patient information confidential and to share it only with the authorisation of the patient. To achieve this information storage and retrieval, communication systems must be appropriately configured. There are many components of this, which are discussed in this paper. Patients may consult clinicians anywhere and at any time: therefore, their data must be available for recipient-driven retrieval (i.e. like the World Wide Web) under patient control and kept private: a method for delivering this is outlined.

  11. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    Science.gov (United States)

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  12. From Data Privacy to Location Privacy

    Science.gov (United States)

    Wang, Ting; Liu, Ling

    Over the past decade, the research on data privacy has achieved considerable advancement in the following two aspects: First, a variety of privacy threat models and privacy principles have been proposed, aiming at providing sufficient protection against different types of inference attacks; Second, a plethora of algorithms and methods have been developed to implement the proposed privacy principles, while attempting to optimize the utility of the resulting data. The first part of the chapter presents an overview of data privacy research by taking a close examination at the achievements from the above two aspects, with the objective of pinpointing individual research efforts on the grand map of data privacy protection. As a special form of data privacy, location privacy possesses its unique characteristics. In the second part of the chapter, we examine the research challenges and opportunities of location privacy protection, in a perspective analogous to data privacy. Our discussion attempts to answer the following three questions: (1) Is it sufficient to apply the data privacy models and algorithms developed to date for protecting location privacy? (2) What is the current state of the research on location privacy? (3) What are the open issues and technical challenges that demand further investigation? Through answering these questions, we intend to provide a comprehensive review of the state of the art in location privacy research.

  13. Location Privacy Protection Based on Improved K-Value Method in Augmented Reality on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Chunyong Yin

    2017-01-01

    Full Text Available With the development of Augmented Reality technology, the application of location based service (LBS is more and more popular, which provides enormous convenience to people’s life. User location information could be obtained at anytime and anywhere. So user location privacy security suffers huge threats. Therefore, it is crucial to pay attention to location privacy protection in LBS. Based on the architecture of the trusted third party (TTP, we analyzed the advantages and shortages of existing location privacy protection methods in LBS on mobile terminal. Then we proposed the improved K-value location privacy protection method according to privacy level, which combines k-anonymity method with pseudonym method. Through the simulation experiment, the results show that this improved method can anonymize all service requests effectively. In addition to the experiment of execution time, it demonstrated that our proposed method can realize the location privacy protection more efficiently.

  14. A secure data privacy preservation for on-demand

    Directory of Open Access Journals (Sweden)

    Dhasarathan Chandramohan

    2017-04-01

    Full Text Available This paper spotlights privacy and its obfuscation issues of intellectual, confidential information owned by insurance and finance sectors. Privacy risk in business era if authoritarians misuse secret information. Software interruptions in steeling digital data in the name of third party services. Liability in digital secrecy for the business continuity isolation, mishandling causing privacy breaching the vicinity and its preventive phenomenon is scrupulous in the cloud, where a huge amount of data is stored and maintained enormously. In this developing IT-world toward cloud, users privacy protection is becoming a big question , albeit cloud computing made changes in the computing field by increasing its effectiveness, efficiency and optimization of the service environment etc, cloud users data and their identity, reliability, maintainability and privacy may vary for different CPs (cloud providers. CP ensures that the user’s proprietary information is maintained more secretly with current technologies. More remarkable occurrence is even the cloud provider does not have suggestions regarding the information and the digital data stored and maintained globally anywhere in the cloud. The proposed system is one of the obligatory research issues in cloud computing. We came forward by proposing the Privacy Preserving Model to Prevent Digital Data Loss in the Cloud (PPM–DDLC. This proposal helps the CR (cloud requester/users to trust their proprietary information and data stored in the cloud.

  15. Utility-preserving privacy protection of textual healthcare documents.

    Science.gov (United States)

    Sánchez, David; Batet, Montserrat; Viejo, Alexandre

    2014-12-01

    The adoption of ITs by medical organisations makes possible the compilation of large amounts of healthcare data, which are quite often needed to be released to third parties for research or business purposes. Many of this data are of sensitive nature, because they may include patient-related documents such as electronic healthcare records. In order to protect the privacy of individuals, several legislations on healthcare data management, which state the kind of information that should be protected, have been defined. Traditionally, to meet with current legislations, a manual redaction process is applied to patient-related documents in order to remove or black-out sensitive terms. This process is costly and time-consuming and has the undesired side effect of severely reducing the utility of the released content. Automatic methods available in the literature usually propose ad-hoc solutions that are limited to protect specific types of structured information (e.g. e-mail addresses, social security numbers, etc.); as a result, they are hardly applicable to the sensitive entities stated in current regulations that do not present those structural regularities (e.g. diseases, symptoms, treatments, etc.). To tackle these limitations, in this paper we propose an automatic sanitisation method for textual medical documents (e.g. electronic healthcare records) that is able to protect, regardless of their structure, sensitive entities (e.g. diseases) and also those semantically related terms (e.g. symptoms) that may disclose the former ones. Contrary to redaction schemes based on term removal, our approach improves the utility of the protected output by replacing sensitive terms with appropriate generalisations retrieved from several medical and general-purpose knowledge bases. Experiments conducted on highly sensitive documents and in coherency with current regulations on healthcare data privacy show promising results in terms of the practical privacy and utility of the

  16. Data protection laws and privacy on Facebook

    Directory of Open Access Journals (Sweden)

    Phillip Nyoni

    2015-07-01

    Full Text Available Background: Social networks have changed the way people communicate. Business processes and social interactions revolve more in the cyber space. However, as these cyber technologies advance, users become more exposed to privacy threats. Regulatory frameworks and legal instruments currently lacking a strong cyber presence are required, for the protection of users. Objectives: There is need to explore and evaluate the extent to which users are exposed to vulnerabilities and threats in the context of the existing protection laws and policies. Furthermore, to investigate how the existing legal instruments can be enhanced to better protect users. Method: This article evaluates and analyses these privacy challenges from a legalistic point of view. The study is focused on the South African Facebook users. Poll information gathered from the profile pages of users at North-West University was analysed. A short survey was also conducted to validate the poll results. Descriptive statistics, including measures of central tendency and measures of spread, have been used to present the data. In addition, a combination of tabulated and graphical description data was also summarised in a meaningful way. Results: The results clearly show that the legal frameworks and laws are still evolving and that they are not adequately drafted to deal with specific cyber violation of privacy. Conclusion: This highlights the need to review legal instruments on a regular basis with wider consultation with users in an endeavour to develop a robust and an enforceable legal framework. A proactive legal framework would be the ideal approach unfortunately; law is reactive to cyber-crimes.

  17. Computers, privacy and data protection an element of choice

    CERN Document Server

    Gutwirth, Serge; De Hert, Paul; Leenes, Ronald

    2011-01-01

    This timely volume presents current developments in ICT and privacy/data protection. Readers will find an alternative view of the Data Protection Directive, the contentious debates on data sharing with the USA (SWIFT, PNR), and the judicial and political resistance against data retention.

  18. An Alternative View of Privacy on Facebook

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2011-02-01

    Full Text Available The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy tends to ignore the political economy of privacy in capitalism that can mask socio-economic inequality and protect capital and the rich from public accountability. Facebook is in this paper analyzed with the help of an approach, in which privacy for dominant groups, in regard to the ability of keeping wealth and power secret from the public, is seen as problematic, whereas privacy at the bottom of the power pyramid for consumers and normal citizens is seen as a protection from dominant interests. Facebook’s privacy concept is based on an understanding that stresses self-regulation and on an individualistic understanding of privacy. The theoretical analysis of the political economy of privacy on Facebook in this paper is based on the political theories of Karl Marx, Hannah Arendt and Jürgen Habermas. Based on the political economist Dallas Smythe’s concept of audience commodification, the process of prosumer commodification on Facebook is analyzed. The political economy of privacy on Facebook is analyzed with the help of a theory of drives that is grounded in Herbert Marcuse’s interpretation of Sigmund Freud, which allows to analyze Facebook based on the concept of play labor (= the convergence of play and labor.

  19. The Protection of the Image and Privacy in France

    Directory of Open Access Journals (Sweden)

    Leonardo Estevam de Assis Zanini

    2018-03-01

    Full Text Available This article analyzes the emergence and development of the protection of the image and privacy in France. It emphasizes that initially the defense of these rights was only work of the courts, that created rules applicable to the concrete cases. The courts used the general clause of civil liability, because there was no developed doctrine on personality rights. Subsequently the matter also began to be object of study of the French doctrinators. Unlike Germany, which granted protection very early, France only regulated these rights with the promulgation of the Law 70-643, of 17th July 1970, which introduced the right to privacy in the article 9 of the French Civil Code. This norm reinforced the protection of the personality, but it remains to be seen whether there has also been an improvement in the protection of the image in France, which we will study in this article.

  20. Large-scale Health Information Database and Privacy Protection.

    Science.gov (United States)

    Yamamoto, Ryuichi

    2016-09-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients' medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  1. Young adult females' views regarding online privacy protection at two time points.

    Science.gov (United States)

    Moreno, Megan A; Kelleher, Erin; Ameenuddin, Nusheen; Rastogi, Sarah

    2014-09-01

    Risks associated with adolescent Internet use include exposure to inappropriate information and privacy violations. Privacy expectations and policies have changed over time. Recent Facebook security setting changes heighten these risks. The purpose of this study was to investigate views and experiences with Internet safety and privacy protection among older adolescent females at two time points, in 2009 and 2012. Two waves of focus groups were conducted, one in 2009 and the other in 2012. During these focus groups, female university students discussed Internet safety risks and strategies and privacy protection. All focus groups were audio recorded and manually transcribed. Qualitative analysis was conducted at the end of each wave and then reviewed and combined in a separate analysis using the constant comparative method. A total of 48 females participated across the two waves. The themes included (1) abundant urban myths, such as the ability for companies to access private information; (2) the importance of filtering one's displayed information; and (3) maintaining age limits on social media access to avoid younger teens' presence on Facebook. The findings present a complex picture of how adolescents view privacy protection and online safety. Older adolescents may be valuable partners in promoting safe and age-appropriate Internet use for younger teens in the changing landscape of privacy. Copyright © 2014. Published by Elsevier Inc.

  2. The privacy concerns in location based services: protection approaches and remaining challenges

    OpenAIRE

    Basiri, Anahid; Moore, Terry; Hill, Chris

    2016-01-01

    Despite the growth in the developments of the Location Based Services (LBS) applications, there are still several challenges remaining. One of the most important concerns about LBS, shared by many users and service providers is the privacy. Privacy has been considered as a big threat to the adoption of LBS among many users and consequently to the growth of LBS markets. This paper discusses the privacy concerns associated with location data, and the current privacy protection approaches. It re...

  3. A Privacy-Protecting Authentication Scheme for Roaming Services with Smart Cards

    Science.gov (United States)

    Son, Kyungho; Han, Dong-Guk; Won, Dongho

    In this work we propose a novel smart card based privacy-protecting authentication scheme for roaming services. Our proposal achieves so-called Class 2 privacy protection, i.e., no information identifying a roaming user and also linking the user's behaviors is not revealed in a visited network. It can be used to overcome the inherent structural flaws of smart card based anonymous authentication schemes issued recently. As shown in our analysis, our scheme is computationally efficient for a mobile user.

  4. Development of measures of online privacy concern and protection for use on the Internet

    OpenAIRE

    Buchanan, T; Paine, C; Joinson, A; Reips, U D

    2007-01-01

    As the Internet grows in importance, concerns about online privacy have arisen. We describe the development and validation of three short Internet-administered scales measuring privacy related attitudes ('Privacy Concern') and behaviors ('General Caution' and 'Technical Protection').

  5. Libraries Protecting Privacy on Social Media: Sharing without "Oversharing"

    Directory of Open Access Journals (Sweden)

    Kelley Cotter

    2016-11-01

    Full Text Available Libraries have increasingly adopted social media as an integral means of connecting with their users. However, social media presents many potential concerns regarding library patron privacy. This article presents the findings from a study of how librarians and library staff perceive and handle issues of patron privacy related to social media marketing in libraries. The study reports the results from a mixed-methods online survey, which used a nonprobability self-selection sampling method to collect responses from individuals employed by libraries, without restrictions on position or library type. Nearly three-quarters of respondents reported working in libraries that have either an official or unofficial social media policy. Approximately 53% of those policies mention patron privacy. The findings suggest that many respondents’ views and practices are influenced by the perception of the library’s physical space and social media presence as public places. The findings also suggest a lack of consensus regarding the extent of the library’s obligation to protect patron privacy on library social media sites and what would constitute a violation of privacy.

  6. Privacy protection for patients with substance use problems

    Directory of Open Access Journals (Sweden)

    Hu LL

    2011-12-01

    Full Text Available Lianne Lian Hu1, Steven Sparenborg2, Betty Tai21Department of Preventive Medicine and Biometrics, Uniformed Services University of the Health Sciences, 2Center for the Clinical Trials Network, National Institute on Drug Abuse, National Institutes of Health, Bethesda, MDAbstract: Many Americans with substance use problems will have opportunities to receive coordinated health care through the integration of primary care and specialty care for substance use disorders under the Patient Protection and Affordable Care Act of 2010. Sharing of patient health records among care providers is essential to realize the benefits of electronic health records. Health information exchange through meaningful use of electronic health records can improve health care safety, quality, and efficiency. Implementation of electronic health records and health information exchange presents great opportunities for health care integration, but also makes patient privacy potentially vulnerable. Privacy issues are paramount for patients with substance use problems. This paper discusses major differences between two federal privacy laws associated with health care for substance use disorders, identifies health care problems created by privacy policies, and describes potential solutions to these problems through technology innovation and policy improvement.Keywords: substance abuse, patient privacy, electronic health records, health information exchange

  7. Accountability as a Way Forward for Privacy Protection in the Cloud

    Science.gov (United States)

    Pearson, Siani; Charlesworth, Andrew

    The issue of how to provide appropriate privacy protection for cloud computing is important, and as yet unresolved. In this paper we propose an approach in which procedural and technical solutions are co-designed to demonstrate accountability as a path forward to resolving jurisdictional privacy and security risks within the cloud.

  8. 32 CFR 9.9 - Protection of State secrets.

    Science.gov (United States)

    2010-07-01

    ... FOR TRIALS BY MILITARY COMMISSIONS OF CERTAIN NON-UNITED STATES CITIZENS IN THE WAR AGAINST TERRORISM § 9.9 Protection of State secrets. Nothing in this part shall be construed to authorize disclosure of... 32 National Defense 1 2010-07-01 2010-07-01 false Protection of State secrets. 9.9 Section 9.9...

  9. Privacy Bridges: EU and US Privacy Experts In Search of Transatlantic Privacy Solutions

    NARCIS (Netherlands)

    Abramatic, J.-F.; Bellamy, B.; Callahan, M.E.; Cate, F.; van Eecke, P.; van Eijk, N.; Guild, E.; de Hert, P.; Hustinx, P.; Kuner, C.; Mulligan, D.; O'Connor, N.; Reidenberg, J.; Rubinstein, I.; Schaar, P.; Shadbolt, N.; Spiekermann, S.; Vladeck, D.; Weitzner, D.J.; Zuiderveen Borgesius, F.; Hagenauw, D.; Hijmans, H.

    2015-01-01

    The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the

  10. Protecting location privacy for outsourced spatial data in cloud storage.

    Science.gov (United States)

    Tian, Feng; Gui, Xiaolin; An, Jian; Yang, Pan; Zhao, Jianqiang; Zhang, Xuejun

    2014-01-01

    As cloud computing services and location-aware devices are fully developed, a large amount of spatial data needs to be outsourced to the cloud storage provider, so the research on privacy protection for outsourced spatial data gets increasing attention from academia and industry. As a kind of spatial transformation method, Hilbert curve is widely used to protect the location privacy for spatial data. But sufficient security analysis for standard Hilbert curve (SHC) is seldom proceeded. In this paper, we propose an index modification method for SHC (SHC(∗)) and a density-based space filling curve (DSC) to improve the security of SHC; they can partially violate the distance-preserving property of SHC, so as to achieve better security. We formally define the indistinguishability and attack model for measuring the privacy disclosure risk of spatial transformation methods. The evaluation results indicate that SHC(∗) and DSC are more secure than SHC, and DSC achieves the best index generation performance.

  11. Large-scale Health Information Database and Privacy Protection*1

    Science.gov (United States)

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA*2 projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients’ medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  12. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-01-01

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users’ privacy. PMID:29751670

  13. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection.

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-05-11

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users' privacy.

  14. Towards quantitative evaluation of privacy protection schemes for electricity usage data sharing

    Directory of Open Access Journals (Sweden)

    Daisuke Mashima

    2018-03-01

    Full Text Available Thanks to the roll-out of smart meters, availability of fine-grained electricity usage data has rapidly grown. Such data has enabled utility companies to perform robust and efficient grid operations. However, at the same time, privacy concerns associated with sharing and disclosure of such data have been raised. In this paper, we first demonstrate the feasibility of estimating privacy-sensitive household attributes based solely on the energy usage data of residential customers. We then discuss a framework to measure privacy gain and evaluate the effectiveness of customer-centric privacy-protection schemes, namely redaction of data irrelevant to services and addition of bounded artificial noise. Keywords: Privacy, Smart meter data, Quantitative evaluation

  15. Efficient task assignment in spatial crowdsourcing with worker and task privacy protection

    KAUST Repository

    Liu, An

    2017-08-01

    Spatial crowdsourcing (SC) outsources tasks to a set of workers who are required to physically move to specified locations and accomplish tasks. Recently, it is emerging as a promising tool for emergency management, as it enables efficient and cost-effective collection of critical information in emergency such as earthquakes, when search and rescue survivors in potential ares are required. However in current SC systems, task locations and worker locations are all exposed in public without any privacy protection. SC systems if attacked thus have penitential risk of privacy leakage. In this paper, we propose a protocol for protecting the privacy for both workers and task requesters while maintaining the functionality of SC systems. The proposed protocol is built on partially homomorphic encryption schemes, and can efficiently realize complex operations required during task assignment over encrypted data through a well-designed computation strategy. We prove that the proposed protocol is privacy-preserving against semi-honest adversaries. Simulation on two real-world datasets shows that the proposed protocol is more effective than existing solutions and can achieve mutual privacy-preserving with acceptable computation and communication cost.

  16. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2018-05-01

    Full Text Available A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users’ privacy.

  17. DE-IDENTIFICATION TECHNIQUE FOR IOT WIRELESS SENSOR NETWORK PRIVACY PROTECTION

    Directory of Open Access Journals (Sweden)

    Yennun Huang

    2017-02-01

    Full Text Available As the IoT ecosystem becoming more and more mature, hardware and software vendors are trying create new value by connecting all kinds of devices together via IoT. IoT devices are usually equipped with sensors to collect data, and the data collected are transmitted over the air via different kinds of wireless connection. To extract the value of the data collected, the data owner may choose to seek for third-party help on data analysis, or even of the data to the public for more insight. In this scenario it is important to protect the released data from privacy leakage. Here we propose that differential privacy, as a de-identification technique, can be a useful approach to add privacy protection to the data released, as well as to prevent the collected from intercepted and decoded during over-the-air transmission. A way to increase the accuracy of the count queries performed on the edge cases in a synthetic database is also presented in this research.

  18. Genetic privacy and confidentiality: why they are so hard to protect.

    Science.gov (United States)

    Rothstein, M A

    1998-01-01

    Author notes that widespread concerns have been raised about protecting genetic privacy and confidentiality in insurance and employment. He argues that effective protections are difficult because complicated issues, such as the right of access to health care, are invariably implicated.

  19. A smart-card-enabled privacy preserving E-prescription system.

    Science.gov (United States)

    Yang, Yanjiang; Han, Xiaoxi; Bao, Feng; Deng, Robert H

    2004-03-01

    Within the overall context of protection of health care information, privacy of prescription data needs special treatment. First, the involvement of diverse parties, especially nonmedical parties in the process of drug prescription complicates the protection of prescription data. Second, both patients and doctors have privacy stakes in prescription, and their privacy should be equally protected. Third, the following facts determine that prescription should not be processed in a truly anonymous manner: certain involved parties conduct useful research on the basis of aggregation of prescription data that are linkable with respect to either the patients or the doctors; prescription data has to be identifiable in some extreme circumstances, e.g., under the court order for inspection and assign liability. In this paper, we propose an e-prescription system to address issues pertaining to the privacy protection in the process of drug prescription. In our system, patients' smart cards play an important role. For one thing, the smart cards are implemented to be portable repositories carrying up-to-date personal medical records and insurance information, providing doctors instant data access crucial to the process of diagnosis and prescription. For the other, with the secret signing key being stored inside, the smart card enables the patient to sign electronically the prescription pad, declaring his acceptance of the prescription. To make the system more realistic, we identify the needs for a patient to delegate his signing capability to other people so as to protect the privacy of information housed on his card. A strong proxy signature scheme achieving technologically mutual agreements on the delegation is proposed to implement the delegation functionality.

  20. Privacy protected text analysis in DataSHIELD

    Directory of Open Access Journals (Sweden)

    Rebecca Wilson

    2017-04-01

    Whilst it is possible to analyse free text within a DataSHIELD infrastructure, the challenge is creating generalised and resilient anti-disclosure methods for free text analysis. There are a range of biomedical and health sciences applications for DataSHIELD methods of privacy protected analysis of free text including analysis of electronic health records and analysis of qualitative data e.g. from social media.

  1. Privacy in Digital Age: Dead or Alive?! Regarding the New EU Data Protection Regulations

    Directory of Open Access Journals (Sweden)

    Seyed Ebrahim Dorraji

    2015-02-01

    Full Text Available Purpose – To review and critically discuss the current state of privacy in the context of constant technological changes and to emphasize the pace of technological advancements and developments reached over the time when the last EU data protection laws came into effect. These facts inevitably affect the perception of privacy and raise the question of whether privacy is dead or takes the last breath in the digital age? This paper is an attempt to address this question.Design/Methodology/Approach – Based on the comparison and systematic analysis of scientific literature, the authors discuss problematic issues related to privacy and data protection in the technology era – where these issues are too complicated to be clearly regulated by laws and rules since “laws move as a function of years and technology moves as a function of months” (Ron Rivest. Therefore, this analytical approach towards the issue may help to facilitate reaching the best-fit decision in this area.Findings – The authors emphasize the change of perception of privacy, which originated and grew on the idea of “an integral part of our humanity”, the “heart of our liberty” and “the beginning of all freedoms” (Solove, 2008, leading to the recently raised idea that privacy is severely hanging with threat. The authors are of the opinion that legislation and regulation may be one of the best and effective techniques for protecting privacy in the twenty-first century, but it is not currently adequate (Wacks, 2012. One of the solutions lies in technology design.Research limitations/implications – The aspects of privacy and data protection in the European Union have been widely discussed recently because of their broad applicability. Therefore, it is hardly possible to review and cover all the important aspects of the issue. This article focuses on the roles of technology and legislation in securing privacy. The authors examine and provide their own views based on

  2. An Adaptive Privacy Protection Method for Smart Home Environments Using Supervised Learning

    Directory of Open Access Journals (Sweden)

    Jingsha He

    2017-03-01

    Full Text Available In recent years, smart home technologies have started to be widely used, bringing a great deal of convenience to people’s daily lives. At the same time, privacy issues have become particularly prominent. Traditional encryption methods can no longer meet the needs of privacy protection in smart home applications, since attacks can be launched even without the need for access to the cipher. Rather, attacks can be successfully realized through analyzing the frequency of radio signals, as well as the timestamp series, so that the daily activities of the residents in the smart home can be learnt. Such types of attacks can achieve a very high success rate, making them a great threat to users’ privacy. In this paper, we propose an adaptive method based on sample data analysis and supervised learning (SDASL, to hide the patterns of daily routines of residents that would adapt to dynamically changing network loads. Compared to some existing solutions, our proposed method exhibits advantages such as low energy consumption, low latency, strong adaptability, and effective privacy protection.

  3. Improving privacy protection in the area of behavioural targeting

    NARCIS (Netherlands)

    Zuiderveen Borgesius, F.J.

    2014-01-01

    This PhD thesis discusses how European law could improve privacy protection in the area of behavioural targeting. Behavioural targeting, also referred to as online profiling, involves monitoring people’s online behaviour, and using the collected information to show people individually targeted

  4. Standpoints and protection of business secrets

    Directory of Open Access Journals (Sweden)

    Brane Bertoncelj

    2001-06-01

    Full Text Available The human impact on an information system where data bases, containing business secretes, are stored is one of the most unreliable and unforeseeable factors. For this reason, it must not be underestimated. The results of this study indicate a correlation between behavioural intention and protection of business secretes. There is a statistically significant correlation between behavioural intention and behavioural supervision. This means that an increased level of perceived supervision over one's own behaviour is related to behavioural intention. A great majority of participants would not divulge a business secret due to internal moral factors, i.e., they possess the appropriate capabilities to determine the advantages of social moral values over personal values.

  5. Privacy preservation and information security protection for patients' portable electronic health records.

    Science.gov (United States)

    Huang, Lu-Chou; Chu, Huei-Chung; Lien, Chung-Yueh; Hsiao, Chia-Hung; Kao, Tsair

    2009-09-01

    As patients face the possibility of copying and keeping their electronic health records (EHRs) through portable storage media, they will encounter new risks to the protection of their private information. In this study, we propose a method to preserve the privacy and security of patients' portable medical records in portable storage media to avoid any inappropriate or unintentional disclosure. Following HIPAA guidelines, the method is designed to protect, recover and verify patient's identifiers in portable EHRs. The results of this study show that our methods are effective in ensuring both information security and privacy preservation for patients through portable storage medium.

  6. Balance between Privacy Protecting and Selling User Data of Wearable Devices

    OpenAIRE

    Huang, Kuang-Chiu; Hsu, Jung-Fang

    2017-01-01

    Smart bracelets are capable of identifying individual data, which can synchronize the step count, mileage, calorie consumption, heart rate, sleeping data and even the pictures users uploaded with the APP. This feature is so convenient on one hand but makes us lose control of our privacy on the other hand. With poor privacy protection mechanism embedded in these wearable devices that hackers can easily invade and steal user data. In addition, most smart bracelet companies have not made a clear...

  7. A privacy protection for an mHealth messaging system

    Science.gov (United States)

    Aaleswara, Lakshmipathi; Akopian, David; Chronopoulos, Anthony T.

    2015-03-01

    In this paper, we propose a new software system that employs features that help the organization to comply with USA HIPAA regulations. The system uses SMS as the primary way of communication to transfer information. Lack of knowledge about some diseases is still a major reason for some harmful diseases spreading. The developed system includes different features that may help to communicate amongst low income people who don't even have access to the internet. Since the software system deals with Personal Health Information (PHI) it is equipped with an access control authentication system mechanism to protect privacy. The system is analyzed for performance to identify how much overhead the privacy rules impose.

  8. Perspectives of Australian adults about protecting the privacy of their health information in statistical databases.

    Science.gov (United States)

    King, Tatiana; Brankovic, Ljiljana; Gillard, Patricia

    2012-04-01

    . Assuring individuals that their personal health information is de-identified reduces their concern about the necessity of consent for releasing health information for research purposes, but many people are not aware that removing their names and other direct identifiers from medical records does not guarantee full privacy protection for their health information. Privacy concerns decrease as extra security measures are introduced to protect privacy. Therefore, instead of "tailoring concern" as proposed by Willison we suggest improving privacy protection of personal information by introducing additional security measures in data publishing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. The Influence of Security Statement, Technical Protection, and Privacy on Satisfaction and Loyalty; A Structural Equation Modeling

    Science.gov (United States)

    Peikari, Hamid Reza

    Customer satisfaction and loyalty have been cited as the e-commerce critical success factors and various studies have been conducted to find the antecedent determinants of these concepts in the online transactions. One of the variables suggested by some studies is perceived security. However, these studies have referred to security from a broad general perspective and no attempts have been made to study the specific security related variables. This paper intends to study the influence on security statement and technical protection on satisfaction, loyalty and privacy. The data was collected from 337 respondents and after the reliability and validity tests, path analysis was applied to examine the hypotheses. The results suggest that loyalty is influenced by satisfaction and security statement and no empirical support was found for the influence on technical protection and privacy on loyalty. Moreover, it was found that security statement and technical protection have a positive significant influence on satisfaction while no significant effect was found for privacy. Furthermore, the analysis indicated that security statement have a positive significant influence on technical protection while technical protection was found to have a significant negative impact on perceived privacy.

  10. Hybrid Paradigm from European and America Concerning Privacy and Personal Data Protection in Indonesia

    Directory of Open Access Journals (Sweden)

    Edmon Makarim

    2013-05-01

    Full Text Available In the emerging era of information and technology, the importance of privacy and data protection is growing ever since. However, despite such common concern from the society, there is some confusion about the mechanisms of differentiation and scope of discussion about privacy with the protection of personal data and even impressed blended with issues of spamming issues. With comparison to Europe and the US legal perspectives, Therefore, this paper tries to discuss such problem in accordance to the perspective of laws to the communication itself.

  11. A compressive sensing based secure watermark detection and privacy preserving storage framework.

    Science.gov (United States)

    Qia Wang; Wenjun Zeng; Jun Tian

    2014-03-01

    Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.

  12. Battling for the Rights to Privacy and Data Protection in the Irish Courts

    Directory of Open Access Journals (Sweden)

    Shane Darcy

    2015-02-01

    Full Text Available Far-reaching mass surveillance by the US National Security Agency and other national security services has brought issues of privacy and data protection to the fore in recent years. Information and technology companies have been embroiled in this scandal for having shared, unwittingly or otherwise, users’ personal data with the security services. Facebook, the world’s largest social media company, has long-been criticised by privacy advocates because of its treatment of users’ data. Proceedings before the Irish courts concerning the role of national data protection authorities have seen an examination of these practices in light of relevant Irish and EU law.

  13. Biomedical databases: protecting privacy and promoting research.

    Science.gov (United States)

    Wylie, Jean E; Mineau, Geraldine P

    2003-03-01

    When combined with medical information, large electronic databases of information that identify individuals provide superlative resources for genetic, epidemiology and other biomedical research. Such research resources increasingly need to balance the protection of privacy and confidentiality with the promotion of research. Models that do not allow the use of such individual-identifying information constrain research; models that involve commercial interests raise concerns about what type of access is acceptable. Researchers, individuals representing the public interest and those developing regulatory guidelines must be involved in an ongoing dialogue to identify practical models.

  14. The Secret Life of Your Classmates: Understanding Communication Privacy Management

    Science.gov (United States)

    Nodulman, Jessica A.

    2011-01-01

    This article presents an activity that combines this popular website, Postsecret.com, with college students' love for the internet, and course content on privacy boundaries and theory, disclosure, communicative control, and privacy rule development. By taking part in this activity, students practice privacy disclosure and are able to examine their…

  15. Privacy Policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy; Petkovic, M.; Jonker, W.; Jonker, Willem

    2007-01-01

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  16. Privacy policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, S.; Hartog, den J.I.; Petkovic, M.; Jonker, W.

    2007-01-01

    Privacy is a prime concern in today’s information society. To protect the privacy of individuals, enterprises must follow certain privacy practices while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website, processes

  17. Using genetic information while protecting the privacy of the soul.

    Science.gov (United States)

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  18. A Legal Approach to Civilian Use of Drones in Europe. Privacy and Personal Data Protection Concerns

    OpenAIRE

    Pauner Chulvi, Cristina; Viguri Cordero, Jorge Agustín

    2015-01-01

    Drones are a growth industry evolving quickly from military to civilian uses however, they have the potential to pose a serious risk to security, privacy and data protection. After a first stage focused on safety issues, Europe is facing the challenge to develop a regulatory framework for drones integration into the airspace system while safeguarding the guarantees of fundamental rights and civil liberties. This paper analyses the potential privacy and data protection risks ...

  19. Protection of the right to privacy in the practice of the European Court of Human Rights

    Directory of Open Access Journals (Sweden)

    Mladenov Marijana

    2013-01-01

    Full Text Available The right to privacy is a fundamental human right and an essential component of the protection of human autonomy and freedom. The development of science and information systems creates various opportunities for interferences with physical and moral integrity of a person. Therefore, it is necessary to determine the precise content of the right to privacy. The European Convention on Human Rights and Fundamental Freedoms guarantees this right under Article 8. The European Court of Human Rights did not precisely define the content of the right to privacy and thereby the applicants could bring different aspects of life into the scope of respect for private life. According to the Court, the concept of privacy and private life includes the following areas of human life: the right to establish and maintain relationships with other human beings, protection of the physical and moral integrity of persons, protection of personal data, change of personal name, various issues related to sexual orientation and transgender. The subject of this paper is referring to previously mentioned spheres of human life in the light of interpretation of Article 8 of the Convention.

  20. Energy-efficient privacy protection for smart home environments using behavioral semantics.

    Science.gov (United States)

    Park, Homin; Basaran, Can; Park, Taejoon; Son, Sang Hyuk

    2014-09-02

    Research on smart environments saturated with ubiquitous computing devices is rapidly advancing while raising serious privacy issues. According to recent studies, privacy concerns significantly hinder widespread adoption of smart home technologies. Previous work has shown that it is possible to infer the activities of daily living within environments equipped with wireless sensors by monitoring radio fingerprints and traffic patterns. Since data encryption cannot prevent privacy invasions exploiting transmission pattern analysis and statistical inference, various methods based on fake data generation for concealing traffic patterns have been studied. In this paper, we describe an energy-efficient, light-weight, low-latency algorithm for creating dummy activities that are semantically similar to the observed phenomena. By using these cloaking activities, the amount of  fake data transmissions can be flexibly controlled to support a trade-off between energy efficiency and privacy protection. According to the experiments using real data collected from a smart home environment, our proposed method can extend the lifetime of the network by more than 2× compared to the previous methods in the literature. Furthermore, the activity cloaking method supports low latency transmission of real data while also significantly reducing the accuracy of the wireless snooping attacks.

  1. A Quantum Private Query Protocol for Enhancing both User and Database Privacy

    Science.gov (United States)

    Zhou, Yi-Hua; Bai, Xue-Wei; Li, Lei-Lei; Shi, Wei-Min; Yang, Yu-Guang

    2018-01-01

    In order to protect the privacy of query user and database, some QKD-based quantum private query (QPQ) protocols were proposed. Unfortunately some of them cannot resist internal attack from database perfectly; some others can ensure better user privacy but require a reduction of database privacy. In this paper, a novel two-way QPQ protocol is proposed to ensure the privacy of both sides of communication. In our protocol, user makes initial quantum states and derives the key bit by comparing initial quantum state and outcome state returned from database by ctrl or shift mode instead of announcing two non-orthogonal qubits as others which may leak part secret information. In this way, not only the privacy of database be ensured but also user privacy is strengthened. Furthermore, our protocol can also realize the security of loss-tolerance, cheat-sensitive, and resisting JM attack etc. Supported by National Natural Science Foundation of China under Grant Nos. U1636106, 61572053, 61472048, 61602019, 61502016; Beijing Natural Science Foundation under Grant Nos. 4152038, 4162005; Basic Research Fund of Beijing University of Technology (No. X4007999201501); The Scientific Research Common Program of Beijing Municipal Commission of Education under Grant No. KM201510005016

  2. The problem of using trade secrets in economic relations

    Directory of Open Access Journals (Sweden)

    А. О. Олефір

    2015-05-01

    Full Text Available Problem setting. In a market economy and increased competition between enterprises become increasingly important concepts such as business information, trade secrets, know-how, confidential information, the information with restricted access. Given the fact that only one patent protection is unable to meet the needs of researchers, in addition to formal public protection and secured legal means we would like to pay attention at private legal measures, particular, the mode of trade secrets. Recent research and publications analysis. Different aspects of the protection of trade secrets were investigated by specialists such as G. Androschuk, J. Berzhye, I. Davydov, O. Davydyuk, D. Zadyhaylo, P. Kraynov, G. Nikiforov, S. Nikiforov, V. Rubanov, E. Solovyov, L. Hoffman, V. Chaplygin, A. Cherniavsky and others. However, at present there is a lack of comprehensive research of this legal phenomenon, equally useful for innovators and businesses that actively protect corporate security. Paper objective. This article is planned to determine the legal characteristics, structural elements and mechanisms by which the use of trade secrets in business have a positive impact on innovation development and corporate security entities. Paper main body. On the basis of requirements of Art. 505 Civil Code of Ukraine and art. 39 of the TRIPS Agreement we formulated commercial information signs under which it receives legal protection as an object of intellectual property: (1 privacy (real or potential in the sense that it is as a whole or in a precise combination of aggregate and its components are not generally known or available to persons in the circles that normally deal with such information; (2 commercial value (not purely industrial or industrial, due to its secrecy; this information is unknown to others, which is a commercial interest; (3 the lawful holder of the information provides active special measures (technical, organizational, legal to preserve secrecy

  3. PRUB: A Privacy Protection Friend Recommendation System Based on User Behavior

    Directory of Open Access Journals (Sweden)

    Wei Jiang

    2016-01-01

    Full Text Available The fast developing social network is a double-edged sword. It remains a serious problem to provide users with excellent mobile social network services as well as protecting privacy data. Most popular social applications utilize behavior of users to build connection with people having similar behavior, thus improving user experience. However, many users do not want to share their certain behavioral information to the recommendation system. In this paper, we aim to design a secure friend recommendation system based on the user behavior, called PRUB. The system proposed aims at achieving fine-grained recommendation to friends who share some same characteristics without exposing the actual user behavior. We utilized the anonymous data from a Chinese ISP, which records the user browsing behavior, for 3 months to test our system. The experiment result shows that our system can achieve a remarkable recommendation goal and, at the same time, protect the privacy of the user behavior information.

  4. Taiwan's perspective on electronic medical records' security and privacy protection: lessons learned from HIPAA.

    Science.gov (United States)

    Yang, Che-Ming; Lin, Herng-Ching; Chang, Polun; Jian, Wen-Shan

    2006-06-01

    The protection of patients' health information is a very important concern in the information age. The purpose of this study is to ascertain what constitutes an effective legal framework in protecting both the security and privacy of health information, especially electronic medical records. All sorts of bills regarding electronic medical data protection have been proposed around the world including Health Insurance Portability and Accountability Act (HIPAA) of the U.S. The trend of a centralized bill that focuses on managing computerized health information is the part that needs our further attention. Under the sponsor of Taiwan's Department of Health (DOH), our expert panel drafted the "Medical Information Security and Privacy Protection Guidelines", which identifies nine principles and entails 12 articles, in the hope that medical organizations will have an effective reference in how to manage their medical information in a confidential and secured fashion especially in electronic transactions.

  5. Towards quantum-based privacy and voting

    International Nuclear Information System (INIS)

    Hillery, Mark; Ziman, Mario; Buzek, Vladimir; Bielikova, Martina

    2006-01-01

    The privacy of communicating participants is often of paramount importance, but in some situations it is an essential condition. A typical example is a fair (secret) voting. We analyze in detail communication privacy based on quantum resources, and we propose new quantum protocols. Possible generalizations that would lead to voting schemes are discussed

  6. Understanding Factors Associated with Singaporean Adolescents' Intention to Adopt Privacy Protection Behavior Using an Extended Theory of Planned Behavior.

    Science.gov (United States)

    Ho, Shirley S; Lwin, May O; Yee, Andrew Z H; Lee, Edmund W J

    2017-09-01

    Using an extended theory of planned behavior (TPB), this study explores how the original TPB variables (attitude, subjective norms, and perceived behavioral control), personality traits, privacy concern, past privacy protection behaviors (PPBs), as well as parental mediation strategies relate to adolescents' intention to engage in privacy protection measures. We administered a cross-sectional survey to a nationally representative sample of adolescents (N = 4,920) in Singapore. The sample comprised 50.5 percent females and 49.5 percent males with age ranging from 13 to 21 years (M = 14.73). Results from the hierarchical regression analysis showed that the proposed extended TPB model received partial support. Subjective norms, among the TPB and other factors, have the strongest relationship with adolescents' intention to engage in PPBs on social network sites. Adolescents' privacy concern and their past PPBs are more important in influencing their future PPB compared with personality traits such as neuroticism and extraversion. Adolescents whose parents have engaged in regulated parental mediation are more likely to protect their privacy on SNSs compared with adolescents whose parents have adopted active mediation style.

  7. Privacy vs security

    CERN Document Server

    Stalla-Bourdillon, Sophie; Ryan, Mark D

    2014-01-01

    Securing privacy in the current environment is one of the great challenges of today's democracies. Privacy vs. Security explores the issues of privacy and security and their complicated interplay, from a legal and a technical point of view. Sophie Stalla-Bourdillon provides a thorough account of the legal underpinnings of the European approach to privacy and examines their implementation through privacy, data protection and data retention laws. Joshua Philips and Mark D. Ryan focus on the technological aspects of privacy, in particular, on today's attacks on privacy by the simple use of today'

  8. Location-Related Privacy in Geo-Social Networks

    DEFF Research Database (Denmark)

    Ruiz Vicente, Carmen; Freni, Dario; Bettini, Claudio

    2011-01-01

    -ins." However, this ability to reveal users' locations causes new privacy threats, which in turn call for new privacy-protection methods. The authors study four privacy aspects central to these social networks - location, absence, co-location, and identity privacy - and describe possible means of protecting...... privacy in these circumstances....

  9. Privacy and Security Issues Surrounding the Protection of Data Generated by Continuous Glucose Monitors.

    Science.gov (United States)

    Britton, Katherine E; Britton-Colonnese, Jennifer D

    2017-03-01

    Being able to track, analyze, and use data from continuous glucose monitors (CGMs) and through platforms and apps that communicate with CGMs helps achieve better outcomes and can advance the understanding of diabetes. The risks to patients' expectation of privacy are great, and their ability to control how their information is collected, stored, and used is virtually nonexistent. Patients' physical security is also at risk if adequate cybersecurity measures are not taken. Currently, data privacy and security protections are not robust enough to address the privacy and security risks and stymies the current and future benefits of CGM and the platforms and apps that communicate with them.

  10. From privacy to data protection in the EU : Implications for big data health research

    NARCIS (Netherlands)

    Mostert, Menno; Bredenoord, Annelien L.; Van Der Slootb, Bart; Van Delden, Johannes J.M.

    2018-01-01

    The right to privacy has usually been considered as the most prominent fundamental right to protect in data-intensive (Big Data) health research. Within the European Union (EU), however, the right to data protection is gaining relevance as a separate fundamental right that should in particular be

  11. Energy-Efficient Privacy Protection for Smart Home Environments Using Behavioral Semantics

    Directory of Open Access Journals (Sweden)

    Homin Park

    2014-09-01

    Full Text Available Research on smart environments saturated with ubiquitous computing devices is rapidly advancing while raising serious privacy issues. According to recent studies, privacy concerns significantly hinder widespread adoption of smart home technologies. Previous work has shown that it is possible to infer the activities of daily living within environments equipped with wireless sensors by monitoring radio fingerprints and traffic patterns. Since data encryption cannot prevent privacy invasions exploiting transmission pattern analysis and statistical inference, various methods based on fake data generation for concealing traffic patterns have been studied. In this paper, we describe an energy-efficient, light-weight, low-latency algorithm for creating dummy activities that are semantically similar to the observed phenomena. By using these cloaking activities, the amount of  fake data transmissions can be flexibly controlled to support a trade-off between energy efficiency and privacy protection. According to the experiments using real data collected from a smart home environment, our proposed method can extend the lifetime of the network by more than 2× compared to the previous methods in the literature. Furthermore, the activity cloaking method supports low latency transmission of real data while also significantly reducing the accuracy of the wireless snooping attacks.

  12. End-to-End Privacy Protection for Facebook Mobile Chat based on AES with Multi-Layered MD5

    Directory of Open Access Journals (Sweden)

    Wibisono Sukmo Wardhono

    2018-01-01

    Full Text Available As social media environments become more interactive and amount of users grown tremendously, privacy is a matter of increasing concern. When personal data become a commodity, social media company can share users data to another party such as government. Facebook, inc is one of the social media company that frequently asked for user’s data. Although this private data request mechanism through a formal and valid legal process, it still undermine the fundamental right to information privacy. In This Case, social media users need protection against privacy violation from social media platform provider itself.  Private chat is the most favorite feature of a social media. Inside a chat room, user can share their private information contents. Cryptography is one of data protection methods that can be used to hides private communication data from unauthorized parties. In our study, we proposed a system that can encrypt chatting content based on AES and multi-layered MD5 to ensure social media users have privacy protection against social media company that use user informations as a commodity. In addition, this system can make users convenience to share their private information through social media platform.

  13. Privacy-leakage codes for biometric authentication systems

    NARCIS (Netherlands)

    Ignatenko, T.; Willems, F.M.J.

    2014-01-01

    In biometric privacy-preserving authentication systems that are based on key-binding, two terminals observe two correlated biometric sequences. The first terminal selects a secret key, which is independent of the biometric data, binds this secret key to the observed biometric sequence and

  14. Network Security Hacks Tips & Tools for Protecting Your Privacy

    CERN Document Server

    Lockhart, Andrew

    2009-01-01

    This second edition of Network Security Hacks offers 125 concise and practical hacks, including more information for Windows administrators, hacks for wireless networking (such as setting up a captive portal and securing against rogue hotspots), and techniques to ensure privacy and anonymity, including ways to evade network traffic analysis, encrypt email and files, and protect against phishing attacks. System administrators looking for reliable answers will also find concise examples of applied encryption, intrusion detection, logging, trending, and incident response.

  15. Large-scale Health Information Database and Privacy Protection*1

    OpenAIRE

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law...

  16. Ambiguity in Social Network Data for Presence, Sensitive-Attribute, Degree and Relationship Privacy Protection.

    Science.gov (United States)

    Rajaei, Mehri; Haghjoo, Mostafa S; Miyaneh, Eynollah Khanjari

    2015-01-01

    Maintaining privacy in network data publishing is a major challenge. This is because known characteristics of individuals can be used to extract new information about them. Recently, researchers have developed privacy methods based on k-anonymity and l-diversity to prevent re-identification or sensitive label disclosure through certain structural information. However, most of these studies have considered only structural information and have been developed for undirected networks. Furthermore, most existing approaches rely on generalization and node clustering so may entail significant information loss as all properties of all members of each group are generalized to the same value. In this paper, we introduce a framework for protecting sensitive attribute, degree (the number of connected entities), and relationships, as well as the presence of individuals in directed social network data whose nodes contain attributes. First, we define a privacy model that specifies privacy requirements for the above private information. Then, we introduce the technique of Ambiguity in Social Network data (ASN) based on anatomy, which specifies how to publish social network data. To employ ASN, individuals are partitioned into groups. Then, ASN publishes exact values of properties of individuals of each group with common group ID in several tables. The lossy join of those tables based on group ID injects uncertainty to reconstruct the original network. We also show how to measure different privacy requirements in ASN. Simulation results on real and synthetic datasets demonstrate that our framework, which protects from four types of private information disclosure, preserves data utility in tabular, topological and spectrum aspects of networks at a satisfactory level.

  17. Privacy Protection on Multiple Sensitive Attributes

    Science.gov (United States)

    Li, Zhen; Ye, Xiaojun

    In recent years, a privacy model called k-anonymity has gained popularity in the microdata releasing. As the microdata may contain multiple sensitive attributes about an individual, the protection of multiple sensitive attributes has become an important problem. Different from the existing models of single sensitive attribute, extra associations among multiple sensitive attributes should be invested. Two kinds of disclosure scenarios may happen because of logical associations. The Q&S Diversity is checked to prevent the foregoing disclosure risks, with an α Requirement definition used to ensure the diversity requirement. At last, a two-step greedy generalization algorithm is used to carry out the multiple sensitive attributes processing which deal with quasi-identifiers and sensitive attributes respectively. We reduce the overall distortion by the measure of Masking SA.

  18. Genetic privacy.

    Science.gov (United States)

    Sankar, Pamela

    2003-01-01

    During the past 10 years, the number of genetic tests performed more than tripled, and public concern about genetic privacy emerged. The majority of states and the U.S. government have passed regulations protecting genetic information. However, research has shown that concerns about genetic privacy are disproportionate to known instances of information misuse. Beliefs in genetic determinacy explain some of the heightened concern about genetic privacy. Discussion of the debate over genetic testing within families illustrates the most recent response to genetic privacy concerns.

  19. A Utility Maximizing and Privacy Preserving Approach for Protecting Kinship in Genomic Databases.

    Science.gov (United States)

    Kale, Gulce; Ayday, Erman; Tastan, Oznur

    2017-09-12

    Rapid and low cost sequencing of genomes enabled widespread use of genomic data in research studies and personalized customer applications, where genomic data is shared in public databases. Although the identities of the participants are anonymized in these databases, sensitive information about individuals can still be inferred. One such information is kinship. We define two routes kinship privacy can leak and propose a technique to protect kinship privacy against these risks while maximizing the utility of shared data. The method involves systematic identification of minimal portions of genomic data to mask as new participants are added to the database. Choosing the proper positions to hide is cast as an optimization problem in which the number of positions to mask is minimized subject to privacy constraints that ensure the familial relationships are not revealed.We evaluate the proposed technique on real genomic data. Results indicate that concurrent sharing of data pertaining to a parent and an offspring results in high risks of kinship privacy, whereas the sharing data from further relatives together is often safer. We also show arrival order of family members have a high impact on the level of privacy risks and on the utility of sharing data. Available at: https://github.com/tastanlab/Kinship-Privacy. erman@cs.bilkent.edu.tr or oznur.tastan@cs.bilkent.edu.tr. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. A Privacy Model for RFID Tag Ownership Transfer

    Directory of Open Access Journals (Sweden)

    Xingchun Yang

    2017-01-01

    Full Text Available The ownership of RFID tag is often transferred from one owner to another in its life cycle. To address the privacy problem caused by tag ownership transfer, we propose a tag privacy model which captures the adversary’s abilities to get secret information inside readers, to corrupt tags, to authenticate tags, and to observe tag ownership transfer processes. This model gives formal definitions for tag forward privacy and backward privacy and can be used to measure the privacy property of tag ownership transfer scheme. We also present a tag ownership transfer scheme, which is privacy-preserving under the proposed model and satisfies the other common security requirements, in addition to achieving better performance.

  1. Regulating Online Data Privacy

    OpenAIRE

    Paul Reid

    2004-01-01

    With existing data protection laws proving inadequate in the fight to protect online data privacy and with the offline law of privacy in a state of change and uncertainty, the search for an alternative solution to the important problem of online data privacy should commence. With the inherent problem of jurisdiction that the Internet presents, such a solution is best coming from a multi-national body with the power to approximate laws in as many jurisdictions as possible, with a recognised au...

  2. Courts, privacy and data protection in Belgium : Fundamental rights that might as well be struck from the constitution

    NARCIS (Netherlands)

    de Hert, Paul; Brkan, Maja; Psychogiopoulou, Evangelia

    2017-01-01

    Through critical analysis of case law in Belgium courts, this chapter reveals the significant role courts play in the protection of privacy and personal data within the new technological environment. It addresses the pressing question from a public who are increasingly aware of their privacy rights

  3. Fast implementation of length-adaptive privacy amplification in quantum key distribution

    International Nuclear Information System (INIS)

    Zhang Chun-Mei; Li Mo; Huang Jing-Zheng; Li Hong-Wei; Li Fang-Yi; Wang Chuan; Yin Zhen-Qiang; Chen Wei; Han Zhen-Fu; Treeviriyanupab Patcharapong; Sripimanwat Keattisak

    2014-01-01

    Post-processing is indispensable in quantum key distribution (QKD), which is aimed at sharing secret keys between two distant parties. It mainly consists of key reconciliation and privacy amplification, which is used for sharing the same keys and for distilling unconditional secret keys. In this paper, we focus on speeding up the privacy amplification process by choosing a simple multiplicative universal class of hash functions. By constructing an optimal multiplication algorithm based on four basic multiplication algorithms, we give a fast software implementation of length-adaptive privacy amplification. “Length-adaptive” indicates that the implementation of privacy amplification automatically adapts to different lengths of input blocks. When the lengths of the input blocks are 1 Mbit and 10 Mbit, the speed of privacy amplification can be as fast as 14.86 Mbps and 10.88 Mbps, respectively. Thus, it is practical for GHz or even higher repetition frequency QKD systems. (general)

  4. Toward Privacy-Preserving Personalized Recommendation Services

    Directory of Open Access Journals (Sweden)

    Cong Wang

    2018-02-01

    Full Text Available Recommendation systems are crucially important for the delivery of personalized services to users. With personalized recommendation services, users can enjoy a variety of targeted recommendations such as movies, books, ads, restaurants, and more. In addition, personalized recommendation services have become extremely effective revenue drivers for online business. Despite the great benefits, deploying personalized recommendation services typically requires the collection of users’ personal data for processing and analytics, which undesirably makes users susceptible to serious privacy violation issues. Therefore, it is of paramount importance to develop practical privacy-preserving techniques to maintain the intelligence of personalized recommendation services while respecting user privacy. In this paper, we provide a comprehensive survey of the literature related to personalized recommendation services with privacy protection. We present the general architecture of personalized recommendation systems, the privacy issues therein, and existing works that focus on privacy-preserving personalized recommendation services. We classify the existing works according to their underlying techniques for personalized recommendation and privacy protection, and thoroughly discuss and compare their merits and demerits, especially in terms of privacy and recommendation accuracy. We also identity some future research directions. Keywords: Privacy protection, Personalized recommendation services, Targeted delivery, Collaborative filtering, Machine learning

  5. 48 CFR 39.105 - Privacy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Privacy. 39.105 Section 39... CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.105 Privacy. Agencies shall ensure that contracts for information technology address protection of privacy in accordance with the Privacy Act (5 U.S.C...

  6. Secret rate - Privacy leakage in biometric systems

    NARCIS (Netherlands)

    Ignatenko, T.; Willems, F.M.J.

    2009-01-01

    Ahlswede and Csiszár [1993] introduced the concept of secret sharing. In their source model two terminals observe two correlated sequences. It is the objective of the terminals to form a common secret by interchanging a public message (helper data) in such a way that the secrecy leakage is

  7. Authentication Without Secrets

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, Lyndon G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robertson, Perry J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This work examines a new approach to authentication, which is the most fundamental security primitive that underpins all cyber security protections. Current Internet authentication techniques require the protection of one or more secret keys along with the integrity protection of the algorithms/computations designed to prove possession of the secret without actually revealing it. Protecting a secret requires physical barriers or encryption with yet another secret key. The reason to strive for "Authentication without Secret Keys" is that protecting secrets (even small ones only kept in a small corner of a component or device) is much harder than protecting the integrity of information that is not secret. Promising methods are examined for authentication of components, data, programs, network transactions, and/or individuals. The successful development of authentication without secret keys will enable far more tractable system security engineering for high exposure, high consequence systems by eliminating the need for brittle protection mechanisms to protect secret keys (such as are now protected in smart cards, etc.). This paper is a re-release of SAND2009-7032 with new figures numerous edits.

  8. Secure Mix-Zones for Privacy Protection of Road Network Location Based Services Users

    Directory of Open Access Journals (Sweden)

    Rubina S. Zuberi

    2016-01-01

    Full Text Available Privacy has been found to be the major impediment and hence the area to be worked out for the provision of Location Based Services in the wide sense. With the emergence of smart, easily portable, communicating devices, information acquisition is achieving new domains. The work presented here is an extension of the ongoing work towards achieving privacy for the present day emerging communication techniques. This work emphasizes one of the most effective real-time privacy enhancement techniques called Mix-Zones. In this paper, we have presented a model of a secure road network with Mix-Zones getting activated on the basis of spatial as well as temporal factors. The temporal factors are ascertained by the amount of traffic and its flow. The paper also discusses the importance of the number of Mix-Zones a user traverses and their mixing effectiveness. We have also shown here using our simulations which are required for the real-time treatment of the problem that the proposed transient Mix-Zones are part of a viable and robust solution towards the road network privacy protection of the communicating moving objects of the present scenario.

  9. Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential

    Directory of Open Access Journals (Sweden)

    John Cologne

    2012-01-01

    Full Text Available Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying the data, but overmasking can compromise utility (analysis potential. Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.

  10. Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential

    International Nuclear Information System (INIS)

    Cologne, J.; Nakashima, E.; Funamoto, S.; Grant, E.J.; Chen, Y.; Hiroaki Katayama, H.

    2012-01-01

    Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but over masking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs

  11. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  12. Secret-key rates and privacy leakage in biometric systems

    NARCIS (Netherlands)

    Ignatenko, T.

    2009-01-01

    In this thesis both the generation of secret keys from biometric data and the binding of secret keys to biometric data are investigated. These secret keys can be used to regulate access to sensitive data, services, and environments. In a biometric secrecy system a secret key is generated or chosen

  13. A Case Study on Differential Privacy

    OpenAIRE

    Asseffa, Samrawit; Seleshi, Bihil

    2017-01-01

    Throughout the ages, human beings prefer to keep most things secret and brand this overall state with the title of privacy. Like most significant terms, privacy tends to create controversy regarding the extent of its flexible boundaries, since various technological advancements are slowly leaching away the power people have over their own information. Even as cell phone brands release new upgrades, the ways in which information is communicated has drastically increased, in turn facilitating t...

  14. DLP system and the secret of personal correspondence

    Directory of Open Access Journals (Sweden)

    Mavrinskaya T.V.

    2017-04-01

    Full Text Available according to the authors, every day a number of threats to information security increases, and this requires an increase in resources (systems of information protection of organizations and enterprises. There are many information security tools with different functionality, but the main mean of preventing information leakage is the Date Loss Prevention (DLP system. If you need to establish control over the leak of confidential information there appear a number of questions of conformity of decisions with the legislation and regulations. This article describes the issue of compliance functionality of a DLP system the provisions and requirements of the legislation in the sphere of protection of family and personal secrets, as well as compliance with the Constitutional right of citizens to privacy of correspondence.

  15. 34 CFR 98.4 - Protection of students' privacy in examination, testing, or treatment.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Protection of students' privacy in examination, testing, or treatment. 98.4 Section 98.4 Education Office of the Secretary, Department of Education STUDENT... are not directly related to academic instruction and that is designed to affect behavioral, emotional...

  16. 78 FR 3015 - Privacy Act of 1974; U.S. Customs and Border Protection; DHS/CBP-004-Intellectual Property Rights...

    Science.gov (United States)

    2013-01-15

    ... Search Systems, System of Records AGENCY: Department of Homeland Security, Privacy Office. ACTION: Notice... and Border Protection, Mint Annex, 799 9th Street NW., Washington, DC 20229-1177. For privacy issues... Property Rights Internal Search (IPRiS) system. IPRS provides a web-based search engine for the public to...

  17. Security controls in an integrated Biobank to protect privacy in data sharing: rationale and study design.

    Science.gov (United States)

    Takai-Igarashi, Takako; Kinoshita, Kengo; Nagasaki, Masao; Ogishima, Soichi; Nakamura, Naoki; Nagase, Sachiko; Nagaie, Satoshi; Saito, Tomo; Nagami, Fuji; Minegishi, Naoko; Suzuki, Yoichi; Suzuki, Kichiya; Hashizume, Hiroaki; Kuriyama, Shinichi; Hozawa, Atsushi; Yaegashi, Nobuo; Kure, Shigeo; Tamiya, Gen; Kawaguchi, Yoshio; Tanaka, Hiroshi; Yamamoto, Masayuki

    2017-07-06

    With the goal of realizing genome-based personalized healthcare, we have developed a biobank that integrates personal health, genome, and omics data along with biospecimens donated by volunteers of 150,000. Such a large-scale of data integration involves obvious risks of privacy violation. The research use of personal genome and health information is a topic of global discussion with regard to the protection of privacy while promoting scientific advancement. The present paper reports on our plans, current attempts, and accomplishments in addressing security problems involved in data sharing to ensure donor privacy while promoting scientific advancement. Biospecimens and data have been collected in prospective cohort studies with the comprehensive agreement. The sample size of 150,000 participants was required for multiple researches including genome-wide screening of gene by environment interactions, haplotype phasing, and parametric linkage analysis. We established the T ohoku M edical M egabank (TMM) data sharing policy: a privacy protection rule that requires physical, personnel, and technological safeguards against privacy violation regarding the use and sharing of data. The proposed policy refers to that of NCBI and that of the Sanger Institute. The proposed policy classifies shared data according to the strength of re-identification risks. Local committees organized by TMM evaluate re-identification risk and assign a security category to a dataset. Every dataset is stored in an assigned segment of a supercomputer in accordance with its security category. A security manager should be designated to handle all security problems at individual data use locations. The proposed policy requires closed networks and IP-VPN remote connections. The mission of the biobank is to distribute biological resources most productively. This mission motivated us to collect biospecimens and health data and simultaneously analyze genome/omics data in-house. The biobank also has the

  18. Customer privacy on UK healthcare websites.

    Science.gov (United States)

    Mundy, Darren P

    2006-09-01

    Privacy has been and continues to be one of the key challenges of an age devoted to the accumulation, processing, and mining of electronic information. In particular, privacy of healthcare-related information is seen as a key issue as health organizations move towards the electronic provision of services. The aim of the research detailed in this paper has been to analyse privacy policies on popular UK healthcare-related websites to determine the extent to which consumer privacy is protected. The author has combined approaches (such as approaches focused on usability, policy content, and policy quality) used in studies by other researchers on e-commerce and US healthcare websites to provide a comprehensive analysis of UK healthcare privacy policies. The author identifies a wide range of issues related to the protection of consumer privacy through his research analysis using quantitative results. The main outcomes from the author's research are that only 61% of healthcare-related websites in their sample group posted privacy policies. In addition, most of the posted privacy policies had poor readability standards and included a variety of privacy vulnerability statements. Overall, the author's findings represent significant current issues in relation to healthcare information protection on the Internet. The hope is that raising awareness of these results will drive forward changes in the industry, similar to those experienced with information quality.

  19. Privacy Protection: Mandating New Arrangements to Implement and Assess Federal Privacy Policy and Practice

    National Research Council Canada - National Science Library

    Relyea, Harold C

    2004-01-01

    When Congress enacted the Privacy Act of 1974, it established a temporary national study commission to conduct a comprehensive assessment of privacy policy and practice in both the public and private...

  20. Privacy in an Ambient World

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  1. Formal Security-Proved Mobile Anonymous Authentication Protocols with Credit-Based Chargeability and Controllable Privacy

    Directory of Open Access Journals (Sweden)

    Chun-I Fan

    2016-06-01

    Full Text Available Smart mobile phones are widely popularized and advanced mobile communication services are provided increasingly often, such that ubiquitous computing environments will soon be a reality. However, there are many security threats to mobile networks and their impact on security is more serious than that in wireline networks owing to the features of wireless transmissions and the ubiquity property. The secret information which mobile users carry may be stolen by malicious entities. To guarantee the quality of advanced services, security and privacy would be important issues when users roam within various mobile networks. In this manuscript, an anonymous authentication scheme will be proposed to protect the security of the network system and the privacy of users. Not only does the proposed scheme provide mutual authentication between each user and the system, but also each user’s identity is kept secret against anyone else, including the system. Although the system anonymously authenticates the users, it can still generate correct bills to charge these anonymous users via a credit-based solution instead of debit-based ones. Furthermore, our protocols also achieve fair privacy which allows the judge to revoke the anonymity and trace the illegal users when they have misused the anonymity property, for example, if they have committed crimes. Finally, in this paper, we also carry out complete theoretical proofs on each claimed security property.

  2. How can hospitals better protect the privacy of electronic medical records? Perspectives from staff members of health information management departments.

    Science.gov (United States)

    Sher, Ming-Ling; Talley, Paul C; Cheng, Tain-Junn; Kuo, Kuang-Ming

    2017-05-01

    The adoption of electronic medical records (EMR) is expected to better improve overall healthcare quality and to offset the financial pressure of excessive administrative burden. However, safeguarding EMR against potentially hostile security breaches from both inside and outside healthcare facilities has created increased patients' privacy concerns from all sides. The aim of our study was to examine the influencing factors of privacy protection for EMR by healthcare professionals. We used survey methodology to collect questionnaire responses from staff members in health information management departments among nine Taiwanese hospitals active in EMR utilisation. A total of 209 valid responses were collected in 2014. We used partial least squares for analysing the collected data. Perceived benefits, perceived barriers, self-efficacy and cues to action were found to have a significant association with intention to protect EMR privacy, while perceived susceptibility and perceived severity were not. Based on the findings obtained, we suggest that hospitals should provide continuous ethics awareness training to relevant staff and design more effective strategies for improving the protection of EMR privacy in their charge. Further practical and research implications are also discussed.

  3. Ensuring privacy in the study of pathogen genetics.

    Science.gov (United States)

    Mehta, Sanjay R; Vinterbo, Staal A; Little, Susan J

    2014-08-01

    Rapid growth in the genetic sequencing of pathogens in recent years has led to the creation of large sequence databases. This aggregated sequence data can be very useful for tracking and predicting epidemics of infectious diseases. However, the balance between the potential public health benefit and the risk to personal privacy for individuals whose genetic data (personal or pathogen) are included in such work has been difficult to delineate, because neither the true benefit nor the actual risk to participants has been adequately defined. Existing approaches to minimise the risk of privacy loss to participants are based on de-identification of data by removal of a predefined set of identifiers. These approaches neither guarantee privacy nor protect the usefulness of the data. We propose a new approach to privacy protection that will quantify the risk to participants, while still maximising the usefulness of the data to researchers. This emerging standard in privacy protection and disclosure control, which is known as differential privacy, uses a process-driven rather than data-centred approach to protecting privacy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Patient Privacy in the Era of Big Data

    Directory of Open Access Journals (Sweden)

    Mehmet Kayaalp

    2018-02-01

    Full Text Available Protecting patient privacy requires various technical tools. It involves regulations for sharing, de-identifying, securely storing, transmitting and handling protected health information (PHI. It involves privacy laws and legal agreements. It requires establishing rules for monitoring privacy leaks, determining actions when they occur, and handling de-identified clinical narrative reports. Deidentification is one such indispensable instrument in this set of privacy tools

  5. Practical Privacy Assessment

    DEFF Research Database (Denmark)

    Peen, Søren; Jansen, Thejs Willem; Jensen, Christian D.

    2008-01-01

    This chapter proposes a privacy assessment model called the Operational Privacy Assessment Model that includes organizational, operational and technical factors for the protection of personal data stored in an IT system. The factors can be evaluated in a simple scale so that not only the resulting...... graphical depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment tool may be used to standardize privacy assessment criteria, making it less painful...... for the management to assess privacy risks on their systems....

  6. Privacy Expectations in Online Contexts

    Science.gov (United States)

    Pure, Rebekah Abigail

    2013-01-01

    Advances in digital networked communication technology over the last two decades have brought the issue of personal privacy into sharper focus within contemporary public discourse. In this dissertation, I explain the Fourth Amendment and the role that privacy expectations play in the constitutional protection of personal privacy generally, and…

  7. Computer-Aided Identification and Validation of Privacy Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2016-05-01

    Full Text Available Privacy is a software quality that is closely related to security. The main difference is that security properties aim at the protection of assets that are crucial for the considered system, and privacy aims at the protection of personal data that are processed by the system. The identification of privacy protection needs in complex systems is a hard and error prone task. Stakeholders whose personal data are processed might be overlooked, or the sensitivity and the need of protection of the personal data might be underestimated. The later personal data and the needs to protect them are identified during the development process, the more expensive it is to fix these issues, because the needed changes of the system-to-be often affect many functionalities. In this paper, we present a systematic method to identify the privacy needs of a software system based on a set of functional requirements by extending the problem-based privacy analysis (ProPAn method. Our method is tool-supported and automated where possible to reduce the effort that has to be spent for the privacy analysis, which is especially important when considering complex systems. The contribution of this paper is a semi-automatic method to identify the relevant privacy requirements for a software-to-be based on its functional requirements. The considered privacy requirements address all dimensions of privacy that are relevant for software development. As our method is solely based on the functional requirements of the system to be, we enable users of our method to identify the privacy protection needs that have to be addressed by the software-to-be at an early stage of the development. As initial evaluation of our method, we show its applicability on a small electronic health system scenario.

  8. 78 FR 57319 - Children's Online Privacy Protection Rule Safe Harbor Proposed Self-Regulatory Guidelines...

    Science.gov (United States)

    2013-09-18

    ...-AB20 Children's Online Privacy Protection Rule Safe Harbor Proposed Self-Regulatory Guidelines; kidSAFE... proposed self-regulatory guidelines submitted by the kidSAFE Seal Program (``kidSAFE''), owned and operated... enabling industry groups or others to submit to the Commission for approval self-regulatory guidelines that...

  9. The trade secrets protection in U.S. and in Europe: a comparative study

    Directory of Open Access Journals (Sweden)

    Chiara Gaido

    2017-12-01

    Full Text Available Only by deeply understanding the new laws that govern trade secrets protection in the United States and Europe, companies will be able to effectively protect their own trade secrets. The purpose of this paper is to highlight the similarities and differences between both regulations to give useful guidelines to international companies who deal in both geographical areas. Therefore, the paper will focus first on the economic value of trade secrets and the costs related to cybercrime and cyberespionage. Then, it will analyze the US and EU historical legal backgrounds that brought to the adoption of both laws. Finally, this article will make a comparative analysis of the provisions in each law. Hence, the paper makes potential suggestions for companies that deal in both regimes.

  10. Secret Sharing Schemes with a large number of players from Toric Varieties

    DEFF Research Database (Denmark)

    Hansen, Johan P.

    A general theory for constructing linear secret sharing schemes over a finite field $\\Fq$ from toric varieties is introduced. The number of players can be as large as $(q-1)^r-1$ for $r\\geq 1$. We present general methods for obtaining the reconstruction and privacy thresholds as well as conditions...... for multiplication on the associated secret sharing schemes. In particular we apply the method on certain toric surfaces. The main results are ideal linear secret sharing schemes where the number of players can be as large as $(q-1)^2-1$. We determine bounds for the reconstruction and privacy thresholds...

  11. Outsourcing Set Intersection Computation Based on Bloom Filter for Privacy Preservation in Multimedia Processing

    Directory of Open Access Journals (Sweden)

    Hongliang Zhu

    2018-01-01

    Full Text Available With the development of cloud computing, the advantages of low cost and high computation ability meet the demands of complicated computation of multimedia processing. Outsourcing computation of cloud could enable users with limited computing resources to store and process distributed multimedia application data without installing multimedia application software in local computer terminals, but the main problem is how to protect the security of user data in untrusted public cloud services. In recent years, the privacy-preserving outsourcing computation is one of the most common methods to solve the security problems of cloud computing. However, the existing computation cannot meet the needs for the large number of nodes and the dynamic topologies. In this paper, we introduce a novel privacy-preserving outsourcing computation method which combines GM homomorphic encryption scheme and Bloom filter together to solve this problem and propose a new privacy-preserving outsourcing set intersection computation protocol. Results show that the new protocol resolves the privacy-preserving outsourcing set intersection computation problem without increasing the complexity and the false positive probability. Besides, the number of participants, the size of input secret sets, and the online time of participants are not limited.

  12. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  13. 76 FR 11435 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2011-03-02

    ... Security Administration. SUMMARY: Pursuant to the Computer Matching and Privacy Protection Act of 1988, Public Law 100-503, the Computer Matching and Privacy Protections Amendments of 1990, Pub. L. 101-508... Interpreting the Provisions of Public Law 100-503, the Computer Matching and Privacy Protection Act of 1988...

  14. Protection of Location Privacy Based on Distributed Collaborative Recommendations.

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users' location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users' location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users' location information profiles and used generalization and encryption to ensure the safety of the user's location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user's location privacy.

  15. The secret to health information technology's success within the diabetes patient population: a comprehensive privacy and security framework.

    Science.gov (United States)

    Pandya, Sheel M

    2010-05-01

    Congress made an unprecedented investment in health information technology (IT) when it passed the American Recovery and Reinvestment Act in February 2009. Health IT provides enormous opportunities to improve health care quality, reduce costs, and engage patients in their own care. But the potential payoff for use of health IT for diabetes care is magnified given the prevalence, cost, and complexity of the disease. However, without proper privacy and security protections in place, diabetes patient data are at risk of misuse, and patient trust in the system is undermined. We need a comprehensive privacy and security framework that articulates clear parameters for access, use, and disclosure of diabetes patient data for all entities storing and exchanging electronic data. (c) 2010 Diabetes Technology Society.

  16. INSPIRATIONS OF THE FRAMEWORK OF INTERNET PRIVACY PROTECTION IN AMERICA%美国网络隐私保护框架的启示

    Institute of Scientific and Technical Information of China (English)

    王忠

    2013-01-01

    介绍了美国白宫发布的《网络世界中消费者数据隐私:全球数字经济中保护隐私及促进创新的框架》的背景及主要内容,结合我国网络隐私保护的实际情况,提出了促进我国网络隐私保护与产业创新良性互动的措施建议.%The background and main content of (Consumer privacy in a networked world: a framework for protecting privacy and promoting innovation in the global digital economy) was introduced, which released by the White House. Combing with actual situation of China's Online Privacy Protection, measures were proposed to promote positive interaction between online privacy protection and industrial innovation.

  17. Blood rights: the body and information privacy.

    Science.gov (United States)

    Alston, Bruce

    2005-05-01

    Genetic and other medical technology makes blood, human tissue and other bodily samples an immediate and accessible source of comprehensive personal and health information about individuals. Yet, unlike medical records, bodily samples are not subject to effective privacy protection or other regulation to ensure that individuals have rights to control the collection, use and transfer of such samples. This article examines the existing coverage of privacy legislation, arguments in favour of baseline protection for bodily samples as sources of information and possible approaches to new regulation protecting individual privacy rights in bodily samples.

  18. Privacy and confidentiality in pragmatic clinical trials.

    Science.gov (United States)

    McGraw, Deven; Greene, Sarah M; Miner, Caroline S; Staman, Karen L; Welch, Mary Jane; Rubel, Alan

    2015-10-01

    With pragmatic clinical trials, an opportunity exists to answer important questions about the relative risks, burdens, and benefits of therapeutic interventions. However, concerns about protecting the privacy of this information are significant and must be balanced with the imperative to learn from the data gathered in routine clinical practice. Traditional privacy protections for research uses of identifiable information rely disproportionately on informed consent or authorizations, based on a presumption that this is necessary to fulfill ethical principles of respect for persons. But frequently, the ideal of informed consent is not realized in its implementation. Moreover, the principle of respect for persons—which encompasses their interests in health information privacy—can be honored through other mechanisms. Data anonymization also plays a role in protecting privacy but is not suitable for all research, particularly pragmatic clinical trials. In this article, we explore both the ethical foundation and regulatory framework intended to protect privacy in pragmatic clinical trials. We then review examples of novel approaches to respecting persons in research that may have the added benefit of honoring patient privacy considerations. © The Author(s) 2015.

  19. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    Science.gov (United States)

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed.

  20. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  1. Minutiae Matching with Privacy Protection Based on the Combination of Garbled Circuit and Homomorphic Encryption

    Directory of Open Access Journals (Sweden)

    Mengxing Li

    2014-01-01

    Full Text Available Biometrics plays an important role in authentication applications since they are strongly linked to holders. With an increasing growth of e-commerce and e-government, one can expect that biometric-based authentication systems are possibly deployed over the open networks in the near future. However, due to its openness, the Internet poses a great challenge to the security and privacy of biometric authentication. Biometric data cannot be revoked, so it is of paramount importance that biometric data should be handled in a secure way. In this paper we present a scheme achieving privacy-preserving fingerprint authentication between two parties, in which fingerprint minutiae matching algorithm is completed in the encrypted domain. To improve the efficiency, we exploit homomorphic encryption as well as garbled circuits to design the protocol. Our goal is to provide protection for the security of template in storage and data privacy of two parties in transaction. The experimental results show that the proposed authentication protocol runs efficiently. Therefore, the protocol can run over open networks and help to alleviate the concerns on security and privacy of biometric applications over the open networks.

  2. Minutiae matching with privacy protection based on the combination of garbled circuit and homomorphic encryption.

    Science.gov (United States)

    Li, Mengxing; Feng, Quan; Zhao, Jian; Yang, Mei; Kang, Lijun; Wu, Lili

    2014-01-01

    Biometrics plays an important role in authentication applications since they are strongly linked to holders. With an increasing growth of e-commerce and e-government, one can expect that biometric-based authentication systems are possibly deployed over the open networks in the near future. However, due to its openness, the Internet poses a great challenge to the security and privacy of biometric authentication. Biometric data cannot be revoked, so it is of paramount importance that biometric data should be handled in a secure way. In this paper we present a scheme achieving privacy-preserving fingerprint authentication between two parties, in which fingerprint minutiae matching algorithm is completed in the encrypted domain. To improve the efficiency, we exploit homomorphic encryption as well as garbled circuits to design the protocol. Our goal is to provide protection for the security of template in storage and data privacy of two parties in transaction. The experimental results show that the proposed authentication protocol runs efficiently. Therefore, the protocol can run over open networks and help to alleviate the concerns on security and privacy of biometric applications over the open networks.

  3. Privacy and data protection: Legal aspects in the Republic of Macedonia

    Directory of Open Access Journals (Sweden)

    Nora Osmani

    2016-07-01

    Full Text Available The purpose of this paper is to present a theoretical assessment of the existing Law on Personal Data Protection in the Republic of Macedonia. The paper aims to analyse whether there is a need for additional legal tools in order to achieve a balance between maintaining data integrity in the digital age and the use of modern technology. The paper discusses the meaning of “information privacy” in the age of big data, cyber threats and the domestic and international response to these issues. Special focus is dedicated to privacy policy enforcement in European Union Law. Having regard to the development of new technologies, prevailing data protection legislation may no longer be able to provide effective protection for individuals’ personal information. Therefore, existing laws should be continuously adapted to respond to new challenges and situations deriving from different online activities and communications.

  4. Privacy enhanced recommender system

    NARCIS (Netherlands)

    Erkin, Zekeriya; Erkin, Zekeriya; Beye, Michael; Veugen, Thijs; Lagendijk, Reginald L.

    2010-01-01

    Recommender systems are widely used in online applications since they enable personalized service to the users. The underlying collaborative filtering techniques work on user’s data which are mostly privacy sensitive and can be misused by the service provider. To protect the privacy of the users, we

  5. Privacy and policy for genetic research.

    Science.gov (United States)

    DeCew, Judith Wagner

    2004-01-01

    I begin with a discussion of the value of privacy and what we lose without it. I then turn to the difficulties of preserving privacy for genetic information and other medical records in the face of advanced information technology. I suggest three alternative public policy approaches to the problem of protecting individual privacy and also preserving databases for genetic research: (1) governmental guidelines and centralized databases, (2) corporate self-regulation, and (3) my hybrid approach. None of these are unproblematic; I discuss strengths and drawbacks of each, emphasizing the importance of protecting the privacy of sensitive medical and genetic information as well as letting information technology flourish to aid patient care, public health and scientific research.

  6. Enhancing Privacy for Digital Rights Management

    NARCIS (Netherlands)

    Petkovic, M.; Conrado, C.; Schrijen, G.J.; Jonker, Willem

    2007-01-01

    This chapter addresses privacy issues in DRM systems. These systems provide a means of protecting digital content, but may violate the privacy of users in that the content they purchase and their actions in the system can be linked to specific users. The chapter proposes a privacy-preserving DRM

  7. Privacy Information Security Classification for Internet of Things Based on Internet Data

    OpenAIRE

    Lu, Xiaofeng; Qu, Zhaowei; Li, Qi; Hui, Pan

    2015-01-01

    A lot of privacy protection technologies have been proposed, but most of them are independent and aim at protecting some specific privacy. There is hardly enough deep study into the attributes of privacy. To minimize the damage and influence of the privacy disclosure, the important and sensitive privacy should be a priori preserved if all privacy pieces cannot be preserved. This paper focuses on studying the attributes of the privacy and proposes privacy information security classification (P...

  8. Privacy by design in personal health monitoring.

    Science.gov (United States)

    Nordgren, Anders

    2015-06-01

    The concept of privacy by design is becoming increasingly popular among regulators of information and communications technologies. This paper aims at analysing and discussing the ethical implications of this concept for personal health monitoring. I assume a privacy theory of restricted access and limited control. On the basis of this theory, I suggest a version of the concept of privacy by design that constitutes a middle road between what I call broad privacy by design and narrow privacy by design. The key feature of this approach is that it attempts to balance automated privacy protection and autonomously chosen privacy protection in a way that is context-sensitive. In personal health monitoring, this approach implies that in some contexts like medication assistance and monitoring of specific health parameters one single automatic option is legitimate, while in some other contexts, for example monitoring in which relatives are receivers of health-relevant information rather than health care professionals, a multi-choice approach stressing autonomy is warranted.

  9. MODEL REGULATION FOR DATA PRIVACY IN THE APPLICATION OF BIOMETRIC SMART CARD

    Directory of Open Access Journals (Sweden)

    Sinta Dewi

    2017-03-01

    This article will explore data privacy model regulation which is intended to regulate and protect  data privacy. This  regulatory model  combining several approaches in managing data privacy, especially in using biometric smardcard. Firstly, through laws that enforces the principles and international standards. Secondly, through the market approach (market-based solution which is derived through industry associations to help protect consumer data privacy by applying privacy policy in the form of a statement that the industry will protect consumers' privacy by implementing fair information principles. Third, through technological approach such as PET's (privacy enchasing technology,  i.e the techniques for anonymous and pseudo-anonymous payment, communication, and web access. Fourthly, through corporate privacy rules.

  10. FCJ-195 Privacy, Responsibility, and Human Rights Activism

    Directory of Open Access Journals (Sweden)

    Becky Kazansky

    2015-06-01

    Full Text Available In this article, we argue that many difficulties associated with the protection of digital privacy are rooted in the framing of privacy as a predominantly individual responsibility. We examine how models of privacy protection, such as Notice and Choice, contribute to the ‘responsibilisation’ of human rights activists who rely on the use of technologies for their work. We also consider how a group of human rights activists countered technology-mediated threats that this ‘responsibilisation’ causes by developing a collective approach to address their digital privacy and security needs. We conclude this article by discussing how technological tools used to maintain or counter the loss of privacy can be improved in order to support the privacy and digital security of human rights activists.

  11. Genetic information, non-discrimination, and privacy protections in genetic counseling practice.

    Science.gov (United States)

    Prince, Anya E R; Roche, Myra I

    2014-12-01

    The passage of the Genetic Information Non Discrimination Act (GINA) was hailed as a pivotal achievement that was expected to calm the fears of both patients and research participants about the potential misuse of genetic information. However, 6 years later, patient and provider awareness of legal protections at both the federal and state level remains discouragingly low, thereby, limiting their potential effectiveness. The increasing demand for genetic testing will expand the number of individuals and families who could benefit from obtaining accurate information about the privacy and anti-discriminatory protections that GINA and other laws extend. In this paper we describe legal protections that are applicable to individuals seeking genetic counseling, review the literature on patient and provider fears of genetic discrimination and examine their awareness and understandings of existing laws, and summarize how genetic counselors currently discuss genetic discrimination. We then present three genetic counseling cases to illustrate issues of genetic discrimination and provide relevant information on applicable legal protections. Genetic counselors have an unprecedented opportunity, as well as the professional responsibility, to disseminate accurate knowledge about existing legal protections to their patients. They can strengthen their effectiveness in this role by achieving a greater knowledge of current protections including being able to identify specific steps that can help protect genetic information.

  12. A survey of the SWISS researchers on the impact of sibling privacy protections on pedigree recruitment.

    Science.gov (United States)

    Worrall, Bradford B; Chen, Donna T; Brown, Robert D; Brott, Thomas G; Meschia, James F

    2005-01-01

    To understand the perceptions and attitudes about privacy safeguards in research and investigate the impact of letter-based proband-initiated contact on recruitment, we surveyed researchers in the Siblings With Ischemic Stroke Study (SWISS). All 49 actively recruiting sites provided at least 1 response, and 61% reported that potential probands were enthusiastic. Although 66% of researchers valued proband-initiated contact, only 23% said that probands viewed this strategy as important to protecting the privacy of siblings. A substantial minority of researchers (37%) said the strategy impeded enrollment, and 44% said it was overly burdensome to probands.

  13. Data Security and Privacy in Apps for Dementia: An Analysis of Existing Privacy Policies.

    Science.gov (United States)

    Rosenfeld, Lisa; Torous, John; Vahia, Ipsit V

    2017-08-01

    Despite tremendous growth in the number of health applications (apps), little is known about how well these apps protect their users' health-related data. This gap in knowledge is of particular concern for apps targeting people with dementia, whose cognitive impairment puts them at increased risk of privacy breaches. In this article, we determine how many dementia apps have privacy policies and how well they protect user data. Our analysis included all iPhone apps that matched the search terms "medical + dementia" or "health & fitness + dementia" and collected user-generated content. We evaluated all available privacy policies for these apps based on criteria that systematically measure how individual user data is handled. Seventy-two apps met the above search teams and collected user data. Of these, only 33 (46%) had an available privacy policy. Nineteen of the 33 with policies (58%) were specific to the app in question, and 25 (76%) specified how individual-user as opposed to aggregate data would be handled. Among these, there was a preponderance of missing information, the majority acknowledged collecting individual data for internal purposes, and most admitted to instances in which they would share user data with outside parties. At present, the majority of health apps focused on dementia lack a privacy policy, and those that do exist lack clarity. Bolstering safeguards and improving communication about privacy protections will help facilitate consumer trust in apps, thereby enabling more widespread and meaningful use by people with dementia and those involved in their care. Copyright © 2017. Published by Elsevier Inc.

  14. AnonySense: Opportunistic and Privacy-Preserving Context Collection

    DEFF Research Database (Denmark)

    Triandopoulos, Nikolaos; Kapadia, Apu; Cornelius, Cory

    2008-01-01

    on tessellation and clustering to protect users' privacy against the system while reporting context, and k-anonymous report aggregation to improve the users' privacy against applications receiving the context. We outline the architecture and security properties of AnonySense, and focus on evaluating our....... We propose AnonySense, a general-purpose architecture for leveraging users' mobile devices for measuring context, while maintaining the privacy of the users.AnonySense features multiple layers of privacy protection-a framework for nodes to receive tasks anonymously, a novel blurring mechanism based...

  15. Molecular mechanisms of lipoapoptosis and metformin protection in GLP-1 secreting cells

    DEFF Research Database (Denmark)

    Kappe, Camilla; Holst, Jens Juul; Zhang, Qimin

    2012-01-01

    Evidence is emerging that elevated serum free fatty acids (hyperlipidemia) contribute to the pathogenesis of type-2-diabetes, and lipotoxicity is observed in many cell types. We recently published data indicating lipotoxic effects of simulated hyperlipidemia also in GLP-1-secreting cells, where...... the antidiabetic drug metformin conferred protection from lipoapoptosis. The aim of the present study was to identify mechanisms involved in mediating lipotoxicity and metformin lipoprotection in GLP-1 secreting cells. These signaling events triggered by simulated hyperlipidemia may underlie reduced GLP-1...... secretion in diabetic subjects, and metformin lipoprotection by metformin could explain elevated plasma GLP-1 levels in diabetic patients on chronic metformin therapy. The present study may thus identify potential molecular targets for increasing endogenous GLP-1 secretion through enhanced viability of GLP...

  16. openPDS: protecting the privacy of metadata through SafeAnswers.

    Directory of Open Access Journals (Sweden)

    Yves-Alexandre de Montjoye

    Full Text Available The rise of smartphones and web services made possible the large-scale collection of personal metadata. Information about individuals' location, phone call logs, or web-searches, is collected and used intensively by organizations and big data researchers. Metadata has however yet to realize its full potential. Privacy and legal concerns, as well as the lack of technical solutions for personal metadata management is preventing metadata from being shared and reconciled under the control of the individual. This lack of access and control is furthermore fueling growing concerns, as it prevents individuals from understanding and managing the risks associated with the collection and use of their data. Our contribution is two-fold: (1 we describe openPDS, a personal metadata management framework that allows individuals to collect, store, and give fine-grained access to their metadata to third parties. It has been implemented in two field studies; (2 we introduce and analyze SafeAnswers, a new and practical way of protecting the privacy of metadata at an individual level. SafeAnswers turns a hard anonymization problem into a more tractable security one. It allows services to ask questions whose answers are calculated against the metadata instead of trying to anonymize individuals' metadata. The dimensionality of the data shared with the services is reduced from high-dimensional metadata to low-dimensional answers that are less likely to be re-identifiable and to contain sensitive information. These answers can then be directly shared individually or in aggregate. openPDS and SafeAnswers provide a new way of dynamically protecting personal metadata, thereby supporting the creation of smart data-driven services and data science research.

  17. The Privacy Jungle:On the Market for Data Protection in Social Networks

    Science.gov (United States)

    Bonneau, Joseph; Preibusch, Sören

    We have conducted the first thorough analysis of the market for privacy practices and policies in online social networks. From an evaluation of 45 social networking sites using 260 criteria we find that many popular assumptions regarding privacy and social networking need to be revisited when considering the entire ecosystem instead of only a handful of well-known sites. Contrary to the common perception of an oligopolistic market, we find evidence of vigorous competition for new users. Despite observing many poor security practices, there is evidence that social network providers are making efforts to implement privacy enhancing technologies with substantial diversity in the amount of privacy control offered. However, privacy is rarely used as a selling point, even then only as auxiliary, nondecisive feature. Sites also failed to promote their existing privacy controls within the site. We similarly found great diversity in the length and content of formal privacy policies, but found an opposite promotional trend: though almost all policies are not accessible to ordinary users due to obfuscating legal jargon, they conspicuously vaunt the sites' privacy practices. We conclude that the market for privacy in social networks is dysfunctional in that there is significant variation in sites' privacy controls, data collection requirements, and legal privacy policies, but this is not effectively conveyed to users. Our empirical findings motivate us to introduce the novel model of a privacy communication game, where the economically rational choice for a site operator is to make privacy control available to evade criticism from privacy fundamentalists, while hiding the privacy control interface and privacy policy to maximize sign-up numbers and encourage data sharing from the pragmatic majority of users.

  18. Privacy Act

    Science.gov (United States)

    Learn about the Privacy Act of 1974, the Electronic Government Act of 2002, the Federal Information Security Management Act, and other information about the Environmental Protection Agency maintains its records.

  19. Disentangling privacy from property: toward a deeper understanding of genetic privacy.

    Science.gov (United States)

    Suter, Sonia M

    2004-04-01

    With the mapping of the human genome, genetic privacy has become a concern to many. People care about genetic privacy because genes play an important role in shaping us--our genetic information is about us, and it is deeply connected to our sense of ourselves. In addition, unwanted disclosure of our genetic information, like a great deal of other personal information, makes us vulnerable to unwanted exposure, stigmatization, and discrimination. One recent approach to protecting genetic privacy is to create property rights in genetic information. This Article argues against that approach. Privacy and property are fundamentally different concepts. At heart, the term "property" connotes control within the marketplace and over something that is disaggregated or alienable from the self. "Privacy," in contrast, connotes control over access to the self as well as things close to, intimately connected to, and about the self. Given these different meanings, a regime of property rights in genetic information would impoverish our understanding of that information, ourselves, and the relationships we hope will be built around and through its disclosure. This Article explores our interests in genetic information in order to deepen our understanding of the ongoing discourse about the distinction between property and privacy. It develops a conception of genetic privacy with a strong relational component. We ordinarily share genetic information in the context of relationships in which disclosure is important to the relationship--family, intimate, doctor-patient, researcher-participant, employer-employee, and insurer-insured relationships. Such disclosure makes us vulnerable to and dependent on the person to whom we disclose it. As a result, trust is essential to the integrity of these relationships and our sharing of genetic information. Genetic privacy can protect our vulnerability in these relationships and enhance the trust we hope to have in them. Property, in contrast, by

  20. Co-regulation in EU personal data protection : The case of technical standards and the privacy by design standardisation ‘mandate’

    NARCIS (Netherlands)

    Kamara, Irene

    The recently adopted General Data Protection Regulation (GDPR), a technology-neutral law, endorses self-regulatory instruments, such as certification and technical standards. Even before the adoption of the General Data Protection Regulation, standardisation activity in the field of privacy

  1. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees.

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-09-10

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one.

  2. An Efficient Secret Key Homomorphic Encryption Used in Image Processing Service

    Directory of Open Access Journals (Sweden)

    Pan Yang

    2017-01-01

    Full Text Available Homomorphic encryption can protect user’s privacy when operating on user’s data in cloud computing. But it is not practical for wide using as the data and services types in cloud computing are diverse. Among these data types, digital image is an important personal data for users. There are also many image processing services in cloud computing. To protect user’s privacy in these services, this paper proposed a scheme using homomorphic encryption in image processing. Firstly, a secret key homomorphic encryption (IGHE was constructed for encrypting image. IGHE can operate on encrypted floating numbers efficiently to adapt to the image processing service. Then, by translating the traditional image processing methods into the operations on encrypted pixels, the encrypted image can be processed homomorphically. That is, service can process the encrypted image directly, and the result after decryption is the same as processing the plain image. To illustrate our scheme, three common image processing instances were given in this paper. The experiments show that our scheme is secure, correct, and efficient enough to be used in practical image processing applications.

  3. Patient Privacy in the Era of Big Data.

    Science.gov (United States)

    Kayaalp, Mehmet

    2018-01-20

    Privacy was defined as a fundamental human right in the Universal Declaration of Human Rights at the 1948 United Nations General Assembly. However, there is still no consensus on what constitutes privacy. In this review, we look at the evolution of privacy as a concept from the era of Hippocrates to the era of social media and big data. To appreciate the modern measures of patient privacy protection and correctly interpret the current regulatory framework in the United States, we need to analyze and understand the concepts of individually identifiable information, individually identifiable health information, protected health information, and de-identification. The Privacy Rule of the Health Insurance Portability and Accountability Act defines the regulatory framework and casts a balance between protective measures and access to health information for secondary (scientific) use. The rule defines the conditions when health information is protected by law and how protected health information can be de-identified for secondary use. With the advents of artificial intelligence and computational linguistics, computational text de-identification algorithms produce de-identified results nearly as well as those produced by human experts, but much faster, more consistently and basically for free. Modern clinical text de-identification systems now pave the road to big data and enable scientists to access de-identified clinical information while firmly protecting patient privacy. However, clinical text de-identification is not a perfect process. In order to maximize the protection of patient privacy and to free clinical and scientific information from the confines of electronic healthcare systems, all stakeholders, including patients, health institutions and institutional review boards, scientists and the scientific communities, as well as regulatory and law enforcement agencies must collaborate closely. On the one hand, public health laws and privacy regulations define rules

  4. 76 FR 75603 - Family Educational Rights and Privacy

    Science.gov (United States)

    2011-12-02

    ... dropout status, demographics, and unique student identifiers. Schools and LEAs are the primary collectors... of using student data must always be balanced with the need to protect student privacy. Protecting student privacy helps achieve a number of important goals, including avoiding discrimination, identity...

  5. Privacy of genetic information: a review of the laws in the United States.

    Science.gov (United States)

    Fuller, B; Ip, M

    2001-01-01

    This paper examines the privacy of genetic information and the laws in the United States designed to protect genetic privacy. While all 50 states have laws protecting the privacy of health information, there are many states that have additional laws that carve out additional protections specifically for genetic information. The majority of the individual states have enacted legislation to protect individuals from discrimination on the basis of genetic information, and most of this legislation also has provisions to protect the privacy of genetic information. On the Federal level, there has been no antidiscrimination or genetic privacy legislation. Secretary Donna Shalala of the Department of Health and Human Services has issued proposed regulations to protect the privacy of individually identifiable health information. These regulations encompass individually identifiable health information and do not make specific provisions for genetic information. The variety of laws regarding genetic privacy, some found in statutes to protect health information and some found in statutes to prevent genetic discrimination, presents challenges to those charged with administering and executing these laws.

  6. Challenges of privacy protection in big data analytics

    DEFF Research Database (Denmark)

    Jensen, Meiko

    2013-01-01

    The big data paradigm implies that almost every type of information eventually can be derived from sufficiently large datasets. However, in such terms, linkage of personal data of individuals poses a severe threat to privacy and civil rights. In this position paper, we propose a set of challenges...... that have to be addressed in order to perform big data analytics in a privacy-compliant way....

  7. Sharing Privacy Protected and Statistically Sound Clinical Research Data Using Outsourced Data Storage

    Directory of Open Access Journals (Sweden)

    Geontae Noh

    2014-01-01

    Full Text Available It is critical to scientific progress to share clinical research data stored in outsourced generally available cloud computing services. Researchers are able to obtain valuable information that they would not otherwise be able to access; however, privacy concerns arise when sharing clinical data in these outsourced publicly available data storage services. HIPAA requires researchers to deidentify private information when disclosing clinical data for research purposes and describes two available methods for doing so. Unfortunately, both techniques degrade statistical accuracy. Therefore, the need to protect privacy presents a significant problem for data sharing between hospitals and researchers. In this paper, we propose a controlled secure aggregation protocol to secure both privacy and accuracy when researchers outsource their clinical research data for sharing. Since clinical data must remain private beyond a patient’s lifetime, we take advantage of lattice-based homomorphic encryption to guarantee long-term security against quantum computing attacks. Using lattice-based homomorphic encryption, we design an aggregation protocol that aggregates outsourced ciphertexts under distinct public keys. It enables researchers to get aggregated results from outsourced ciphertexts of distinct researchers. To the best of our knowledge, our protocol is the first aggregation protocol which can aggregate ciphertexts which are encrypted with distinct public keys.

  8. PriBots: Conversational Privacy with Chatbots

    OpenAIRE

    Harkous, Hamza; Fawaz, Kassem; Shin, Kang G.; Aberer, Karl

    2016-01-01

    Traditional mechanisms for delivering notice and enabling choice have so far failed to protect users’ privacy. Users are continuously frustrated by complex privacy policies, unreachable privacy settings, and a multitude of emerging standards. The miniaturization trend of smart devices and the emergence of the Internet of Things (IoTs) will exacerbate this problem further. In this paper, we propose Conversational Privacy Bots (PriBots) as a new way of delivering notice and choice through a two...

  9. An overview of human genetic privacy.

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2017-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. © 2016 New York Academy of Sciences.

  10. An overview of human genetic privacy

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2016-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. PMID:27626905

  11. 45 CFR 503.2 - General policies-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false General policies-Privacy Act. 503.2 Section 503.2... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.2 General policies—Privacy Act. The Commission will protect the...

  12. Do privacy and data protection rules apply to legal persons and should they? A proposal for a two-tiered system

    NARCIS (Netherlands)

    van der Sloot, B.

    2015-01-01

    Privacy and data protection rules are usually said to protect the individual against intrusive governments and nosy companies. These rights guarantee the individual's freedom, personal autonomy and human dignity, among others. More and more, however, legal persons are also allowed to invoke the

  13. Pathology Image-Sharing on Social Media: Recommendations for Protecting Privacy While Motivating Education.

    Science.gov (United States)

    Crane, Genevieve M; Gardner, Jerad M

    2016-08-01

    There is a rising interest in the use of social media by pathologists. However, the use of pathology images on social media has been debated, particularly gross examination, autopsy, and dermatologic condition photographs. The immediacy of the interactions, increased interest from patients and patient groups, and fewer barriers to public discussion raise additional considerations to ensure patient privacy is protected. Yet these very features all add to the power of social media for educating other physicians and the nonmedical public about disease and for creating better understanding of the important role of pathologists in patient care. The professional and societal benefits are overwhelmingly positive, and we believe the potential for harm is minimal provided common sense and routine patient privacy principles are utilized. We lay out ethical and practical guidelines for pathologists who use social media professionally. © 2016 American Medical Association. All Rights Reserved.

  14. 78 FR 15732 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection Amendments of 1990 (Pub. L. 101...

  15. Online privacy: overview and preliminary research

    Directory of Open Access Journals (Sweden)

    Renata Mekovec

    2010-12-01

    Full Text Available Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4 Over the last decade using the Internet for online shopping, information browsing and searching as well as for online communication has become part of everyday life. Although the Internet technology has a lot of benefits for users, one of the most important disadvantages is related to the increasing capacity for users’ online activity surveillance. However, the users are increasingly becoming aware of online surveillance methods, which results in their increased concern for privacy protection. Numerous factors influence the way in which individuals perceive the level of privacy protection when they are online. This article provides a review of factors that influence the privacy perception of Internet users. Previous online privacy research related to e-business was predominantly focused on the dimension of information privacy and concerned with the way users’ personal information is collected, saved and used by an online company. This article’s main aim is to provide an overview of numerous Internet users’ privacy perception elements across various privacy dimensions as well as their potential categorization. In addition, considering that e-banking and online shopping are one of the most widely used e-services, an examination of online privacy perception of e-banking/online shopping users was performed. 

  16. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-01-01

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one. PMID:27626417

  17. Privacy Protection for Personal Health Device Communication and Healthcare Building Applications

    Directory of Open Access Journals (Sweden)

    Soon Seok Kim

    2014-01-01

    Full Text Available This paper proposes a new method for protecting patient privacy when communicating with a gateway which collects bioinformation through using personal health devices, a type of biosensor for telemedicine, at home and in other buildings. As the suggested method is designed to conform with ISO/IEEE 11073-20601, which is the international standard, interoperability with various health devices was considered. We believe it will be a highly valuable resource for dealing with basic data because it suggests an additional standard for security with the Continua Health Alliance or related international groups in the future.

  18. Security, privacy, and confidentiality issues on the Internet

    OpenAIRE

    Kelly, Grant; McKenzie, Bruce

    2002-01-01

    We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standa...

  19. Privacy and technology challenges for ubiquitous social networking

    DEFF Research Database (Denmark)

    Sapuppo, Antonio; Seet, Boon-Chong

    2015-01-01

    towards important challenges such as social sensing, enabling social networking and privacy protection. In this paper we firstly investigate the methods and technologies for acquisition of the relevant context for promotion of sociability among inhabitants of USN environments. Afterwards, we review...... architectures and techniques for enabling social interactions between participants. Finally, we identify privacy as the major challenge for networking in USN environments. Consequently, we depict design guidelines and review privacy protection models for facilitating personal information disclosure....

  20. Access to Information and Privacy | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    As a Crown corporation, IDRC is subject to Canada's laws on access to information and privacy protection. The following resources will help you learn more about IDRC and the access to information and privacy acts, including instructions for submitting an access to information or privacy act (ATIP) request. IDRC and ATIP ...

  1. Gaussian operations and privacy

    International Nuclear Information System (INIS)

    Navascues, Miguel; Acin, Antonio

    2005-01-01

    We consider the possibilities offered by Gaussian states and operations for two honest parties, Alice and Bob, to obtain privacy against a third eavesdropping party, Eve. We first extend the security analysis of the protocol proposed in [Navascues et al. Phys. Rev. Lett. 94, 010502 (2005)]. Then, we prove that a generalized version of this protocol does not allow one to distill a secret key out of bound entangled Gaussian states

  2. Privacy Data Decomposition and Discretization Method for SaaS Services

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2017-01-01

    Full Text Available In cloud computing, user functional requirements are satisfied through service composition. However, due to the process of interaction and sharing among SaaS services, user privacy data tends to be illegally disclosed to the service participants. In this paper, we propose a privacy data decomposition and discretization method for SaaS services. First, according to logic between the data, we classify the privacy data into discrete privacy data and continuous privacy data. Next, in order to protect the user privacy information, continuous data chains are decomposed into discrete data chain, and discrete data chains are prevented from being synthesized into continuous data chains. Finally, we propose a protection framework for privacy data and demonstrate its correctness and feasibility with experiments.

  3. Personal Privacy Protection in Big Data Era%大数据时代个人隐私的保护

    Institute of Scientific and Technical Information of China (English)

    张永兵

    2016-01-01

    近年来,以云计算为基础平台的大数据时代正式到来,大数据因蕴藏有巨大的商业价值而使不法分子想方设法盗取个人隐私数据,从而影响用户的正常生活。本文通过分析大数据时代个人隐私安全面临的严峻挑战,对个人隐私保护所采用的技术措施进行总结,并提出了个人或企业应遵守的法律和行业规范,最后探索了个人隐私保护的进一步研究方向。%In recent years, the era of big data based on cloud computing platform officially arrived, and big data contains a huge commercial value and makes the criminals try to steal personal privacy data, thus affecting the normal life of the user. By analyzing the challenges faced by the privacy security in the era of big data, summarize the technical measures adopted in the protection of personal privacy, put forward the laws and industry standards the individual or enterprise should abide by, and finally explore the direction of further research on the protection of personal privacy.

  4. Enhancing Security and Privacy in Video Surveillance through Role-Oriented Access Control Mechanism

    DEFF Research Database (Denmark)

    Mahmood Rajpoot, Qasim

    sensitive regions, e.g. faces, from the videos. However, very few research efforts have focused on addressing the security aspects of video surveillance data and on authorizing access to this data. Interestingly, while PETs help protect the privacy of individuals, they may also hinder the usefulness....... Pervasive usage of such systems gives substantial powers to those monitoring the videos and poses a threat to the privacy of anyone observed by the system. Aside from protecting privacy from the outside attackers, it is equally important to protect the privacy of individuals from the inside personnel...... involved in monitoring surveillance data to minimize the chances of misuse of the system, e.g. voyeurism. In this context, several techniques to protect the privacy of individuals, called privacy enhancing techniques (PET) have therefore been proposed in the literature which detect and mask the privacy...

  5. Differences in legislation of data privacy protection in internet marketing in USA, EU and Serbia

    Directory of Open Access Journals (Sweden)

    Markov Jasmina

    2012-01-01

    Full Text Available There is a growing number of companies that are, in its operations and dealings with consumers, turning to the Internet and using huge opportunities that it provides. Therefore, Internet marketing is now experiencing extreme expansion and it is considered to be the marketing segment that is vulnerable to intensive and continuous change. Along with the positive effects brought to both businesses and consumers, there are some negatives associated with this form of marketing, and one of them is the insufficient protection of privacy. The fact is that we must raise the level of data protection, and improve its quality. Intense changes have to be taken on the normative level, because there are still plenty of reasons for the dissatisfaction of consumers when it comes to protecting their privacy. Thus, the legislation must play a key role in building consumer confidence as well as in the establishment of a positive relationship with marketers. The aim of this paper is to show the importance of the construction of such levels of private data protection which will establish longterm partnerships between consumers, marketers and other participants in the market, since only the aforementioned relations can bring prosperity to all parties. The paper will make a comparative analysis of the legislative framework in this field in the United States, the European Union and Serbia, as well as stress still present significant backlog of Serbia in relation to the aforementioned developed countries.

  6. mSieve: Differential Behavioral Privacy in Time Series of Mobile Sensor Data.

    Science.gov (United States)

    Saleheen, Nazir; Chakraborty, Supriyo; Ali, Nasir; Mahbubur Rahman, Md; Hossain, Syed Monowar; Bari, Rummana; Buder, Eugene; Srivastava, Mani; Kumar, Santosh

    2016-09-01

    Differential privacy concepts have been successfully used to protect anonymity of individuals in population-scale analysis. Sharing of mobile sensor data, especially physiological data, raise different privacy challenges, that of protecting private behaviors that can be revealed from time series of sensor data. Existing privacy mechanisms rely on noise addition and data perturbation. But the accuracy requirement on inferences drawn from physiological data, together with well-established limits within which these data values occur, render traditional privacy mechanisms inapplicable. In this work, we define a new behavioral privacy metric based on differential privacy and propose a novel data substitution mechanism to protect behavioral privacy. We evaluate the efficacy of our scheme using 660 hours of ECG, respiration, and activity data collected from 43 participants and demonstrate that it is possible to retain meaningful utility, in terms of inference accuracy (90%), while simultaneously preserving the privacy of sensitive behaviors.

  7. Bridging the transatlantic divide in privacy

    Directory of Open Access Journals (Sweden)

    Paula Kift

    2013-08-01

    Full Text Available In the context of the US National Security Agency surveillance scandal, the transatlantic privacy divide has come back to the fore. In the United States, the right to privacy is primarily understood as a right to physical privacy, thus the protection from unwarranted government searches and seizures. In Germany on the other hand, it is also understood as a right to spiritual privacy, thus the right of citizens to develop into autonomous moral agents. The following article will discuss the different constitutional assumptions that underlie American and German attitudes towards privacy, namely privacy as an aspect of liberty or as an aspect of dignity. As data flows defy jurisdictional boundaries, however, policymakers across the Atlantic are faced with a conundrum: how can German and American privacy cultures be reconciled?

  8. Data security breaches and privacy in Europe

    CERN Document Server

    Wong, Rebecca

    2013-01-01

    Data Security Breaches and Privacy in Europe aims to consider data protection and cybersecurity issues; more specifically, it aims to provide a fruitful discussion on data security breaches. A detailed analysis of the European Data Protection framework will be examined. In particular, the Data Protection Directive 95/45/EC, the Directive on Privacy and Electronic Communications and the proposed changes under the Data Protection Regulation (data breach notifications) and its implications are considered. This is followed by an examination of the Directive on Attacks against information systems a

  9. Privacy Training Program

    Science.gov (United States)

    Recognizing that training and awareness are critical to protecting agency Personally Identifiable Information (PII), the EPA is developing online training for privacy contacts in its programs and regions.

  10. Privacy protection on the internet: The European model

    Directory of Open Access Journals (Sweden)

    Baltezarević Vesna

    2017-01-01

    Full Text Available The Internet has a huge impact on all areas of social activity. Everyday life, social interaction and economics are directed to new information and communication technologies. A positive aspect of the new technology is reflected in the fact that it has created a virtual space that has led to the elimination of the various barriers, which has enabled interaction and information exchange across the world. Inclusion in the virtual social network provides connectivity for communicators who are looking for space that allows them freedom of expression and connect them with new ' friends'. Because of the feeling of complete freedom and the absence of censorship on the network communicators leave many personal details and photos, without thinking about the possible abuses of privacy. Recording of the different incidents on the network has resulted in the need to take precaution measures, in order to protect the users and the rule of law, given that freedom on the network is only possible with the existence of an adequate system of safety and security. In this paper we deal with the problem of the protection of personal data of users of virtual social networks against malicious activity and abuse, with special reference to the activities of the European Union in an effort to regulate this area. The European Commission has concentrated on finding the best solutions to protect the user's virtual space for more than two decades, starting from 1995 until a directive on security of networks and information systems, which was adopted in the first half of 2016.

  11. The Best of Both Worlds? Free Trade in Services and EU Law on Privacy and Data Protection

    NARCIS (Netherlands)

    Yakovleva, S.; Irion, K.

    2016-01-01

    The article focuses on the interplay between European Union (EU) law on privacy and data protection and international trade law, in particular the General Agreement on Trade in Services (GATS) and the WTO dispute settlement system. The argument distinguishes between the effects of international

  12. Protection of business and industrial secrets under the Atomic Energy Act and the relevant ordinances governing licensing and supervisory procedures

    International Nuclear Information System (INIS)

    Steinberg, R.

    1988-01-01

    The article deals with problems concerning the protection of secret information in licensing and supervisory procedures under the Atomic Energy Act and the relevant ordinances. The extent of the secret protection of business and industrial secrets is regulated differently for both procedures. These legal provisions have to be interpreted with due consideration for third party interests in information. (WG) [de

  13. (IN-PRIVACY IN MOBILE APPS. CUSTOMER OPPORTUNITIES

    Directory of Open Access Journals (Sweden)

    Yu.S. Chemerkina

    2016-01-01

    Full Text Available Subject of Study. The paper presents the results of an investigation of cross-platform mobile applications. This paper focuses on a cross-platform app data investigation in purpose of creating a database that helps to make decisions from data privacy viewpoint. These decisions refer to knowledge about mobile apps that are available to the public, especially on how consumer data is protected while it is stored locally or transferred via network as well as what type of data may leak. Methods. This paper proposes a forensics methodology as a cornerstone of an app data investigation process. The object of research is an application data protection under different security control types among modern mobile OS. The subject of research is a modification of forensics approach and behavioral analysis to examine application data privacy in order to find data that are not properly handled by applications which lead to data leakages, defining protection control type without forensics limits. In addition, this paper relies on using the simplest tools, proposing a limit to examine locally stored data and transmitted over the network to cover all data, excluding memory and code analysis unless it is valuable (behavioral analysis. The research methods of the tasks set in the paper include digital forensics approach methods depending on data conception (at-rest, in-use/memory, in-transit with behavioral analysis of application, and static and dynamic application code analysis. Main Results. The research was carried out for the scope of that thesis, and the following scientific results were obtained. First, the methods used to investigate the privacy of application data allow considering application features and protection code design and flaws in the context of incomplete user awareness about the privacy state due to external activity of the developer. Second, the knowledge set about facts of application data protection that allows making a knowledge database to

  14. Privacy-preserving digital rights management

    NARCIS (Netherlands)

    Conrado, C.; Petkovic, M.; Jonker, W.; Jonker, W.; Petkovic, M.

    2004-01-01

    DRM systems provide a means for protecting digital content, but at the same time they violate the privacy of users in a number of ways. This paper addresses privacy issues in DRM systems. The main challenge is how to allow a user to interact with the system in an anonymous/pseudonymous way, while

  15. The privacy paradox : Investigating discrepancies between expressed privacy concerns and actual online behavior - A systematic literature review

    NARCIS (Netherlands)

    Barth, Susanne; de Jong, Menno D.T.

    2017-01-01

    Also known as the privacy paradox, recent research on online behavior has revealed discrepancies between user attitude and their actual behavior. More specifically: While users claim to be very concerned about their privacy, they nevertheless undertake very little to protect their personal data.

  16. 77 FR 64962 - Privacy Act of 1974, as Amended

    Science.gov (United States)

    2012-10-24

    ... social media, and recipients of other public relations materials issued by the CFPB about CFPB sponsored... THE BUREAU OF CONSUMER FINANCIAL PROTECTION Privacy Act of 1974, as Amended AGENCY: Bureau of Consumer Financial Protection. ACTION: Notice of Proposed Privacy Act System of Records. SUMMARY: In...

  17. 77 FR 60382 - Privacy Act of 1974, as Amended

    Science.gov (United States)

    2012-10-03

    ... financial products or services, (b) consumer behavior with respect to consumer financial products and... BUREAU OF CONSUMER FINANCIAL PROTECTION Privacy Act of 1974, as Amended AGENCY: Bureau of Consumer... the Privacy Act of 1974, as amended, the Bureau of Consumer Financial Protection, hereinto referred to...

  18. Privacy information management for video surveillance

    Science.gov (United States)

    Luo, Ying; Cheung, Sen-ching S.

    2013-05-01

    The widespread deployment of surveillance cameras has raised serious privacy concerns. Many privacy-enhancing schemes have been proposed to automatically redact images of trusted individuals in the surveillance video. To identify these individuals for protection, the most reliable approach is to use biometric signals such as iris patterns as they are immutable and highly discriminative. In this paper, we propose a privacy data management system to be used in a privacy-aware video surveillance system. The privacy status of a subject is anonymously determined based on her iris pattern. For a trusted subject, the surveillance video is redacted and the original imagery is considered to be the privacy information. Our proposed system allows a subject to access her privacy information via the same biometric signal for privacy status determination. Two secure protocols, one for privacy information encryption and the other for privacy information retrieval are proposed. Error control coding is used to cope with the variability in iris patterns and efficient implementation is achieved using surrogate data records. Experimental results on a public iris biometric database demonstrate the validity of our framework.

  19. Privacy and ethics in pediatric environmental health research-part II: protecting families and communities.

    Science.gov (United States)

    Fisher, Celia B

    2006-10-01

    In pediatric environmental health research, information about family members is often directly sought or indirectly obtained in the process of identifying child risk factors and helping to tease apart and identify interactions between genetic and environmental factors. However, federal regulations governing human subjects research do not directly address ethical issues associated with protections for family members who are not identified as the primary "research participant." Ethical concerns related to family consent and privacy become paramount as pediatric environmental health research increasingly turns to questions of gene-environment interactions. In this article I identify issues arising from and potential solutions for the privacy and informed consent challenges of pediatric environmental health research intended to adequately protect the rights and welfare of children, family members, and communities. I first discuss family members as secondary research participants and then the specific ethical challenges of longitudinal research on late-onset environmental effects and gene-environment interactions. I conclude with a discussion of the confidentiality and social risks of recruitment and data collection of research conducted within small or unique communities, ethnic minority populations, and low-income families. The responsible conduct of pediatric environmental health research must be conceptualized as a goodness of fit between the specific research context and the unique characteristics of subjects and other family stakeholders.

  20. Differential privacy in intelligent transportation systems

    NARCIS (Netherlands)

    Kargl, Frank; Friedman, Arik; Boreli, Roksana

    2013-01-01

    In this paper, we investigate how the concept of differential privacy can be applied to Intelligent Transportation Systems (ITS), focusing on protection of Floating Car Data (FCD) stored and processed in central Traffic Data Centers (TDC). We illustrate an integration of differential privacy with

  1. Fourier domain asymmetric cryptosystem for privacy protected multimodal biometric security

    Science.gov (United States)

    Choudhury, Debesh

    2016-04-01

    We propose a Fourier domain asymmetric cryptosystem for multimodal biometric security. One modality of biometrics (such as face) is used as the plaintext, which is encrypted by another modality of biometrics (such as fingerprint). A private key is synthesized from the encrypted biometric signature by complex spatial Fourier processing. The encrypted biometric signature is further encrypted by other biometric modalities, and the corresponding private keys are synthesized. The resulting biometric signature is privacy protected since the encryption keys are provided by the human, and hence those are private keys. Moreover, the decryption keys are synthesized using those private encryption keys. The encrypted signatures are decrypted using the synthesized private keys and inverse complex spatial Fourier processing. Computer simulations demonstrate the feasibility of the technique proposed.

  2. Preserving differential privacy under finite-precision semantics.

    Directory of Open Access Journals (Sweden)

    Ivan Gazeau

    2013-06-01

    Full Text Available The approximation introduced by finite-precision representation of continuous data can induce arbitrarily large information leaks even when the computation using exact semantics is secure. Such leakage can thus undermine design efforts aimed at protecting sensitive information. We focus here on differential privacy, an approach to privacy that emerged from the area of statistical databases and is now widely applied also in other domains. In this approach, privacy is protected by the addition of noise to a true (private value. To date, this approach to privacy has been proved correct only in the ideal case in which computations are made using an idealized, infinite-precision semantics. In this paper, we analyze the situation at the implementation level, where the semantics is necessarily finite-precision, i.e. the representation of real numbers and the operations on them, are rounded according to some level of precision. We show that in general there are violations of the differential privacy property, and we study the conditions under which we can still guarantee a limited (but, arguably, totally acceptable variant of the property, under only a minor degradation of the privacy level. Finally, we illustrate our results on two cases of noise-generating distributions: the standard Laplacian mechanism commonly used in differential privacy, and a bivariate version of the Laplacian recently introduced in the setting of privacy-aware geolocation.

  3. 76 FR 63896 - Federal Acquisition Regulation; Privacy Training, 2010-013

    Science.gov (United States)

    2011-10-14

    ... should a breach occur; and (7) Any agency-specific privacy training requirements. (d) The contractor is... Acquisition Regulation; Privacy Training, 2010-013 AGENCY: Department of Defense (DoD), General Services... contractors to complete training that addresses the protection of privacy, in accordance with the Privacy Act...

  4. BORDERS OF COMMUNICATION PRIVACY IN SLOVENIAN CRIMINAL PROCEDURE – CONSTITUTIONAL CHALLENGES

    Directory of Open Access Journals (Sweden)

    Sabina Zgaga

    2015-01-01

    Full Text Available Due to fast technological development and our constant communication protection of communication privacy in every aspect of our (legal life has become more important than ever before. Regarding protection of privacy in criminal procedure special emphasis should be given to the regulation of privacy in Slovenian Constitution and its interpretation in the case law of the Constitutional Court. This paper presents the definition of privacy and communication privacy in Slovenian constitutional law and exposes the main issues of communication privacy that have been discussed in the case law of the Constitutional Court in the last twenty years. Thereby the paper tries to show the general trend in the case law of Constitutional Court regarding the protection of communication privacy and to expose certain unsolved issues and unanswered challenges. Slovenian constitutional regulation of communication privacy is very protective, considering the broad definition of privacy and the strict conditions for encroachment of communication privacy. The case law of Slovenian Constitutional Court has also shown such trend, with the possible exception of the recent decision on a dynamic IP address. The importance of this decision is however significant, since it could be applicable to all forms of communication via internet, the prevailing form of communication nowadays. Certain challenges still lay ahead, such as the current proposal for the amendment of Criminal Procedure Act-M, which includes the use of IMSI catchers and numerous unanswered issues regarding data retention after the decisive annulment of its partial legal basis by the Constitutional Court.

  5. Privacy Practices of Health Social Networking Sites: Implications for Privacy and Data Security in Online Cancer Communities.

    Science.gov (United States)

    Charbonneau, Deborah H

    2016-08-01

    While online communities for social support continue to grow, little is known about the state of privacy practices of health social networking sites. This article reports on a structured content analysis of privacy policies and disclosure practices for 25 online ovarian cancer communities. All of the health social networking sites in the study sample provided privacy statements to users, yet privacy practices varied considerably across the sites. The majority of sites informed users that personal information was collected about participants and shared with third parties (96%, n = 24). Furthermore, more than half of the sites (56%, n = 14) stated that cookies technology was used to track user behaviors. Despite these disclosures, only 36% (n = 9) offered opt-out choices for sharing data with third parties. In addition, very few of the sites (28%, n = 7) allowed individuals to delete their personal information. Discussions about specific security measures used to protect personal information were largely missing. Implications for privacy, confidentiality, consumer choice, and data safety in online environments are discussed. Overall, nurses and other health professionals can utilize these findings to encourage individuals seeking online support and participating in social networking sites to build awareness of privacy risks to better protect their personal health information in the digital age.

  6. Decrypting Information Sensitivity: Risk, Privacy, and Data Protection Law in the United States and the European Union

    Science.gov (United States)

    Fazlioglu, Muge

    2017-01-01

    This dissertation examines the risk-based approach to privacy and data protection and the role of information sensitivity within risk management. Determining what information carries the greatest risk is a multi-layered challenge that involves balancing the rights and interests of multiple actors, including data controllers, data processors, and…

  7. Protection of the Locational Privacy Using Mosaic Theory of Data (Varstvo lokacijske zasebnosti s pomočjo mozaične teorije podatkov

    Directory of Open Access Journals (Sweden)

    Primož Križnar

    2016-12-01

    Full Text Available The individual’s right to privacy is one of the fundamental human rights. Part of this »embedded« right presents a person’s capability to move from a variety of different points and locations with reasonable expectation that performed paths, stops and current locations are not systematically recorded and stored for future use. Notwithstanding this, individuals often seem to be ignorant of the modern technology capabilities, which is aggressively interfering with wide spectrum of their privacy, part of which is also locational privacy. However, the following as one of the existential component of privacy must also be given all the necessary legal protection, which, at least for the time being, is reflected in the implementation of the mosaic theory in the European legal traditions with the help of established legal standards of the European Court of Human Rights regarding privacy.

  8. Secure privacy-preserving biometric authentication scheme for telecare medicine information systems.

    Science.gov (United States)

    Li, Xuelei; Wen, Qiaoyan; Li, Wenmin; Zhang, Hua; Jin, Zhengping

    2014-11-01

    Healthcare delivery services via telecare medicine information systems (TMIS) can help patients to obtain their desired telemedicine services conveniently. However, information security and privacy protection are important issues and crucial challenges in healthcare information systems, where only authorized patients and doctors can employ telecare medicine facilities and access electronic medical records. Therefore, a secure authentication scheme is urgently required to achieve the goals of entity authentication, data confidentiality and privacy protection. This paper investigates a new biometric authentication with key agreement scheme, which focuses on patient privacy and medical data confidentiality in TMIS. The new scheme employs hash function, fuzzy extractor, nonce and authenticated Diffie-Hellman key agreement as primitives. It provides patient privacy protection, e.g., hiding identity from being theft and tracked by unauthorized participant, and preserving password and biometric template from being compromised by trustless servers. Moreover, key agreement supports secure transmission by symmetric encryption to protect patient's medical data from being leaked. Finally, the analysis shows that our proposal provides more security and privacy protection for TMIS.

  9. Big data privacy protection model based on multi-level trusted system

    Science.gov (United States)

    Zhang, Nan; Liu, Zehua; Han, Hongfeng

    2018-05-01

    This paper introduces and inherit the multi-level trusted system model that solves the Trojan virus by encrypting the privacy of user data, and achieve the principle: "not to read the high priority hierarchy, not to write the hierarchy with low priority". Thus ensuring that the low-priority data privacy leak does not affect the disclosure of high-priority data privacy. This paper inherits the multi-level trustworthy system model of Trojan horse and divides seven different risk levels. The priority level 1˜7 represent the low to high value of user data privacy, and realize seven kinds of encryption with different execution efficiency Algorithm, the higher the priority, the greater the value of user data privacy, at the expense of efficiency under the premise of choosing a more encrypted encryption algorithm to ensure data security. For enterprises, the price point is determined by the unit equipment users to decide the length of time. The higher the risk sub-group algorithm, the longer the encryption time. The model assumes that users prefer the lower priority encryption algorithm to ensure efficiency. This paper proposes a privacy cost model for each of the seven risk subgroups. Among them, the higher the privacy cost, the higher the priority of the risk sub-group, the higher the price the user needs to pay to ensure the privacy of the data. Furthermore, by introducing the existing pricing model of economics and the human traffic model proposed by this paper and fluctuating with the market demand, this paper improves the price of unit products when the market demand is low. On the other hand, when the market demand increases, the profit of the enterprise will be guaranteed under the guidance of the government by reducing the price per unit of product. Then, this paper introduces the dynamic factors of consumers' mood and age to optimize. At the same time, seven algorithms are selected from symmetric and asymmetric encryption algorithms to define the enterprise

  10. Piloting a deceased subject integrated data repository and protecting privacy of relatives.

    Science.gov (United States)

    Huser, Vojtech; Kayaalp, Mehmet; Dodd, Zeyno A; Cimino, James J

    2014-01-01

    Use of deceased subject Electronic Health Records can be an important piloting platform for informatics or biomedical research. Existing legal framework allows such research under less strict de-identification criteria; however, privacy of non-decedent must be protected. We report on creation of the decease subject Integrated Data Repository (dsIDR) at National Institutes of Health, Clinical Center and a pilot methodology to remove secondary protected health information or identifiable information (secondary PxI; information about persons other than the primary patient). We characterize available structured coded data in dsIDR and report the estimated frequencies of secondary PxI, ranging from 12.9% (sensitive token presence) to 1.1% (using stricter criteria). Federating decedent EHR data from multiple institutions can address sample size limitations and our pilot study provides lessons learned and methodology that can be adopted by other institutions.

  11. Security, privacy, and confidentiality issues on the Internet.

    Science.gov (United States)

    Kelly, Grant; McKenzie, Bruce

    2002-01-01

    We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standard for encrypting Web-based information interchanges is Secure Sockets Layer (SSL), more recently known as Transport Layer Security or TLS. There is a public key infrastructure process to 'sign' a message whereby the private key of an individual can be used to 'hash' the message. This can then be verified against the sender's public key. This ensures the data's authenticity and origin without conferring privacy, and is called a 'digital signature'. The best protection against viruses is not opening e-mails from unknown sources or those containing unusual message headers.

  12. Secret-key and identification rates for biometric identification systems with protected templates

    NARCIS (Netherlands)

    Ignatenko, T.; Willems, F.M.J.

    2010-01-01

    In this paper we consider secret generation in biometric identification systems with protected templates. This problem is closely related to the study of the bio metric identification capacity [Willems et al., 2003] and [O’Sullivan and Sclmmid, 2002] and the common randomness generation scheme

  13. Consumer Responses to the Introduction of Privacy Protection Measures: An Exploratory Research Framework

    OpenAIRE

    Heng Xu

    2009-01-01

    Information privacy is at the center of discussion and controversy among multiple stakeholders including business leaders, privacy activists, and government regulators. However, conceptualizations of information privacy have been somewhat patchy in current privacy literature. In this article, we review the conceptualizations of information privacy through three different lenses (information exchange, social contract and information control), and then try to build upon previous literature from...

  14. Fourteen Reasons Privacy Matters: A Multidisciplinary Review of Scholarly Literature

    Science.gov (United States)

    Magi, Trina J.

    2011-01-01

    Librarians have long recognized the importance of privacy to intellectual freedom. As digital technology and its applications advance, however, efforts to protect privacy may become increasingly difficult. With some users behaving in ways that suggest they do not care about privacy and with powerful voices claiming that privacy is dead, librarians…

  15. Privacy-preserving Identity Management

    OpenAIRE

    Milutinovic, Milica

    2015-01-01

    With the technological advances and the evolution of online services, user privacy is becoming a crucial issue in the modern day society. Privacy in the general sense refers to individuals’ ability to protect information about themselves and selectively present it to other entities. This concept is nowadays strongly affected by everyday practices that assume personal data disclosure, such as online shopping and participation in loyalty schemes. This makes it difficult for an individual to con...

  16. The Future Internet: A World of Secret Shares

    Directory of Open Access Journals (Sweden)

    William J. Buchanan

    2015-11-01

    Full Text Available The Public Key Infrastructure (PKI is crumbling, partially due to the lack of a strong understanding of how encryption actually works, but also due to weaknesses in its implementation. This paper outlines an Internet storage technique using secret sharing methods which could be used to overcome the problems inherent with PKI, while supporting new types of architectures incorporating such things as automated failover and break-glass data recovery. The paper outlines a novel architecture: SECRET, which supports a robust cloud-based infrastructure with in-built privacy and failover. In order to understand the performance overhead of SECRET, the paper outlines a range of experiments that investigate the overhead of this and other secret share methods.

  17. Are organisations in South Africa ready to comply with personal data protection or privacy legislation and regulations?

    CSIR Research Space (South Africa)

    Baloyi, Ntsako

    2017-06-01

    Full Text Available people. Organisations require people’s trust and in turn, people are entitled to demand, as far as practicable and lawful, certain privileges from these organisations, such as the right to data protection or privacy. The power imbalance between... of restrictions on international data transfers, where there are no ‘adequate’ levels of personal data protection [5, 6]. This could have dire consequences for businesses. The European Union (EU) Directive [5] was a game changer. It resulted in the conclusion...

  18. 78 FR 38724 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-06-27

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... Agreement that establishes a computer matching program between the Department of Homeland Security/U.S... and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection...

  19. The Regulatory Framework for Privacy and Security

    Science.gov (United States)

    Hiller, Janine S.

    The internet enables the easy collection of massive amounts of personally identifiable information. Unregulated data collection causes distrust and conflicts with widely accepted principles of privacy. The regulatory framework in the United States for ensuring privacy and security in the online environment consists of federal, state, and self-regulatory elements. New laws have been passed to address technological and internet practices that conflict with privacy protecting policies. The United States and the European Union approaches to privacy differ significantly, and the global internet environment will likely cause regulators to face the challenge of balancing privacy interests with data collection for many years to come.

  20. A Secure and Privacy-Preserving Targeted Ad-System

    Science.gov (United States)

    Androulaki, Elli; Bellovin, Steven M.

    Thanks to its low product-promotion cost and its efficiency, targeted online advertising has become very popular. Unfortunately, being profile-based, online advertising methods violate consumers' privacy, which has engendered resistance to the ads. However, protecting privacy through anonymity seems to encourage click-fraud. In this paper, we define consumer's privacy and present a privacy-preserving, targeted ad system (PPOAd) which is resistant towards click fraud. Our scheme is structured to provide financial incentives to all entities involved.

  1. Selling health data: de-identification, privacy, and speech.

    Science.gov (United States)

    Kaplan, Bonnie

    2015-07-01

    Two court cases that involve selling prescription data for pharmaceutical marketing affect biomedical informatics, patient and clinician privacy, and regulation. Sorrell v. IMS Health Inc. et al. in the United States and R v. Department of Health, Ex Parte Source Informatics Ltd. in the United Kingdom concern privacy and health data protection, data de-identification and reidentification, drug detailing (marketing), commercial benefit from the required disclosure of personal information, clinician privacy and the duty of confidentiality, beneficial and unsavory uses of health data, regulating health technologies, and considering data as speech. Individuals should, at the very least, be aware of how data about them are collected and used. Taking account of how those data are used is needed so societal norms and law evolve ethically as new technologies affect health data privacy and protection.

  2. Privacy and Ethics in Pediatric Environmental Health Research—Part II: Protecting Families and Communities

    Science.gov (United States)

    Fisher, Celia B.

    2006-01-01

    Background In pediatric environmental health research, information about family members is often directly sought or indirectly obtained in the process of identifying child risk factors and helping to tease apart and identify interactions between genetic and environmental factors. However, federal regulations governing human subjects research do not directly address ethical issues associated with protections for family members who are not identified as the primary “research participant.” Ethical concerns related to family consent and privacy become paramount as pediatric environmental health research increasingly turns to questions of gene–environment interactions. Objectives In this article I identify issues arising from and potential solutions for the privacy and informed consent challenges of pediatric environmental health research intended to adequately protect the rights and welfare of children, family members, and communities. Discussion I first discuss family members as secondary research participants and then the specific ethical challenges of longitudinal research on late-onset environmental effects and gene–environment interactions. I conclude with a discussion of the confidentiality and social risks of recruitment and data collection of research conducted within small or unique communities, ethnic minority populations, and low-income families. Conclusions The responsible conduct of pediatric environmental health research must be conceptualized as a goodness of fit between the specific research context and the unique characteristics of subjects and other family stakeholders. PMID:17035154

  3. "Everybody Knows Everybody Else's Business"-Privacy in Rural Communities.

    Science.gov (United States)

    Leung, Janni; Smith, Annetta; Atherton, Iain; McLaughlin, Deirdre

    2016-12-01

    Patients have a right to privacy in a health care setting. This involves conversational discretion, security of medical records and physical privacy of remaining unnoticed or unidentified when using health care services other than by those who need to know or whom the patient wishes to know. However, the privacy of cancer patients who live in rural areas is more difficult to protect due to the characteristics of rural communities. The purpose of this article is to reflect on concerns relating to the lack of privacy experienced by cancer patients and health care professionals in the rural health care setting. In addition, this article suggests future research directions to provide much needed evidence for educating health care providers and guiding health care policies that can lead to better protection of privacy among cancer patients living in rural communities.

  4. User Privacy in RFID Networks

    Science.gov (United States)

    Singelée, Dave; Seys, Stefaan

    Wireless RFID networks are getting deployed at a rapid pace and have already entered the public space on a massive scale: public transport cards, the biometric passport, office ID tokens, customer loyalty cards, etc. Although RFID technology offers interesting services to customers and retailers, it could also endanger the privacy of the end-users. The lack of protection mechanisms being deployed could potentially result in a privacy leakage of personal data. Furthermore, there is the emerging threat of location privacy. In this paper, we will show some practical attack scenarios and illustrates some of them with cases that have received press coverage. We will present the main challenges of enhancing privacy in RFID networks and evaluate some solutions proposed in literature. The main advantages and shortcomings will be briefly discussed. Finally, we will give an overview of some academic and industrial research initiatives on RFID privacy.

  5. Privacy After Snowden: Theoretical Developments and Public Opinion Perceptions of Privacy in Slovenia (Zasebnost po Snowdnu: novejša pojmovanja zasebnosti in odnos javnosti do le-te v Sloveniji

    Directory of Open Access Journals (Sweden)

    Aleš Završnik

    2014-10-01

    Full Text Available The article analyses recent theorizing of privacy arising from new technologies that allow constant and ubiquitous monitoring of our communication and movement. The theoretical part analyses Helen Nissenbaum’s theory of contextual integrity of privacy and pluralistic understanding of privacy by Daniel Solove. The empirical part presents the results of an online survey on the Slovenian public perceptions of privacy that includes questions on types and frequency of victimizations relating to the right to privacy; self-reported privacy violations; concern for the protection of one’s own privacy; perception of primary privacy offenders; the value of privacy; attitude towards data retention in public telecommunication networks; and acquaintance with the Information Commissioner of RS. Despite growing distrust of large internet corporations and – after Edward Snowden’s revelations – Intelligence agencies, the findings indicate a low degree of awareness and care for the protection of personal data.

  6. Mechanism of personal privacy protection based on blockchain%基于区块链的个人隐私保护机制

    Institute of Scientific and Technical Information of China (English)

    章宁; 钟珊

    2017-01-01

    Aiming at the problem of personal privacy protection in Interact car rental scenario,a personal privacy protection mechanism based on blockchain was proposed.Firstly,a framework for personal privacy protection based on blockchain was proposed for solving personal privacy issues exposed in the Internet car rental.Secondly,the design and definition of the model were given by participant profile,database design and performance analysis,and the framework and implementation of the model were expounded from the aspects of granting authority,writing data,reading data and revoking authority.Finally,the realizability of the mechanism was proved by the system development based on blockchain.%针对互联网租车场景中个人隐私保护问题,提出一种基于区块链的个人隐私保护机制.首先,针对互联网租车中暴露的个人隐私问题提出一个基于区块链的个人隐私保护解决方案框架;然后,通过参与者简介、数据库设计以及性能分析给出模型的设计和定义,并从授予权限、写入数据、读取数据和撤销权限等方面阐述该模型的框架和实现;最后,通过基于区块链的系统开发表明了该机制的可实现性.

  7. A Certificate Authority (CA-based cryptographic solution for HIPAA privacy/security regulations

    Directory of Open Access Journals (Sweden)

    Sangram Ray

    2014-07-01

    Full Text Available The Health Insurance Portability and Accountability Act (HIPAA passed by the US Congress establishes a number of privacy/security regulations for e-healthcare systems. These regulations support patients’ medical privacy and secure exchange of PHI (protected health information among medical practitioners. Three existing HIPAA-based schemes have been studied but appear to be ineffective as patients’ PHI is stored in smartcards. Moreover, carrying a smartcard during a treatment session and accessing PHI from different locations results in restrictions. In addition, authentication of the smartcard presenter would not be possible if the PIN is compromised. In this context, we propose an MCS (medical center server should be located at each hospital and accessed via the Internet for secure handling of patients’ PHI. All entities of the proposed e-health system register online with the MCS, and each entity negotiates a contributory registration key, where public-key certificates issued and maintained by CAs are used for authentication. Prior to a treatment session, a doctor negotiates a secret session key with MCS and uploads/retrieves patients’ PHI securely. The proposed scheme has five phases, which have been implemented in a secure manner for supporting HIPAA privacy/security regulations. Finally, the security aspects, computation and communication costs of the scheme are analyzed and compared with existing methods that display satisfactory performance.

  8. New threats to health data privacy.

    Science.gov (United States)

    Li, Fengjun; Zou, Xukai; Liu, Peng; Chen, Jake Y

    2011-11-24

    Along with the rapid digitalization of health data (e.g. Electronic Health Records), there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. We study how patient privacy could be compromised with the help of today's information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  9. New threats to health data privacy

    Directory of Open Access Journals (Sweden)

    Li Fengjun

    2011-11-01

    Full Text Available Abstract Background Along with the rapid digitalization of health data (e.g. Electronic Health Records, there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. Results We study how patient privacy could be compromised with the help of today’s information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. Conclusion We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  10. Privacy-Preserving Trajectory Collection

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Xuegang, Huang; Pedersen, Torben Bach

    2008-01-01

    In order to provide context--aware Location--Based Services, real location data of mobile users must be collected and analyzed by spatio--temporal data mining methods. However, the data mining methods need precise location data, while the mobile users want to protect their location privacy....... To remedy this situation, this paper first formally defines novel location privacy requirements. Then, it briefly presents a system for privacy--preserving trajectory collection that meets these requirements. The system is composed of an untrusted server and clients communicating in a P2P network. Location...... data is anonymized in the system using data cloaking and data swapping techniques. Finally, the paper empirically demonstrates that the proposed system is effective and feasible....

  11. Digital privacy in the marketplace perspectives on the information exchange

    CERN Document Server

    Milne, George

    2015-01-01

    Digital Privacy in the Marketplace focuses on the data ex-changes between marketers and consumers, with special ttention to the privacy challenges that are brought about by new information technologies. The purpose of this book is to provide a background source to help the reader think more deeply about the impact of privacy issues on both consumers and marketers. It covers topics such as: why privacy is needed, the technological, historical and academic theories of privacy, how market exchange af-fects privacy, what are the privacy harms and protections available, and what is the likely future of privacy.

  12. E-Commerce and Privacy: Conflict and Opportunity.

    Science.gov (United States)

    Farah, Badie N.; Higby, Mary A.

    2001-01-01

    Electronic commerce has intensified conflict between businesses' need to collect data and customers' desire to protect privacy. Web-based privacy tools and legislation could add to the costs of e-commerce and reduce profitability. Business models not based on profiling customers may be needed. (SK)

  13. Data Security and Privacy in Cloud Computing

    OpenAIRE

    Yunchuan Sun; Junsheng Zhang; Yongping Xiong; Guangyu Zhu

    2014-01-01

    Data security has consistently been a major issue in information technology. In the cloud computing environment, it becomes particularly serious because the data is located in different places even in all the globe. Data security and privacy protection are the two main factors of user’s concerns about the cloud technology. Though many techniques on the topics in cloud computing have been investigated in both academics and industries, data security and privacy protection are becoming more impo...

  14. Fuzzy Privacy Decision for Context-Aware Access Personal Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qingsheng; QI Yong; ZHAO Jizhong; HOU Di; NIU Yujie

    2007-01-01

    A context-aware privacy protection framework was designed for context-aware services and privacy control methods about access personal information in pervasive environment. In the process of user's privacy decision, it can produce fuzzy privacy decision as the change of personal information sensitivity and personal information receiver trust. The uncertain privacy decision model was proposed about personal information disclosure based on the change of personal information receiver trust and personal information sensitivity. A fuzzy privacy decision information system was designed according to this model. Personal privacy control policies can be extracted from this information system by using rough set theory. It also solves the problem about learning privacy control policies of personal information disclosure.

  15. Role-task conditional-purpose policy model for privacy preserving data publishing

    Directory of Open Access Journals (Sweden)

    Rana Elgendy

    2017-12-01

    Full Text Available Privacy becomes a major concern for both consumers and enterprises; therefore many research efforts have been devoted to the development of privacy preserving technology. The challenge in data privacy is to share the data while assuring the protection of personal information. Data privacy includes assuring protection for both insider ad outsider threats even if the data is published. Access control can help to protect the data from outsider threats. Access control is defined as the process of mediating every request to resources and data maintained by a system and determining whether the request should be granted or denied. This can be enforced by a mechanism implementing regulations established by a security policy. In this paper, we present privacy preserving data publishing model based on integration of CPBAC, MD-TRBAC, PBFW, protection against database administrator technique inspired from oracle vault technique and benefits of anonymization technique to protect data when being published using k-anonymity. The proposed model meets the requirements of workflow and non-workflow system in enterprise environment. It is based on the characteristics of the conditional purposes, conditional roles, tasks, and policies. It guarantees the protection against insider threats such as database administrator. Finally it assures needed protection in case of publishing the data. Keywords: Database security, Access control, Data publishing, Anonymization

  16. Privacy in the Internet: Myth or reality

    Directory of Open Access Journals (Sweden)

    Mikarić Bratislav

    2016-01-01

    Full Text Available The present time, unthinkable without using Internet - from e-mail, through social networks, cloud services, GPS, to YouTube and mobile computing in business, as well as on a private level, poses a question: Is there a way to protect data and their privacy on the Internet? What are the ways to control what personal information we will publicly share with others and is there a safe way to protect privacy on the world's global computer network? The paper gives an overview of the situation in the field, as well as tips for achieving the desired level of data protection.

  17. Big Data and Consumer Participation in Privacy Contracts: Deciding who Decides on Privacy

    Directory of Open Access Journals (Sweden)

    Michiel Rhoen

    2015-02-01

    Full Text Available Big data puts data protection to the test. Consumers granting permission to process their personal data are increasingly opening up their personal lives, thanks to the “datafication” of everyday life, indefinite data retention and the increasing sophistication of algorithms for analysis.The privacy implications of big data call for serious consideration of consumers’ opportunities to participate in decision-making processes about their contracts. If these opportunities are insufficient, the resulting rules may represent special interests rather than consumers’ needs. This may undermine the legitimacy of big data applications.This article argues that providing sufficient consumer participation in privacy matters requires choosing the best available decision making mechanism. Is a consumer to negotiate his own privacy terms in the market, will lawmakers step in on his behalf, or is he to seek protection through courts? Furthermore is this a matter of national law or European law? These choices will affect the opportunities for achieving different policy goals associated with the possible benefits of the “big data revolution”.

  18. Security, privacy, and confidentiality issues on the Internet

    Science.gov (United States)

    Kelly, Grant; McKenzie, Bruce

    2002-01-01

    We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standard for encrypting Web-based information interchanges is Secure Sockets Layer (SSL), more recently known as Transport Layer Security or TLS. There is a public key infrastructure process to `sign' a message whereby the private key of an individual can be used to `hash' the message. This can then be verified against the sender's public key. This ensures the data's authenticity and origin without conferring privacy, and is called a `digital signature'. The best protection against viruses is not opening e-mails from unknown sources or those containing unusual message headers. PMID:12554559

  19. On Secret Sharing with Nonlinear Product Reconstruction

    DEFF Research Database (Denmark)

    Cascudo Pueyo, Ignacio; Cramer, Ronald; Mirandola, Diego

    2015-01-01

    Multiplicative linear secret sharing is a fundamental notion in the area of secure multiparty computation and, since recently, in the area of two-party cryptography as well. In a nutshell, this notion guarantees that the product of two secrets is obtained as a linear function of the vector......-necessarily-linear “product reconstruction function.” Is the resulting notion equivalent to multiplicative linear secret sharing? We show the (perhaps somewhat counterintuitive) result that this relaxed notion is strictly more general. Concretely, fix a finite field ${\\mathbb F}_q$ as the base field over which linear secret...... sharing is considered. Then we show there exists an (exotic) linear secret sharing scheme with an unbounded number of players $n$ such that it has $t$-privacy with $t = \\Omega(n)$ and such that it does admit a product reconstruction function, yet this function is necessarily nonlinear. In addition, we...

  20. An Effective Grouping Method for Privacy-Preserving Bike Sharing Data Publishing

    Directory of Open Access Journals (Sweden)

    A S M Touhidul Hasan

    2017-10-01

    Full Text Available Bike sharing programs are eco-friendly transportation systems that are widespread in smart city environments. In this paper, we study the problem of privacy-preserving bike sharing microdata publishing. Bike sharing systems collect visiting information along with user identity and make it public by removing the user identity. Even after excluding user identification, the published bike sharing dataset will not be protected against privacy disclosure risks. An adversary may arrange published datasets based on bike’s visiting information to breach a user’s privacy. In this paper, we propose a grouping based anonymization method to protect published bike sharing dataset from linking attacks. The proposed Grouping method ensures that the published bike sharing microdata will be protected from disclosure risks. Experimental results show that our approach can protect user privacy in the released datasets from disclosure risks and can keep more data utility compared with existing methods.

  1. GAIN RATIO BASED FEATURE SELECTION METHOD FOR PRIVACY PRESERVATION

    Directory of Open Access Journals (Sweden)

    R. Praveena Priyadarsini

    2011-04-01

    Full Text Available Privacy-preservation is a step in data mining that tries to safeguard sensitive information from unsanctioned disclosure and hence protecting individual data records and their privacy. There are various privacy preservation techniques like k-anonymity, l-diversity and t-closeness and data perturbation. In this paper k-anonymity privacy protection technique is applied to high dimensional datasets like adult and census. since, both the data sets are high dimensional, feature subset selection method like Gain Ratio is applied and the attributes of the datasets are ranked and low ranking attributes are filtered to form new reduced data subsets. K-anonymization privacy preservation technique is then applied on reduced datasets. The accuracy of the privacy preserved reduced datasets and the original datasets are compared for their accuracy on the two functionalities of data mining namely classification and clustering using naïve Bayesian and k-means algorithm respectively. Experimental results show that classification and clustering accuracy are comparatively the same for reduced k-anonym zed datasets and the original data sets.

  2. ‘Regulating’ Online Data Privacy

    Directory of Open Access Journals (Sweden)

    Paul Reid

    2004-09-01

    Full Text Available With existing data protection laws proving inadequate in the fight to protect online data privacy and with the offline law of privacy in a state of change and uncertainty, the search for an alternative solution to the important problem of online data privacy should commence. With the inherent problem of jurisdiction that the Internet presents, such a solution is best coming from a multi-national body with the power to approximate laws in as many jurisdictions as possible, with a recognised authority and a functioning enforcement mechanism. The European Union is such a body and while existing data protection laws stem from the EU, they were neither tailored specifically for the Internet and the online world, nor do they fully harmonise the laws of the member states – an essential element in Internet regulation. Current laws face further problems with the ease and frequency of data transfers outwith the EU. An Internet specific online data privacy regulation would fully approximate the laws of the twenty five member states and, if suitably drafted, could perhaps, drawing upon EC competition jurisprudence, achieve a degree of extraterritoriality, thus combating the problem posed by transfers outwith the EU. Any solution, however, is dependant upon our political leaders having the political will and courage to reach and agreement upon any new law.

  3. Practical security and privacy attacks against biometric hashing using sparse recovery

    Science.gov (United States)

    Topcu, Berkay; Karabat, Cagatay; Azadmanesh, Matin; Erdogan, Hakan

    2016-12-01

    Biometric hashing is a cancelable biometric verification method that has received research interest recently. This method can be considered as a two-factor authentication method which combines a personal password (or secret key) with a biometric to obtain a secure binary template which is used for authentication. We present novel practical security and privacy attacks against biometric hashing when the attacker is assumed to know the user's password in order to quantify the additional protection due to biometrics when the password is compromised. We present four methods that can reconstruct a biometric feature and/or the image from a hash and one method which can find the closest biometric data (i.e., face image) from a database. Two of the reconstruction methods are based on 1-bit compressed sensing signal reconstruction for which the data acquisition scenario is very similar to biometric hashing. Previous literature introduced simple attack methods, but we show that we can achieve higher level of security threats using compressed sensing recovery techniques. In addition, we present privacy attacks which reconstruct a biometric image which resembles the original image. We quantify the performance of the attacks using detection error tradeoff curves and equal error rates under advanced attack scenarios. We show that conventional biometric hashing methods suffer from high security and privacy leaks under practical attacks, and we believe more advanced hash generation methods are necessary to avoid these attacks.

  4. Quantifying privacy and security of biometric fuzzy commitment

    NARCIS (Netherlands)

    Zhou, Xuebing; Kuijper, Arjan; Veldhuis, Raymond N.J.; Busch, Christoph

    2011-01-01

    Fuzzy commitment is an efficient template protection algorithm that can improve security and safeguard privacy of biometrics. Existing theoretical security analysis has proved that although privacy leakage is unavoidable, perfect security from information-theoretical points of view is possible when

  5. Toward sensitive document release with privacy guarantees

    OpenAIRE

    David Sánchez; Montserrat Batet

    2017-01-01

    Toward sensitive document release with privacy guarantees DOI: 10.1016/j.engappai.2016.12.013 URL: http://www.sciencedirect.com/science/article/pii/S0952197616302408 Filiació URV: SI Inclòs a la memòria: SI Privacy has become a serious concern for modern Information Societies. The sensitive nature of much of the data that are daily exchanged or released to untrusted parties requires that responsible organizations undertake appropriate privacy protection measures. Nowadays, much...

  6. The study on privacy preserving data mining for information security

    Science.gov (United States)

    Li, Xiaohui

    2012-04-01

    Privacy preserving data mining have a rapid development in a short year. But it still faces many challenges in the future. Firstly, the level of privacy has different definitions in different filed. Therefore, the measure of privacy preserving data mining technology protecting private information is not the same. So, it's an urgent issue to present a unified privacy definition and measure. Secondly, the most of research in privacy preserving data mining is presently confined to the theory study.

  7. Protecting patient privacy when sharing patient-level data from clinical trials.

    Science.gov (United States)

    Tucker, Katherine; Branson, Janice; Dilleen, Maria; Hollis, Sally; Loughlin, Paul; Nixon, Mark J; Williams, Zoë

    2016-07-08

    Greater transparency and, in particular, sharing of patient-level data for further scientific research is an increasingly important topic for the pharmaceutical industry and other organisations who sponsor and conduct clinical trials as well as generally in the interests of patients participating in studies. A concern remains, however, over how to appropriately prepare and share clinical trial data with third party researchers, whilst maintaining patient confidentiality. Clinical trial datasets contain very detailed information on each participant. Risk to patient privacy can be mitigated by data reduction techniques. However, retention of data utility is important in order to allow meaningful scientific research. In addition, for clinical trial data, an excessive application of such techniques may pose a public health risk if misleading results are produced. After considering existing guidance, this article makes recommendations with the aim of promoting an approach that balances data utility and privacy risk and is applicable across clinical trial data holders. Our key recommendations are as follows: 1. Data anonymisation/de-identification: Data holders are responsible for generating de-identified datasets which are intended to offer increased protection for patient privacy through masking or generalisation of direct and some indirect identifiers. 2. Controlled access to data, including use of a data sharing agreement: A legally binding data sharing agreement should be in place, including agreements not to download or further share data and not to attempt to seek to identify patients. Appropriate levels of security should be used for transferring data or providing access; one solution is use of a secure 'locked box' system which provides additional safeguards. This article provides recommendations on best practices to de-identify/anonymise clinical trial data for sharing with third-party researchers, as well as controlled access to data and data sharing

  8. A multi-agent approach: To preserve user information privacy for a pervasive and ubiquitous environment

    Directory of Open Access Journals (Sweden)

    Chandramohan Dhasarathan

    2015-03-01

    Full Text Available Cloud user’s data are getting insecure in current technological advancement. This research focuses on proposing a secure model to maintain the secrecy in a cloud environment using intelligent agent. This paper presents an intelligent model to protect user’s valuable personal data. Preserving proprietor’s data and information in cloud is one of the top most challenging missions for cloud provider. Many researches fanatical their valuable time’s to discover some technique, algorithms and protocols to solve secrecy issue and develop a full-fledged cloud computing standard structure as a newest computing to all cloud users. Some researchers came forward with cryptography technique, cyber middle wear technique, noise injection and third party layer technique to preserve privacy about data in cloud. We propose a hybrid authentication technique as an end point lock. It is a composite model coupled with an algorithm for user’s privacy preserving, which is likely to be Hash Diff Anomaly Detection and Prevention (HDAD. This algorithmic protocol acts intelligently as a privacy preserving model and technique to ensure the users data are kept more secretly and develop an endorsed trust on providers. We also explore the highest necessity to maintain the confidentiality of cloud user’s data.

  9. Extending SQL to Support Privacy Policies

    Science.gov (United States)

    Ghazinour, Kambiz; Pun, Sampson; Majedi, Maryam; Chinaci, Amir H.; Barker, Ken

    Increasing concerns over Internet applications that violate user privacy by exploiting (back-end) database vulnerabilities must be addressed to protect both customer privacy and to ensure corporate strategic assets remain trustworthy. This chapter describes an extension onto database catalogues and Structured Query Language (SQL) for supporting privacy in Internet applications, such as in social networks, e-health, e-governmcnt, etc. The idea is to introduce new predicates to SQL commands to capture common privacy requirements, such as purpose, visibility, generalization, and retention for both mandatory and discretionary access control policies. The contribution is that corporations, when creating the underlying databases, will be able to define what their mandatory privacy policies arc with which all application users have to comply. Furthermore, each application user, when providing their own data, will be able to define their own privacy policies with which other users have to comply. The extension is supported with underlying catalogues and algorithms. The experiments demonstrate a very reasonable overhead for the extension. The result is a low-cost mechanism to create new systems that arc privacy aware and also to transform legacy databases to their privacy-preserving equivalents. Although the examples arc from social networks, one can apply the results to data security and user privacy of other enterprises as well.

  10. Privacy and Property? Multi-level Strategies for Protecting Personal Interests in Genetic Material

    OpenAIRE

    Laurie, Graeme

    2003-01-01

    The paper builds on earlier medico-legal work by Laurie on privacy in relation to genetic material. In this chapter, the author discusses not only Laurie's views as 'pro-privacy' but the limitations of privacy, particularly once information, genetic or otherwise, enters a public sphere. The article draws on cases and laws in the UK, continental Europe, and the US, to provide a comparative view in suggesting an alternative approach to privacy.

  11. 78 FR 23810 - Privacy Act System of Records

    Science.gov (United States)

    2013-04-22

    ... SMALL BUSINESS ADMINISTRATION Privacy Act System of Records AGENCY: Small Business Administration. ACTION: Notice of new Privacy Act system of records and request for comment. SUMMARY: The Small Business... the protected information collected from applicants and participants in the Small Business Innovation...

  12. Achieving Network Level Privacy in Wireless Sensor Networks†

    Science.gov (United States)

    Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2010-01-01

    Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power), sensor networks (e.g., mobility and topology) and QoS issues (e.g., packet reach-ability and timeliness). In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks. PMID:22294881

  13. Achieving Network Level Privacy in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sungyoung Lee

    2010-02-01

    Full Text Available Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power, sensor networks (e.g., mobility and topology and QoS issues (e.g., packet reach-ability and timeliness. In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks.

  14. Applying secret sharing for HIS backup exchange.

    Science.gov (United States)

    Kuroda, Tomohiro; Kimura, Eizen; Matsumura, Yasushi; Yamashita, Yoshinori; Hiramatsu, Haruhiko; Kume, Naoto; Sato, Atsushi

    2013-01-01

    To secure business continuity is indispensable for hospitals to fulfill its social responsibility under disasters. Although to back up the data of the hospital information system (HIS) at multiple remote sites is a key strategy of business continuity plan (BCP), the requirements to treat privacy sensitive data jack up the cost for the backup. The secret sharing is a method to split an original secret message up so that each individual piece is meaningless, but putting sufficient number of pieces together to reveal the original message. The secret sharing method eases us to exchange HIS backups between multiple hospitals. This paper evaluated the feasibility of the commercial secret sharing solution for HIS backup through several simulations. The result shows that the commercial solution is feasible to realize reasonable HIS backup exchange platform when template of contract between participating hospitals is ready.

  15. Privacy and User Experience in 21st Century Library Discovery

    Directory of Open Access Journals (Sweden)

    Shayna Pekala

    2017-06-01

    Full Text Available Over the last decade, libraries have taken advantage of emerging technologies to provide new discovery tools to help users find information and resources more efficiently. In the wake of this technological shift in discovery, privacy has become an increasingly prominent and complex issue for libraries. The nature of the web, over which users interact with discovery tools, has substantially diminished the library’s ability to control patron privacy. The emergence of a data economy has led to a new wave of online tracking and surveillance, in which multiple third parties collect and share user data during the discovery process, making it much more difficult, if not impossible, for libraries to protect patron privacy. In addition, users are increasingly starting their searches with web search engines, diminishing the library’s control over privacy even further. While libraries have a legal and ethical responsibility to protect patron privacy, they are simultaneously challenged to meet evolving user needs for discovery. In a world where “search” is synonymous with Google, users increasingly expect their library discovery experience to mimic their experience using web search engines. However, web search engines rely on a drastically different set of privacy standards, as they strive to create tailored, personalized search results based on user data. Libraries are seemingly forced to make a choice between delivering the discovery experience users expect and protecting user privacy. This paper explores the competing interests of privacy and user experience, and proposes possible strategies to address them in the future design of library discovery tools.

  16. Trust-aware Privacy Control for Social Media

    OpenAIRE

    Li, Na; Najafian-Razavi, Maryam; Gillet, Denis

    2011-01-01

    Due to the huge exposure of personal information in social media, a challenge now is to design effective privacy mechanisms that protect against unauthorized access to social data. In this paper, a trust model for social media is first presented. Based on the trust model, a trust-aware privacy control protocol is proposed, that exploits the underlying inter-entity trust information. The objective is to design a fine-grained privacy scheme that ensures a user’s online information is disclosed ...

  17. The protection of the right to privacy as the social imperative of digital age: How vulnerable are we?

    Directory of Open Access Journals (Sweden)

    Levakov-Vermezović Tijana

    2016-01-01

    Full Text Available The paper examines various forms of jeopardizing the privacy of individuals in digital world, with specific focus on criminal protection provided by current international and national legal framework and the jurisprudence of European Court of Human Rights. The significance of conducting this scientific research is essential considering that we live in the era of electronic communications in which no one is anonymous. Development of information and communication technologies has brought, among its many advantages, various challenges in all spheres of modern life. Since the Internet has become the global forum, individuals have been increasingly target of countless insults, defamation and threats; moreover, numerous personal information or media get published without consent. The practice shows that effective suppression and control of illegal behavior on the Internet and punishing the perpetrators is at the rudimental level. In order to provide proper protection for the victims of criminal offenses committed against their privacy in the digital world, it is necessary to create new models and approaches to solving this problem.

  18. Biometrics and privacy

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2001-01-01

    Biometrics offers many alternatives for protecting our privacy and preventing us from falling victim to crime. Biometrics can even serve as a solid basis for safe anonymous and semi-anonymous legal transactions. In this article Jan Grijpink clarifies which concepts and practical applications this

  19. Data privacy foundations, new developments and the big data challenge

    CERN Document Server

    Torra, Vicenç

    2017-01-01

    This book offers a broad, cohesive overview of the field of data privacy. It discusses, from a technological perspective, the problems and solutions of the three main communities working on data privacy: statistical disclosure control (those with a statistical background), privacy-preserving data mining (those working with data bases and data mining), and privacy-enhancing technologies (those involved in communications and security) communities. Presenting different approaches, the book describes alternative privacy models and disclosure risk measures as well as data protection procedures for respondent, holder and user privacy. It also discusses specific data privacy problems and solutions for readers who need to deal with big data.

  20. The disclosure of diagnosis codes can breach research participants' privacy.

    Science.gov (United States)

    Loukides, Grigorios; Denny, Joshua C; Malin, Bradley

    2010-01-01

    De-identified clinical data in standardized form (eg, diagnosis codes), derived from electronic medical records, are increasingly combined with research data (eg, DNA sequences) and disseminated to enable scientific investigations. This study examines whether released data can be linked with identified clinical records that are accessible via various resources to jeopardize patients' anonymity, and the ability of popular privacy protection methodologies to prevent such an attack. The study experimentally evaluates the re-identification risk of a de-identified sample of Vanderbilt's patient records involved in a genome-wide association study. It also measures the level of protection from re-identification, and data utility, provided by suppression and generalization. Privacy protection is quantified using the probability of re-identifying a patient in a larger population through diagnosis codes. Data utility is measured at a dataset level, using the percentage of retained information, as well as its description, and at a patient level, using two metrics based on the difference between the distribution of Internal Classification of Disease (ICD) version 9 codes before and after applying privacy protection. More than 96% of 2800 patients' records are shown to be uniquely identified by their diagnosis codes with respect to a population of 1.2 million patients. Generalization is shown to reduce further the percentage of de-identified records by less than 2%, and over 99% of the three-digit ICD-9 codes need to be suppressed to prevent re-identification. Popular privacy protection methods are inadequate to deliver a sufficiently protected and useful result when sharing data derived from complex clinical systems. The development of alternative privacy protection models is thus required.

  1. Developing genetic privacy legislation: the South Carolina experience.

    Science.gov (United States)

    Edwards, J G; Young, S R; Brooks, K A; Aiken, J H; Patterson, E D; Pritchett, S T

    1998-01-01

    The availability of presymptomatic and predisposition genetic testing has spawned the need for legislation prohibiting health insurance discrimination on the basis of genetic information. The federal effort, the Health Insurance Portability and Accountability Act (HIPAA) of 1996, falls short by protecting only those who access insurance through group plans. A committee of University of South Carolina professionals convened in 1996 to develop legislation in support of genetic privacy for the state of South Carolina. The legislation prevents health insurance companies from denying coverage or setting insurance rates on the basis of genetic information. It also protects the privacy of genetic information and prohibits performance of genetic tests without specific informed consent. In preparing the bill, genetic privacy laws from other states were reviewed, and a modified version of the Virginia law adopted. The South Carolina Committee for the Protection of Genetic Privacy version went a step further by including enforcement language and excluding Virginia's sunset clause. The definition of genetic information encompassed genetic test results, and importantly, includes family history of genetic disease. Our experience in navigating through the state legislature and working through opposition from the health insurance lobby is detailed herein.

  2. Privacy and human behavior in the age of information.

    Science.gov (United States)

    Acquisti, Alessandro; Brandimarte, Laura; Loewenstein, George

    2015-01-30

    This Review summarizes and draws connections between diverse streams of empirical research on privacy behavior. We use three themes to connect insights from social and behavioral sciences: people's uncertainty about the consequences of privacy-related behaviors and their own preferences over those consequences; the context-dependence of people's concern, or lack thereof, about privacy; and the degree to which privacy concerns are malleable—manipulable by commercial and governmental interests. Organizing our discussion by these themes, we offer observations concerning the role of public policy in the protection of privacy in the information age. Copyright © 2015, American Association for the Advancement of Science.

  3. Robust image obfuscation for privacy protection in Web 2.0 applications

    Science.gov (United States)

    Poller, Andreas; Steinebach, Martin; Liu, Huajian

    2012-03-01

    We present two approaches to robust image obfuscation based on permutation of image regions and channel intensity modulation. The proposed concept of robust image obfuscation is a step towards end-to-end security in Web 2.0 applications. It helps to protect the privacy of the users against threats caused by internet bots and web applications that extract biometric and other features from images for data-linkage purposes. The approaches described in this paper consider that images uploaded to Web 2.0 applications pass several transformations, such as scaling and JPEG compression, until the receiver downloads them. In contrast to existing approaches, our focus is on usability, therefore the primary goal is not a maximum of security but an acceptable trade-off between security and resulting image quality.

  4. Privacy-Preserving Matching of Spatial Datasets with Protection against Background Knowledge

    DEFF Research Database (Denmark)

    Ghinita, Gabriel; Vicente, Carmen Ruiz; Shang, Ning

    2010-01-01

    should be disclosed. Previous research efforts focused on private matching for relational data, and rely either on spaceembedding or on SMC techniques. Space-embedding transforms data points to hide their exact attribute values before matching is performed, whereas SMC protocols simulate complex digital...... circuits that evaluate the matching condition without revealing anything else other than the matching outcome. However, existing solutions have at least one of the following drawbacks: (i) they fail to protect against adversaries with background knowledge on data distribution, (ii) they compromise privacy...... by returning large amounts of false positives and (iii) they rely on complex and expensive SMC protocols. In this paper, we introduce a novel geometric transformation to perform private matching on spatial datasets. Our method is efficient and it is not vulnerable to background knowledge attacks. We consider...

  5. Exploring the Perceived Measures of Privacy: RFID in Public Applications

    Directory of Open Access Journals (Sweden)

    Mohammad Alamgir Hossain

    2014-06-01

    Full Text Available The purpose of this study is to explore the measures that may protect privacy of the users - in the context of RFID use in public applications. More specifically, this study investigates what the users perceive to have securing their privacy, particularly for the RFID applications in public uses. Qualitative research approach has been utilised for this study. The author conducted two focus-group discussion sessions and eight in-depth interviews in two countries: one from Australasia region (Australia and the other from Asia (Bangladesh, assuming that the status, and the perceptions and tolerance of the citizens on privacy issues are different in the stated regions. The explored factors have been analysed from privacy perspectives. The findings show that, in developed and developing countries, the basic perceptions of the users on privacy protection are complimentary; however, privacy is a more serious concern in Australia than in Bangladesh. Data analysis proposed some attributes that may improve users’ privacy perceptions when RFID is used in public applications. This study is the single initiative that focuses on privacy of RFID users from national-use context. As practical implication, the proposed attributes can be exercised by the deploying agencies that implement RFID technology for citizens’ use.

  6. Through Patients' Eyes: Regulation, Technology, Privacy, and the Future.

    Science.gov (United States)

    Petersen, Carolyn

    2018-04-22

    Privacy is commonly regarded as a regulatory requirement achieved via technical and organizational management practices. Those working in the field of informatics often play a role in privacy preservation as a result of their expertise in information technology, workflow analysis, implementation science, or related skills. Viewing privacy from the perspective of patients whose protected health information is at risk broadens the considerations to include the perceived duality of privacy; the existence of privacy within a context unique to each patient; the competing needs inherent within privacy management; the need for particular consideration when data are shared; and the need for patients to control health information in a global setting. With precision medicine, artificial intelligence, and other treatment innovations on the horizon, health care professionals need to think more broadly about how to preserve privacy in a health care environment driven by data sharing. Patient-reported privacy preferences, privacy portability, and greater transparency around privacy-preserving functionalities are potential strategies for ensuring that privacy regulations are met and privacy is preserved. Georg Thieme Verlag KG Stuttgart.

  7. An innovative privacy preserving technique for incremental datasets on cloud computing.

    Science.gov (United States)

    Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan

    2016-08-01

    Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Privacy Protection in Participatory Sensing Applications Requiring Fine-Grained Locations

    DEFF Research Database (Denmark)

    Dong, Kai; Gu, Tao; Tao, Xianping

    2010-01-01

    The emerging participatory sensing applications have brought a privacy risk where users expose their location information. Most of the existing solutions preserve location privacy by generalizing a precise user location to a coarse-grained location, and hence they cannot be applied in those appli...... provider is an trustworthy entity, making our solution more feasible to practical applications. We present and analyze our security model, and evaluate the performance and scalability of our system....

  9. 75 FR 68852 - Privacy Act of 1974; System of Records Notice

    Science.gov (United States)

    2010-11-09

    ...., Washington, DC 20590 or [email protected] . FOR FURTHER INFORMATION CONTACT: For privacy issues please... DEPARTMENT OF TRANSPORTATION Office of the Secretary Privacy Act of 1974; System of Records Notice... Secretary of Transportation (DOT/OST) proposes to establish a DOT-wide system of records under the Privacy...

  10. The Genetic Privacy Act and commentary

    Energy Technology Data Exchange (ETDEWEB)

    Annas, G.J.; Glantz, L.H.; Roche, P.A.

    1995-02-28

    The Genetic Privacy Act is a proposal for federal legislation. The Act is based on the premise that genetic information is different from other types of personal information in ways that require special protection. Therefore, to effectively protect genetic privacy unauthorized collection and analysis of individually identifiable DNA must be prohibited. As a result, the premise of the Act is that no stranger should have or control identifiable DNA samples or genetic information about an individual unless that individual specifically authorizes the collection of DNA samples for the purpose of genetic analysis, authorized the creation of that private information, and has access to and control over the dissemination of that information.

  11. Protecting Privacy in Big Data: A Layered Approach for Curriculum Integration

    Science.gov (United States)

    Schwieger, Dana; Ladwig, Christine

    2016-01-01

    The demand for college graduates with skills in big data analysis is on the rise. Employers in all industry sectors have found significant value in analyzing both separate and combined data streams. However, news reports continue to script headlines drawing attention to data improprieties, privacy breaches and identity theft. While data privacy is…

  12. Personalized privacy-preserving frequent itemset mining using randomized response.

    Science.gov (United States)

    Sun, Chongjing; Fu, Yan; Zhou, Junlin; Gao, Hui

    2014-01-01

    Frequent itemset mining is the important first step of association rule mining, which discovers interesting patterns from the massive data. There are increasing concerns about the privacy problem in the frequent itemset mining. Some works have been proposed to handle this kind of problem. In this paper, we introduce a personalized privacy problem, in which different attributes may need different privacy levels protection. To solve this problem, we give a personalized privacy-preserving method by using the randomized response technique. By providing different privacy levels for different attributes, this method can get a higher accuracy on frequent itemset mining than the traditional method providing the same privacy level. Finally, our experimental results show that our method can have better results on the frequent itemset mining while preserving personalized privacy.

  13. Digital privacy in Asia: Setting the agenda | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-06-09

    Jun 9, 2016 ... The report, A New Dawn: Privacy in Asia, summarizes the findings of the research. ... among citizens about protecting their personal data and Internet privacy. ... A study on mobile phone use by the poor has resulted in the ...

  14. Mandatory Enforcement of Privacy Policies using Trusted Computing Principles

    NARCIS (Netherlands)

    Kargl, Frank; Schaub, Florian; Dietzel, Stefan

    Modern communication systems and information technology create significant new threats to information privacy. In this paper, we discuss the need for proper privacy protection in cooperative intelligent transportation systems (cITS), one instance of such systems. We outline general principles for

  15. Because we care: Privacy Dashboard on Firefox OS

    OpenAIRE

    Piekarska, Marta; Zhou, Yun; Strohmeier, Dominik; Raake, Alexander

    2015-01-01

    In this paper we present the Privacy Dashboard -- a tool designed to inform and empower the people using mobile devices, by introducing features such as Remote Privacy Protection, Backup, Adjustable Location Accuracy, Permission Control and Secondary-User Mode. We have implemented our solution on FirefoxOS and conducted user studies to verify the usefulness and usability of our tool. The paper starts with a discussion of different aspects of mobile privacy, how users perceive it and how much ...

  16. Privacy and security of patient data in the pathology laboratory.

    Science.gov (United States)

    Cucoranu, Ioan C; Parwani, Anil V; West, Andrew J; Romero-Lauro, Gonzalo; Nauman, Kevin; Carter, Alexis B; Balis, Ulysses J; Tuthill, Mark J; Pantanowitz, Liron

    2013-01-01

    Data protection and security are critical components of routine pathology practice because laboratories are legally required to securely store and transmit electronic patient data. With increasing connectivity of information systems, laboratory work-stations, and instruments themselves to the Internet, the demand to continuously protect and secure laboratory information can become a daunting task. This review addresses informatics security issues in the pathology laboratory related to passwords, biometric devices, data encryption, internet security, virtual private networks, firewalls, anti-viral software, and emergency security situations, as well as the potential impact that newer technologies such as mobile devices have on the privacy and security of electronic protected health information (ePHI). In the United States, the Health Insurance Portability and Accountability Act (HIPAA) govern the privacy and protection of medical information and health records. The HIPAA security standards final rule mandate administrative, physical, and technical safeguards to ensure the confidentiality, integrity, and security of ePHI. Importantly, security failures often lead to privacy breaches, invoking the HIPAA privacy rule as well. Therefore, this review also highlights key aspects of HIPAA and its impact on the pathology laboratory in the United States.

  17. Privacy and security of patient data in the pathology laboratory

    Directory of Open Access Journals (Sweden)

    Ioan C Cucoranu

    2013-01-01

    Full Text Available Data protection and security are critical components of routine pathology practice because laboratories are legally required to securely store and transmit electronic patient data. With increasing connectivity of information systems, laboratory work-stations, and instruments themselves to the Internet, the demand to continuously protect and secure laboratory information can become a daunting task. This review addresses informatics security issues in the pathology laboratory related to passwords, biometric devices, data encryption, internet security, virtual private networks, firewalls, anti-viral software, and emergency security situations, as well as the potential impact that newer technologies such as mobile devices have on the privacy and security of electronic protected health information (ePHI. In the United States, the Health Insurance Portability and Accountability Act (HIPAA govern the privacy and protection of medical information and health records. The HIPAA security standards final rule mandate administrative, physical, and technical safeguards to ensure the confidentiality, integrity, and security of ePHI. Importantly, security failures often lead to privacy breaches, invoking the HIPAA privacy rule as well. Therefore, this review also highlights key aspects of HIPAA and its impact on the pathology laboratory in the United States.

  18. Keeping our patients' secrets.

    Science.gov (United States)

    Clough, J D; Rowan, D W; Nickelson, D E

    1999-10-01

    Protecting the privacy of the patient's medical record is a central issue in current discussions about a patient bill of rights, and controversy over a proposed "unique health identifier" has raised the decibel level of these discussions. At the heart of the debate is how best to resolve the inherent conflict between the individual's right to privacy and the need for access to patients' health information for reasons of public health, research, and health care management.

  19. Reclaiming Data Ownership: Differential Privacy in a Decentralized Setting

    OpenAIRE

    Asplund, Alexander Benjamin; Hartvigsen, Peter F

    2015-01-01

    In the field of privacy-preserving data mining the common practice have been to gather data from the users, centralize it in a single database, and employ various anonymization techniques to protect the personally identifiable information contained within the data. Both theoretical analyses and real-world examples of data breaches have proven that these methods have severe shortcomings in protecting an individual's privacy. A major breakthrough was achieved in 2006 when a method called differ...

  20. Privacy and Anonymity in the Information Society – Challenges for the European Union

    Directory of Open Access Journals (Sweden)

    Ioannis A. Tsoukalas

    2011-01-01

    Full Text Available Electronic information is challenging traditional views on property and privacy. The explosion of digital data, driven by novel web applications, social networking, and mobile devices makes data security and the protection of privacy increasingly difficult. Furthermore, biometric data and radiofrequency identification applications enable correlations that are able to trace our cultural, behavioral, and emotional states. The concept of privacy in the digital realm is transformed and emerges as one of the biggest risks facing today's Information Society. In this context, the European Union (EU policy-making procedures strive to adapt to the pace of technological advancement. The EU needs to improve the existing legal frameworks for privacy and data protection. It needs to work towards a “privacy by education” approach for the empowerment of “privacy-literate” European digital citizens.

  1. Security and privacy in biometrics

    CERN Document Server

    Campisi, Patrizio

    2013-01-01

    This important text/reference presents the latest secure and privacy-compliant techniques in automatic human recognition. Featuring viewpoints from an international selection of experts in the field, the comprehensive coverage spans both theory and practical implementations, taking into consideration all ethical and legal issues. Topics and features: presents a unique focus on novel approaches and new architectures for unimodal and multimodal template protection; examines signal processing techniques in the encrypted domain, security and privacy leakage assessment, and aspects of standardizati

  2. 77 FR 46643 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2012-08-06

    ... providing notice to and obtaining consent from parents. Conversely, online services whose business models..., challenging others to gameplay, swapping digital collectibles, participating in monitored `chat' with... Digital Democracy (``CDD''), Consumers Union (``CU''), and the Electronic Privacy Information Center...

  3. Information Privacy: The Attitudes and Behaviours of Internet Users

    OpenAIRE

    Jakovljević, Marija

    2011-01-01

    The rise of electronic commerce and the Internet have created new technologies and capabilities, which increase concern for privacy online. This study reports on the results of an investigation of Internet users attitudes towards concern for privacy online, online behaviours adopted under varying levels of concern for privacy (high, moderate and low) and the types of information Internet users are protective of. Methodological triangulation was used, whereby both quantitative and qualitative ...

  4. Preserving Differential Privacy for Similarity Measurement in Smart Environments

    Directory of Open Access Journals (Sweden)

    Kok-Seng Wong

    2014-01-01

    Full Text Available Advances in both sensor technologies and network infrastructures have encouraged the development of smart environments to enhance people’s life and living styles. However, collecting and storing user’s data in the smart environments pose severe privacy concerns because these data may contain sensitive information about the subject. Hence, privacy protection is now an emerging issue that we need to consider especially when data sharing is essential for analysis purpose. In this paper, we consider the case where two agents in the smart environment want to measure the similarity of their collected or stored data. We use similarity coefficient function FSC as the measurement metric for the comparison with differential privacy model. Unlike the existing solutions, our protocol can facilitate more than one request to compute FSC without modifying the protocol. Our solution ensures privacy protection for both the inputs and the computed FSC results.

  5. What was privacy?

    Science.gov (United States)

    McCreary, Lew

    2008-10-01

    Why is that question in the past tense? Because individuals can no longer feel confident that the details of their lives--from identifying numbers to cultural preferences--will be treated with discretion rather than exploited. Even as Facebook users happily share the names of their favorite books, movies, songs, and brands, they often regard marketers' use of that information as an invasion of privacy. In this wide-ranging essay, McCreary, a senior editor at HBR, examines numerous facets of the privacy issue, from Google searches, public shaming on the internet, and cell phone etiquette to passenger screening devices, public surveillance cameras, and corporate chief privacy officers. He notes that IBM has been a leader on privacy; its policy forswearing the use of employees' genetic information in hiring and benefits decisions predated the federal Genetic Information Nondiscrimination Act by three years. Now IBM is involved in an open-source project known as Higgins to provide users with transportable, potentially anonymous online presences. Craigslist, whose CEO calls it "as close to 100% user driven as you can get," has taken an extremely conservative position on privacy--perhaps easier for a company with a declared lack of interest in maximizing revenue. But TJX and other corporate victims of security breaches have discovered that retaining consumers' transaction information can be both costly and risky. Companies that underestimate the importance of privacy to their customers or fail to protect it may eventually face harsh regulation, reputational damage, or both. The best thing they can do, says the author, is negotiate directly with those customers over where to draw the line.

  6. Privacy in the digital world: medical and health data outside of HIPAA protections.

    Science.gov (United States)

    Glenn, Tasha; Monteith, Scott

    2014-11-01

    Increasing quantities of medical and health data are being created outside of HIPAA protection, primarily by patients. Data sources are varied, including the use of credit cards for physician visit and medication co-pays, Internet searches, email content, social media, support groups, and mobile health apps. Most medical and health data not covered by HIPAA are controlled by third party data brokers and Internet companies. These companies combine this data with a wide range of personal information about consumer daily activities, transactions, movements, and demographics. The combined data are used for predictive profiling of individual health status, and often sold for advertising and other purposes. The rapid expansion of medical and health data outside of HIPAA protection is encroaching on privacy and the doctor-patient relationship, and is of particular concern for psychiatry. Detailed discussion of the appropriate handling of this medical and health data is needed by individuals with a wide variety of expertise.

  7. Public privacy: Reciprocity and Silence

    Directory of Open Access Journals (Sweden)

    Jenny Kennedy

    2014-10-01

    Full Text Available In his 1958 poem 'Dedication to my Wife' TS Eliot proclaims "these are private words addressed to you in public". Simultaneously written for his wife, Valerie Fletcher, and to the implied you of a discourse network, Eliot's poem helps to illustrate the narrative voices and silences that are constitutive of an intimate public sphere. This paper situates reciprocity as a condition of possibility for public privacy. It shows how reciprocity is enabled by systems of code operating through material and symbolic registers. Code promises to control communication, to produce neutral, systemic forms of meaning. Yet such automation is challenged by uneven and fragmented patterns of reciprocity. Moreover, examining the media of public privacy reveals historical trajectories important for understanding contemporary socio­technical platforms of reciprocity. To explore the implicit requirement of reciprocity in publicly private practices, three sites of communication are investigated framed by a media archaeology perspective: postal networks, the mail­art project PostSecret and the anonymous zine 'You'.

  8. Privacy Protection in Data Sharing : Towards Feedback Solutions

    NARCIS (Netherlands)

    R. Meijer; P. Conradie; R. Choenni; M.S. Bargh

    2014-01-01

    Sharing data is gaining importance in recent years due to proliferation of social media and a growing tendency of governments to gain citizens’ trust through being transparent. Data dissemination, however, increases chance of compromising privacy sensitive data, which undermines trust of data

  9. A Failure to "Do No Harm" -- India's Aadhaar biometric ID program and its inability to protect privacy in relation to measures in Europe and the U.S.

    Science.gov (United States)

    Dixon, Pam

    2017-01-01

    It is important that digital biometric identity systems be used by governments with a Do no Harm mandate, and the establishment of regulatory, enforcement and restorative frameworks ensuring data protection and privacy needs to transpire prior to the implementation of technological programs and services. However, when, and where large government bureaucracies are involved, the proper planning and execution of public service programs very often result in ungainly outcomes, and are often qualitatively not guaranteeable. Several important factors, such as the strength of the political and legal systems, may affect such cases as the implementation of a national digital identity system. Digital identity policy development, as well as technical deployment of biometric technologies and enrollment processes, may all differ markedly, and could depend in some part at least, on the overall economic development of the country in question, or political jurisdiction, among other factors. This article focuses on the Republic of India's national digital biometric identity system, the Aadhaar , for its development, data protection and privacy policies, and impact. Two additional political jurisdictions, the European Union, and the United States are also situationally analyzed as they may be germane to data protection and privacy policies originated to safeguard biometric identities. Since biometrics are foundational elements in modern digital identity systems, expression of data protection policies that orient and direct how biometrics are to be utilized as unique identifiers are the focus of this analysis. As more of the world's economies create and elaborate capacities, capabilities and functionalities within their respective digital ambits, it is not enough to simply install suitable digital identity technologies; much, much more - is durably required. For example, both vigorous and descriptive means of data protection should be well situated within any jurisdictionally relevant

  10. Comparative Approaches to Biobanks and Privacy.

    Science.gov (United States)

    Rothstein, Mark A; Knoppers, Bartha Maria; Harrell, Heather L

    2016-03-01

    Laws in the 20 jurisdictions studied for this project display many similar approaches to protecting privacy in biobank research. Although few have enacted biobank-specific legislation, many countries address biobanking within other laws. All provide for some oversight mechanisms for biobank research, even though the nature of that oversight varies between jurisdictions. Most have some sort of controlled access system in place for research with biobank specimens. While broad consent models facilitate biobanking, countries without national or federated biobanks have been slow to adopt broad consent. International guidelines have facilitated sharing and generally take a proportional risk approach, but many countries have provisions guiding international sharing and a few even limit international sharing. Although privacy laws may not prohibit international collaborations, the multi-prong approach to privacy unique to each jurisdiction can complicate international sharing. These symposium issues can serve as a resource for explaining the sometimes intricate privacy laws in each studied jurisdiction, outlining the key issues with regards to privacy and biobanking, and serving to describe a framework for the process of harmonization of privacy laws. © 2016 American Society of Law, Medicine & Ethics.

  11. 45 CFR 2508.3 - What is the Corporation's Privacy Act policy?

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false What is the Corporation's Privacy Act policy? 2508... NATIONAL AND COMMUNITY SERVICE IMPLEMENTATION OF THE PRIVACY ACT OF 1974 § 2508.3 What is the Corporation's Privacy Act policy? It is the policy of the Corporation to protect, preserve, and defend the right of...

  12. δ-dependency for privacy-preserving XML data publishing.

    Science.gov (United States)

    Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny

    2014-08-01

    An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis

  13. Adolescents and Social Media: Privacy, Brain Development, and the Law.

    Science.gov (United States)

    Costello, Caitlin R; McNiel, Dale E; Binder, Renée L

    2016-09-01

    Adolescents under the age of 18 are not recognized in the law as adults, nor do they have the fully developed capacity of adults. Yet teens regularly enter into contractual arrangements with operators of websites to send and post information about themselves. Their level of development limits their capacity to understand the implications of online communications, yet the risks are real to adolescents' privacy and reputations. This article explores an apparent contradiction in the law: that in areas other than online communications, U.S. legal systems seek to protect minors from the limitations of youth. The Children's Online Privacy Protection Act provides some protection to the privacy of young people, but applies only to children under age 13, leaving minors of ages 13 to 17 with little legal protection in their online activities. In this article, we discuss several strategies to mitigate the risks of adolescent online activity. © 2016 American Academy of Psychiatry and the Law.

  14. Privacy in the Post-NSA Era: Time for a Fundamental Revision?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Big Brother Watch and others have filed a complaint against the United Kingdom under the European Convention on Human Rights about a violation of Article 8, the right to privacy. It regards the NSA affair and UK-based surveillance activities operated by secret services. The question is whether it

  15. Pervasive Computing, Privacy and Distribution of the Self

    Directory of Open Access Journals (Sweden)

    Soraj Hongladarom

    2011-05-01

    Full Text Available The emergence of what is commonly known as “ambient intelligence” or “ubiquitous computing” means that our conception of privacy and trust needs to be reconsidered. Many have voiced their concerns about the threat to privacy and the more prominent role of trust that have been brought about by emerging technologies. In this paper, I will present an investigation of what this means for the self and identity in our ambient intelligence environment. Since information about oneself can be actively distributed and processed, it is proposed that in a significant sense it is the self itself that is distributed throughout a pervasive or ubiquitous computing network when information pertaining to the self of the individual travels through the network. Hence privacy protection needs to be extended to all types of information distributed. It is also recommended that appropriately strong legislation on privacy and data protection regarding this pervasive network is necessary, but at present not sufficient, to ensure public trust. What is needed is a campaign on public awareness and positive perception of the technology.

  16. Privacy as Fundamental right:The Case of Indian AAdhar

    DEFF Research Database (Denmark)

    Khajuria, Samant; Skouby, Knud Erik; Sørensen, Lene Tolstrup

    In August 2017; unanimous judgment by the Supreme Court of India (SCI) i was a resounding victory for privacy. The ruling was the outcome of a petition challenging the constitutional validity of the Indian biometric identity scheme Aadhaar. The one-page order signed by all nine judges declares....... “The right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” Privacy is a key concerns today in the emerging digital market of India. The vision of Digital Indian can only......-mining tool. Looking into EU General Data Protection Regulation (GDPR), where the main goal of the regulation is to build and/or increase trust in the EU citizens in using digital services. Similarly, a clear well defined privacy regulations needs to be in place in India with heavy fines failing to comply....

  17. Practical Secure Transaction for Privacy-Preserving Ride-Hailing Services

    Directory of Open Access Journals (Sweden)

    Chenglong Cao

    2018-01-01

    Full Text Available Ride-hailing service solves the issue of taking a taxi difficultly in rush hours. It is changing the way people travel and has had a rapid development in recent years. Since the service is offered over the Internet, there is a great deal of uncertainty about security and privacy. Focusing on the issue, we changed payment pattern of existing systems and designed a privacy protection ride-hailing scheme. E-cash was generated by a new partially blind signature protocol that achieves e-cash unforgeability and passenger privacy. Particularly, in the face of a service platform and a payment platform, a passenger is still anonymous. Additionally, a lightweight hash chain was constructed to keep e-cash divisible and reusable, which increases practicability of transaction systems. The analysis shows that the scheme has small communication and computation costs, and it can be effectively applied in the ride-hailing service with privacy protection.

  18. Evaluating Common Privacy Vulnerabilities in Internet Service Providers

    Science.gov (United States)

    Kotzanikolaou, Panayiotis; Maniatis, Sotirios; Nikolouzou, Eugenia; Stathopoulos, Vassilios

    Privacy in electronic communications receives increased attention in both research and industry forums, stemming from both the users' needs and from legal and regulatory requirements in national or international context. Privacy in internet-based communications heavily relies on the level of security of the Internet Service Providers (ISPs), as well as on the security awareness of the end users. This paper discusses the role of the ISP in the privacy of the communications. Based on real security audits performed in national-wide ISPs, we illustrate privacy-specific threats and vulnerabilities that many providers fail to address when implementing their security policies. We subsequently provide and discuss specific security measures that the ISPs can implement, in order to fine-tune their security policies in the context of privacy protection.

  19. Pythia: A Privacy-enhanced Personalized Contextual Suggestion System for Tourism

    NARCIS (Netherlands)

    Drosatos, G.; Efraimidis, P.S.; Arampatzis, A.; Stamatelatos, G.; Athanasiadis, I.N.

    2015-01-01

    We present Pythia, a privacy-enhanced non-invasive contextual suggestion system for tourists, with important architectural innovations. The system offers high quality personalized recommendations, non-invasive operation and protection of user privacy. A key feature of Pythia is the exploitation of

  20. 76 FR 71417 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Law Enforcement Agencies (LEA...

    Science.gov (United States)

    2011-11-17

    ...; Computer Matching Program (SSA/ Law Enforcement Agencies (LEA)) Match Number 5001 AGENCY: Social Security... protections for such persons. The Privacy Act, as amended, regulates the use of computer matching by Federal... accordance with the Privacy Act of 1974, as amended by the Computer Matching and Privacy Protection Act of...

  1. Acoustic assessment of speech privacy curtains in two nursing units

    Science.gov (United States)

    Pope, Diana S.; Miller-Klein, Erik T.

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s’ standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered. PMID:26780959

  2. Acoustic assessment of speech privacy curtains in two nursing units.

    Science.gov (United States)

    Pope, Diana S; Miller-Klein, Erik T

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s' standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  3. Acoustic assessment of speech privacy curtains in two nursing units

    Directory of Open Access Journals (Sweden)

    Diana S Pope

    2016-01-01

    Full Text Available Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient′s bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s′ standard hospital construction and the other was newly refurbished (2013 with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  4. Enhancing Privacy Education with a Technical Emphasis in IT Curriculum

    Directory of Open Access Journals (Sweden)

    Svetlana Peltsverger

    2015-12-01

    Full Text Available The paper describes the development of four learning modules that focus on technical details of how a person’s privacy might be compromised in real-world scenarios. The paper shows how students benefited from the addition of hands-on learning experiences of privacy and data protection to the existing information technology courses. These learning modules raised students’ awareness of potential breaches of privacy as a user as well as a developer. The demonstration of a privacy breach in action helped students to design, configure, and implement technical solutions to prevent privacy violations. The assessment results demonstrate the strength of the technical approach.

  5. Mining Roles and Access Control for Relational Data under Privacy and Accuracy Constraints

    Science.gov (United States)

    Pervaiz, Zahid

    2013-01-01

    Access control mechanisms protect sensitive information from unauthorized users. However, when sensitive information is shared and a Privacy Protection Mechanism (PPM) is not in place, an authorized insider can still compromise the privacy of a person leading to identity disclosure. A PPM can use suppression and generalization to anonymize and…

  6. Protecting and Evaluating Genomic Privacy in Medical Tests and Personalized Medicine

    OpenAIRE

    Ayday, Erman; Raisaro, Jean Louis; Rougemont, Jacques; Hubaux, Jean-Pierre

    2013-01-01

    In this paper, we propose privacy-enhancing technologies for medical tests and personalized medicine methods that use patients' genomic data. Focusing on genetic disease-susceptibility tests, we develop a new architecture (between the patient and the medical unit) and propose a "privacy-preserving disease susceptibility test" (PDS) by using homomorphic encryption and proxy re-encryption. Assuming the whole genome sequencing to be done by a certified institution, we propose to store patients' ...

  7. Optimal Black-Box Secret Sharing over Arbitrary Abelian Groups

    DEFF Research Database (Denmark)

    Cramer, Ronald; Fehr, Serge

    2002-01-01

    A black-box secret sharing scheme for the threshold access structure T t,n is one which works over any finite Abelian group G. Briefly, such a scheme differs from an ordinary linear secret sharing scheme (over, say, a given finite field) in that distribution matrix and reconstruction vectors...... are defined over ℤ and are designed independently of the group G from which the secret and the shares are sampled. This means that perfect completeness and perfect privacy are guaranteed regardless of which group G is chosen. We define the black-box secret sharing problem as the problem of devising......, for an arbitrary given T t,n , a scheme with minimal expansion factor, i.e., where the length of the full vector of shares divided by the number of players n is minimal. Such schemes are relevant for instance in the context of distributed cryptosystems based on groups with secret or hard to compute group order...

  8. User Privacy and Empowerment: Trends, Challenges, and Opportunities

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2018-01-01

    to the service providers. Considering business models that are slanted towards service provid-ers, privacy has become a crucial issue in today’s fast growing digital world. Hence, this paper elaborates personal information flow between users, service providers, and data brokers. We also discussed the significant...... privacy issues like present business models, user awareness about privacy and user control over per-sonal data. To address such issues, this paper also identified challenges that com-prise unavailability of effective privacy awareness or protection tools and the ef-fortless way to study and see the flow...... of personal information and its manage-ment. Thus, empowering users and enhancing awareness are essential to compre-hending the value of secrecy. This paper also introduced latest advances in the domain of privacy issues like User Managed Access (UMA) can state suitable requirements for user empowerment...

  9. DQC Comments on the Posted Recommendations Regarding Data Security and Privacy Protections

    Science.gov (United States)

    Data Quality Campaign, 2010

    2010-01-01

    The U.S. Department of Education is conducting several activities to address privacy and security issues related to education data. Earlier this year a contractor for the Department convened a group of privacy and security experts and produced a report with recommendations to the Department on ways they can address emerging challenges in…

  10. Privacy Enhancements for Inexact Biometric Templates

    Science.gov (United States)

    Ratha, Nalini; Chikkerur, Sharat; Connell, Jonathan; Bolle, Ruud

    Traditional authentication schemes utilize tokens or depend on some secret knowledge possessed by the user for verifying his or her identity. Although these techniques are widely used, they have several limitations. Both tokenand knowledge-based approaches cannot differentiate between an authorized user and an impersonator having access to the tokens or passwords. Biometrics-based authentication schemes overcome these limitations while offering usability advantages in the area of password management. However, despite its obvious advantages, the use of biometrics raises several security and privacy concerns.

  11. 77 FR 30433 - Privacy Act of 1974: Implementation of Exemptions; Automated Targeting System

    Science.gov (United States)

    2012-05-23

    ... Border Protection, Mint Annex, 799 Ninth Street NW., Washington, DC 20229. For privacy issues please... Secretary 6 CFR Part 5 [Docket No. DHS-2012-0020] Privacy Act of 1974: Implementation of Exemptions; Automated Targeting System AGENCY: Privacy Office, DHS. ACTION: Notice of proposed rulemaking. SUMMARY: The...

  12. Protecting the privacy of individual general practice patient electronic records for geospatial epidemiology research.

    Science.gov (United States)

    Mazumdar, Soumya; Konings, Paul; Hewett, Michael; Bagheri, Nasser; McRae, Ian; Del Fante, Peter

    2014-12-01

    General practitioner (GP) practices in Australia are increasingly storing patient information in electronic databases. These practice databases can be accessed by clinical audit software to generate reports that inform clinical or population health decision making and public health surveillance. Many audit software applications also have the capacity to generate de-identified patient unit record data. However, the de-identified nature of the extracted data means that these records often lack geographic information. Without spatial references, it is impossible to build maps reflecting the spatial distribution of patients with particular conditions and needs. Links to socioeconomic, demographic, environmental or other geographically based information are also not possible. In some cases, relatively coarse geographies such as postcode are available, but these are of limited use and researchers cannot undertake precision spatial analyses such as calculating travel times. We describe a method that allows researchers to implement meaningful mapping and spatial epidemiological analyses of practice level patient data while preserving privacy. This solution has been piloted in a diabetes risk research project in the patient population of a practice in Adelaide. The method offers researchers a powerful means of analysing geographic clinic data in a privacy-protected manner. © 2014 Public Health Association of Australia.

  13. A Privacy Preservation Model for Health-Related Social Networking Sites

    Science.gov (United States)

    2015-01-01

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS. PMID:26155953

  14. A Privacy Preservation Model for Health-Related Social Networking Sites.

    Science.gov (United States)

    Li, Jingquan

    2015-07-08

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS.

  15. Secure and privacy-preserving data communication in Internet of Things

    CERN Document Server

    Zhu, Liehuang; Xu, Chang

    2017-01-01

    This book mainly concentrates on protecting data security and privacy when participants communicate with each other in the Internet of Things (IoT). Technically, this book categorizes and introduces a collection of secure and privacy-preserving data communication schemes/protocols in three traditional scenarios of IoT: wireless sensor networks, smart grid and vehicular ad-hoc networks recently. This book presents three advantages which will appeal to readers. Firstly, it broadens reader’s horizon in IoT by touching on three interesting and complementary topics: data aggregation, privacy protection, and key agreement and management. Secondly, various cryptographic schemes/protocols used to protect data confidentiality and integrity is presented. Finally, this book will illustrate how to design practical systems to implement the algorithms in the context of IoT communication. In summary, readers can simply learn and directly apply the new technologies to communicate data in IoT after reading this book.

  16. Enforcement of Privacy Policies over Multiple Online Social Networks for Collaborative Activities

    Science.gov (United States)

    Wu, Zhengping; Wang, Lifeng

    Our goal is to tend to develop an enforcement architecture of privacy policies over multiple online social networks. It is used to solve the problem of privacy protection when several social networks build permanent or temporary collaboration. Theoretically, this idea is practical, especially due to more and more social network tend to support open source framework “OpenSocial”. But as we known different social network websites may have the same privacy policy settings based on different enforcement mechanisms, this would cause problems. In this case, we have to manually write code for both sides to make the privacy policy settings enforceable. We can imagine that, this is a huge workload based on the huge number of current social networks. So we focus on proposing a middleware which is used to automatically generate privacy protection component for permanent integration or temporary interaction of social networks. This middleware provide functions, such as collecting of privacy policy of each participant in the new collaboration, generating a standard policy model for each participant and mapping all those standard policy to different enforcement mechanisms of those participants.

  17. Social Media Users’ Legal Consciousness About Privacy

    Directory of Open Access Journals (Sweden)

    Katharine Sarikakis

    2017-02-01

    Full Text Available This article explores the ways in which the concept of privacy is understood in the context of social media and with regard to users’ awareness of privacy policies and laws in the ‘Post-Snowden’ era. In the light of presumably increased public exposure to privacy debates, generated partly due to the European “Right to be Forgotten” ruling and the Snowden revelations on mass surveillance, this article explores users’ meaning-making of privacy as a matter of legal dimension in terms of its violations and threats online and users’ ways of negotiating their Internet use, in particular social networking sites. Drawing on the concept of legal consciousness, this article explores through focus group interviews the ways in which social media users negotiate privacy violations and what role their understanding of privacy laws (or lack thereof might play in their strategies of negotiation. The findings are threefold: first, privacy is understood almost universally as a matter of controlling one’s own data, including information disclosure even to friends, and is strongly connected to issues about personal autonomy; second, a form of resignation with respect to control over personal data appears to coexist with a recognized need to protect one’s private data, while respondents describe conscious attempts to circumvent systems of monitoring or violation of privacy, and third, despite widespread coverage of privacy legal issues in the press, respondents’ concerns about and engagement in “self-protecting” tactics derive largely from being personally affected by violations of law and privacy.

  18. Privacy and security in teleradiology

    International Nuclear Information System (INIS)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  19. Privacy and security in teleradiology

    Energy Technology Data Exchange (ETDEWEB)

    Ruotsalainen, Pekka [National Institute for Health and Welfare, Helsinki (Finland)], E-mail: pekka.ruotsalainen@THL.fi

    2010-01-15

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  20. 78 FR 15731 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0011] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... amended by the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer...

  1. Open source tools for standardized privacy protection of medical images

    Science.gov (United States)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  2. By Policy or Design? Privacy in the US in a Post-Snowden World

    OpenAIRE

    Halbert, Debora; Larsson, Stefan

    2015-01-01

    By drawing from a number of studies in the field as well as the Snowden revelations and the case of MegaUpload/MEGA, the article makes an analysis of relevant legislation on privacy in the digital context. The purpose of the analysis is to understand to what extent and how the current paradigm of privacy protection is, or is not, sufficient for contemporary needs. In particular, we ask how privacy is protected by policy in an American context and to what extent this is or is not insufficient ...

  3. A Content Analysis of Library Vendor Privacy Policies: Do They Meet Our Standards?

    Science.gov (United States)

    Magi, Trina J.

    2010-01-01

    Librarians have a long history of protecting user privacy, but they have done seemingly little to understand or influence the privacy policies of library resource vendors that increasingly collect user information through Web 2.0-style personalization features. After citing evidence that college students value privacy, this study used content…

  4. 77 FR 37061 - DHS Data Privacy and Integrity Advisory Committee

    Science.gov (United States)

    2012-06-20

    .... Please note that the meeting may end early if the Committee has completed its business. ADDRESSES: The... draft report to the Department providing guidance on privacy protections for cybersecurity pilot... . Please note that the meeting may end early if all business is completed. Privacy Act Statement: DHS's Use...

  5. 78 FR 1275 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-01-08

    ... Social Security Administration (Computer Matching Agreement 1071). SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of... of its new computer matching program with the Social Security Administration (SSA). DATES: OPM will...

  6. Governance Through Privacy, Fairness, and Respect for Individuals.

    Science.gov (United States)

    Baker, Dixie B; Kaye, Jane; Terry, Sharon F

    2016-01-01

    Individuals have a moral claim to be involved in the governance of their personal data. Individuals' rights include privacy, autonomy, and the ability to choose for themselves how they want to manage risk, consistent with their own personal values and life situations. The Fair Information Practices principles (FIPPs) offer a framework for governance. Privacy-enhancing technology that complies with applicable law and FIPPs offers a dynamic governance tool for enabling the fair and open use of individual's personal data. Any governance model must protect against the risks posed by data misuse. Individual perceptions of risks are a subjective function involving individuals' values toward self, family, and society, their perceptions of trust, and their cognitive decision-making skills. Individual privacy protections and individuals' right to choose are codified in the HIPAA Privacy Rule, which attempts to strike a balance between the dual goals of information flow and privacy protection. The choices most commonly given individuals regarding the use of their health information are binary ("yes" or "no") and immutable. Recent federal recommendations and law recognize the need for granular, dynamic choices. Individuals expect that they will govern the use of their own health and genomic data. Failure to build and maintain individuals' trust increases the likelihood that they will refuse to grant permission to access or use their data. The "no surprises principle" asserts that an individual's personal information should never be collected, used, transmitted, or disclosed in a way that would surprise the individual were she to learn about it. The FIPPs provide a powerful framework for enabling data sharing and use, while maintaining trust. We introduce the eight FIPPs adopted by the Department of Health and Human Services, and provide examples of their interpretation and implementation. Privacy risk and health risk can be reduced by giving consumers control, autonomy, and

  7. PrivateRide: A Privacy-Enhanced Ride-Hailing Service

    Directory of Open Access Journals (Sweden)

    Pham Anh

    2017-04-01

    Full Text Available In the past few years, we have witnessed a rise in the popularity of ride-hailing services (RHSs, an online marketplace that enables accredited drivers to use their own cars to drive ride-hailing users. Unlike other transportation services, RHSs raise significant privacy concerns, as providers are able to track the precise mobility patterns of millions of riders worldwide. We present the first survey and analysis of the privacy threats in RHSs. Our analysis exposes high-risk privacy threats that do not occur in conventional taxi services. Therefore, we propose PrivateRide, a privacy-enhancing and practical solution that offers anonymity and location privacy for riders, and protects drivers’ information from harvesting attacks. PrivateRide lowers the high-risk privacy threats in RHSs to a level that is at least as low as that of many taxi services. Using real data-sets from Uber and taxi rides, we show that PrivateRide significantly enhances riders’ privacy, while preserving tangible accuracy in ride matching and fare calculation, with only negligible effects on convenience. Moreover, by using our Android implementation for experimental evaluations, we show that PrivateRide’s overhead during ride setup is negligible. In short, we enable privacy-conscious riders to achieve levels of privacy that are not possible in current RHSs and even in some conventional taxi services, thereby offering a potential business differentiator.

  8. MUSES RT2AE V P/DP: On the Road to Privacy-Friendly Security Technologies in the Workplace

    OpenAIRE

    Van Der Sype, Yung Shin Marleen; Guislain, Jonathan; Seigneur, Jean-Marc; Titi, Xavier

    2016-01-01

    Successful protection of company data assets requires strong technological support. As many security incidents still occur from within, security technologies often include elements to monitor the behaviour of employees. As those security systems are considered as privacy-intrusive, they are hard to align with the privacy and data protection rights of the employees of the company. Even though there is currently no legal obligation for developers to embed privacy and data protection in security...

  9. Electronic Communication of Protected Health Information: Privacy, Security, and HIPAA Compliance.

    Science.gov (United States)

    Drolet, Brian C; Marwaha, Jayson S; Hyatt, Brad; Blazar, Phillip E; Lifchez, Scott D

    2017-06-01

    Technology has enhanced modern health care delivery, particularly through accessibility to health information and ease of communication with tools like mobile device messaging (texting). However, text messaging has created new risks for breach of protected health information (PHI). In the current study, we sought to evaluate hand surgeons' knowledge and compliance with privacy and security standards for electronic communication by text message. A cross-sectional survey of the American Society for Surgery of the Hand membership was conducted in March and April 2016. Descriptive and inferential statistical analyses were performed of composite results as well as relevant subgroup analyses. A total of 409 responses were obtained (11% response rate). Although 63% of surgeons reported that they believe that text messaging does not meet Health Insurance Portability and Accountability Act of 1996 security standards, only 37% reported they do not use text messages to communicate PHI. Younger surgeons and respondents who believed that their texting was compliant were statistically significantly more like to report messaging of PHI (odds ratio, 1.59 and 1.22, respectively). A majority of hand surgeons in this study reported the use of text messaging to communicate PHI. Of note, neither the Health Insurance Portability and Accountability Act of 1996 statute nor US Department of Health and Human Services specifically prohibits this form of electronic communication. To be compliant, surgeons, practices, and institutions need to take reasonable security precautions to prevent breach of privacy with electronic communication. Communication of clinical information by text message is not prohibited under Health Insurance Portability and Accountability Act of 1996, but surgeons should use appropriate safeguards to prevent breach when using this form of communication. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  10. Aligning the Effective Use of Student Data with Student Privacy and Security Laws

    Science.gov (United States)

    Winnick, Steve; Coleman, Art; Palmer, Scott; Lipper, Kate; Neiditz, Jon

    2011-01-01

    This legal and policy guidance provides a summary framework for state policymakers as they work to use longitudinal data to improve student achievement while also protecting the privacy and security of individual student records. Summarizing relevant federal privacy and security laws, with a focus on the Family Educational Records and Privacy Act…

  11. Privacy and legal issues in cloud computing

    CERN Document Server

    Weber, Rolf H

    2015-01-01

    Adopting a multi-disciplinary and comparative approach, this book focuses on emerging and innovative attempts to tackle privacy and legal issues in cloud computing, such as personal data privacy, security and intellectual property protection. Leading international academics and practitioners in the fields of law and computer science examine the specific legal implications of cloud computing pertaining to jurisdiction, biomedical practice and information ownership. This collection offers original and critical responses to the rising challenges posed by cloud computing.

  12. 77 FR 60131 - DHS Data Privacy and Integrity Advisory Committee

    Science.gov (United States)

    2012-10-02

    .... to 5 p.m. Please note that the meeting may end early if the Committee has completed its business... privacy protections for the collection and use of biometrics and for cybersecurity pilot programs. These... meeting may end early if all business is completed. Privacy Act Statement: DHS's Use of Your Information...

  13. 77 FR 70795 - Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security...

    Science.gov (United States)

    2012-11-27

    ... 20598-6036; email: [email protected] . For privacy issues please contact: Jonathan Cantor, (202-343... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security Administration System of Records AGENCY: Privacy...

  14. 77 FR 70792 - Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security...

    Science.gov (United States)

    2012-11-27

    ..., VA 20598-6036; email: [email protected] . For privacy issues please contact: Jonathan R. Cantor... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security Administration System of Records AGENCY: Privacy...

  15. Authentication Method for Privacy Protection in Smart Grid Environment

    Directory of Open Access Journals (Sweden)

    Do-Eun Cho

    2014-01-01

    Full Text Available Recently, the interest in green energy is increasing as a means to resolve problems including the exhaustion of the energy source and, effective management of energy through the convergence of various fields. Therefore, the projects of smart grid which is called intelligent electrical grid for the accomplishment of low carbon green growth are being carried out in a rush. However, as the IT is centered upon the electrical grid, the shortage of IT also appears in smart grid and the complexity of convergence is aggravating the problem. Also, various personal information and payment information within the smart grid are gradually becoming big data and target for external invasion and attack; thus, there is increase in concerns for this matter. The purpose of this study is to analyze the security vulnerabilities and security requirement within smart grid and the authentication and access control method for privacy protection within home network. Therefore, we propose a secure access authentication and remote control method for user’s home device within home network environment, and we present their security analysis. The proposed access authentication method blocks the unauthorized external access and enables secure remote access to home network and its devices with a secure message authentication protocol.

  16. Scalable privacy-preserving data sharing methodology for genome-wide association studies.

    Science.gov (United States)

    Yu, Fei; Fienberg, Stephen E; Slavković, Aleksandra B; Uhler, Caroline

    2014-08-01

    The protection of privacy of individual-level information in genome-wide association study (GWAS) databases has been a major concern of researchers following the publication of "an attack" on GWAS data by Homer et al. (2008). Traditional statistical methods for confidentiality and privacy protection of statistical databases do not scale well to deal with GWAS data, especially in terms of guarantees regarding protection from linkage to external information. The more recent concept of differential privacy, introduced by the cryptographic community, is an approach that provides a rigorous definition of privacy with meaningful privacy guarantees in the presence of arbitrary external information, although the guarantees may come at a serious price in terms of data utility. Building on such notions, Uhler et al. (2013) proposed new methods to release aggregate GWAS data without compromising an individual's privacy. We extend the methods developed in Uhler et al. (2013) for releasing differentially-private χ(2)-statistics by allowing for arbitrary number of cases and controls, and for releasing differentially-private allelic test statistics. We also provide a new interpretation by assuming the controls' data are known, which is a realistic assumption because some GWAS use publicly available data as controls. We assess the performance of the proposed methods through a risk-utility analysis on a real data set consisting of DNA samples collected by the Wellcome Trust Case Control Consortium and compare the methods with the differentially-private release mechanism proposed by Johnson and Shmatikov (2013). Copyright © 2014 Elsevier Inc. All rights reserved.

  17. New Collaborative Filtering Algorithms Based on SVD++ and Differential Privacy

    Directory of Open Access Journals (Sweden)

    Zhengzheng Xian

    2017-01-01

    Full Text Available Collaborative filtering technology has been widely used in the recommender system, and its implementation is supported by the large amount of real and reliable user data from the big-data era. However, with the increase of the users’ information-security awareness, these data are reduced or the quality of the data becomes worse. Singular Value Decomposition (SVD is one of the common matrix factorization methods used in collaborative filtering, which introduces the bias information of users and items and is realized by using algebraic feature extraction. The derivative model SVD++ of SVD achieves better predictive accuracy due to the addition of implicit feedback information. Differential privacy is defined very strictly and can be proved, which has become an effective measure to solve the problem of attackers indirectly deducing the personal privacy information by using background knowledge. In this paper, differential privacy is applied to the SVD++ model through three approaches: gradient perturbation, objective-function perturbation, and output perturbation. Through theoretical derivation and experimental verification, the new algorithms proposed can better protect the privacy of the original data on the basis of ensuring the predictive accuracy. In addition, an effective scheme is given that can measure the privacy protection strength and predictive accuracy, and a reasonable range for selection of the differential privacy parameter is provided.

  18. Adipose Tissue-Derived Stem Cell Secreted IGF-1 Protects Myoblasts from the Negative Effect of Myostatin

    Directory of Open Access Journals (Sweden)

    Sebastian Gehmert

    2014-01-01

    Full Text Available Myostatin, a TGF-β family member, is associated with inhibition of muscle growth and differentiation and might interact with the IGF-1 signaling pathway. Since IGF-1 is secreted at a bioactive level by adipose tissue-derived mesenchymal stem cells (ASCs, these cells (ASCs provide a therapeutic option for Duchenne Muscular Dystrophy (DMD. But the protective effect of stem cell secreted IGF-1 on myoblast under high level of myostatin remains unclear. In the present study murine myoblasts were exposed to myostatin under presence of ASCs conditioned medium and investigated for proliferation and apoptosis. The protective effect of IGF-1 was further examined by using IGF-1 neutralizing and receptor antibodies as well as gene silencing RNAi technology. MyoD expression was detected to identify impact of IGF-1 on myoblasts differentiation when exposed to myostatin. IGF-1 was accountable for 43.6% of the antiapoptotic impact and 48.8% for the proliferative effect of ASCs conditioned medium. Furthermore, IGF-1 restored mRNA and protein MyoD expression of myoblasts under risk. Beside fusion and transdifferentiation the beneficial effect of ASCs is mediated by paracrine secreted cytokines, particularly IGF-1. The present study underlines the potential of ASCs as a therapeutic option for Duchenne muscular dystrophy and other dystrophic muscle diseases.

  19. Immunoglobins in mammary secretions

    DEFF Research Database (Denmark)

    Hurley, W L; Theil, Peter Kappel

    2013-01-01

    Immunoglobulins secreted in colostrum and milk by the lactating mammal are major factors providing immune protection to the newborn. Immunoglobulins in mammary secretions represent the cumulative immune response of the lactating animal to exposure to antigenic stimulation that occurs through...... the immunoglobulins found in mammary secretions in the context of their diversity of structure, origin, mechanisms of transfer, and function....

  20. Workshop--E-leaks: the privacy of health information in the age of electronic information.

    Science.gov (United States)

    Vonn, Michael; Lang, Renée; Perras, Maude

    2011-10-01

    This workshop examined some of the new challenges to health-related privacy emerging as a result of the proliferation of electronic communications and data storage, including through social media, electronic health records and ready access to personal information on the internet. The right to privacy is a human right. As such, protecting privacy and enforcing the duty of confidentiality regarding health information are fundamental to treating people with autonomy, dignity and respect. For people living with HIV, unauthorized disclosure of their status can lead to discrimination and breaches of other human rights. While this is not new, in this information age a new breed of privacy violation is emerging and our legal protections are not necessarily keeping pace.

  1. Reward-based spatial crowdsourcing with differential privacy preservation

    Science.gov (United States)

    Xiong, Ping; Zhang, Lefeng; Zhu, Tianqing

    2017-11-01

    In recent years, the popularity of mobile devices has transformed spatial crowdsourcing (SC) into a novel mode for performing complicated projects. Workers can perform tasks at specified locations in return for rewards offered by employers. Existing methods ensure the efficiency of their systems by submitting the workers' exact locations to a centralised server for task assignment, which can lead to privacy violations. Thus, implementing crowsourcing applications while preserving the privacy of workers' location is a key issue that needs to be tackled. We propose a reward-based SC method that achieves acceptable utility as measured by task assignment success rates, while efficiently preserving privacy. A differential privacy model ensures rigorous privacy guarantee, and Laplace noise is introduced to protect workers' exact locations. We then present a reward allocation mechanism that adjusts each piece of the reward for a task using the distribution of the workers' locations. Through experimental results, we demonstrate that this optimised-reward method is efficient for SC applications.

  2. Protecting Privacy in Shared Photos via Adversarial Examples Based Stealth

    Directory of Open Access Journals (Sweden)

    Yujia Liu

    2017-01-01

    Full Text Available Online image sharing in social platforms can lead to undesired privacy disclosure. For example, some enterprises may detect these large volumes of uploaded images to do users’ in-depth preference analysis for commercial purposes. And their technology might be today’s most powerful learning model, deep neural network (DNN. To just elude these automatic DNN detectors without affecting visual quality of human eyes, we design and implement a novel Stealth algorithm, which makes the automatic detector blind to the existence of objects in an image, by crafting a kind of adversarial examples. It is just like all objects disappear after wearing an “invisible cloak” from the view of the detector. Then we evaluate the effectiveness of Stealth algorithm through our newly defined measurement, named privacy insurance. The results indicate that our scheme has considerable success rate to guarantee privacy compared with other methods, such as mosaic, blur, and noise. Better still, Stealth algorithm has the smallest impact on image visual quality. Meanwhile, we set a user adjustable parameter called cloak thickness for regulating the perturbation intensity. Furthermore, we find that the processed images have transferability property; that is, the adversarial images generated for one particular DNN will influence the others as well.

  3. Health Records and the Cloud Computing Paradigm from a Privacy Perspective

    Directory of Open Access Journals (Sweden)

    Christian Stingl

    2011-01-01

    Full Text Available With the advent of cloud computing, the realization of highly available electronic health records providing location-independent access seems to be very promising. However, cloud computing raises major security issues that need to be addressed particularly within the health care domain. The protection of the privacy of individuals often seems to be left on the sidelines. For instance, common protection against malicious insiders, i.e., non-disclosure agreements, is purely organizational. Clearly, such measures cannot prevent misuses but can at least discourage it. In this paper, we present an approach to storing highly sensitive health data in the cloud whereas the protection of patient's privacy is exclusively based on technical measures, so that users and providers of health records do not need to trust the cloud provider with privacy related issues. Our technical measures comprise anonymous communication and authentication, anonymous yet authorized transactions and pseudonymization of databases.

  4. Kids Sell: Celebrity Kids’ Right to Privacy

    Directory of Open Access Journals (Sweden)

    Seong Choul Hong

    2016-04-01

    Full Text Available The lives of celebrities are often spotlighted in the media because of their newsworthiness; however, many celebrities argue that their right to privacy is often infringed upon. Concerns about celebrity privacy are not limited to the celebrities themselves and often expand to their children. As a result of their popularity, public interest has pushed paparazzi and journalists to pursue trivial and private details about the lives of both celebrities and their children. This paper investigates conflicting areas where the right to privacy and the right to know collide when dealing with the children of celebrities. In general, the courts have been unsympathetic to celebrity privacy claims, noting their newsworthiness and self-promoted characteristic. Unless the press violates news-gathering ethics or torts, the courts will often rule in favor of the media. However, the story becomes quite different when related to an infringement on the privacy of celebrities’ children. This paper argues that all children have a right to protect their privacy regardless of their parents’ social status. Children of celebrities should not be exempt to principles of privacy just because their parents are a celebrity. Furthermore, they should not be exposed by the media without the voluntary consent of their legal patrons. That is, the right of the media to publish and the newsworthiness of children of celebrities must be restrictedly acknowledged.

  5. Enhancing Privacy Education with a Technical Emphasis in IT Curriculum

    Science.gov (United States)

    Peltsverger, Svetlana; Zheng, Guangzhi

    2016-01-01

    The paper describes the development of four learning modules that focus on technical details of how a person's privacy might be compromised in real-world scenarios. The paper shows how students benefited from the addition of hands-on learning experiences of privacy and data protection to the existing information technology courses. These learning…

  6. Usability Issues in the User Interfaces of Privacy-Enhancing Technologies

    Science.gov (United States)

    LaTouche, Lerone W.

    2013-01-01

    Privacy on the Internet has become one of the leading concerns for Internet users. These users are not wrong in their concerns if personally identifiable information is not protected and under their control. To minimize the collection of Internet users' personal information and help solve the problem of online privacy, a number of…

  7. Privacy and security in teleradiology.

    Science.gov (United States)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  8. Service Outsourcing Character Oriented Privacy Conflict Detection Method in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2014-01-01

    Full Text Available Cloud computing has provided services for users as a software paradigm. However, it is difficult to ensure privacy information security because of its opening, virtualization, and service outsourcing features. Therefore how to protect user privacy information has become a research focus. In this paper, firstly, we model service privacy policy and user privacy preference with description logic. Secondly, we use the pellet reasonor to verify the consistency and satisfiability, so as to detect the privacy conflict between services and user. Thirdly, we present the algorithm of detecting privacy conflict in the process of cloud service composition and prove the correctness and feasibility of this method by case study and experiment analysis. Our method can reduce the risk of user sensitive privacy information being illegally used and propagated by outsourcing services. In the meantime, the method avoids the exception in the process of service composition by the privacy conflict, and improves the trust degree of cloud service providers.

  9. Filaggrin-dependent secretion of sphingomyelinase protects against staphylococcal α-toxin-induced keratinocyte death.

    Science.gov (United States)

    Brauweiler, Anne M; Bin, Lianghua; Kim, Byung Eui; Oyoshi, Michiko K; Geha, Raif S; Goleva, Elena; Leung, Donald Y M

    2013-02-01

    The skin of patients with atopic dermatitis (AD) has defects in keratinocyte differentiation, particularly in expression of the epidermal barrier protein filaggrin. AD skin lesions are often exacerbated by Staphylococcus aureus-mediated secretion of the virulence factor α-toxin. It is unknown whether lack of keratinocyte differentiation predisposes to enhanced lethality from staphylococcal toxins. We investigated whether keratinocyte differentiation and filaggrin expression protect against cell death induced by staphylococcal α-toxin. Filaggrin-deficient primary keratinocytes were generated through small interfering RNA gene knockdown. RNA expression was determined by using real-time PCR. Cell death was determined by using the lactate dehydrogenase assay. Keratinocyte cell survival in filaggrin-deficient (ft/ft) mouse skin biopsies was determined based on Keratin 5 staining. α-Toxin heptamer formation and acid sphingomyelinase expression were determined by means of immunoblotting. We found that filaggrin expression, occurring as the result of keratinocyte differentiation, significantly inhibits staphylococcal α-toxin-mediated pathogenicity. Furthermore, filaggrin plays a crucial role in protecting cells by mediating the secretion of sphingomyelinase, an enzyme that reduces the number of α-toxin binding sites on the keratinocyte surface. Finally, we determined that sphingomyelinase enzymatic activity directly prevents α-toxin binding and protects keratinocytes against α-toxin-induced cytotoxicity. The current study introduces the novel concept that S aureus α-toxin preferentially targets and destroys filaggrin-deficient keratinocytes. It also provides a mechanism to explain the increased propensity for S aureus-mediated exacerbation of AD skin disease. Copyright © 2012 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.

  10. Balancing Cyberspace Promise, Privacy, and Protection: Tracking the Debate.

    Science.gov (United States)

    Metivier-Carreiro, Karen A.; LaFollette, Marcel C.

    1997-01-01

    Examines aspects of cyberspace policy: Internet content and expectations; privacy: medical information and data collected by the government; and the regulation of offensive material: the Communications Decency Act, Internet filters, and the American Library Association's proactive great Web sites for children. Suggests that even "child…

  11. Preserving Employee Privacy in Wellness.

    Science.gov (United States)

    Terry, Paul E

    2017-07-01

    The proposed "Preserving Employee Wellness Programs Act" states that the collection of information about the manifested disease or disorder of a family member shall not be considered an unlawful acquisition of genetic information. The bill recognizes employee privacy protections that are already in place and includes specific language relating to nondiscrimination based on illness. Why did legislation expressly intending to "preserve wellness programs" generate such antipathy about wellness among journalists? This article argues that those who are committed to preserving employee wellness must be equally committed to preserving employee privacy. Related to this, we should better parse between discussions and rules about commonplace health screenings versus much less common genetic testing.

  12. Privacy enhancing techniques - the key to secure communication and management of clinical and genomic data.

    Science.gov (United States)

    De Moor, G J E; Claerhout, B; De Meyer, F

    2003-01-01

    To introduce some of the privacy protection problems related to genomics based medicine and to highlight the relevance of Trusted Third Parties (TTPs) and of Privacy Enhancing Techniques (PETs) in the restricted context of clinical research and statistics. Practical approaches based on two different pseudonymisation models, both for batch and interactive data collection and exchange, are described and analysed. The growing need of managing both clinical and genetic data raises important legal and ethical challenges. Protecting human rights in the realm of privacy, while optimising research potential and other statistical activities is a challenge that can easily be overcome with the assistance of a trust service provider offering advanced privacy enabling/enhancing solutions. As such, the use of pseudonymisation and other innovative Privacy Enhancing Techniques can unlock valuable data sources.

  13. 78 FR 59082 - Privacy Act of 1974; Department of Transportation, Federal Motor Carrier Safety Administration...

    Science.gov (United States)

    2013-09-25

    ..., [email protected] . For privacy issues please contact: Claire W. Barrett, Departmental Chief... DEPARTMENT OF TRANSPORTATION Office of the Secretary [Docket No. FMCSA-2013-0306] Privacy Act of... Administration (FMCSA), DOT. ACTION: Notice to amend a system of records. SUMMARY: In accordance with the Privacy...

  14. Economics of Privacy: Users'€™ Attitudes and Economic Impact of Information Privacy Protection

    OpenAIRE

    Frik, Alisa

    2017-01-01

    This doctoral thesis consists of three essays within the field of economics of information privacy examined through the lens of behavioral and experimental economics. Rapid development and expansion of Internet, mobile and network technologies in the last decades has provided multitudinous opportunities and benefits to both business and society proposing the customized services and personalized offers at a relatively low price and high speed. However, such innovations and progress have al...

  15. Insights to develop privacy policy for organization in Indonesia

    Science.gov (United States)

    Rosmaini, E.; Kusumasari, T. F.; Lubis, M.; Lubis, A. R.

    2018-03-01

    Nowadays, the increased utilization of shared application in the network needs not only dictate to have enhanced security but also emphasize the need to balance its privacy protection and ease of use. Meanwhile, its accessibility and availability as the demand from organization service put privacy obligations become more complex process to be handled and controlled. Nonetheless, the underlying principles for privacy policy exist in Indonesian current laws, even though they spread across various article regulations. Religions, constitutions, statutes, regulations, custom and culture requirements still become the reference model to control the activity process for data collection and information sharing accordingly. Moreover, as the customer and organization often misinterpret their responsibilities and rights in the business function, process and level, the essential thing to be considered for professionals on how to articulate clearly the rules that manage their information gathering and distribution in a manner that translates into information system specification and requirements for developers and managers. This study focus on providing suggestion and recommendation to develop privacy policy based on descriptive analysis of 791 respondents on personal data protection in accordance with political and economic factor in Indonesia.

  16. Toward protocols for quantum-ensured privacy and secure voting

    International Nuclear Information System (INIS)

    Bonanome, Marianna; Buzek, Vladimir; Ziman, Mario; Hillery, Mark

    2011-01-01

    We present a number of schemes that use quantum mechanics to preserve privacy, in particular, we show that entangled quantum states can be useful in maintaining privacy. We further develop our original proposal [see M. Hillery, M. Ziman, V. Buzek, and M. Bielikova, Phys. Lett. A 349, 75 (2006)] for protecting privacy in voting, and examine its security under certain types of attacks, in particular dishonest voters and external eavesdroppers. A variation of these quantum-based schemes can be used for multiparty function evaluation. We consider functions corresponding to group multiplication of N group elements, with each element chosen by a different party. We show how quantum mechanics can be useful in maintaining the privacy of the choices group elements.

  17. Toward protocols for quantum-ensured privacy and secure voting

    Energy Technology Data Exchange (ETDEWEB)

    Bonanome, Marianna [Department of Applied Mathematics and Computer Science, New York City College of Technology, 300 Jay Street, Brooklyn, New York 11201 (United States); Buzek, Vladimir; Ziman, Mario [Research Center for Quantum Information, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia); Faculty of Informatics, Masaryk University, Botanicka 68a, 602 00 Brno (Czech Republic); Hillery, Mark [Department of Physics, Hunter College of CUNY, 695 Park Avenue, New York, New York 10021 (United States)

    2011-08-15

    We present a number of schemes that use quantum mechanics to preserve privacy, in particular, we show that entangled quantum states can be useful in maintaining privacy. We further develop our original proposal [see M. Hillery, M. Ziman, V. Buzek, and M. Bielikova, Phys. Lett. A 349, 75 (2006)] for protecting privacy in voting, and examine its security under certain types of attacks, in particular dishonest voters and external eavesdroppers. A variation of these quantum-based schemes can be used for multiparty function evaluation. We consider functions corresponding to group multiplication of N group elements, with each element chosen by a different party. We show how quantum mechanics can be useful in maintaining the privacy of the choices group elements.

  18. Privacy vs. Reward in Indoor Location-Based Services

    Directory of Open Access Journals (Sweden)

    Fawaz Kassem

    2016-10-01

    Full Text Available With the advance of indoor localization technology, indoor location-based services (ILBS are gaining popularity. They, however, accompany privacy concerns. ILBS providers track the users’ mobility to learn more about their behavior, and then provide them with improved and personalized services. Our survey of 200 individuals highlighted their concerns about this tracking for potential leakage of their personal/private traits, but also showed their willingness to accept reduced tracking for improved service. In this paper, we propose PR-LBS (Privacy vs. Reward for Location-Based Service, a system that addresses these seemingly conflicting requirements by balancing the users’ privacy concerns and the benefits of sharing location information in indoor location tracking environments. PR-LBS relies on a novel location-privacy criterion to quantify the privacy risks pertaining to sharing indoor location information. It also employs a repeated play model to ensure that the received service is proportionate to the privacy risk. We implement and evaluate PR-LBS extensively with various real-world user mobility traces. Results show that PR-LBS has low overhead, protects the users’ privacy, and makes a good tradeoff between the quality of service for the users and the utility of shared location data for service providers.

  19. Anthocyanin increases adiponectin secretion and protects against diabetes-related endothelial dysfunction.

    Science.gov (United States)

    Liu, Yan; Li, Dan; Zhang, Yuhua; Sun, Ruifang; Xia, Min

    2014-04-15

    Adiponectin is an adipose tissue-secreted adipokine with beneficial effects on the cardiovascular system. In this study, we evaluated a potential role for adiponectin in the protective effects of anthocyanin on diabetes-related endothelial dysfunction. We treated db/db mice on a normal diet with anthocyanin cyanidin-3-O-β-glucoside (C3G; 2 g/kg diet) for 8 wk. Endothelium-dependent and -independent relaxations of the aorta were then evaluated. Adiponectin expression and secretion were also measured. C3G treatment restores endothelium-dependent relaxation of the aorta in db/db mice, whereas diabetic mice treated with an anti-adiponectin antibody do not respond. C3G treatment induces adiponectin expression and secretion in cultured 3T3 adipocytes through transcription factor forkhead box O1 (Foxo1). Silencing Foxo1 expression prevented C3G-stimulated induction of adiponectin expression. In contrast, overexpression of Foxo1-ADA promoted adiponectin expression in adipocytes. C3G activates Foxo1 by increasing its deacetylation via silent mating type information regulation 2 homolog 1 (Sirt1). Furthermore, purified anthocyanin supplementation significantly improved flow-mediated dilation (FMD) and increased serum adiponectin concentrations in patients with type 2 diabetes. Changes in adiponectin concentrations positively correlated with FMD in the anthocyanin group. Mechanistically, adiponectin activates cAMP-PKA-eNOS signaling pathways in human aortic endothelial cells, increasing endothelial nitric oxide bioavailability. These results demonstrate that adipocyte-derived adiponectin is required for anthocyanin C3G-mediated improvement of endothelial function in diabetes.

  20. Crossing borders: Security and privacy issues of the European e-passport

    NARCIS (Netherlands)

    Hoepman, J.H.; Hubbers, E.; Jacobs, B.P.F.; Oostdijk, M.D.; Wichers Schreur, R.

    2008-01-01

    The first generation of European e-passports will be issued in 2006. We discuss how borders are crossed regarding the security and privacy erosion of the proposed schemes, and show which borders need to be crossed to improve the security and the privacy protection of the next generation of

  1. Privacy and equality in diagnostic genetic testing.

    Science.gov (United States)

    Nyrhinen, Tarja; Hietala, Marja; Puukka, Pauli; Leino-Kilpi, Helena

    2007-05-01

    This study aimed to determine the extent to which the principles of privacy and equality were observed during diagnostic genetic testing according to views held by patients or child patients' parents (n = 106) and by staff (n = 162) from three Finnish university hospitals. The data were collected through a structured questionnaire and analysed using the SAS 8.1 statistical software. In general, the two principles were observed relatively satisfactorily in clinical practice. According to patients/parents, equality in the post-analytic phase and, according to staff, privacy in the pre-analytic phase, involved the greatest ethical problems. The two groups differed in their views concerning pre-analytic privacy. Although there were no major problems regarding the two principles, the differences between the testing phases require further clarification. To enhance privacy protection and equality, professionals need to be given more genetics/ethics training, and patients individual counselling by genetics units staff, giving more consideration to patients' world-view, the purpose of the test and the test result.

  2. 77 FR 75409 - Multistakeholder Meetings To Develop Consumer Data Privacy Code of Conduct Concerning Mobile...

    Science.gov (United States)

    2012-12-20

    ... Protecting Privacy and Promoting Innovation in the Global Digital Economy (the ``Privacy Blueprint'').\\1\\ The Privacy Blueprint directs NTIA to convene multistakeholder processes to develop legally enforceable codes... services for mobile devices handle personal data.\\3\\ On July 12, 2012, NTIA convened the first meeting of...

  3. Impact of Mini-drone based Video Surveillance on Invasion of Privacy

    OpenAIRE

    Korshunov, Pavel; Bonetto, Margherita; Ebrahimi, Touradj; Ramponi, Giovanni

    2015-01-01

    An increase in adoption of video surveillance, affecting many aspects of daily lives, raises public concern about an intrusion into individual privacy. New sensing and surveillance technologies, such as mini-drones, threaten to eradicate boundaries of private space even more. Therefore, it is important to study the effect of mini-drones on privacy intrusion and to understand how existing protection privacy filters perform on a video captured by a mini-drone. To this end, we have built a publi...

  4. Security and Privacy Analyses of Internet of Things Toys

    OpenAIRE

    Chu, Gordon; Apthorpe, Noah; Feamster, Nick

    2018-01-01

    This paper investigates the security and privacy of Internet-connected children's smart toys through case studies of three commercially-available products. We conduct network and application vulnerability analyses of each toy using static and dynamic analysis techniques, including application binary decompilation and network monitoring. We discover several publicly undisclosed vulnerabilities that violate the Children's Online Privacy Protection Rule (COPPA) as well as the toys' individual pr...

  5. Security of electronic medical information and patient privacy: what you need to know.

    Science.gov (United States)

    Andriole, Katherine P

    2014-12-01

    The responsibility that physicians have to protect their patients from harm extends to protecting the privacy and confidentiality of patient health information including that contained within radiological images. The intent of HIPAA and subsequent HIPAA Privacy and Security Rules is to keep patients' private information confidential while allowing providers access to and maintaining the integrity of relevant information needed to provide care. Failure to comply with electronic protected health information (ePHI) regulations could result in financial or criminal penalties or both. Protected health information refers to anything that can reasonably be used to identify a patient (eg, name, age, date of birth, social security number, radiology examination accession number). The basic tools and techniques used to maintain medical information security and patient privacy described in this article include physical safeguards such as computer device isolation and data backup, technical safeguards such as firewalls and secure transmission modes, and administrative safeguards including documentation of security policies, training of staff, and audit tracking through system logs. Other important concepts related to privacy and security are explained, including user authentication, authorization, availability, confidentiality, data integrity, and nonrepudiation. Patient privacy and security of medical information are critical elements in today's electronic health care environment. Radiology has led the way in adopting digital systems to make possible the availability of medical information anywhere anytime, and in identifying and working to eliminate any risks to patients. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  6. Anti-discrimination Analysis Using Privacy Attack Strategies

    KAUST Repository

    Ruggieri, Salvatore; Hajian, Sara; Kamiran, Faisal; Zhang, Xiangliang

    2014-01-01

    Social discrimination discovery from data is an important task to identify illegal and unethical discriminatory patterns towards protected-by-law groups, e.g., ethnic minorities. We deploy privacy attack strategies as tools for discrimination

  7. Privacy preserving data anonymization of spontaneous ADE reporting system dataset.

    Science.gov (United States)

    Lin, Wen-Yang; Yang, Duen-Chuan; Wang, Jie-Teng

    2016-07-18

    To facilitate long-term safety surveillance of marketing drugs, many spontaneously reporting systems (SRSs) of ADR events have been established world-wide. Since the data collected by SRSs contain sensitive personal health information that should be protected to prevent the identification of individuals, it procures the issue of privacy preserving data publishing (PPDP), that is, how to sanitize (anonymize) raw data before publishing. Although much work has been done on PPDP, very few studies have focused on protecting privacy of SRS data and none of the anonymization methods is favorable for SRS datasets, due to which contain some characteristics such as rare events, multiple individual records, and multi-valued sensitive attributes. We propose a new privacy model called MS(k, θ (*) )-bounding for protecting published spontaneous ADE reporting data from privacy attacks. Our model has the flexibility of varying privacy thresholds, i.e., θ (*) , for different sensitive values and takes the characteristics of SRS data into consideration. We also propose an anonymization algorithm for sanitizing the raw data to meet the requirements specified through the proposed model. Our algorithm adopts a greedy-based clustering strategy to group the records into clusters, conforming to an innovative anonymization metric aiming to minimize the privacy risk as well as maintain the data utility for ADR detection. Empirical study was conducted using FAERS dataset from 2004Q1 to 2011Q4. We compared our model with four prevailing methods, including k-anonymity, (X, Y)-anonymity, Multi-sensitive l-diversity, and (α, k)-anonymity, evaluated via two measures, Danger Ratio (DR) and Information Loss (IL), and considered three different scenarios of threshold setting for θ (*) , including uniform setting, level-wise setting and frequency-based setting. We also conducted experiments to inspect the impact of anonymized data on the strengths of discovered ADR signals. With all three

  8. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  9. Privacy enhanced group communication in clinical environment

    Science.gov (United States)

    Li, Mingyan; Narayanan, Sreeram; Poovendran, Radha

    2005-04-01

    Privacy protection of medical records has always been an important issue and is mandated by the recent Health Insurance Portability and Accountability Act (HIPAA) standards. In this paper, we propose security architectures for a tele-referring system that allows electronic group communication among professionals for better quality treatments, while protecting patient privacy against unauthorized access. Although DICOM defines the much-needed guidelines for confidentiality of medical data during transmission, there is no provision in the existing medical security systems to guarantee patient privacy once the data has been received. In our design, we address this issue by enabling tracing back to the recipient whose received data is disclosed to outsiders, using watermarking technique. We present security architecture design of a tele-referring system using a distributed approach and a centralized web-based approach. The resulting tele-referring system (i) provides confidentiality during the transmission and ensures integrity and authenticity of the received data, (ii) allows tracing of the recipient who has either distributed the data to outsiders or whose system has been compromised, (iii) provides proof of receipt or origin, and (iv) can be easy to use and low-cost to employ in clinical environment.

  10. Ethics and Privacy Implications of Using the Internet and Social Media to Recruit Participants for Health Research: A Privacy-by-Design Framework for Online Recruitment

    Science.gov (United States)

    Cyr, Alaina B; Arbuckle, Luk; Ferris, Lorraine E

    2017-01-01

    Background The Internet and social media offer promising ways to improve the reach, efficiency, and effectiveness of recruitment efforts at a reasonable cost, but raise unique ethical dilemmas. We describe how we used social media to recruit cancer patients and family caregivers for a research study, the ethical issues we encountered, and the strategies we developed to address them. Objective Drawing on the principles of Privacy by Design (PbD), a globally recognized standard for privacy protection, we aimed to develop a PbD framework for online health research recruitment. Methods We proposed a focus group study on the dietary behaviors of cancer patients and their families, and the role of Web-based dietary self-management tools. Using an established blog on our hospital website, we proposed publishing a recruitment post and sharing the link on our Twitter and Facebook pages. The Research Ethics Board (REB) raised concern about the privacy risks associated with our recruitment strategy; by clicking on a recruitment post, an individual could inadvertently disclose personal health information to third-party companies engaged in tracking online behavior. The REB asked us to revise our social media recruitment strategy with the following questions in mind: (1) How will you inform users about the potential for privacy breaches and their implications? and (2) How will you protect users from privacy breaches or inadvertently sharing potentially identifying information about themselves? Results Ethical guidelines recommend a proportionate approach to ethics assessment, which advocates for risk mitigation strategies that are proportional to the magnitude and probability of risks. We revised our social media recruitment strategy to inform users about privacy risks and to protect their privacy, while at the same time meeting our recruitment objectives. We provide a critical reflection of the perceived privacy risks associated with our social media recruitment strategy and

  11. Realizing IoT service's policy privacy over publish/subscribe-based middleware.

    Science.gov (United States)

    Duan, Li; Zhang, Yang; Chen, Shiping; Wang, Shiyao; Cheng, Bo; Chen, Junliang

    2016-01-01

    The publish/subscribe paradigm makes IoT service collaborations more scalable and flexible, due to the space, time and control decoupling of event producers and consumers. Thus, the paradigm can be used to establish large-scale IoT service communication infrastructures such as Supervisory Control and Data Acquisition systems. However, preserving IoT service's policy privacy is difficult in this paradigm, because a classical publisher has little control of its own event after being published; and a subscriber has to accept all the events from the subscribed event type with no choice. Few existing publish/subscribe middleware have built-in mechanisms to address the above issues. In this paper, we present a novel access control framework, which is capable of preserving IoT service's policy privacy. In particular, we adopt the publish/subscribe paradigm as the IoT service communication infrastructure to facilitate the protection of IoT services policy privacy. The key idea in our policy-privacy solution is using a two-layer cooperating method to match bi-directional privacy control requirements: (a) data layer for protecting IoT events; and (b) application layer for preserving the privacy of service policy. Furthermore, the anonymous-set-based principle is adopted to realize the functionalities of the framework, including policy embedding and policy encoding as well as policy matching. Our security analysis shows that the policy privacy framework is Chosen-Plaintext Attack secure. We extend the open source Apache ActiveMQ broker by building into a policy-based authorization mechanism to enforce the privacy policy. The performance evaluation results indicate that our approach is scalable with reasonable overheads.

  12. Privacy, technology, and norms: the case of Smart Meters.

    Science.gov (United States)

    Horne, Christine; Darras, Brice; Bean, Elyse; Srivastava, Anurag; Frickel, Scott

    2015-05-01

    Norms shift and emerge in response to technological innovation. One such innovation is Smart Meters - components of Smart Grid energy systems capable of minute-to-minute transmission of consumer electricity use information. We integrate theory from sociological research on social norms and privacy to examine how privacy threats affect the demand for and expectations of norms that emerge in response to new technologies, using Smart Meters as a test case. Results from three vignette experiments suggest that increased threats to privacy created by Smart Meters are likely to provoke strong demand for and expectations of norms opposing the technology and that the strength of these normative rules is at least partly conditional on the context. Privacy concerns vary little with actors' demographic characteristics. These findings contribute to theoretical understanding of norm emergence and have practical implications for implementing privacy protections that effectively address concerns of electricity users. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. A Taxonomy of Privacy Constructs for Privacy-Sensitive Robotics

    OpenAIRE

    Rueben, Matthew; Grimm, Cindy M.; Bernieri, Frank J.; Smart, William D.

    2017-01-01

    The introduction of robots into our society will also introduce new concerns about personal privacy. In order to study these concerns, we must do human-subject experiments that involve measuring privacy-relevant constructs. This paper presents a taxonomy of privacy constructs based on a review of the privacy literature. Future work in operationalizing privacy constructs for HRI studies is also discussed.

  14. Problem of data privacy protection in direct marketing

    Directory of Open Access Journals (Sweden)

    Markov Jasmina

    2011-01-01

    Full Text Available The dynamism of modern business conditions, as well as increasing competition, call for companies to change their usual ways of doing business and communicating with consumers. Therefore, today's direct marketing industry is experiencing explosive growth, as more and more companies include these activities in their communication mix. Many companies benefit from the development and usage of direct marketing, but at the same time, its growing usage led to numerous problems for companies as well as for the consumers. Direct marketing, advanced information technologies and Internet, on whose use it is more and more based, caused a number of legal and ethical questions without precedent. One of the issues that is making consumers more and more worried is concerning the privacy of their personal data and information which is being collected by a large number of companies. In addition, consumers are often not aware of this data collecting, which is adding even more gravity to this problem. The remainder of this paper will point to the necessity and great importance of careful and responsible use of consumer's personal data by direct marketers, with the aim of build long-term partnership relationships between the two. In addition, special attention will be paid to major problems that consumers face today in the field of data protection, as well as to the efforts committed in order to bring these problems to a minimum by getting consumers more involved in making decisions about usage of their personal data and information.

  15. Privacy exposed : Consumer responses to data collection and usage practices of mobile apps

    NARCIS (Netherlands)

    Wottrich, V.M.

    2018-01-01

    Mobile apps are increasingly jeopardizing consumer privacy by collecting, storing, and sharing personal information. However, little is known about users’ responses to data collection and usage practices of apps. This dissertation investigated (1) the status quo of privacy protection behavior, (2)

  16. Privacy policies for health social networking sites

    Science.gov (United States)

    Li, Jingquan

    2013-01-01

    Health social networking sites (HSNS), virtual communities where users connect with each other around common problems and share relevant health data, have been increasingly adopted by medical professionals and patients. The growing use of HSNS like Sermo and PatientsLikeMe has prompted public concerns about the risks that such online data-sharing platforms pose to the privacy and security of personal health data. This paper articulates a set of privacy risks introduced by social networking in health care and presents a practical example that demonstrates how the risks might be intrinsic to some HSNS. The aim of this study is to identify and sketch the policy implications of using HSNS and how policy makers and stakeholders should elaborate upon them to protect the privacy of online health data. PMID:23599228

  17. Privacy policies for health social networking sites.

    Science.gov (United States)

    Li, Jingquan

    2013-01-01

    Health social networking sites (HSNS), virtual communities where users connect with each other around common problems and share relevant health data, have been increasingly adopted by medical professionals and patients. The growing use of HSNS like Sermo and PatientsLikeMe has prompted public concerns about the risks that such online data-sharing platforms pose to the privacy and security of personal health data. This paper articulates a set of privacy risks introduced by social networking in health care and presents a practical example that demonstrates how the risks might be intrinsic to some HSNS. The aim of this study is to identify and sketch the policy implications of using HSNS and how policy makers and stakeholders should elaborate upon them to protect the privacy of online health data.

  18. Fair Information Principles of Brazilian Companies online privacy policies

    Directory of Open Access Journals (Sweden)

    Patricia Zeni Marchiori

    2016-05-01

    Full Text Available This research aims to present the Fair Information Principles in the privacy policies of the websites of major Brazilian companies (according to the 2014 Forbes Magazine list. The check and analysis were supported by a checklist compiled from documents issued by the Federal Trade Commission and the Organization for Economic Co-operation and development. The study selected fourteen companies from a universe of twenty-five, considering the immediacy criterion of access to the privacy policy on their websites. The security (safeguards principle is the most widespread foundation at the privacy policies of the companies selected (existing in eight of the fourteen analyzed policies; and the principle of responsibility receives less adhesion due to the fact that it is not covered in any of the examined online privacy policies. The Sabesp Company presents the most complete privacy policy, considering the compliance with the Fair Information Principles when compared to the others perused, while WEG does not present any of the principles identified in the documental survey. As for e-commerce, the number of companies that assume some of the Principles is further reduced. For the selected universe the adherence to the Fair information Principles is still incipient, althought its use is not mandatory. An open discussion of the proposed Brazilian law about personal data protection should play an important role in creating further guidance on the subject. Additional studies in this subject should involve the perception of users, as well as a cutout of companies which target e-commerce, considering that an effective alignment with these principles and other guidelines are required in order to protect the user’s privacy and personal data in the web environment.

  19. Beyond Concern: K-12 Faculty and Staff's Perspectives on Privacy Topics and Cybersafety

    Science.gov (United States)

    Hipsky, Shellie; Younes, Wiam

    2015-01-01

    In a time when discussions about information privacy dominate the media, research on Cybersafety education reveals that K-12 teachers and staff are concerned about information privacy in schools and they seek to learn more about the protection of their students' and own personal information online. Privacy topics are typically introduced to the…

  20. 75 FR 28042 - Privacy Act of 1974: System of Records; Department of Homeland Security Transportation Security...

    Science.gov (United States)

    2010-05-19

    ..., VA 20598-6036 or [email protected] . For privacy issues please contact: Mary Ellen Callahan (703-235... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2010-0013] Privacy Act of..., Transportation Security Enforcement Record System, System of Records AGENCY: Privacy Office, DHS. ACTION: Notice...

  1. Privacy under construction : A developmental perspective on privacy perception

    NARCIS (Netherlands)

    Steijn, W.M.P.; Vedder, A.H.

    2015-01-01

    We present a developmental perspective regarding the difference in perceptions toward privacy between young and old. Here, we introduce the notion of privacy conceptions, that is, the specific ideas that individuals have regarding what privacy actually is. The differences in privacy concerns often

  2. Trade Secrets in Life Science and Pharmaceutical Companies

    Science.gov (United States)

    Nealey, Tara; Daignault, Ronald M.; Cai, Yu

    2015-01-01

    Trade secret protection arises under state common law and state statutes. In general, a trade secret is information that is not generally known to the public and is maintained as a secret, and it provides a competitive advantage or economic benefit to the trade secret holder. Trade secrets can be worth tens or hundreds of millions of dollars, and damage awards in trade secret litigation have been high; often, there is a lot at stake. Obtaining a trade secret through “improper means” is misappropriation. If the alleged trade secret, however, was developed independently, known publicly, or not maintained as a secret, then those defenses may successfully overcome a claim for trade secret misappropriation. With today’s interconnectedness in the biotechnology and pharmaceutical fields, more collaborations, joint ventures, and outsourcing arrangements among firms, and increased mobility of employees’ careers, life science companies need to not only understand how to protect their trade secrets, but also know how to defend against a claim for trade secret theft. PMID:25414378

  3. Trade secrets in life science and pharmaceutical companies.

    Science.gov (United States)

    Nealey, Tara; Daignault, Ronald M; Cai, Yu

    2014-11-20

    Trade secret protection arises under state common law and state statutes. In general, a trade secret is information that is not generally known to the public and is maintained as a secret, and it provides a competitive advantage or economic benefit to the trade secret holder. Trade secrets can be worth tens or hundreds of millions of dollars, and damage awards in trade secret litigation have been high; often, there is a lot at stake. Obtaining a trade secret through "improper means" is misappropriation. If the alleged trade secret, however, was developed independently, known publicly, or not maintained as a secret, then those defenses may successfully overcome a claim for trade secret misappropriation. With today's interconnectedness in the biotechnology and pharmaceutical fields, more collaborations, joint ventures, and outsourcing arrangements among firms, and increased mobility of employees' careers, life science companies need to not only understand how to protect their trade secrets, but also know how to defend against a claim for trade secret theft. Copyright © 2015 Cold Spring Harbor Laboratory Press; all rights reserved.

  4. Achieving Optimal Privacy in Trust-Aware Social Recommender Systems

    Science.gov (United States)

    Dokoohaki, Nima; Kaleli, Cihan; Polat, Huseyin; Matskin, Mihhail

    Collaborative filtering (CF) recommenders are subject to numerous shortcomings such as centralized processing, vulnerability to shilling attacks, and most important of all privacy. To overcome these obstacles, researchers proposed for utilization of interpersonal trust between users, to alleviate many of these crucial shortcomings. Till now, attention has been mainly paid to strong points about trust-aware recommenders such as alleviating profile sparsity or calculation cost efficiency, while least attention has been paid on investigating the notion of privacy surrounding the disclosure of individual ratings and most importantly protection of trust computation across social networks forming the backbone of these systems. To contribute to addressing problem of privacy in trust-aware recommenders, within this paper, first we introduce a framework for enabling privacy-preserving trust-aware recommendation generation. While trust mechanism aims at elevating recommender's accuracy, to preserve privacy, accuracy of the system needs to be decreased. Since within this context, privacy and accuracy are conflicting goals we show that a Pareto set can be found as an optimal setting for both privacy-preserving and trust-enabling mechanisms. We show that this Pareto set, when used as the configuration for measuring the accuracy of base collaborative filtering engine, yields an optimized tradeoff between conflicting goals of privacy and accuracy. We prove this concept along with applicability of our framework by experimenting with accuracy and privacy factors, and we show through experiment how such optimal set can be inferred.

  5. SafePub: A Truthful Data Anonymization Algorithm With Strong Privacy Guarantees

    Directory of Open Access Journals (Sweden)

    Bild Raffael

    2018-01-01

    Full Text Available Methods for privacy-preserving data publishing and analysis trade off privacy risks for individuals against the quality of output data. In this article, we present a data publishing algorithm that satisfies the differential privacy model. The transformations performed are truthful, which means that the algorithm does not perturb input data or generate synthetic output data. Instead, records are randomly drawn from the input dataset and the uniqueness of their features is reduced. This also offers an intuitive notion of privacy protection. Moreover, the approach is generic, as it can be parameterized with different objective functions to optimize its output towards different applications. We show this by integrating six well-known data quality models. We present an extensive analytical and experimental evaluation and a comparison with prior work. The results show that our algorithm is the first practical implementation of the described approach and that it can be used with reasonable privacy parameters resulting in high degrees of protection. Moreover, when parameterizing the generic method with an objective function quantifying the suitability of data for building statistical classifiers, we measured prediction accuracies that compare very well with results obtained using state-of-the-art differentially private classification algorithms.

  6. Privacy-preserving data aggregation in two-tiered wireless sensor networks with mobile nodes.

    Science.gov (United States)

    Yao, Yonglei; Liu, Jingfa; Xiong, Neal N

    2014-11-10

    Privacy-preserving data aggregation in wireless sensor networks (WSNs) with mobile nodes is a challenging problem, as an accurate aggregation result should be derived in a privacy-preserving manner, under the condition that nodes are mobile and have no pre-specified keys for cryptographic operations. In this paper, we focus on the SUM aggregation function and propose two privacy-preserving data aggregation protocols for two-tiered sensor networks with mobile nodes: Privacy-preserving Data Aggregation against non-colluded Aggregator and Sink (PDAAS) and Privacy-preserving Data Aggregation against Colluded Aggregator and Sink (PDACAS). Both protocols guarantee that the sink can derive the SUM of all raw sensor data but each sensor's raw data is kept confidential. In PDAAS, two keyed values are used, one shared with the sink and the other shared with the aggregator. PDAAS can protect the privacy of sensed data against external eavesdroppers, compromised sensor nodes, the aggregator or the sink, but fails if the aggregator and the sink collude. In PDACAS, multiple keyed values are used in data perturbation, which are not shared with the aggregator or the sink. PDACAS can protect the privacy of sensor nodes even the aggregator and the sink collude, at the cost of a little more overhead than PDAAS. Thorough analysis and experiments are conducted, which confirm the efficacy and efficiency of both schemes.

  7. A Generic Privacy Quantification Framework for Privacy-Preserving Data Publishing

    Science.gov (United States)

    Zhu, Zutao

    2010-01-01

    In recent years, the concerns about the privacy for the electronic data collected by government agencies, organizations, and industries are increasing. They include individual privacy and knowledge privacy. Privacy-preserving data publishing is a research branch that preserves the privacy while, at the same time, withholding useful information in…

  8. 76 FR 21091 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Science.gov (United States)

    2011-04-14

    ...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...), as amended, (Pub. L. 100-503, the Computer Matching and Privacy Protection Act (CMPPA) of 1988), the...

  9. Student Data Privacy Communications Toolkit

    Science.gov (United States)

    Foundation for Excellence in Education, 2016

    2016-01-01

    Parents expect school districts and schools to keep their children safe while they are in school. That expectation of safety and security also extends to the protection of their children's learning data. Therefore, it is critical that school districts and schools are open and transparent about their student data privacy practices, and that those…

  10. The Models of Applying Online Privacy Literacy Strategies: A Case Study of Instagram Girl Users

    OpenAIRE

    Abdollah Bicharanlou; Seyedeh farzaneh Siasi rad

    2017-01-01

    Social networks affect remarkably in the lives of virtual space users. These networks like most human relations involve compromising between self-disclosure and privacy protection. A process which is realized through improving privacy and empowering the user at the personal level. This study aimed to assess strategies based on online privacy literacy. In particular, strategies that Instagram young girls users should employ to achieve the optimum level of privacy. For this purpose, firstly the...

  11. Cervicovaginal secretions protect from human papillomavirus infection: effects of vaginal douching.

    Science.gov (United States)

    Chu, Tang-Yuan; Chang, Ying-Cheng; Ding, Dah-Ching

    2013-06-01

    Cervicovaginal secretions (CVSs) are reported to protect against human papillomavirus (HPV) infection. Although vaginal douching is known to clear both viral inoculants and CVSs, its effect on CVSs in women with HPV infection is unknown. The in vitro HPV pseudovirus infection system was used to test the protective activity of CVSs against HPV infection in samples collected before and after vaginal douching. To simulate different time points of vaginal douching in relation to viral exposure, the cell CVS reconstitute was washed after different viral exposure durations. In the CVSs of premenopausal and postmenopausal women who did not perform douching, the CVSs inhibited HPV infection by 56.7 ± 1.8% and 53.6 ± 2.5%, respectively; in women who had performed douching, the CVSs inhibited HPV infection by only 31.2 ± 7.1%, which was significantly lower (p infection existed for up to 8 hours after HPV exposure, and cell washing increased the clearance to up to 82-93% of the infectious load. This study confirms the protective activity of CVSs against HPV infection regardless of age. In this in vitro study, the net effect of douching was found to be beneficial. Copyright © 2013. Published by Elsevier B.V.

  12. Surveillance, Privacy and Trans-Atlantic Relations

    DEFF Research Database (Denmark)

    Recent revelations, by Edward Snowden and others, of the vast network of government spying enabled by modern technology have raised major concerns both in the European Union and the United States on how to protect privacy in the face of increasing governmental surveillance. This book brings...

  13. Differential Privacy and Machine Learning: a Survey and Review

    OpenAIRE

    Ji, Zhanglong; Lipton, Zachary C.; Elkan, Charles

    2014-01-01

    The objective of machine learning is to extract useful information from data, while privacy is preserved by concealing information. Thus it seems hard to reconcile these competing interests. However, they frequently must be balanced when mining sensitive data. For example, medical research represents an important application where it is necessary both to extract useful information and protect patient privacy. One way to resolve the conflict is to extract general characteristics of whole popul...

  14. Trust information-based privacy architecture for ubiquitous health.

    Science.gov (United States)

    Ruotsalainen, Pekka Sakari; Blobel, Bernd; Seppälä, Antto; Nykänen, Pirkko

    2013-10-08

    Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system

  15. Choose Privacy Week: Educate Your Students (and Yourself) about Privacy

    Science.gov (United States)

    Adams, Helen R.

    2016-01-01

    The purpose of "Choose Privacy Week" is to encourage a national conversation to raise awareness of the growing threats to personal privacy online and in day-to-day life. The 2016 Choose Privacy Week theme is "respecting individuals' privacy," with an emphasis on minors' privacy. A plethora of issues relating to minors' privacy…

  16. 75 FR 63703 - Privacy Act of 1974; Privacy Act Regulation

    Science.gov (United States)

    2010-10-18

    ... FEDERAL RESERVE SYSTEM 12 CFR Part 261a [Docket No. R-1313] Privacy Act of 1974; Privacy Act... implementing the Privacy Act of 1974 (Privacy Act). The primary changes concern the waiver of copying fees... records under the Privacy Act; the amendment of special procedures for the release of medical records to...

  17. 75 FR 7648 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Department of Veterans Affairs...

    Science.gov (United States)

    2010-02-22

    ... Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503), amended the Privacy... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0006] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Veterans Affairs/Veterans Benefits Administration (VA/ VBA...

  18. Designing Privacy for You : A User Centric Approach For Privacy

    OpenAIRE

    Senarath, Awanthika; Arachchilage, Nalin A. G.; Slay, Jill

    2017-01-01

    Privacy directly concerns the user as the data owner (data- subject) and hence privacy in systems should be implemented in a manner which concerns the user (user-centered). There are many concepts and guidelines that support development of privacy and embedding privacy into systems. However, none of them approaches privacy in a user- centered manner. Through this research we propose a framework that would enable developers and designers to grasp privacy in a user-centered manner and implement...

  19. The Privacy Coach: Supporting customer privacy in the Internet of Things

    OpenAIRE

    Broenink, Gerben; Hoepman, Jaap-Henk; Hof, Christian van 't; van Kranenburg, Rob; Smits, David; Wisman, Tijmen

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing technologies on the RFID tags themselves. Instead the Privacy Coach functions as a mediator between customer privacy preferences and corporate privacy policies, trying to find a match between the ...

  20. Quantifying the costs and benefits of privacy-preserving health data publishing.

    Science.gov (United States)

    Khokhar, Rashid Hussain; Chen, Rui; Fung, Benjamin C M; Lui, Siu Man

    2014-08-01

    Cost-benefit analysis is a prerequisite for making good business decisions. In the business environment, companies intend to make profit from maximizing information utility of published data while having an obligation to protect individual privacy. In this paper, we quantify the trade-off between privacy and data utility in health data publishing in terms of monetary value. We propose an analytical cost model that can help health information custodians (HICs) make better decisions about sharing person-specific health data with other parties. We examine relevant cost factors associated with the value of anonymized data and the possible damage cost due to potential privacy breaches. Our model guides an HIC to find the optimal value of publishing health data and could be utilized for both perturbative and non-perturbative anonymization techniques. We show that our approach can identify the optimal value for different privacy models, including K-anonymity, LKC-privacy, and ∊-differential privacy, under various anonymization algorithms and privacy parameters through extensive experiments on real-life data. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Hacktivism 1-2-3: how privacy enhancing technologies change the face of anonymous hacktivism

    NARCIS (Netherlands)

    Bodó, B.

    2014-01-01

    This short essay explores how the notion of hacktivism changes due to easily accessible, military grade Privacy Enhancing Technologies (PETs). Privacy Enhancing Technologies, technological tools which provide anonymous communications and protect users from online surveillance enable new forms of

  2. Toward privacy-preserving JPEG image retrieval

    Science.gov (United States)

    Cheng, Hang; Wang, Jingyue; Wang, Meiqing; Zhong, Shangping

    2017-07-01

    This paper proposes a privacy-preserving retrieval scheme for JPEG images based on local variance. Three parties are involved in the scheme: the content owner, the server, and the authorized user. The content owner encrypts JPEG images for privacy protection by jointly using permutation cipher and stream cipher, and then, the encrypted versions are uploaded to the server. With an encrypted query image provided by an authorized user, the server may extract blockwise local variances in different directions without knowing the plaintext content. After that, it can calculate the similarity between the encrypted query image and each encrypted database image by a local variance-based feature comparison mechanism. The authorized user with the encryption key can decrypt the returned encrypted images with plaintext content similar to the query image. The experimental results show that the proposed scheme not only provides effective privacy-preserving retrieval service but also ensures both format compliance and file size preservation for encrypted JPEG images.

  3. 78 FR 69925 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Fiscal Service...

    Science.gov (United States)

    2013-11-21

    ... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching... savings securities. C. Authority for Conducting the Matching Program This computer matching agreement sets... amended by the Computer Matching and Privacy Protection Act of 1988, as amended, and the regulations and...

  4. 76 FR 5235 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA Internal Match)-Match Number 1014

    Science.gov (United States)

    2011-01-28

    ...; Computer Matching Program (SSA Internal Match)--Match Number 1014 AGENCY: Social Security Administration... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching....C. 552a, as amended, and the provisions of the Computer Matching and Privacy Protection Act of 1988...

  5. Auditing cloud computing a security and privacy guide

    CERN Document Server

    Halpert, Ben

    2011-01-01

    The auditor's guide to ensuring correct security and privacy practices in a cloud computing environment Many organizations are reporting or projecting a significant cost savings through the use of cloud computing-utilizing shared computing resources to provide ubiquitous access for organizations and end users. Just as many organizations, however, are expressing concern with security and privacy issues for their organization's data in the "cloud." Auditing Cloud Computing provides necessary guidance to build a proper audit to ensure operational integrity and customer data protection, among othe

  6. Ethics and Privacy Implications of Using the Internet and Social Media to Recruit Participants for Health Research: A Privacy-by-Design Framework for Online Recruitment.

    Science.gov (United States)

    Bender, Jacqueline Lorene; Cyr, Alaina B; Arbuckle, Luk; Ferris, Lorraine E

    2017-04-06

    The Internet and social media offer promising ways to improve the reach, efficiency, and effectiveness of recruitment efforts at a reasonable cost, but raise unique ethical dilemmas. We describe how we used social media to recruit cancer patients and family caregivers for a research study, the ethical issues we encountered, and the strategies we developed to address them. Drawing on the principles of Privacy by Design (PbD), a globally recognized standard for privacy protection, we aimed to develop a PbD framework for online health research recruitment. We proposed a focus group study on the dietary behaviors of cancer patients and their families, and the role of Web-based dietary self-management tools. Using an established blog on our hospital website, we proposed publishing a recruitment post and sharing the link on our Twitter and Facebook pages. The Research Ethics Board (REB) raised concern about the privacy risks associated with our recruitment strategy; by clicking on a recruitment post, an individual could inadvertently disclose personal health information to third-party companies engaged in tracking online behavior. The REB asked us to revise our social media recruitment strategy with the following questions in mind: (1) How will you inform users about the potential for privacy breaches and their implications? and (2) How will you protect users from privacy breaches or inadvertently sharing potentially identifying information about themselves? Ethical guidelines recommend a proportionate approach to ethics assessment, which advocates for risk mitigation strategies that are proportional to the magnitude and probability of risks. We revised our social media recruitment strategy to inform users about privacy risks and to protect their privacy, while at the same time meeting our recruitment objectives. We provide a critical reflection of the perceived privacy risks associated with our social media recruitment strategy and the appropriateness of the risk

  7. SecureMA: protecting participant privacy in genetic association meta-analysis

    OpenAIRE

    Xie, Wei; Kantarcioglu, Murat; Bush, William S.; Crawford, Dana; Denny, Joshua C.; Heatherly, Raymond; Malin, Bradley A.

    2014-01-01

    Motivation: Sharing genomic data is crucial to support scientific investigation such as genome-wide association studies. However, recent investigations suggest the privacy of the individual participants in these studies can be compromised, leading to serious concerns and consequences, such as overly restricted access to data.

  8. Users or Students? Privacy in University MOOCS.

    Science.gov (United States)

    Jones, Meg Leta; Regner, Lucas

    2016-10-01

    Two terms, student privacy and Massive Open Online Courses, have received a significant amount of attention recently. Both represent interesting sites of change in entrenched structures, one educational and one legal. MOOCs represent something college courses have never been able to provide: universal access. Universities not wanting to miss the MOOC wave have started to build MOOC courses and integrate them into the university system in various ways. However, the design and scale of university MOOCs create tension for privacy laws intended to regulate information practices exercised by educational institutions. Are MOOCs part of the educational institutions these laws and policies aim to regulate? Are MOOC users students whose data are protected by aforementioned laws and policies? Many university researchers and faculty members are asked to participate as designers and instructors in MOOCs but may not know how to approach the issues proposed. While recent scholarship has addressed the disruptive nature of MOOCs, student privacy generally, and data privacy in the K-12 system, we provide an in-depth description and analysis of the MOOC phenomenon and the privacy laws and policies that guide and regulate educational institutions today. We offer privacy case studies of three major MOOC providers active in the market today to reveal inconsistencies among MOOC platform and the level and type of legal uncertainty surrounding them. Finally, we provide a list of organizational questions to pose internally to navigate the uncertainty presented to university MOOC teams.

  9. Privacy Management and Networked PPD Systems - Challenges Solutions.

    Science.gov (United States)

    Ruotsalainen, Pekka; Pharow, Peter; Petersen, Francoise

    2015-01-01

    Modern personal portable health devices (PPDs) become increasingly part of a larger, inhomogeneous information system. Information collected by sensors are stored and processed in global clouds. Services are often free of charge, but at the same time service providers' business model is based on the disclosure of users' intimate health information. Health data processed in PPD networks is not regulated by health care specific legislation. In PPD networks, there is no guarantee that stakeholders share same ethical principles with the user. Often service providers have own security and privacy policies and they rarely offer to the user possibilities to define own, or adapt existing privacy policies. This all raises huge ethical and privacy concerns. In this paper, the authors have analyzed privacy challenges in PPD networks from users' viewpoint using system modeling method and propose the principle "Personal Health Data under Personal Control" must generally be accepted at global level. Among possible implementation of this principle, the authors propose encryption, computer understandable privacy policies, and privacy labels or trust based privacy management methods. The latter can be realized using infrastructural trust calculation and monitoring service. A first step is to require the protection of personal health information and the principle proposed being internationally mandatory. This requires both regulatory and standardization activities, and the availability of open and certified software application which all service providers can implement. One of those applications should be the independent Trust verifier.

  10. Do privacy and security regulations need a status update? Perspectives from an intergenerational survey.

    Directory of Open Access Journals (Sweden)

    Stacey Pereira

    Full Text Available The importance of health privacy protections in the era of the "Facebook Generation" has been called into question. The ease with which younger people share personal information about themselves has led to the assumption that they are less concerned than older generations about the privacy of their information, including health information. We explored whether survey respondents' views toward health privacy suggest that efforts to strengthen privacy protections as health information is moved online are unnecessary.Using Amazon's Mechanical Turk (MTurk, which is well-known for recruitment for survey research, we distributed a 45-item survey to individuals in the U.S. to assess their perspectives toward privacy and security of online and health information, social media behaviors, use of health and fitness devices, and demographic information.1310 participants (mean age: 36 years, 50% female, 78% non-Hispanic white, 54% college graduates or higher were categorized by generations: Millennials, Generation X, and Baby Boomers. In multivariate regression models, we found that generational cohort was an independent predictor of level of concern about privacy and security of both online and health information. Younger generations were significantly less likely to be concerned than older generations (all P 0.05.This study is limited by the non-representativeness of our sample.Though Millennials reported lower levels of concern about privacy and security, this was not related to internet or social media behaviors, and majorities within all generations reported concern about both the privacy and security of their health information. Thus, there is no intergenerational imperative to relax privacy and security standards, and it would be advisable to take privacy and security of health information more seriously.

  11. PRIVACY IN CLOUD COMPUTING: A SURVEY

    OpenAIRE

    Arockiam L; Parthasarathy G; Monikandan S

    2012-01-01

    Various cloud computing models are used to increase the profit of an organization. Cloud provides a convenient environment and more advantages to business organizations to run their business. But, it has some issues related to the privacy of data. User’s data are stored and maintained out of user’s premises. The failure of data protection causes many issues like data theft which affects the individual organization. The cloud users may be satisfied, if their data are protected p...

  12. Defending Privacy: the Development and Deployment of a Darknet

    OpenAIRE

    McManamon, Conor; Mtenzi, Fredrick

    2010-01-01

    New measures imposed by governments, Internet service providers and other third parties which threaten the state of privacy are also opening new avenues to protecting it. The unwarranted scrutiny of legitimate services such as file hosters and the BitTorrent protocol, once relatively unknown to the casual Internet user, is becoming more obvious. The darknet is a rising contender against these new measures and will preserve the default right to privacy of Internet users. A darknet is defined i...

  13. Do privacy and security regulations need a status update? Perspectives from an intergenerational survey

    Science.gov (United States)

    Pereira, Stacey; Robinson, Jill Oliver; Gutierrez, Amanda M.; Majumder, Mary A.; McGuire, Amy L.; Rothstein, Mark A.

    2017-01-01

    Background The importance of health privacy protections in the era of the “Facebook Generation” has been called into question. The ease with which younger people share personal information about themselves has led to the assumption that they are less concerned than older generations about the privacy of their information, including health information. We explored whether survey respondents’ views toward health privacy suggest that efforts to strengthen privacy protections as health information is moved online are unnecessary. Methods Using Amazon’s Mechanical Turk (MTurk), which is well-known for recruitment for survey research, we distributed a 45-item survey to individuals in the U.S. to assess their perspectives toward privacy and security of online and health information, social media behaviors, use of health and fitness devices, and demographic information. Results 1310 participants (mean age: 36 years, 50% female, 78% non-Hispanic white, 54% college graduates or higher) were categorized by generations: Millennials, Generation X, and Baby Boomers. In multivariate regression models, we found that generational cohort was an independent predictor of level of concern about privacy and security of both online and health information. Younger generations were significantly less likely to be concerned than older generations (all P privacy or security of online or health information (all P > 0.05). Limitations This study is limited by the non-representativeness of our sample. Conclusions Though Millennials reported lower levels of concern about privacy and security, this was not related to internet or social media behaviors, and majorities within all generations reported concern about both the privacy and security of their health information. Thus, there is no intergenerational imperative to relax privacy and security standards, and it would be advisable to take privacy and security of health information more seriously. PMID:28926626

  14. Do privacy and security regulations need a status update? Perspectives from an intergenerational survey.

    Science.gov (United States)

    Pereira, Stacey; Robinson, Jill Oliver; Peoples, Hayley A; Gutierrez, Amanda M; Majumder, Mary A; McGuire, Amy L; Rothstein, Mark A

    2017-01-01

    The importance of health privacy protections in the era of the "Facebook Generation" has been called into question. The ease with which younger people share personal information about themselves has led to the assumption that they are less concerned than older generations about the privacy of their information, including health information. We explored whether survey respondents' views toward health privacy suggest that efforts to strengthen privacy protections as health information is moved online are unnecessary. Using Amazon's Mechanical Turk (MTurk), which is well-known for recruitment for survey research, we distributed a 45-item survey to individuals in the U.S. to assess their perspectives toward privacy and security of online and health information, social media behaviors, use of health and fitness devices, and demographic information. 1310 participants (mean age: 36 years, 50% female, 78% non-Hispanic white, 54% college graduates or higher) were categorized by generations: Millennials, Generation X, and Baby Boomers. In multivariate regression models, we found that generational cohort was an independent predictor of level of concern about privacy and security of both online and health information. Younger generations were significantly less likely to be concerned than older generations (all P privacy or security of online or health information (all P > 0.05). This study is limited by the non-representativeness of our sample. Though Millennials reported lower levels of concern about privacy and security, this was not related to internet or social media behaviors, and majorities within all generations reported concern about both the privacy and security of their health information. Thus, there is no intergenerational imperative to relax privacy and security standards, and it would be advisable to take privacy and security of health information more seriously.

  15. Privacy transparency patterns

    NARCIS (Netherlands)

    Siljee B.I.J.

    2015-01-01

    This paper describes two privacy patterns for creating privacy transparency: the Personal Data Table pattern and the Privacy Policy Icons pattern, as well as a full overview of privacy transparency patterns. It is a first step in creating a full set of privacy design patterns, which will aid

  16. 78 FR 15734 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... computer matching program between the Department of Homeland Security/U.S. Citizenship and Immigration... Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection Amendments of 1990...

  17. 78 FR 15733 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... computer matching program between the Department of Homeland Security/U.S. Citizenship and Immigration... Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection Amendments of 1990...

  18. Open-source intelligence and privacy by design

    NARCIS (Netherlands)

    Koops, B.J.; Hoepman, J.H.; Leenes, R.

    2013-01-01

    As demonstrated by other papers on this issue, open-source intelligence (OSINT) by state authorities poses challenges for privacy protection and intellectual-property enforcement. A possible strategy to address these challenges is to adapt the design of OSINT tools to embed normative requirements,

  19. Controlling the signal: Practical privacy protection of genomic data sharing through Beacon services.

    Science.gov (United States)

    Wan, Zhiyu; Vorobeychik, Yevgeniy; Kantarcioglu, Murat; Malin, Bradley

    2017-07-26

    Genomic data is increasingly collected by a wide array of organizations. As such, there is a growing demand to make summary information about such collections available more widely. However, over the past decade, a series of investigations have shown that attacks, rooted in statistical inference methods, can be applied to discern the presence of a known individual's DNA sequence in the pool of subjects. Recently, it was shown that the Beacon Project of the Global Alliance for Genomics and Health, a web service for querying about the presence (or absence) of a specific allele, was vulnerable. The Integrating Data for Analysis, Anonymization, and Sharing (iDASH) Center modeled a track in their third Privacy Protection Challenge on how to mitigate the Beacon vulnerability. We developed the winning solution for this track. This paper describes our computational method to optimize the tradeoff between the utility and the privacy of the Beacon service. We generalize the genomic data sharing problem beyond that which was introduced in the iDASH Challenge to be more representative of real world scenarios to allow for a more comprehensive evaluation. We then conduct a sensitivity analysis of our method with respect to several state-of-the-art methods using a dataset of 400,000 positions in Chromosome 10 for 500 individuals from Phase 3 of the 1000 Genomes Project. All methods are evaluated for utility, privacy and efficiency. Our method achieves better performance than all state-of-the-art methods, irrespective of how key factors (e.g., the allele frequency in the population, the size of the pool and utility weights) change from the original parameters of the problem. We further illustrate that it is possible for our method to exhibit subpar performance under special cases of allele query sequences. However, we show our method can be extended to address this issue when the query sequence is fixed and known a priori to the data custodian, so that they may plan stage their

  20. Privacy Awareness: A Means to Solve the Privacy Paradox?

    Science.gov (United States)

    Pötzsch, Stefanie

    People are limited in their resources, i.e. they have limited memory capabilities, cannot pay attention to too many things at the same time, and forget much information after a while; computers do not suffer from these limitations. Thus, revealing personal data in electronic communication environments and being completely unaware of the impact of privacy might cause a lot of privacy issues later. Even if people are privacy aware in general, the so-called privacy paradox shows that they do not behave according to their stated attitudes. This paper discusses explanations for the existing dichotomy between the intentions of people towards disclosure of personal data and their behaviour. We present requirements on tools for privacy-awareness support in order to counteract the privacy paradox.

  1. Towards data protection compliance

    NARCIS (Netherlands)

    Zannone, N.; Petkovic, M.; Etalle, S.

    2010-01-01

    Privacy and data protection are fundamental issues nowadays for every organization. This paper calls for the development of methods, techniques and infrastructure to allow the deployment of privacy-aware IT systems, in which humans are integral part of the organizational processes and accountable

  2. Towards Data Protection Compliance

    NARCIS (Netherlands)

    Zannone, Nicola; Petkovic, M.; Etalle, Sandro

    2010-01-01

    Privacy and data protection are fundamental issues nowadays for every organization. This paper calls for the development of methods, techniques and infrastructure to allow the deployment of privacy-aware IT systems, in which humans are integral part of the organizational processes and accountable

  3. Lessons learned from a privacy breach at an academic health science centre.

    Science.gov (United States)

    Malonda, Jacqueline; Campbell, Janice; Crivianu-Gaita, Daniela; Freedman, Melvin H; Stevens, Polly; Laxer, Ronald M

    2009-01-01

    In 2007, the Hospital for Sick Children experienced a serious privacy breach when a laptop computer containing the personal health information of approximately 3,000 patients and research subjects was stolen from a physician-researcher's vehicle. This incident was reported to the information and privacy commissioner of Ontario (IPC). The IPC issued an order that required the hospital to examine and revise its policies, practices and research protocols related to the protection of personal health information and to educate staff on privacy-related matters.

  4. Privacy Policy

    Science.gov (United States)

    ... Home → NLM Privacy Policy URL of this page: https://medlineplus.gov/privacy.html NLM Privacy Policy To ... out of cookies in the most popular browsers, http://www.usa.gov/optout_instructions.shtml. Please note ...

  5. Privacy Policies for Apps Targeted Toward Youth: Descriptive Analysis of Readability

    Science.gov (United States)

    Das, Gitanjali; Cheung, Cynthia; Nebeker, Camille; Bietz, Matthew

    2018-01-01

    Background Due to the growing availability of consumer information, the protection of personal data is of increasing concern. Objective We assessed readability metrics of privacy policies for apps that are either available to or targeted toward youth to inform strategies to educate and protect youth from unintentional sharing of personal data. Methods We reviewed the 1200 highest ranked apps from the Apple and Google Play Stores and systematically selected apps geared toward youth. After applying exclusion criteria, 99 highly ranked apps geared toward minors remained, 64 of which had a privacy policy. We obtained and analyzed these privacy policies using reading grade level (RGL) as a metric. Policies were further compared as a function of app category (free vs paid; entertainment vs social networking vs utility). Results Analysis of privacy policies for these 64 apps revealed an average RGL of 12.78, which is well above the average reading level (8.0) of adults in the United States. There was also a small but statistically significant difference in word count as a function of app category (entertainment: 2546 words, social networking: 3493 words, and utility: 1038 words; P=.02). Conclusions Although users must agree to privacy policies to access digital tools and products, readability analyses suggest that these agreements are not comprehensible to most adults, let alone youth. We propose that stakeholders, including pediatricians and other health care professionals, play a role in educating youth and their guardians about the use of Web-based services and potential privacy risks, including the unintentional sharing of personal data. PMID:29301737

  6. Concentrated Differential Privacy

    OpenAIRE

    Dwork, Cynthia; Rothblum, Guy N.

    2016-01-01

    We introduce Concentrated Differential Privacy, a relaxation of Differential Privacy enjoying better accuracy than both pure differential privacy and its popular "(epsilon,delta)" relaxation without compromising on cumulative privacy loss over multiple computations.

  7. Redefining genomic privacy: trust and empowerment.

    Directory of Open Access Journals (Sweden)

    Yaniv Erlich

    2014-11-01

    Full Text Available Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  8. Redefining genomic privacy: trust and empowerment.

    Science.gov (United States)

    Erlich, Yaniv; Williams, James B; Glazer, David; Yocum, Kenneth; Farahany, Nita; Olson, Maynard; Narayanan, Arvind; Stein, Lincoln D; Witkowski, Jan A; Kain, Robert C

    2014-11-01

    Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  9. Privacy Protection Method for Multiple Sensitive Attributes Based on Strong Rule

    Directory of Open Access Journals (Sweden)

    Tong Yi

    2015-01-01

    Full Text Available At present, most studies on data publishing only considered single sensitive attribute, and the works on multiple sensitive attributes are still few. And almost all the existing studies on multiple sensitive attributes had not taken the inherent relationship between sensitive attributes into account, so that adversary can use the background knowledge about this relationship to attack the privacy of users. This paper presents an attack model with the association rules between the sensitive attributes and, accordingly, presents a data publication for multiple sensitive attributes. Through proof and analysis, the new model can prevent adversary from using the background knowledge about association rules to attack privacy, and it is able to get high-quality released information. At last, this paper verifies the above conclusion with experiments.

  10. Isolating Graphical Failure-Inducing Input for Privacy Protection in Error Reporting Systems

    Directory of Open Access Journals (Sweden)

    Matos João

    2016-04-01

    Full Text Available This work proposes a new privacy-enhancing system that minimizes the disclosure of information in error reports. Error reporting mechanisms are of the utmost importance to correct software bugs but, unfortunately, the transmission of an error report may reveal users’ private information. Some privacy-enhancing systems for error reporting have been presented in the past years, yet they rely on path condition analysis, which we show in this paper to be ineffective when it comes to graphical-based input. Knowing that numerous applications have graphical user interfaces (GUI, it is very important to overcome such limitation. This work describes a new privacy-enhancing error reporting system, based on a new input minimization algorithm called GUIᴍɪɴ that is geared towards GUI, to remove input that is unnecessary to reproduce the observed failure. Before deciding whether to submit the error report, the user is provided with a step-by-step graphical replay of the minimized input, to evaluate whether it still yields sensitive information. We also provide an open source implementation of the proposed system and evaluate it with well-known applications.

  11. 76 FR 67755 - Privacy Act of 1974; Department of Homeland Security U.S. Customs and Border Protection DHS/CBP...

    Science.gov (United States)

    2011-11-02

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2011-0102] Privacy Act of... Data System of Records AGENCY: Privacy Office, DHS. ACTION: Notice of Privacy Act system of records. SUMMARY: In accordance with the Privacy Act of 1974 the Department of Homeland Security proposes to...

  12. Privacy and security disclosures on telecardiology websites

    NARCIS (Netherlands)

    Dubbeld, L.

    2006-01-01

    This article discusses telemedicine providers¿ online privacy and security disclosures. It presents the results of an exploratory study of a number of telecardiology companies¿ Web sites, providing insight in some of the current strategies towards data protection and information security in the

  13. The interplay between decentralization and privacy: the case of blockchain technologies

    OpenAIRE

    De Filippi , Primavera

    2016-01-01

    International audience; Decentralized architectures are gaining popularity as a way to protect one's privacy against the ubiquitous surveillance of states and corporations. Yet, in spite of the obvious benefits they provide when it comes to data sovereignty, decentralized architectures also present certain characteristics that—if not properly accounted for—might ultimately impinge upon users' privacy. While they are capable of preserving the confidentiality of data, decentralized architecture...

  14. Query Monitoring and Analysis for Database Privacy - A Security Automata Model Approach.

    Science.gov (United States)

    Kumar, Anand; Ligatti, Jay; Tu, Yi-Cheng

    2015-11-01

    Privacy and usage restriction issues are important when valuable data are exchanged or acquired by different organizations. Standard access control mechanisms either restrict or completely grant access to valuable data. On the other hand, data obfuscation limits the overall usability and may result in loss of total value. There are no standard policy enforcement mechanisms for data acquired through mutual and copyright agreements. In practice, many different types of policies can be enforced in protecting data privacy. Hence there is the need for an unified framework that encapsulates multiple suites of policies to protect the data. We present our vision of an architecture named security automata model (SAM) to enforce privacy-preserving policies and usage restrictions. SAM analyzes the input queries and their outputs to enforce various policies, liberating data owners from the burden of monitoring data access. SAM allows administrators to specify various policies and enforces them to monitor queries and control the data access. Our goal is to address the problems of data usage control and protection through privacy policies that can be defined, enforced, and integrated with the existing access control mechanisms using SAM. In this paper, we lay out the theoretical foundation of SAM, which is based on an automata named Mandatory Result Automata. We also discuss the major challenges of implementing SAM in a real-world database environment as well as ideas to meet such challenges.

  15. Trade Secrets in the Legal Studies Curriculum--A Case Study

    Science.gov (United States)

    Evans, Michelle

    2012-01-01

    Trade secrets can be a valuable company asset because of their potential to last forever. Unfortunately, along with such a significant benefit, there is also a significant risk--the risk that the trade secret can be lost in an instant if it is not sufficiently protected. Companies must be vigilant in protecting these secrets. However, the law is…

  16. Regulating genetic privacy in the online health information era.

    Science.gov (United States)

    Magnusson, Roger S

    As the clinical implications of the genetic components of disease come to be better understood, there is likely to be a significant increase in the volume of genetic information held within clinical records. As patient health care records, in turn, come on-line as part of broader health information networks, there is likely to be considerable pressure in favour of special laws protecting genetic privacy. This paper reviews some of the privacy challenges posed by electronic health records, some government initiatives in this area, and notes the impact that developments in genetic testing will have upon the 'genetic content' of e-health records. Despite the sensitivity of genetic information, the paper argues against a policy of 'genetic exceptionalism', and its implications for genetic privacy laws.

  17. 76 FR 58007 - Privacy Act of 1974; Report of a New System of Records

    Science.gov (United States)

    2011-09-19

    ..., Division of Information Security & Privacy Management, Enterprise Architecture and Strategy Group, Office... system contain Protected Health Information (PHI) as defined by HHS regulation ``Standards for Privacy of... such PHI that are otherwise authorized by these routine uses may only be made if, and as, permitted or...

  18. (a,k)-Anonymous Scheme for Privacy-Preserving Data Collection in IoT-based Healthcare Services Systems.

    Science.gov (United States)

    Li, Hongtao; Guo, Feng; Zhang, Wenyin; Wang, Jie; Xing, Jinsheng

    2018-02-14

    The widely use of IoT technologies in healthcare services has pushed forward medical intelligence level of services. However, it also brings potential privacy threat to the data collection. In healthcare services system, health and medical data that contains privacy information are often transmitted among networks, and such privacy information should be protected. Therefore, there is a need for privacy-preserving data collection (PPDC) scheme to protect clients (patients) data. We adopt (a,k)-anonymity model as privacy pretection scheme for data collection, and propose a novel anonymity-based PPDC method for healthcare services in this paper. The threat model is analyzed in the client-server-to-user (CS2U) model. On client-side, we utilize (a,k)-anonymity notion to generate anonymous tuples which can resist possible attack, and adopt a bottom-up clustering method to create clusters that satisfy a base privacy level of (a 1 ,k 1 )-anonymity. On server-side, we reduce the communication cost through generalization technology, and compress (a 1 ,k 1 )-anonymous data through an UPGMA-based cluster combination method to make the data meet the deeper level of privacy (a 2 ,k 2 )-anonymity (a 1  ≥ a 2 , k 2  ≥ k 1 ). Theoretical analysis and experimental results prove that our scheme is effective in privacy-preserving and data quality.

  19. 40 CFR 370.64 - What information can I claim as trade secret or confidential?

    Science.gov (United States)

    2010-07-01

    ... secret or confidential? 370.64 Section 370.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... secret or confidential? (a) Trade secrets. You may be able to withhold the name of a specific chemical... trade secret. The requirements for withholding trade secret information are set forth in EPCRA section...

  20. SecureMA: protecting participant privacy in genetic association meta-analysis.

    Science.gov (United States)

    Xie, Wei; Kantarcioglu, Murat; Bush, William S; Crawford, Dana; Denny, Joshua C; Heatherly, Raymond; Malin, Bradley A

    2014-12-01

    Sharing genomic data is crucial to support scientific investigation such as genome-wide association studies. However, recent investigations suggest the privacy of the individual participants in these studies can be compromised, leading to serious concerns and consequences, such as overly restricted access to data. We introduce a novel cryptographic strategy to securely perform meta-analysis for genetic association studies in large consortia. Our methodology is useful for supporting joint studies among disparate data sites, where privacy or confidentiality is of concern. We validate our method using three multisite association studies. Our research shows that genetic associations can be analyzed efficiently and accurately across substudy sites, without leaking information on individual participants and site-level association summaries. Our software for secure meta-analysis of genetic association studies, SecureMA, is publicly available at http://github.com/XieConnect/SecureMA. Our customized secure computation framework is also publicly available at http://github.com/XieConnect/CircuitService. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Privacy preserving interactive record linkage (PPIRL).

    Science.gov (United States)

    Kum, Hye-Chung; Krishnamurthy, Ashok; Machanavajjhala, Ashwin; Reiter, Michael K; Ahalt, Stanley

    2014-01-01

    Record linkage to integrate uncoordinated databases is critical in biomedical research using Big Data. Balancing privacy protection against the need for high quality record linkage requires a human-machine hybrid system to safely manage uncertainty in the ever changing streams of chaotic Big Data. In the computer science literature, private record linkage is the most published area. It investigates how to apply a known linkage function safely when linking two tables. However, in practice, the linkage function is rarely known. Thus, there are many data linkage centers whose main role is to be the trusted third party to determine the linkage function manually and link data for research via a master population list for a designated region. Recently, a more flexible computerized third-party linkage platform, Secure Decoupled Linkage (SDLink), has been proposed based on: (1) decoupling data via encryption, (2) obfuscation via chaffing (adding fake data) and universe manipulation; and (3) minimum information disclosure via recoding. We synthesize this literature to formalize a new framework for privacy preserving interactive record linkage (PPIRL) with tractable privacy and utility properties and then analyze the literature using this framework. Human-based third-party linkage centers for privacy preserving record linkage are the accepted norm internationally. We find that a computer-based third-party platform that can precisely control the information disclosed at the micro level and allow frequent human interaction during the linkage process, is an effective human-machine hybrid system that significantly improves on the linkage center model both in terms of privacy and utility.

  2. An Efficient and Privacy-Preserving Multiuser Cloud-Based LBS Query Scheme

    Directory of Open Access Journals (Sweden)

    Lu Ou

    2018-01-01

    Full Text Available Location-based services (LBSs are increasingly popular in today’s society. People reveal their location information to LBS providers to obtain personalized services such as map directions, restaurant recommendations, and taxi reservations. Usually, LBS providers offer user privacy protection statement to assure users that their private location information would not be given away. However, many LBSs run on third-party cloud infrastructures. It is challenging to guarantee user location privacy against curious cloud operators while still permitting users to query their own location information data. In this paper, we propose an efficient privacy-preserving cloud-based LBS query scheme for the multiuser setting. We encrypt LBS data and LBS queries with a hybrid encryption mechanism, which can efficiently implement privacy-preserving search over encrypted LBS data and is very suitable for the multiuser setting with secure and effective user enrollment and user revocation. This paper contains security analysis and performance experiments to demonstrate the privacy-preserving properties and efficiency of our proposed scheme.

  3. Display methods of electronic patient record screens: patient privacy concerns.

    Science.gov (United States)

    Niimi, Yukari; Ota, Katsumasa

    2013-01-01

    To provide adequate care, medical professionals have to collect not only medical information but also information that may be related to private aspects of the patient's life. With patients' increasing awareness of information privacy, healthcare providers have to pay attention to the patients' right of privacy. This study aimed to clarify the requirements of the display method of electronic patient record (EPR) screens in consideration of both patients' information privacy concerns and health professionals' information needs. For this purpose, semi-structured group interviews were conducted of 78 medical professionals. They pointed out that partial concealment of information to meet patients' requests for privacy could result in challenges in (1) safety in healthcare, (2) information sharing, (3) collaboration, (4) hospital management, and (5) communication. They believed that EPRs should (1) meet the requirements of the therapeutic process, (2) have restricted access, (3) provide convenient access to necessary information, and (4) facilitate interprofessional collaboration. This study provides direction for the development of display methods that balance the sharing of vital information and protection of patient privacy.

  4. 76 FR 64115 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-092)] Privacy Act of 1974; Privacy Act... retirement of one Privacy Act system of records notice. SUMMARY: In accordance with the Privacy Act of 1974, NASA is giving notice that it proposes to cancel the following Privacy Act system of records notice...

  5. UnLynx: A Decentralized System for Privacy-Conscious Data Sharing

    Directory of Open Access Journals (Sweden)

    Froelicher David

    2017-10-01

    Full Text Available Current solutions for privacy-preserving data sharing among multiple parties either depend on a centralized authority that must be trusted and provides only weakest-link security (e.g., the entity that manages private/secret cryptographic keys, or leverage on decentralized but impractical approaches (e.g., secure multi-party computation. When the data to be shared are of a sensitive nature and the number of data providers is high, these solutions are not appropriate. Therefore, we present UnLynx, a new decentralized system for efficient privacy-preserving data sharing. We consider m servers that constitute a collective authority whose goal is to verifiably compute on data sent from n data providers. UnLynx guarantees the confidentiality, unlinkability between data providers and their data, privacy of the end result and the correctness of computations by the servers. Furthermore, to support differentially private queries, UnLynx can collectively add noise under encryption. All of this is achieved through a combination of a set of new distributed and secure protocols that are based on homomorphic cryptography, verifiable shuffling and zero-knowledge proofs. UnLynx is highly parallelizable and modular by design as it enables multiple security/privacy vs. runtime tradeoffs. Our evaluation shows that UnLynx can execute a secure survey on 400,000 personal data records containing 5 encrypted attributes, distributed over 20 independent databases, for a total of 2,000,000 ciphertexts, in 24 minutes.

  6. New Technology "Clouds" Student Data Privacy

    Science.gov (United States)

    Krueger, Keith R.; Moore, Bob

    2015-01-01

    As technology has leaped forward to provide valuable learning tools, parents and policy makers have begun raising concerns about the privacy of student data that schools and systems have. Federal laws are intended to protect students and their families but they have not and will never be able to keep up with rapidly evolving technology. School…

  7. A Neural-Network Clustering-Based Algorithm for Privacy Preserving Data Mining

    Science.gov (United States)

    Tsiafoulis, S.; Zorkadis, V. C.; Karras, D. A.

    The increasing use of fast and efficient data mining algorithms in huge collections of personal data, facilitated through the exponential growth of technology, in particular in the field of electronic data storage media and processing power, has raised serious ethical, philosophical and legal issues related to privacy protection. To cope with these concerns, several privacy preserving methodologies have been proposed, classified in two categories, methodologies that aim at protecting the sensitive data and those that aim at protecting the mining results. In our work, we focus on sensitive data protection and compare existing techniques according to their anonymity degree achieved, the information loss suffered and their performance characteristics. The ℓ-diversity principle is combined with k-anonymity concepts, so that background information can not be exploited to successfully attack the privacy of data subjects data refer to. Based on Kohonen Self Organizing Feature Maps (SOMs), we firstly organize data sets in subspaces according to their information theoretical distance to each other, then create the most relevant classes paying special attention to rare sensitive attribute values, and finally generalize attribute values to the minimum extend required so that both the data disclosure probability and the information loss are possibly kept negligible. Furthermore, we propose information theoretical measures for assessing the anonymity degree achieved and empirical tests to demonstrate it.

  8. 75 FR 5166 - Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration...

    Science.gov (United States)

    2010-02-01

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2009-0043] Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration/Railroad Retirement Board (SSA/RRB))-- Match... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 Public Law (Pub. L.) 100-503...

  9. Privacy og selvbeskrivelse

    DEFF Research Database (Denmark)

    Rosengaard, Hans Ulrik

    2015-01-01

    En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse......En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse...

  10. Concurrently Deniable Group Key Agreement and Its Application to Privacy-Preserving VANETs

    Directory of Open Access Journals (Sweden)

    Shengke Zeng

    2018-01-01

    Full Text Available VANETs need secure communication. Authentication in VANETs resists the attack on the receipt of false information. Authenticated group key agreement (GKA is used to establish a confidential and authenticated communication channel for the multiple vehicles. However, authentication incurs privacy leakage, that is, by using digital signature. Therefore, the deniability is deserved for GKA (which is termed as DGKA due to the privacy protection. In the DGKA protocol, each participant interacts with intended partners to establish a common group session key. After this agreement session, each participant can not only be regarded as the intended sender but also deny that it has ever participated in this session. Therefore, under this established key, vehicles send confidential messages with authentication property and the deniability protects the vehicles privacy. We present a novel transformation from an unauthenticated group key agreement to a deniable (authenticated group key agreement without increasing communication round. Our full deniability is achieved even in the concurrent setting which suits the Internet environment. In addition, we design an authenticated and privacy-preserving communication protocol for VANETs by using the proposed deniable group key agreement.

  11. The privacy coach: Supporting customer privacy in the internet of things

    NARCIS (Netherlands)

    Broenink, E.G.; Hoepman, J.H.; Hof, C. van 't; Kranenburg, R. van; Smits, D.; Wisman, T.

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing

  12. Semantic Security: Privacy Definitions Revisited

    OpenAIRE

    Jinfei Liu; Li Xiong; Jun Luo

    2013-01-01

    In this paper we illustrate a privacy framework named Indistinguishabley Privacy. Indistinguishable privacy could be deemed as the formalization of the existing privacy definitions in privacy preserving data publishing as well as secure multi-party computation. We introduce three representative privacy notions in the literature, Bayes-optimal privacy for privacy preserving data publishing, differential privacy for statistical data release, and privacy w.r.t. semi-honest behavior in the secure...

  13. Privacy protectionism and health information: is there any redress for harms to health?

    Science.gov (United States)

    Allen, Judy; Holman, C D'arcy J; Meslin, Eric M; Stanley, Fiona

    2013-12-01

    Health information collected by governments can be a valuable resource for researchers seeking to improve diagnostics, treatments and public health outcomes. Responsible use requires close attention to privacy concerns and to the ethical acceptability of using personal health information without explicit consent. Less well appreciated are the legal and ethical issues that are implicated when privacy protection is extended to the point where the potential benefits to the public from research are lost. Balancing these issues is a delicate matter for data custodians. This article examines the legal, ethical and structural context in which data custodians make decisions about the release of data for research. It considers the impact of those decisions on individuals. While there is strong protection against risks to privacy and multiple avenues of redress, there is no redress where harms result from a failure to release data for research.

  14. Federal Privacy Laws That Apply to Children and Education. Safeguarding Data

    Science.gov (United States)

    Data Quality Campaign, 2014

    2014-01-01

    This table identifies and briefly describes the following federal policies that safeguard and protect the confidentiality of personal information: (1) Family Educational Rights and Privacy Act (FERPA); (2) Protection of Pupil Rights Amendment (PPRA); (3) Health Insurance Portability and Accountability Act (HIPAA); (4) Children's Online Privacy…

  15. Tales from the dark side: Privacy dark strategies and privacy dark patterns

    DEFF Research Database (Denmark)

    Bösch, Christoph; Erb, Benjamin; Kargl, Frank

    2016-01-01

    Privacy strategies and privacy patterns are fundamental concepts of the privacy-by-design engineering approach. While they support a privacy-aware development process for IT systems, the concepts used by malicious, privacy-threatening parties are generally less understood and known. We argue...... that understanding the “dark side”, namely how personal data is abused, is of equal importance. In this paper, we introduce the concept of privacy dark strategies and privacy dark patterns and present a framework that collects, documents, and analyzes such malicious concepts. In addition, we investigate from...... a psychological perspective why privacy dark strategies are effective. The resulting framework allows for a better understanding of these dark concepts, fosters awareness, and supports the development of countermeasures. We aim to contribute to an easier detection and successive removal of such approaches from...

  16. A PhD abstract presentation on Personal Information Privacy System based on Proactive Design

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning

    2014-01-01

    providers and websites collects and make an extensive use of personal information. Using different Big Data methods and techniques the knowledge and patterns are generated or extracted from the data. This will lead to a serious problem to privacy breach. Hence, there is a need of embedding privacy...... in the design phase will be the basic principle on which the data security can be provided, and the privacy will be protected. This will give more control and power to user over personal information....

  17. An Efficient Association Rule Hiding Algorithm for Privacy Preserving Data Mining

    OpenAIRE

    Yogendra Kumar Jain,; Vinod Kumar Yadav,; Geetika S. Panday

    2011-01-01

    The security of the large database that contains certain crucial information, it will become a serious issue when sharing data to the network against unauthorized access. Privacy preserving data mining is a new research trend in privacy data for data mining and statistical database. Association analysis is a powerful toolfor discovering relationships which are hidden in large database. Association rules hiding algorithms get strong and efficient performance for protecting confidential and cru...

  18. A Knowledge Model Sharing Based Approach to Privacy-Preserving Data Mining

    OpenAIRE

    Hongwei Tian; Weining Zhang; Shouhuai Xu; Patrick Sharkey

    2012-01-01

    Privacy-preserving data mining (PPDM) is an important problem and is currently studied in three approaches: the cryptographic approach, the data publishing, and the model publishing. However, each of these approaches has some problems. The cryptographic approach does not protect privacy of learned knowledge models and may have performance and scalability issues. The data publishing, although is popular, may suffer from too much utility loss for certain types of data mining applications. The m...

  19. Mum's the Word: Feds Are Serious About Protecting Patients' Privacy.

    Science.gov (United States)

    Conde, Crystal

    2010-08-01

    The Health Information Technology for Economic and Clinical Health (HITECH) Act significantly changes HIPAA privacy and security policies that affect physicians. Chief among the changes are the new breach notification regulations, developed by the U.S. Department of Health and Human Services Office for Civil Rights. The Texas Medical Association has developed resources to help physicians comply with the new HIPAA regulations.

  20. The social life of genes: privacy, property and the new genetics.

    Science.gov (United States)

    Everett, Margaret

    2003-01-01

    With the advent of the Human Genome Project and widespread fears over human cloning and medical privacy, a number of states have moved to protect genetic privacy. Oregon's unique Genetic Privacy Act of 1995, which declared that an individual had property rights to their DNA, has provoked national and international interest and controversy. This paper critically reviews the literature on genetic privacy and gene patenting from law, philosophy, science and anthropology. The debate in Oregon, from 1995 to 2001, illustrates many of the key issues in this emerging area. Both sides of the debate invoke the property metaphor, reinforcing deterministic assumptions and avoiding more fundamental questions about the integrity of the body and self-identity. The anthropological critique of the commodification of the body, and the concept of 'embodiment' are useful in analyzing the debate over DNA as property.

  1. Public Auditing with Privacy Protection in a Multi-User Model of Cloud-Assisted Body Sensor Networks

    Science.gov (United States)

    Li, Song; Cui, Jie; Zhong, Hong; Liu, Lu

    2017-01-01

    Wireless Body Sensor Networks (WBSNs) are gaining importance in the era of the Internet of Things (IoT). The modern medical system is a particular area where the WBSN techniques are being increasingly adopted for various fundamental operations. Despite such increasing deployments of WBSNs, issues such as the infancy in the size, capabilities and limited data processing capacities of the sensor devices restrain their adoption in resource-demanding applications. Though providing computing and storage supplements from cloud servers can potentially enrich the capabilities of the WBSNs devices, data security is one of the prevailing issues that affects the reliability of cloud-assisted services. Sensitive applications such as modern medical systems demand assurance of the privacy of the users’ medical records stored in distant cloud servers. Since it is economically impossible to set up private cloud servers for every client, auditing data security managed in the remote servers has necessarily become an integral requirement of WBSNs’ applications relying on public cloud servers. To this end, this paper proposes a novel certificateless public auditing scheme with integrated privacy protection. The multi-user model in our scheme supports groups of users to store and share data, thus exhibiting the potential for WBSNs’ deployments within community environments. Furthermore, our scheme enriches user experiences by offering public verifiability, forward security mechanisms and revocation of illegal group members. Experimental evaluations demonstrate the security effectiveness of our proposed scheme under the Random Oracle Model (ROM) by outperforming existing cloud-assisted WBSN models. PMID:28475110

  2. Public Auditing with Privacy Protection in a Multi-User Model of Cloud-Assisted Body Sensor Networks.

    Science.gov (United States)

    Li, Song; Cui, Jie; Zhong, Hong; Liu, Lu

    2017-05-05

    Wireless Body Sensor Networks (WBSNs) are gaining importance in the era of the Internet of Things (IoT). The modern medical system is a particular area where the WBSN techniques are being increasingly adopted for various fundamental operations. Despite such increasing deployments of WBSNs, issues such as the infancy in the size, capabilities and limited data processing capacities of the sensor devices restrain their adoption in resource-demanding applications. Though providing computing and storage supplements from cloud servers can potentially enrich the capabilities of the WBSNs devices, data security is one of the prevailing issues that affects the reliability of cloud-assisted services. Sensitive applications such as modern medical systems demand assurance of the privacy of the users' medical records stored in distant cloud servers. Since it is economically impossible to set up private cloud servers for every client, auditing data security managed in the remote servers has necessarily become an integral requirement of WBSNs' applications relying on public cloud servers. To this end, this paper proposes a novel certificateless public auditing scheme with integrated privacy protection. The multi-user model in our scheme supports groups of users to store and share data, thus exhibiting the potential for WBSNs' deployments within community environments. Furthermore, our scheme enriches user experiences by offering public verifiability, forward security mechanisms and revocation of illegal group members. Experimental evaluations demonstrate the security effectiveness of our proposed scheme under the Random Oracle Model (ROM) by outperforming existing cloud-assisted WBSN models.

  3. 76 FR 34177 - Privacy Act of 1974: Implementation of Exemptions; U.S. Citizenship and Immigration Services...

    Science.gov (United States)

    2011-06-13

    ...] Privacy Act of 1974: Implementation of Exemptions; U.S. Citizenship and Immigration Services, Immigration... Privacy Act of 1974 for the Department of Homeland Security United States Citizenship and Immigration Services, Immigration and Customs Enforcement, and Customs and Border Protection--001 Alien File, Index...

  4. 77 FR 43639 - Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration (SSA...

    Science.gov (United States)

    2012-07-25

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0090] Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration (SSA)/Department of Veterans Affairs (VA.... SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503...

  5. 77 FR 54943 - Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration (SSA...

    Science.gov (United States)

    2012-09-06

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0016] Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration (SSA)/Department of Veterans Affairs (VA.... SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503...

  6. Big Brother’s Little Helpers: The Right to Privacy and the Responsibility of Internet Service Providers

    Directory of Open Access Journals (Sweden)

    Yael Ronen

    2015-02-01

    Full Text Available Following the 2013 revelations on the extent of intelligence gathering through internet service providers, this article concerns the responsibility of internet service providers (ISPs involved in disclosure of personal data to government authorities under the right to privacy, by reference to the developing, non-binding standards applied to businesses under the Protect, Respect and Remedy Framework. The article examines the manner in which the Framework applies to ISPs and looks at measures that ISPs can take to fulfil their responsibility to respect the right to privacy. It utilizes the challenges to the right to privacy to discuss some aspects of the extension of human rights responsibilities to corporations. These include the respective roles of government and non-state actors, the extent to which corporations may be required to act proactively in order to protect the privacy of clients, and the relevance of transnational activity.

  7. European Data Protection Law Review > Volume 3 , Issue 3 > Privacy and Data Protection in the Age of Pervasive Technologies in AI and Robotics DOI https://doi.org/10.21552/edpl/2017/3/8 Robert van den Hoven van Genderen

    NARCIS (Netherlands)

    van den Hoven van Genderen, R.

    2017-01-01

    Robots have been a part of the popular imagination since antiquity. And yet the idea of a robot — a being that exists somehow in the twilight between machine and person — continues to fascinate. Privacy, data protection and physical integrity will be structurally influenced by the pervasive

  8. Privacy-Preserving Meter Report Protocol of Isolated Smart Grid Devices

    Directory of Open Access Journals (Sweden)

    Zhiwei Wang

    2017-01-01

    Full Text Available Smart grid aims to improve the reliability, efficiency, and security of the traditional grid, which allows two-way transmission and efficiency-driven response. However, a main concern of this new technique is that the fine-grained metering data may leak the personal privacy information of the customers. Thus, the data aggregation mechanism for privacy protection is required for the meter report protocol in smart grid. In this paper, we propose an efficient privacy-preserving meter report protocol for the isolated smart grid devices. Our protocol consists of an encryption scheme with additively homomorphic property and a linearly homomorphic signature scheme, where the linearly homomorphic signature scheme is suitable for privacy-preserving data aggregation. We also provide security analysis of our protocol in the context of some typical attacks in smart grid. The implementation of our protocol on the Intel Edison platform shows that our protocol is efficient enough for the physical constrained devices, like smart meters.

  9. Public Opinion about the Importance of Privacy in Biobank Research

    Science.gov (United States)

    Kaufman, David J.; Murphy-Bollinger, Juli; Scott, Joan; Hudson, Kathy L.

    2009-01-01

    Concerns about privacy may deter people from participating in genetic research. Recruitment and retention of biobank participants requires understanding the nature and magnitude of these concerns. Potential participants in a proposed biobank were asked about their willingness to participate, their privacy concerns, informed consent, and data sharing. A representative survey of 4659 U.S. adults was conducted. Ninety percent of respondents would be concerned about privacy, 56% would be concerned about researchers having their information, and 37% would worry that study data could be used against them. However, 60% would participate in the biobank if asked. Nearly half (48%) would prefer to provide consent once for all research approved by an oversight panel, whereas 42% would prefer to provide consent for each project separately. Although 92% would allow academic researchers to use study data, 80% and 75%, respectively, would grant access to government and industry researchers. Concern about privacy was related to lower willingness to participate only when respondents were told that they would receive $50 for participation and would not receive individual research results back. Among respondents who were told that they would receive $200 or individual research results, privacy concerns were not related to willingness. Survey respondents valued both privacy and participation in biomedical research. Despite pervasive privacy concerns, 60% would participate in a biobank. Assuring research participants that their privacy will be protected to the best of researchers' abilities may increase participants' acceptance of consent for broad research uses of biobank data by a wide range of researchers. PMID:19878915

  10. Privacy, the individual and genetic information: a Buddhist perspective.

    Science.gov (United States)

    Hongladarom, Soraj

    2009-09-01

    Bioinformatics is a new field of study whose ethical implications involve a combination of bioethics, computer ethics and information ethics. This paper is an attempt to view some of these implications from the perspective of Buddhism. Privacy is a central concern in both computer/information ethics and bioethics, and with information technology being increasingly utilized to process biological and genetic data, the issue has become even more pronounced. Traditionally, privacy presupposes the individual self but as Buddhism does away with the ultimate conception of an individual self, it has to find a way to analyse and justify privacy that does not presuppose such a self. It does this through a pragmatic conception that does not depend on a positing of the substantial self, which is then found to be unnecessary for an effective protection of privacy. As it may be possible one day to link genetic data to individuals, the Buddhist conception perhaps offers a more flexible approach, as what is considered to be integral to an individual person is not fixed in objectivity but depends on convention.

  11. Simulating cloud environment for HIS backup using secret sharing.

    Science.gov (United States)

    Kuroda, Tomohiro; Kimura, Eizen; Matsumura, Yasushi; Yamashita, Yoshinori; Hiramatsu, Haruhiko; Kume, Naoto

    2013-01-01

    In the face of a disaster hospitals are expected to be able to continue providing efficient and high-quality care to patients. It is therefore crucial for hospitals to develop business continuity plans (BCPs) that identify their vulnerabilities, and prepare procedures to overcome them. A key aspect of most hospitals' BCPs is creating the backup of the hospital information system (HIS) data at multiple remote sites. However, the need to keep the data confidential dramatically increases the costs of making such backups. Secret sharing is a method to split an original secret message so that individual pieces are meaningless, but putting sufficient number of pieces together reveals the original message. It allows creation of pseudo-redundant arrays of independent disks for privacy-sensitive data over the Internet. We developed a secret sharing environment for StarBED, a large-scale network experiment environment, and evaluated its potential and performance during disaster recovery. Simulation results showed that the entire main HIS database of Kyoto University Hospital could be retrieved within three days even if one of the distributed storage systems crashed during a disaster.

  12. 75 FR 81205 - Privacy Act: Revision of Privacy Act Systems of Records

    Science.gov (United States)

    2010-12-27

    ... DEPARTMENT OF AGRICULTURE Office of the Secretary Privacy Act: Revision of Privacy Act Systems of Records AGENCY: Office of the Secretary, USDA. ACTION: Notice to Revise Privacy Act Systems of Records... two Privacy Act Systems of Records entitled ``Information on Persons Disqualified from the...

  13. A case study of the Secure Anonymous Information Linkage (SAIL) Gateway: A privacy-protecting remote access system for health-related research and evaluation☆

    Science.gov (United States)

    Jones, Kerina H.; Ford, David V.; Jones, Chris; Dsilva, Rohan; Thompson, Simon; Brooks, Caroline J.; Heaven, Martin L.; Thayer, Daniel S.; McNerney, Cynthia L.; Lyons, Ronan A.

    2014-01-01

    With the current expansion of data linkage research, the challenge is to find the balance between preserving the privacy of person-level data whilst making these data accessible for use to their full potential. We describe a privacy-protecting safe haven and secure remote access system, referred to as the Secure Anonymised Information Linkage (SAIL) Gateway. The Gateway provides data users with a familiar Windows interface and their usual toolsets to access approved anonymously-linked datasets for research and evaluation. We outline the principles and operating model of the Gateway, the features provided to users within the secure environment, and how we are approaching the challenges of making data safely accessible to increasing numbers of research users. The Gateway represents a powerful analytical environment and has been designed to be scalable and adaptable to meet the needs of the rapidly growing data linkage community. PMID:24440148

  14. Democracy and genetic privacy: the value of bodily integrity.

    Science.gov (United States)

    Beckman, Ludvig

    2005-01-01

    The right to genetic privacy is presently being incorporated in legal systems all over the world. It remains largely unclear however what interests and values this right serves to protect. There are many different arguments made in the literature, yet none takes into account the problem of how particular values can be justified given the plurality of moral and religious doctrines in our societies. In this article theories of public reason are used in order to explore how genetic privacy could be justified in a way that is sensitive to the "fact of pluralism". The idea of public reason is specified as the idea that governments should appeal only to values and beliefs that are acceptable to all reasonable citizens in the justification of rights. In examining prevalent arguments for genetic privacy--based on the value of autonomy or on the value of intimacy--it is concluded that they do not meet this requirement. In dealing with this deficiency in the literature, an argument is developed that genetic privacy is fundamental to the democratic participation of all citizens. By referring to the preconditions of democratic citizenship, genetic privacy can be justified in a way that respects the plurality of comprehensive doctrines of morality and religion in contemporary societies.

  15. Privacy and Technology: Counseling Institutions of Higher Education.

    Science.gov (United States)

    Cranman, Kevin A.

    1998-01-01

    Examines the challenges to colleges and universities associated with maintaining privacy as use of technology increases and technology advances. Lapses in security, types of information needing protection, liability under federal laws, other relevant laws and pending legislation, ethics, and policy implementation in the electronic age are…

  16. Security of Linear Secret-Sharing Schemes Against Mass Surveillance

    DEFF Research Database (Denmark)

    Giacomelli, Irene; Olimid, Ruxandra; Ranellucci, Samuel

    2015-01-01

    by a proprietary code that the provider (“big brother”) could manipulate to covertly violate the privacy of the users (by implementing Algorithm-Substitution Attacks or ASAs). First, we formalize the security notion that expresses the goal of big brother and prove that for any linear secret-sharing scheme...... there exists an undetectable subversion of it that efficiently allows surveillance. Second, we formalize the security notion that assures that a sharing scheme is secure against ASAs and construct the first sharing scheme that meets this notion....

  17. 78 FR 40515 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-07-05

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-071] Privacy Act of 1974; Privacy Act System of Records AGENCY: National Aeronautics and Space Administration (NASA). ACTION: Notice of Privacy Act system of records. SUMMARY: Each Federal agency is required by the Privacy Act of 1974 to publish...

  18. 78 FR 77503 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-12-23

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-149] Privacy Act of 1974; Privacy Act... proposed revisions to existing Privacy Act systems of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  19. The Genetic Privacy Act and commentary

    Energy Technology Data Exchange (ETDEWEB)

    Annas, G.J.; Glantz, L.H.; Roche, P.A.

    1995-02-28

    The Genetic Privacy Act is a proposal for federal legislation. The Act is based on the premise that genetic information is different from other types of personal information in ways that require special protection. The DNA molecule holds an extensive amount of currently indecipherable information. The major goal of the Human Genome Project is to decipher this code so that the information it contains is accessible. The privacy question is, accessible to whom? The highly personal nature of the information contained in DNA can be illustrated by thinking of DNA as containing an individual`s {open_quotes}future diary.{close_quotes} A diary is perhaps the most personal and private document a person can create. It contains a person`s innermost thoughts and perceptions, and is usually hidden and locked to assure its secrecy. Diaries describe the past. The information in one`s genetic code can be thought of as a coded probabilistic future diary because it describes an important part of a unique and personal future. This document presents an introduction to the proposal for federal legislation `the Genetic Privacy Act`; a copy of the proposed act; and comment.

  20. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.

    Science.gov (United States)

    Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang

    2018-05-22

    With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.