WorldWideScience

Sample records for privacy protection technology

  1. A privacy protection model to support personal privacy in relational databases.

    OpenAIRE

    2008-01-01

    The individual of today incessantly insists on more protection of his/her personal privacy than a few years ago. During the last few years, rapid technological advances, especially in the field of information technology, directed most attention and energy to the privacy protection of the Internet user. Research was done and is still being done covering a vast area to protect the privacy of transactions performed on the Internet. However, it was established that almost no research has been don...

  2. Protecting patron privacy

    CERN Document Server

    Beckstrom, Matthew

    2015-01-01

    In a world where almost anyone with computer savvy can hack, track, and record the online activities of others, your library can serve as a protected haven for your visitors who rely on the Internet to conduct research-if you take the necessary steps to safeguard their privacy. This book shows you how to protect patrons' privacy while using the technology that your library provides, including public computers, Internet access, wireless networks, and other devices. Logically organized into two major sections, the first part of the book discusses why the privacy of your users is of paramount

  3. Privacy Protection Research of Mobile RFID

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Radio Frequency Identification is one of the most controversial technologies at present.It is very difficult to detect who reads a tag incorporated into products owned by a person,a significant concern to privacy threats in RFID system arises from this reason.User privacy problem is prior considersion for mobile RFID service,because most mobile RFID service based on end-user service.Propose a solution for user privacy protection,which is a modification of EPC Class 1 Generation 2 protocol,and introduce a privacy protection scenario for mobile RFID service using this method.

  4. Protecting genetic privacy.

    Science.gov (United States)

    Roche, P A; Annas, G J

    2001-05-01

    This article outlines the arguments for and against new rules to protect genetic privacy. We explain why genetic information is different to other sensitive medical information, why researchers and biotechnology companies have opposed new rules to protect genetic privacy (and favour anti-discrimination laws instead), and discuss what can be done to protect privacy in relation to genetic-sequence information and to DNA samples themselves.

  5. Are Data Sharing and Privacy Protection Mutually Exclusive?

    Science.gov (United States)

    Joly, Yann; Dyke, Stephanie O M; Knoppers, Bartha M; Pastinen, Tomi

    2016-11-17

    We review emerging strategies to protect the privacy of research participants in international epigenome research: open consent, genome donation, registered access, automated procedures, and privacy-enhancing technologies. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. The role of privacy protection in healthcare information systems adoption.

    Science.gov (United States)

    Hsu, Chien-Lung; Lee, Ming-Ren; Su, Chien-Hui

    2013-10-01

    Privacy protection is an important issue and challenge in healthcare information systems (HISs). Recently, some privacy-enhanced HISs are proposed. Users' privacy perception, intention, and attitude might affect the adoption of such systems. This paper aims to propose a privacy-enhanced HIS framework and investigate the role of privacy protection in HISs adoption. In the proposed framework, privacy protection, access control, and secure transmission modules are designed to enhance the privacy protection of a HIS. An experimental privacy-enhanced HIS is also implemented. Furthermore, we proposed a research model extending the unified theory of acceptance and use of technology by considering perceived security and information security literacy and then investigate user adoption of a privacy-enhanced HIS. The experimental results and analyses showed that user adoption of a privacy-enhanced HIS is directly affected by social influence, performance expectancy, facilitating conditions, and perceived security. Perceived security has a mediating effect between information security literacy and user adoption. This study proposes several implications for research and practice to improve designing, development, and promotion of a good healthcare information system with privacy protection.

  7. Privacy and technology challenges for ubiquitous social networking

    DEFF Research Database (Denmark)

    Sapuppo, Antonio; Seet, Boon-Chong

    2015-01-01

    towards important challenges such as social sensing, enabling social networking and privacy protection. In this paper we firstly investigate the methods and technologies for acquisition of the relevant context for promotion of sociability among inhabitants of USN environments. Afterwards, we review...... architectures and techniques for enabling social interactions between participants. Finally, we identify privacy as the major challenge for networking in USN environments. Consequently, we depict design guidelines and review privacy protection models for facilitating personal information disclosure....

  8. Routes for breaching and protecting genetic privacy.

    Science.gov (United States)

    Erlich, Yaniv; Narayanan, Arvind

    2014-06-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these data sets is vital for progress in biomedical research. However, a growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, indicate the underlying assumptions, and assess their technological complexity and maturation. We then review potential mitigation methods for privacy-preserving dissemination of sensitive data and highlight different cases that are relevant to genetic applications.

  9. Trajectory data privacy protection based on differential privacy mechanism

    Science.gov (United States)

    Gu, Ke; Yang, Lihao; Liu, Yongzhi; Liao, Niandong

    2018-05-01

    In this paper, we propose a trajectory data privacy protection scheme based on differential privacy mechanism. In the proposed scheme, the algorithm first selects the protected points from the user’s trajectory data; secondly, the algorithm forms the polygon according to the protected points and the adjacent and high frequent accessed points that are selected from the accessing point database, then the algorithm calculates the polygon centroids; finally, the noises are added to the polygon centroids by the differential privacy method, and the polygon centroids replace the protected points, and then the algorithm constructs and issues the new trajectory data. The experiments show that the running time of the proposed algorithms is fast, the privacy protection of the scheme is effective and the data usability of the scheme is higher.

  10. Routes for breaching and protecting genetic privacy

    OpenAIRE

    Erlich, Yaniv; Narayanan, Arvind

    2013-01-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these datasets is vital for progress in biomedical research. However, one growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, point to the underlying assumptions, and assess its technological complexity and maturati...

  11. Do Smartphone Power Users Protect Mobile Privacy Better than Nonpower Users? Exploring Power Usage as a Factor in Mobile Privacy Protection and Disclosure.

    Science.gov (United States)

    Kang, Hyunjin; Shin, Wonsun

    2016-03-01

    This study examines how consumers' competence at using smartphone technology (i.e., power usage) affects their privacy protection behaviors. A survey conducted with smartphone users shows that power usage influences privacy protection behavior not only directly but also indirectly through privacy concerns and trust placed in mobile service providers. A follow-up experiment indicates that the effects of power usage on smartphone users' information management can be a function of content personalization. Users, high on power usage, are less likely to share personal information on personalized mobile sites, but they become more revealing when they interact with nonpersonalized mobile sites.

  12. Control use of data to protect privacy.

    Science.gov (United States)

    Landau, Susan

    2015-01-30

    Massive data collection by businesses and governments calls into question traditional methods for protecting privacy, underpinned by two core principles: (i) notice, that there should be no data collection system whose existence is secret, and (ii) consent, that data collected for one purpose not be used for another without user permission. But notice, designated as a fundamental privacy principle in a different era, makes little sense in situations where collection consists of lots and lots of small amounts of information, whereas consent is no longer realistic, given the complexity and number of decisions that must be made. Thus, efforts to protect privacy by controlling use of data are gaining more attention. I discuss relevant technology, policy, and law, as well as some examples that can illuminate the way. Copyright © 2015, American Association for the Advancement of Science.

  13. Privacy in Digital Age: Dead or Alive?! Regarding the New EU Data Protection Regulations

    Directory of Open Access Journals (Sweden)

    Seyed Ebrahim Dorraji

    2015-02-01

    Full Text Available Purpose – To review and critically discuss the current state of privacy in the context of constant technological changes and to emphasize the pace of technological advancements and developments reached over the time when the last EU data protection laws came into effect. These facts inevitably affect the perception of privacy and raise the question of whether privacy is dead or takes the last breath in the digital age? This paper is an attempt to address this question.Design/Methodology/Approach – Based on the comparison and systematic analysis of scientific literature, the authors discuss problematic issues related to privacy and data protection in the technology era – where these issues are too complicated to be clearly regulated by laws and rules since “laws move as a function of years and technology moves as a function of months” (Ron Rivest. Therefore, this analytical approach towards the issue may help to facilitate reaching the best-fit decision in this area.Findings – The authors emphasize the change of perception of privacy, which originated and grew on the idea of “an integral part of our humanity”, the “heart of our liberty” and “the beginning of all freedoms” (Solove, 2008, leading to the recently raised idea that privacy is severely hanging with threat. The authors are of the opinion that legislation and regulation may be one of the best and effective techniques for protecting privacy in the twenty-first century, but it is not currently adequate (Wacks, 2012. One of the solutions lies in technology design.Research limitations/implications – The aspects of privacy and data protection in the European Union have been widely discussed recently because of their broad applicability. Therefore, it is hardly possible to review and cover all the important aspects of the issue. This article focuses on the roles of technology and legislation in securing privacy. The authors examine and provide their own views based on

  14. New Technology "Clouds" Student Data Privacy

    Science.gov (United States)

    Krueger, Keith R.; Moore, Bob

    2015-01-01

    As technology has leaped forward to provide valuable learning tools, parents and policy makers have begun raising concerns about the privacy of student data that schools and systems have. Federal laws are intended to protect students and their families but they have not and will never be able to keep up with rapidly evolving technology. School…

  15. Privacy, technology, and norms: the case of Smart Meters.

    Science.gov (United States)

    Horne, Christine; Darras, Brice; Bean, Elyse; Srivastava, Anurag; Frickel, Scott

    2015-05-01

    Norms shift and emerge in response to technological innovation. One such innovation is Smart Meters - components of Smart Grid energy systems capable of minute-to-minute transmission of consumer electricity use information. We integrate theory from sociological research on social norms and privacy to examine how privacy threats affect the demand for and expectations of norms that emerge in response to new technologies, using Smart Meters as a test case. Results from three vignette experiments suggest that increased threats to privacy created by Smart Meters are likely to provoke strong demand for and expectations of norms opposing the technology and that the strength of these normative rules is at least partly conditional on the context. Privacy concerns vary little with actors' demographic characteristics. These findings contribute to theoretical understanding of norm emergence and have practical implications for implementing privacy protections that effectively address concerns of electricity users. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. 45 CFR 164.522 - Rights to request privacy protection for protected health information.

    Science.gov (United States)

    2010-10-01

    ... ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY Privacy of Individually Identifiable Health Information § 164.522 Rights to request privacy protection for protected health information. (a)(1... 45 Public Welfare 1 2010-10-01 2010-10-01 false Rights to request privacy protection for protected...

  17. Hacktivism 1-2-3: how privacy enhancing technologies change the face of anonymous hacktivism

    NARCIS (Netherlands)

    Bodó, B.

    2014-01-01

    This short essay explores how the notion of hacktivism changes due to easily accessible, military grade Privacy Enhancing Technologies (PETs). Privacy Enhancing Technologies, technological tools which provide anonymous communications and protect users from online surveillance enable new forms of

  18. Protecting privacy in a clinical data warehouse.

    Science.gov (United States)

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  19. Privacy and Technology: Counseling Institutions of Higher Education.

    Science.gov (United States)

    Cranman, Kevin A.

    1998-01-01

    Examines the challenges to colleges and universities associated with maintaining privacy as use of technology increases and technology advances. Lapses in security, types of information needing protection, liability under federal laws, other relevant laws and pending legislation, ethics, and policy implementation in the electronic age are…

  20. PRIVACY PROTECTION PROBLEMS IN SOCIAL NETWORKS

    OpenAIRE

    OKUR, M. Cudi

    2011-01-01

    Protecting privacy has become a major concern for most social network users because of increased difficulties of controlling the online data. This article presents an assessment of the common privacy related risks of social networking sites. Open and hidden privacy risks of active and passive online profiles are examined and increasing share of social networking in these phenomena is discussed. Inadequacy of available legal and institutional protection is demonstrated and the effectiveness of...

  1. Location Privacy Protection Based on Improved K-Value Method in Augmented Reality on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Chunyong Yin

    2017-01-01

    Full Text Available With the development of Augmented Reality technology, the application of location based service (LBS is more and more popular, which provides enormous convenience to people’s life. User location information could be obtained at anytime and anywhere. So user location privacy security suffers huge threats. Therefore, it is crucial to pay attention to location privacy protection in LBS. Based on the architecture of the trusted third party (TTP, we analyzed the advantages and shortages of existing location privacy protection methods in LBS on mobile terminal. Then we proposed the improved K-value location privacy protection method according to privacy level, which combines k-anonymity method with pseudonym method. Through the simulation experiment, the results show that this improved method can anonymize all service requests effectively. In addition to the experiment of execution time, it demonstrated that our proposed method can realize the location privacy protection more efficiently.

  2. Through Patients' Eyes: Regulation, Technology, Privacy, and the Future.

    Science.gov (United States)

    Petersen, Carolyn

    2018-04-22

    Privacy is commonly regarded as a regulatory requirement achieved via technical and organizational management practices. Those working in the field of informatics often play a role in privacy preservation as a result of their expertise in information technology, workflow analysis, implementation science, or related skills. Viewing privacy from the perspective of patients whose protected health information is at risk broadens the considerations to include the perceived duality of privacy; the existence of privacy within a context unique to each patient; the competing needs inherent within privacy management; the need for particular consideration when data are shared; and the need for patients to control health information in a global setting. With precision medicine, artificial intelligence, and other treatment innovations on the horizon, health care professionals need to think more broadly about how to preserve privacy in a health care environment driven by data sharing. Patient-reported privacy preferences, privacy portability, and greater transparency around privacy-preserving functionalities are potential strategies for ensuring that privacy regulations are met and privacy is preserved. Georg Thieme Verlag KG Stuttgart.

  3. Genetic secrets: Protecting privacy and confidentiality in the genetic era

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-07-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  4. MUSES RT2AE V P/DP: On the Road to Privacy-Friendly Security Technologies in the Workplace

    OpenAIRE

    Van Der Sype, Yung Shin Marleen; Guislain, Jonathan; Seigneur, Jean-Marc; Titi, Xavier

    2016-01-01

    Successful protection of company data assets requires strong technological support. As many security incidents still occur from within, security technologies often include elements to monitor the behaviour of employees. As those security systems are considered as privacy-intrusive, they are hard to align with the privacy and data protection rights of the employees of the company. Even though there is currently no legal obligation for developers to embed privacy and data protection in security...

  5. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  6. 22 CFR 212.22 - Protection of personal privacy.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Protection of personal privacy. 212.22 Section... Information for Public Inspection and Copying § 212.22 Protection of personal privacy. To the extent required to prevent a clearly unwarranted invasion of personal privacy, USAID may delete identifying details...

  7. Location privacy protection in mobile networks

    CERN Document Server

    Liu, Xinxin

    2013-01-01

    This SpringerBrief analyzes the potential privacy threats in wireless and mobile network environments, and reviews some existing works. It proposes multiple privacy preserving techniques against several types of privacy threats that are targeting users in a mobile network environment. Depending on the network architecture, different approaches can be adopted. The first proposed approach considers a three-party system architecture where there is a trusted central authority that can be used to protect users? privacy. The second approach considers a totally distributed environment where users per

  8. Data protection laws and privacy on Facebook

    Directory of Open Access Journals (Sweden)

    Phillip Nyoni

    2015-07-01

    Full Text Available Background: Social networks have changed the way people communicate. Business processes and social interactions revolve more in the cyber space. However, as these cyber technologies advance, users become more exposed to privacy threats. Regulatory frameworks and legal instruments currently lacking a strong cyber presence are required, for the protection of users. Objectives: There is need to explore and evaluate the extent to which users are exposed to vulnerabilities and threats in the context of the existing protection laws and policies. Furthermore, to investigate how the existing legal instruments can be enhanced to better protect users. Method: This article evaluates and analyses these privacy challenges from a legalistic point of view. The study is focused on the South African Facebook users. Poll information gathered from the profile pages of users at North-West University was analysed. A short survey was also conducted to validate the poll results. Descriptive statistics, including measures of central tendency and measures of spread, have been used to present the data. In addition, a combination of tabulated and graphical description data was also summarised in a meaningful way. Results: The results clearly show that the legal frameworks and laws are still evolving and that they are not adequately drafted to deal with specific cyber violation of privacy. Conclusion: This highlights the need to review legal instruments on a regular basis with wider consultation with users in an endeavour to develop a robust and an enforceable legal framework. A proactive legal framework would be the ideal approach unfortunately; law is reactive to cyber-crimes.

  9. Privacy protection for patients with substance use problems

    Directory of Open Access Journals (Sweden)

    Hu LL

    2011-12-01

    Full Text Available Lianne Lian Hu1, Steven Sparenborg2, Betty Tai21Department of Preventive Medicine and Biometrics, Uniformed Services University of the Health Sciences, 2Center for the Clinical Trials Network, National Institute on Drug Abuse, National Institutes of Health, Bethesda, MDAbstract: Many Americans with substance use problems will have opportunities to receive coordinated health care through the integration of primary care and specialty care for substance use disorders under the Patient Protection and Affordable Care Act of 2010. Sharing of patient health records among care providers is essential to realize the benefits of electronic health records. Health information exchange through meaningful use of electronic health records can improve health care safety, quality, and efficiency. Implementation of electronic health records and health information exchange presents great opportunities for health care integration, but also makes patient privacy potentially vulnerable. Privacy issues are paramount for patients with substance use problems. This paper discusses major differences between two federal privacy laws associated with health care for substance use disorders, identifies health care problems created by privacy policies, and describes potential solutions to these problems through technology innovation and policy improvement.Keywords: substance abuse, patient privacy, electronic health records, health information exchange

  10. Genetic secrets: Protecting privacy and confidentiality in the genetic era. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-09-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  11. 48 CFR 39.105 - Privacy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Privacy. 39.105 Section 39... CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.105 Privacy. Agencies shall ensure that contracts for information technology address protection of privacy in accordance with the Privacy Act (5 U.S.C...

  12. Gender and online privacy among teens: risk perception, privacy concerns, and protection behaviors.

    Science.gov (United States)

    Youn, Seounmi; Hall, Kimberly

    2008-12-01

    Survey data from 395 high school students revealed that girls perceive more privacy risks and have a higher level of privacy concerns than boys. Regarding privacy protection behaviors, boys tended to read unsolicited e-mail and register for Web sites while directly sending complaints in response to unsolicited e-mail. This study found girls to provide inaccurate information as their privacy concerns increased. Boys, however, refrained from registering to Web sites as their concerns increased.

  13. 78 FR 76986 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2013-12-20

    ... FEDERAL TRADE COMMISSION 16 CFR Part 312 RIN 3084-AB20 Children's Online Privacy Protection Rule... published final rule amendments to the Children's Online Privacy Protection Rule on January 17, 2013 to update the requirements set forth in the notice, parental consent, confidentiality and security, and safe...

  14. 78 FR 3971 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2013-01-17

    ... functionality or content of their properties or gain greater publicity through social media in an effort to... Children's Online Privacy Protection Rule; Final Rule #0;#0;Federal Register / Vol. 78 , No. 12 / Thursday... 3084-AB20 Children's Online Privacy Protection Rule AGENCY: Federal Trade Commission (``FTC'' or...

  15. Couldn't or wouldn't? The influence of privacy concerns and self-efficacy in privacy management on privacy protection.

    Science.gov (United States)

    Chen, Hsuan-Ting; Chen, Wenghong

    2015-01-01

    Sampling 515 college students, this study investigates how privacy protection, including profile visibility, self-disclosure, and friending, are influenced by privacy concerns and efficacy regarding one's own ability to manage privacy settings, a factor that researchers have yet to give a great deal of attention to in the context of social networking sites (SNSs). The results of this study indicate an inconsistency in adopting strategies to protect privacy, a disconnect from limiting profile visibility and friending to self-disclosure. More specifically, privacy concerns lead SNS users to limit their profile visibility and discourage them from expanding their network. However, they do not constrain self-disclosure. Similarly, while self-efficacy in privacy management encourages SNS users to limit their profile visibility, it facilitates self-disclosure. This suggests that if users are limiting their profile visibility and constraining their friending behaviors, it does not necessarily mean they will reduce self-disclosure on SNSs because these behaviors are predicted by different factors. In addition, the study finds an interaction effect between privacy concerns and self-efficacy in privacy management on friending. It points to the potential problem of increased risk-taking behaviors resulting from high self-efficacy in privacy management and low privacy concerns.

  16. Effective evaluation of privacy protection techniques in visible and thermal imagery

    Science.gov (United States)

    Nawaz, Tahir; Berg, Amanda; Ferryman, James; Ahlberg, Jörgen; Felsberg, Michael

    2017-09-01

    Privacy protection may be defined as replacing the original content in an image region with a (less intrusive) content having modified target appearance information to make it less recognizable by applying a privacy protection technique. Indeed, the development of privacy protection techniques also needs to be complemented with an established objective evaluation method to facilitate their assessment and comparison. Generally, existing evaluation methods rely on the use of subjective judgments or assume a specific target type in image data and use target detection and recognition accuracies to assess privacy protection. An annotation-free evaluation method that is neither subjective nor assumes a specific target type is proposed. It assesses two key aspects of privacy protection: "protection" and "utility." Protection is quantified as an appearance similarity, and utility is measured as a structural similarity between original and privacy-protected image regions. We performed an extensive experimentation using six challenging datasets (having 12 video sequences), including a new dataset (having six sequences) that contains visible and thermal imagery. The new dataset is made available online for the community. We demonstrate effectiveness of the proposed method by evaluating six image-based privacy protection techniques and also show comparisons of the proposed method over existing methods.

  17. An Adaptive Privacy Protection Method for Smart Home Environments Using Supervised Learning

    Directory of Open Access Journals (Sweden)

    Jingsha He

    2017-03-01

    Full Text Available In recent years, smart home technologies have started to be widely used, bringing a great deal of convenience to people’s daily lives. At the same time, privacy issues have become particularly prominent. Traditional encryption methods can no longer meet the needs of privacy protection in smart home applications, since attacks can be launched even without the need for access to the cipher. Rather, attacks can be successfully realized through analyzing the frequency of radio signals, as well as the timestamp series, so that the daily activities of the residents in the smart home can be learnt. Such types of attacks can achieve a very high success rate, making them a great threat to users’ privacy. In this paper, we propose an adaptive method based on sample data analysis and supervised learning (SDASL, to hide the patterns of daily routines of residents that would adapt to dynamically changing network loads. Compared to some existing solutions, our proposed method exhibits advantages such as low energy consumption, low latency, strong adaptability, and effective privacy protection.

  18. 36 CFR 902.56 - Protection of personal privacy.

    Science.gov (United States)

    2010-07-01

    ... privacy. 902.56 Section 902.56 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT... Protection of personal privacy. (a) Any of the following personnel, medical, or similar records is within the... invasion of his personal privacy: (1) Personnel and background records personal to any officer or employee...

  19. New technologies and the right to privacy in Nigeria: Evaluating the ...

    African Journals Online (AJOL)

    Nnamdi Azikiwe University Journal of International Law and Jurisprudence ... The paper concludes that in spite of the wide use of new technologies, the jurisprudence protecting privacy is still largely underdeveloped in Nigeria. This is largely ...

  20. Hybrid Paradigm from European and America Concerning Privacy and Personal Data Protection in Indonesia

    Directory of Open Access Journals (Sweden)

    Edmon Makarim

    2013-05-01

    Full Text Available In the emerging era of information and technology, the importance of privacy and data protection is growing ever since. However, despite such common concern from the society, there is some confusion about the mechanisms of differentiation and scope of discussion about privacy with the protection of personal data and even impressed blended with issues of spamming issues. With comparison to Europe and the US legal perspectives, Therefore, this paper tries to discuss such problem in accordance to the perspective of laws to the communication itself.

  1. Protecting Privacy and Confidentiality in Environmental Health Research.

    Science.gov (United States)

    Resnik, David B

    2010-01-01

    Environmental health researchers often need to make difficult decisions on how to protect privacy and confidentiality when they conduct research in the home or workplace. These dilemmas are different from those normally encountered in clinical research. Although protecting privacy and confidentiality is one of the most important principles of research involving human subjects, it can be overridden to prevent imminent harm to individuals or if required by law. Investigators should carefully consider the facts and circumstances and use good judgment when deciding whether to breach privacy or confidentiality.

  2. Protecting Privacy in the Global South (Phase 2) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The absence of appropriate privacy protections can lead to grave problems. Privacy ... Developing countries are home to the greatest number of Internet and mobile users, but such privacy protection is scarce. ... Agent(e) responsable du CRDI.

  3. The Impact of Privacy Concerns and Perceived Vulnerability to Risks on Users Privacy Protection Behaviors on SNS: A Structural Equation Model

    OpenAIRE

    Noora Sami Al-Saqer; Mohamed E. Seliaman

    2016-01-01

    This research paper investigates Saudi users’ awareness levels about privacy policies in Social Networking Sites (SNSs), their privacy concerns and their privacy protection measures. For this purpose, a research model that consists of five main constructs namely information privacy concern, awareness level of privacy policies of social networking sites, perceived vulnerability to privacy risks, perceived response efficacy, and privacy protecting behavior was developed. An online survey questi...

  4. Battling for the Rights to Privacy and Data Protection in the Irish Courts

    Directory of Open Access Journals (Sweden)

    Shane Darcy

    2015-02-01

    Full Text Available Far-reaching mass surveillance by the US National Security Agency and other national security services has brought issues of privacy and data protection to the fore in recent years. Information and technology companies have been embroiled in this scandal for having shared, unwittingly or otherwise, users’ personal data with the security services. Facebook, the world’s largest social media company, has long-been criticised by privacy advocates because of its treatment of users’ data. Proceedings before the Irish courts concerning the role of national data protection authorities have seen an examination of these practices in light of relevant Irish and EU law.

  5. 45 CFR 164.520 - Notice of privacy practices for protected health information.

    Science.gov (United States)

    2010-10-01

    ... DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY Privacy of Individually Identifiable Health Information § 164.520 Notice of privacy practices for protected health information. (a) Standard... 45 Public Welfare 1 2010-10-01 2010-10-01 false Notice of privacy practices for protected health...

  6. Privacy rules for DNA databanks. Protecting coded 'future diaries'.

    Science.gov (United States)

    Annas, G J

    1993-11-17

    In privacy terms, genetic information is like medical information. But the information contained in the DNA molecule itself is more sensitive because it contains an individual's probabilistic "future diary," is written in a code that has only partially been broken, and contains information about an individual's parents, siblings, and children. Current rules for protecting the privacy of medical information cannot protect either genetic information or identifiable DNA samples stored in DNA databanks. A review of the legal and public policy rationales for protecting genetic privacy suggests that specific enforceable privacy rules for DNA databanks are needed. Four preliminary rules are proposed to govern the creation of DNA databanks, the collection of DNA samples for storage, limits on the use of information derived from the samples, and continuing obligations to those whose DNA samples are in the databanks.

  7. Courts, privacy and data protection in Belgium : Fundamental rights that might as well be struck from the constitution

    NARCIS (Netherlands)

    de Hert, Paul; Brkan, Maja; Psychogiopoulou, Evangelia

    2017-01-01

    Through critical analysis of case law in Belgium courts, this chapter reveals the significant role courts play in the protection of privacy and personal data within the new technological environment. It addresses the pressing question from a public who are increasingly aware of their privacy rights

  8. Privacy vs security

    CERN Document Server

    Stalla-Bourdillon, Sophie; Ryan, Mark D

    2014-01-01

    Securing privacy in the current environment is one of the great challenges of today's democracies. Privacy vs. Security explores the issues of privacy and security and their complicated interplay, from a legal and a technical point of view. Sophie Stalla-Bourdillon provides a thorough account of the legal underpinnings of the European approach to privacy and examines their implementation through privacy, data protection and data retention laws. Joshua Philips and Mark D. Ryan focus on the technological aspects of privacy, in particular, on today's attacks on privacy by the simple use of today'

  9. Privacy Protection Method in the Era of Cloud Computing and Big Data

    Directory of Open Access Journals (Sweden)

    Liu Ying

    2015-01-01

    Full Text Available Cloud Computing has become the academic and industrial hotspot in China in recent years. Cloud Computing can help business clients manage finance more conveniently and efficiently. It can also reduce the protection of privacy. In addition, its inherent deficiencies also hinder its application in the privacy protection, such as safety, different criteria, etc. This paper analyzes the application of cloud computing and big data in privacy protection and the existing problems, and therefore puts forward ways to promote the privacy protection in the era of cloud computing and big data.

  10. Older and Wiser? Facebook Use, Privacy Concern, and Privacy Protection in the Life Stages of Emerging, Young, and Middle Adulthood

    Directory of Open Access Journals (Sweden)

    Evert Van den Broeck

    2015-11-01

    Full Text Available A large part of research conducted on privacy concern and protection on social networking sites (SNSs concentrates on children and adolescents. Individuals in these developmental stages are often described as vulnerable Internet users. But how vulnerable are adults in terms of online informational privacy? This study applied a privacy boundary management approach and investigated Facebook use, privacy concern, and the application of privacy settings on Facebook by linking the results to Erikson’s three stages of adulthood: emerging, young, and middle adulthood. An online survey was distributed among 18- to 65-year-old Dutch-speaking adults ( N  = 508, 51.8% females. Analyses revealed clear differences between the three adult age groups in terms of privacy concern, Facebook use, and privacy protection. Results indicated that respondents in young adulthood and middle adulthood were more vulnerable in terms of privacy protection than emerging adults. Clear discrepancies were found between privacy concern and protection for these age groups. More particularly, the middle adulthood group was more concerned about their privacy in comparison to the emerging adulthood and young adulthood group. Yet, they reported to use privacy settings less frequently than the younger age groups. Emerging adults were found to be pragmatic and privacy conscious SNS users. Young adults occupied the intermediate position, suggesting a developmental shift. The impact of generational differences is discussed, as well as implications for education and governmental action.

  11. 76 FR 31425 - HIPAA Privacy Rule Accounting of Disclosures Under the Health Information Technology for Economic...

    Science.gov (United States)

    2011-05-31

    ... 164 HIPAA Privacy Rule Accounting of Disclosures Under the Health Information Technology for Economic... Secretary 45 CFR Part 164 RIN 0991-AB62 HIPAA Privacy Rule Accounting of Disclosures Under the Health... accounting of disclosures of protected health information. The purpose of these modifications is, in part, to...

  12. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Directory of Open Access Journals (Sweden)

    Cynthia Cheung

    Full Text Available Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters.This study sought to understand privacy attitudes of early adopters of emerging health technologies.Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis.Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science.Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  13. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Science.gov (United States)

    Cheung, Cynthia; Bietz, Matthew J; Patrick, Kevin; Bloss, Cinnamon S

    2016-01-01

    Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters. This study sought to understand privacy attitudes of early adopters of emerging health technologies. Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis. Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science. Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  14. Uniting Legislation with RFID Privacy-Enhancing Technologies

    NARCIS (Netherlands)

    Rieback, M.R.; Crispo, B.; Tanenbaum, A.S.

    2005-01-01

    RFID is a popular identification and automation technology with serious security and privacy threats. Legislation expounds upon the actual security and privacy needs of people in RFID-enabled environments, while technology helps to ensure legal compliance. This paper examines the main aims of RFID

  15. FCJ-195 Privacy, Responsibility, and Human Rights Activism

    Directory of Open Access Journals (Sweden)

    Becky Kazansky

    2015-06-01

    Full Text Available In this article, we argue that many difficulties associated with the protection of digital privacy are rooted in the framing of privacy as a predominantly individual responsibility. We examine how models of privacy protection, such as Notice and Choice, contribute to the ‘responsibilisation’ of human rights activists who rely on the use of technologies for their work. We also consider how a group of human rights activists countered technology-mediated threats that this ‘responsibilisation’ causes by developing a collective approach to address their digital privacy and security needs. We conclude this article by discussing how technological tools used to maintain or counter the loss of privacy can be improved in order to support the privacy and digital security of human rights activists.

  16. Protecting Your Child's Privacy Online

    Science.gov (United States)

    ... Keeping Up With Kids' Apps infographic Kids and Computer Security Kids and Mobile Phones Kids and Socializing Online ... email Looking for business guidance on privacy and ... The Federal Trade Commission (FTC) is the nation’s consumer protection agency. The FTC works to prevent fraudulent, deceptive ...

  17. On the comprehensibility and perceived privacy protection of indirect questioning techniques.

    Science.gov (United States)

    Hoffmann, Adrian; Waubert de Puiseau, Berenike; Schmidt, Alexander F; Musch, Jochen

    2017-08-01

    On surveys that assess sensitive personal attributes, indirect questioning aims at increasing respondents' willingness to answer truthfully by protecting confidentiality. However, the assumption that subjects understand questioning procedures fully and trust them to protect their privacy is rarely tested. In a scenario-based design, we compared four indirect questioning procedures in terms of their comprehensibility and perceived privacy protection. All indirect questioning techniques were found to be less comprehensible by respondents than a conventional direct question used for comparison. Less-educated respondents experienced more difficulties when confronted with any indirect questioning technique. Regardless of education, the crosswise model was found to be the most comprehensible among the four indirect methods. Indirect questioning in general was perceived to increase privacy protection in comparison to a direct question. Unexpectedly, comprehension and perceived privacy protection did not correlate. We recommend assessing these factors separately in future evaluations of indirect questioning.

  18. Protecting privacy in data release

    CERN Document Server

    Livraga, Giovanni

    2015-01-01

    This book presents a comprehensive approach to protecting sensitive information when large data collections are released by their owners. It addresses three key requirements of data privacy: the protection of data explicitly released, the protection of information not explicitly released but potentially vulnerable due to a release of other data, and the enforcement of owner-defined access restrictions to the released data. It is also the first book with a complete examination of how to enforce dynamic read and write access authorizations on released data, applicable to the emerging data outsou

  19. Anonymous communication networks protecting privacy on the web

    CERN Document Server

    Peng, Kun

    2014-01-01

    In today's interactive network environment, where various types of organizations are eager to monitor and track Internet use, anonymity is one of the most powerful resources available to counterbalance the threat of unknown spectators and to ensure Internet privacy.Addressing the demand for authoritative information on anonymous Internet usage, Anonymous Communication Networks: Protecting Privacy on the Web examines anonymous communication networks as a solution to Internet privacy concerns. It explains how anonymous communication networks make it possible for participants to communicate with

  20. Privacy in context technology, policy, and the integrity of social life

    CERN Document Server

    Nissenbaum, Helen

    2009-01-01

    Privacy is one of the most urgent issues associated with information technology and digital media. This book claims that what people really care about when they complain and protest that privacy has been violated is not the act of sharing information itself—most people understand that this is crucial to social life —but the inappropriate, improper sharing of information. Arguing that privacy concerns should not be limited solely to concern about control over personal information, Helen Nissenbaum counters that information ought to be distributed and protected according to norms governing distinct social contexts—whether it be workplace, health care, schools, or among family and friends. She warns that basic distinctions between public and private, informing many current privacy policies, in fact obscure more than they clarify. In truth, contemporary information systems should alarm us only when they function without regard for social norms and values, and thereby weaken the fabric of social life.

  1. Data protection and privacy : The age of intelligent machines

    NARCIS (Netherlands)

    Leenes, Ronald; van Brakel, Rosamunde; Gutwirth, Serge; de Hert, Paul

    2017-01-01

    This volume arises from the tenth annual International Conference on Computers, Privacy, and Data Protection (CPDP 2017) held in Brussels in January 2017, bringing together papers that offer conceptual analyses, highlight issues, propose solutions, and discuss practices regarding privacy and data

  2. Privacy Expectations in Online Contexts

    Science.gov (United States)

    Pure, Rebekah Abigail

    2013-01-01

    Advances in digital networked communication technology over the last two decades have brought the issue of personal privacy into sharper focus within contemporary public discourse. In this dissertation, I explain the Fourth Amendment and the role that privacy expectations play in the constitutional protection of personal privacy generally, and…

  3. Privacy and policy for genetic research.

    Science.gov (United States)

    DeCew, Judith Wagner

    2004-01-01

    I begin with a discussion of the value of privacy and what we lose without it. I then turn to the difficulties of preserving privacy for genetic information and other medical records in the face of advanced information technology. I suggest three alternative public policy approaches to the problem of protecting individual privacy and also preserving databases for genetic research: (1) governmental guidelines and centralized databases, (2) corporate self-regulation, and (3) my hybrid approach. None of these are unproblematic; I discuss strengths and drawbacks of each, emphasizing the importance of protecting the privacy of sensitive medical and genetic information as well as letting information technology flourish to aid patient care, public health and scientific research.

  4. Energy-efficient privacy protection for smart home environments using behavioral semantics.

    Science.gov (United States)

    Park, Homin; Basaran, Can; Park, Taejoon; Son, Sang Hyuk

    2014-09-02

    Research on smart environments saturated with ubiquitous computing devices is rapidly advancing while raising serious privacy issues. According to recent studies, privacy concerns significantly hinder widespread adoption of smart home technologies. Previous work has shown that it is possible to infer the activities of daily living within environments equipped with wireless sensors by monitoring radio fingerprints and traffic patterns. Since data encryption cannot prevent privacy invasions exploiting transmission pattern analysis and statistical inference, various methods based on fake data generation for concealing traffic patterns have been studied. In this paper, we describe an energy-efficient, light-weight, low-latency algorithm for creating dummy activities that are semantically similar to the observed phenomena. By using these cloaking activities, the amount of  fake data transmissions can be flexibly controlled to support a trade-off between energy efficiency and privacy protection. According to the experiments using real data collected from a smart home environment, our proposed method can extend the lifetime of the network by more than 2× compared to the previous methods in the literature. Furthermore, the activity cloaking method supports low latency transmission of real data while also significantly reducing the accuracy of the wireless snooping attacks.

  5. MODEL REGULATION FOR DATA PRIVACY IN THE APPLICATION OF BIOMETRIC SMART CARD

    Directory of Open Access Journals (Sweden)

    Sinta Dewi

    2017-03-01

    This article will explore data privacy model regulation which is intended to regulate and protect  data privacy. This  regulatory model  combining several approaches in managing data privacy, especially in using biometric smardcard. Firstly, through laws that enforces the principles and international standards. Secondly, through the market approach (market-based solution which is derived through industry associations to help protect consumer data privacy by applying privacy policy in the form of a statement that the industry will protect consumers' privacy by implementing fair information principles. Third, through technological approach such as PET's (privacy enchasing technology,  i.e the techniques for anonymous and pseudo-anonymous payment, communication, and web access. Fourthly, through corporate privacy rules.

  6. Privacy-invading technologies : safeguarding privacy, liberty & security in the 21st century

    NARCIS (Netherlands)

    Klitou, Demetrius

    2012-01-01

    With a focus on the growing development and deployment of the latest technologies that threaten privacy, the PhD dissertation argues that the US and UK legal frameworks, in their present form, are inadequate to defend privacy and other civil liberties against the intrusive capabilities of body

  7. Scalable privacy-preserving data sharing methodology for genome-wide association studies: an application to iDASH healthcare privacy protection challenge.

    Science.gov (United States)

    Yu, Fei; Ji, Zhanglong

    2014-01-01

    In response to the growing interest in genome-wide association study (GWAS) data privacy, the Integrating Data for Analysis, Anonymization and SHaring (iDASH) center organized the iDASH Healthcare Privacy Protection Challenge, with the aim of investigating the effectiveness of applying privacy-preserving methodologies to human genetic data. This paper is based on a submission to the iDASH Healthcare Privacy Protection Challenge. We apply privacy-preserving methods that are adapted from Uhler et al. 2013 and Yu et al. 2014 to the challenge's data and analyze the data utility after the data are perturbed by the privacy-preserving methods. Major contributions of this paper include new interpretation of the χ2 statistic in a GWAS setting and new results about the Hamming distance score, a key component for one of the privacy-preserving methods.

  8. 76 FR 48811 - Computer Matching and Privacy Protection Act of 1988

    Science.gov (United States)

    2011-08-09

    ... CORPORATION FOR NATIONAL AND COMMUNITY SERVICE Computer Matching and Privacy Protection Act of... of the Computer Matching and Privacy Protection Act of 1988 (54 FR 25818, June 19, 1989), and OMB... Security Administration (``SSA''). DATES: CNCS will file a report on the computer matching agreement with...

  9. Privacy as human flourishing: could a shift towards virtue ethics strengthen privacy protection in the age of Big Data?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Privacy is commonly seen as an instrumental value in relation to negative freedom, human dignity and personal autonomy. Article 8 ECHR, protecting the right to privacy, was originally coined as a doctrine protecting the negative freedom of citizens in vertical relations, that is between citizen and

  10. Privacy Protection in Cloud Using Rsa Algorithm

    OpenAIRE

    Amandeep Kaur; Manpreet Kaur

    2014-01-01

    The cloud computing architecture has been on high demand nowadays. The cloud has been successful over grid and distributed environment due to its cost and high reliability along with high security. However in the area of research it is observed that cloud computing still has some issues in security regarding privacy. The cloud broker provide services of cloud to general public and ensures that data is protected however they sometimes lag security and privacy. Thus in this work...

  11. HOTEL GUEST’S PRIVACY PROTECTION IN TOURISM BUSINESS LAW

    OpenAIRE

    Oliver Radolovic

    2010-01-01

    In the tourism business law, especially in the hotel-keeper’s contract (direct, agency, allotment), the hotel-keeper assumes certain obligations to the guests, among which, in the last twenty years, the protection of the guest’s privacy is particularly emphasized. The subject of the paper is hotel guest’s privacy protection in the Croatian and comparative tourism business law. The paper is structured in a way that it analyzes, through the laws of Croatia, France, Italy, Germany, UK and USA, t...

  12. Privacy and data protection: Legal aspects in the Republic of Macedonia

    Directory of Open Access Journals (Sweden)

    Nora Osmani

    2016-07-01

    Full Text Available The purpose of this paper is to present a theoretical assessment of the existing Law on Personal Data Protection in the Republic of Macedonia. The paper aims to analyse whether there is a need for additional legal tools in order to achieve a balance between maintaining data integrity in the digital age and the use of modern technology. The paper discusses the meaning of “information privacy” in the age of big data, cyber threats and the domestic and international response to these issues. Special focus is dedicated to privacy policy enforcement in European Union Law. Having regard to the development of new technologies, prevailing data protection legislation may no longer be able to provide effective protection for individuals’ personal information. Therefore, existing laws should be continuously adapted to respond to new challenges and situations deriving from different online activities and communications.

  13. Privacy Protection in Personal Health Information and Shared Care Records

    Directory of Open Access Journals (Sweden)

    Roderick L B Neame

    2014-03-01

    Full Text Available Background The protection of personal information privacy has become one of the most pressing security concerns for record keepers. Many institutions have yet to implement the essential infrastructure for data privacy protection and patient control when accessing and sharing data; even more have failed to instil a privacy and security awareness mindset and culture amongst their staff. Increased regulation, together with better compliance monitoring has led to the imposition of increasingly significant monetary penalties for failures to protect privacy. Objective  There is growing pressure in clinical environments to deliver shared patient care and to support this with integrated information.  This demands that more information passes between institutions and care providers without breaching patient privacy or autonomy.  This can be achieved with relatively minor enhancements of existing infrastructures and does not require extensive investment in inter-operating electronic records: indeed such investments to date have been shown not to materially improve data sharing.Requirements for Privacy  There is an ethical duty as well as a legal obligation on the part of care providers (and record keepers to keep patient information confidential and to share it only with the authorisation of the patient.  To achieve this information storage and retrieval, and communication systems must be appropriately configured. Patients may consult clinicians anywhere and at any time: therefore their data must be available for recipient-driven retrieval under patient control and kept private. 

  14. Co-regulation in EU personal data protection : The case of technical standards and the privacy by design standardisation ‘mandate’

    NARCIS (Netherlands)

    Kamara, Irene

    The recently adopted General Data Protection Regulation (GDPR), a technology-neutral law, endorses self-regulatory instruments, such as certification and technical standards. Even before the adoption of the General Data Protection Regulation, standardisation activity in the field of privacy

  15. Energy-Efficient Privacy Protection for Smart Home Environments Using Behavioral Semantics

    Directory of Open Access Journals (Sweden)

    Homin Park

    2014-09-01

    Full Text Available Research on smart environments saturated with ubiquitous computing devices is rapidly advancing while raising serious privacy issues. According to recent studies, privacy concerns significantly hinder widespread adoption of smart home technologies. Previous work has shown that it is possible to infer the activities of daily living within environments equipped with wireless sensors by monitoring radio fingerprints and traffic patterns. Since data encryption cannot prevent privacy invasions exploiting transmission pattern analysis and statistical inference, various methods based on fake data generation for concealing traffic patterns have been studied. In this paper, we describe an energy-efficient, light-weight, low-latency algorithm for creating dummy activities that are semantically similar to the observed phenomena. By using these cloaking activities, the amount of  fake data transmissions can be flexibly controlled to support a trade-off between energy efficiency and privacy protection. According to the experiments using real data collected from a smart home environment, our proposed method can extend the lifetime of the network by more than 2× compared to the previous methods in the literature. Furthermore, the activity cloaking method supports low latency transmission of real data while also significantly reducing the accuracy of the wireless snooping attacks.

  16. Privacy protection for personal health information and shared care records.

    Science.gov (United States)

    Neame, Roderick L B

    2014-01-01

    The protection of personal information privacy has become one of the most pressing security concerns for record keepers: this will become more onerous with the introduction of the European General Data Protection Regulation (GDPR) in mid-2014. Many institutions, both large and small, have yet to implement the essential infrastructure for data privacy protection and patient consent and control when accessing and sharing data; even more have failed to instil a privacy and security awareness mindset and culture amongst their staff. Increased regulation, together with better compliance monitoring, has led to the imposition of increasingly significant monetary penalties for failure to protect privacy: these too are set to become more onerous under the GDPR, increasing to a maximum of 2% of annual turnover. There is growing pressure in clinical environments to deliver shared patient care and to support this with integrated information. This demands that more information passes between institutions and care providers without breaching patient privacy or autonomy. This can be achieved with relatively minor enhancements of existing infrastructures and does not require extensive investment in inter-operating electronic records: indeed such investments to date have been shown not to materially improve data sharing. REQUIREMENTS FOR PRIVACY: There is an ethical duty as well as a legal obligation on the part of care providers (and record keepers) to keep patient information confidential and to share it only with the authorisation of the patient. To achieve this information storage and retrieval, communication systems must be appropriately configured. There are many components of this, which are discussed in this paper. Patients may consult clinicians anywhere and at any time: therefore, their data must be available for recipient-driven retrieval (i.e. like the World Wide Web) under patient control and kept private: a method for delivering this is outlined.

  17. Privacy protection schemes for fingerprint recognition systems

    Science.gov (United States)

    Marasco, Emanuela; Cukic, Bojan

    2015-05-01

    The deployment of fingerprint recognition systems has always raised concerns related to personal privacy. A fingerprint is permanently associated with an individual and, generally, it cannot be reset if compromised in one application. Given that fingerprints are not a secret, potential misuses besides personal recognition represent privacy threats and may lead to public distrust. Privacy mechanisms control access to personal information and limit the likelihood of intrusions. In this paper, image- and feature-level schemes for privacy protection in fingerprint recognition systems are reviewed. Storing only key features of a biometric signature can reduce the likelihood of biometric data being used for unintended purposes. In biometric cryptosystems and biometric-based key release, the biometric component verifies the identity of the user, while the cryptographic key protects the communication channel. Transformation-based approaches only a transformed version of the original biometric signature is stored. Different applications can use different transforms. Matching is performed in the transformed domain which enable the preservation of low error rates. Since such templates do not reveal information about individuals, they are referred to as cancelable templates. A compromised template can be re-issued using a different transform. At image-level, de-identification schemes can remove identifiers disclosed for objectives unrelated to the original purpose, while permitting other authorized uses of personal information. Fingerprint images can be de-identified by, for example, mixing fingerprints or removing gender signature. In both cases, degradation of matching performance is minimized.

  18. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    Science.gov (United States)

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  19. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  20. Beyond individual-centric privacy : Information technology in social systems

    NARCIS (Netherlands)

    Pieters, W.

    2017-01-01

    In the public debate, social implications of information technology are mainly seen through the privacy lens. Impact assessments of information technology are also often limited to privacy impact assessments, which are focused on individual rights and well-being, as opposed to the social

  1. From Data Privacy to Location Privacy

    Science.gov (United States)

    Wang, Ting; Liu, Ling

    Over the past decade, the research on data privacy has achieved considerable advancement in the following two aspects: First, a variety of privacy threat models and privacy principles have been proposed, aiming at providing sufficient protection against different types of inference attacks; Second, a plethora of algorithms and methods have been developed to implement the proposed privacy principles, while attempting to optimize the utility of the resulting data. The first part of the chapter presents an overview of data privacy research by taking a close examination at the achievements from the above two aspects, with the objective of pinpointing individual research efforts on the grand map of data privacy protection. As a special form of data privacy, location privacy possesses its unique characteristics. In the second part of the chapter, we examine the research challenges and opportunities of location privacy protection, in a perspective analogous to data privacy. Our discussion attempts to answer the following three questions: (1) Is it sufficient to apply the data privacy models and algorithms developed to date for protecting location privacy? (2) What is the current state of the research on location privacy? (3) What are the open issues and technical challenges that demand further investigation? Through answering these questions, we intend to provide a comprehensive review of the state of the art in location privacy research.

  2. Review of the model of technological pragmatism considering privacy and security

    Directory of Open Access Journals (Sweden)

    Kovačević-Lepojević Marina M.

    2013-01-01

    Full Text Available The model of technological pragmatism assumes awareness that technological development involves both benefits and dangers. Most modern security technologies represent citizens' mass surveillance tools, which can lead to compromising a significant amount of personal data due to the lack of institutional monitoring and control. On the other hand, people are interested in improving crime control and reducing the fear of potential victimization which this framework provides as a rational justification for the apparent loss of privacy, personal rights and freedoms. Citizens' perception on the categories of security and privacy, and their balancing, can provide the necessary guidelines to regulate the application of security technologies in the actual context. The aim of this paper is to analyze the attitudes of students at the University of Belgrade (N = 269 toward the application of security technology and identification of the key dimensions. On the basis of the relevant research the authors have formed assumptions about the following dimensions: security, privacy, trust in institutions and concern about the misuse of security technology. The Prise Questionnaire on Security Technology and Privacy was used for data collection. Factor analysis abstracted eight factors which together account for 58% of variance, with the highest loading of the four factors that are identified as security, privacy, trust and concern. The authors propose a model of technological pragmatism considering security and privacy. The data also showed that students are willing to change their privacy for the purpose of improving security and vice versa.

  3. Blood rights: the body and information privacy.

    Science.gov (United States)

    Alston, Bruce

    2005-05-01

    Genetic and other medical technology makes blood, human tissue and other bodily samples an immediate and accessible source of comprehensive personal and health information about individuals. Yet, unlike medical records, bodily samples are not subject to effective privacy protection or other regulation to ensure that individuals have rights to control the collection, use and transfer of such samples. This article examines the existing coverage of privacy legislation, arguments in favour of baseline protection for bodily samples as sources of information and possible approaches to new regulation protecting individual privacy rights in bodily samples.

  4. Online Tracking Technologies and Web Privacy:Technologieën voor Online volgen en Web Privacy

    OpenAIRE

    Acar, Mustafa Gunes Can

    2017-01-01

    In my PhD thesis, I would like to study the problem of online privacy with a focus on Web and mobile applications. Key research questions to be addressed by my study are the following: How can we formalize and quantify web tracking? What are the threats presented against privacy by different tracking techniques such as browser fingerprinting and cookie based tracking? What kind of privacy enhancing technologies (PET) can be used to ensure privacy without degrading service quality? The stud...

  5. Privacy Information Security Classification for Internet of Things Based on Internet Data

    OpenAIRE

    Lu, Xiaofeng; Qu, Zhaowei; Li, Qi; Hui, Pan

    2015-01-01

    A lot of privacy protection technologies have been proposed, but most of them are independent and aim at protecting some specific privacy. There is hardly enough deep study into the attributes of privacy. To minimize the damage and influence of the privacy disclosure, the important and sensitive privacy should be a priori preserved if all privacy pieces cannot be preserved. This paper focuses on studying the attributes of the privacy and proposes privacy information security classification (P...

  6. Data Security and Privacy in Cloud Computing

    OpenAIRE

    Yunchuan Sun; Junsheng Zhang; Yongping Xiong; Guangyu Zhu

    2014-01-01

    Data security has consistently been a major issue in information technology. In the cloud computing environment, it becomes particularly serious because the data is located in different places even in all the globe. Data security and privacy protection are the two main factors of user’s concerns about the cloud technology. Though many techniques on the topics in cloud computing have been investigated in both academics and industries, data security and privacy protection are becoming more impo...

  7. Utility-preserving privacy protection of textual healthcare documents.

    Science.gov (United States)

    Sánchez, David; Batet, Montserrat; Viejo, Alexandre

    2014-12-01

    The adoption of ITs by medical organisations makes possible the compilation of large amounts of healthcare data, which are quite often needed to be released to third parties for research or business purposes. Many of this data are of sensitive nature, because they may include patient-related documents such as electronic healthcare records. In order to protect the privacy of individuals, several legislations on healthcare data management, which state the kind of information that should be protected, have been defined. Traditionally, to meet with current legislations, a manual redaction process is applied to patient-related documents in order to remove or black-out sensitive terms. This process is costly and time-consuming and has the undesired side effect of severely reducing the utility of the released content. Automatic methods available in the literature usually propose ad-hoc solutions that are limited to protect specific types of structured information (e.g. e-mail addresses, social security numbers, etc.); as a result, they are hardly applicable to the sensitive entities stated in current regulations that do not present those structural regularities (e.g. diseases, symptoms, treatments, etc.). To tackle these limitations, in this paper we propose an automatic sanitisation method for textual medical documents (e.g. electronic healthcare records) that is able to protect, regardless of their structure, sensitive entities (e.g. diseases) and also those semantically related terms (e.g. symptoms) that may disclose the former ones. Contrary to redaction schemes based on term removal, our approach improves the utility of the protected output by replacing sensitive terms with appropriate generalisations retrieved from several medical and general-purpose knowledge bases. Experiments conducted on highly sensitive documents and in coherency with current regulations on healthcare data privacy show promising results in terms of the practical privacy and utility of the

  8. Computers, privacy and data protection an element of choice

    CERN Document Server

    Gutwirth, Serge; De Hert, Paul; Leenes, Ronald

    2011-01-01

    This timely volume presents current developments in ICT and privacy/data protection. Readers will find an alternative view of the Data Protection Directive, the contentious debates on data sharing with the USA (SWIFT, PNR), and the judicial and political resistance against data retention.

  9. The Protection of the Image and Privacy in France

    Directory of Open Access Journals (Sweden)

    Leonardo Estevam de Assis Zanini

    2018-03-01

    Full Text Available This article analyzes the emergence and development of the protection of the image and privacy in France. It emphasizes that initially the defense of these rights was only work of the courts, that created rules applicable to the concrete cases. The courts used the general clause of civil liability, because there was no developed doctrine on personality rights. Subsequently the matter also began to be object of study of the French doctrinators. Unlike Germany, which granted protection very early, France only regulated these rights with the promulgation of the Law 70-643, of 17th July 1970, which introduced the right to privacy in the article 9 of the French Civil Code. This norm reinforced the protection of the personality, but it remains to be seen whether there has also been an improvement in the protection of the image in France, which we will study in this article.

  10. Digital privacy in the marketplace perspectives on the information exchange

    CERN Document Server

    Milne, George

    2015-01-01

    Digital Privacy in the Marketplace focuses on the data ex-changes between marketers and consumers, with special ttention to the privacy challenges that are brought about by new information technologies. The purpose of this book is to provide a background source to help the reader think more deeply about the impact of privacy issues on both consumers and marketers. It covers topics such as: why privacy is needed, the technological, historical and academic theories of privacy, how market exchange af-fects privacy, what are the privacy harms and protections available, and what is the likely future of privacy.

  11. Large-scale Health Information Database and Privacy Protection.

    Science.gov (United States)

    Yamamoto, Ryuichi

    2016-09-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients' medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  12. Are privacy-enhancing technologies for genomic data ready for the clinic? A survey of medical experts of the Swiss HIV Cohort Study.

    Science.gov (United States)

    Raisaro, Jean-Louis; McLaren, Paul J; Fellay, Jacques; Cavassini, Matthias; Klersy, Catherine; Hubaux, Jean-Pierre

    2018-03-01

    Protecting patient privacy is a major obstacle for the implementation of genomic-based medicine. Emerging privacy-enhancing technologies can become key enablers for managing sensitive genetic data. We studied physicians' attitude toward this kind of technology in order to derive insights that might foster their future adoption for clinical care. We conducted a questionnaire-based survey among 55 physicians of the Swiss HIV Cohort Study who tested the first implementation of a privacy-preserving model for delivering genomic test results. We evaluated their feedback on three different aspects of our model: clinical utility, ability to address privacy concerns and system usability. 38/55 (69%) physicians participated in the study. Two thirds of them acknowledged genetic privacy as a key aspect that needs to be protected to help building patient trust and deploy new-generation medical information systems. All of them successfully used the tool for evaluating their patients' pharmacogenomics risk and 90% were happy with the user experience and the efficiency of the tool. Only 8% of physicians were unsatisfied with the level of information and wanted to have access to the patient's actual DNA sequence. This survey, although limited in size, represents the first evaluation of privacy-preserving models for genomic-based medicine. It has allowed us to derive unique insights that will improve the design of these new systems in the future. In particular, we have observed that a clinical information system that uses homomorphic encryption to provide clinicians with risk information based on sensitive genetic test results can offer information that clinicians feel sufficient for their needs and appropriately respectful of patients' privacy. The ability of this kind of systems to ensure strong security and privacy guarantees and to provide some analytics on encrypted data has been assessed as a key enabler for the management of sensitive medical information in the near future

  13. Young adult females' views regarding online privacy protection at two time points.

    Science.gov (United States)

    Moreno, Megan A; Kelleher, Erin; Ameenuddin, Nusheen; Rastogi, Sarah

    2014-09-01

    Risks associated with adolescent Internet use include exposure to inappropriate information and privacy violations. Privacy expectations and policies have changed over time. Recent Facebook security setting changes heighten these risks. The purpose of this study was to investigate views and experiences with Internet safety and privacy protection among older adolescent females at two time points, in 2009 and 2012. Two waves of focus groups were conducted, one in 2009 and the other in 2012. During these focus groups, female university students discussed Internet safety risks and strategies and privacy protection. All focus groups were audio recorded and manually transcribed. Qualitative analysis was conducted at the end of each wave and then reviewed and combined in a separate analysis using the constant comparative method. A total of 48 females participated across the two waves. The themes included (1) abundant urban myths, such as the ability for companies to access private information; (2) the importance of filtering one's displayed information; and (3) maintaining age limits on social media access to avoid younger teens' presence on Facebook. The findings present a complex picture of how adolescents view privacy protection and online safety. Older adolescents may be valuable partners in promoting safe and age-appropriate Internet use for younger teens in the changing landscape of privacy. Copyright © 2014. Published by Elsevier Inc.

  14. The privacy concerns in location based services: protection approaches and remaining challenges

    OpenAIRE

    Basiri, Anahid; Moore, Terry; Hill, Chris

    2016-01-01

    Despite the growth in the developments of the Location Based Services (LBS) applications, there are still several challenges remaining. One of the most important concerns about LBS, shared by many users and service providers is the privacy. Privacy has been considered as a big threat to the adoption of LBS among many users and consequently to the growth of LBS markets. This paper discusses the privacy concerns associated with location data, and the current privacy protection approaches. It re...

  15. An efficient reversible privacy-preserving data mining technology over data streams.

    Science.gov (United States)

    Lin, Chen-Yi; Kao, Yuan-Hung; Lee, Wei-Bin; Chen, Rong-Chang

    2016-01-01

    With the popularity of smart handheld devices and the emergence of cloud computing, users and companies can save various data, which may contain private data, to the cloud. Topics relating to data security have therefore received much attention. This study focuses on data stream environments and uses the concept of a sliding window to design a reversible privacy-preserving technology to process continuous data in real time, known as a continuous reversible privacy-preserving (CRP) algorithm. Data with CRP algorithm protection can be accurately recovered through a data recovery process. In addition, by using an embedded watermark, the integrity of the data can be verified. The results from the experiments show that, compared to existing algorithms, CRP is better at preserving knowledge and is more effective in terms of reducing information loss and privacy disclosure risk. In addition, it takes far less time for CRP to process continuous data than existing algorithms. As a result, CRP is confirmed as suitable for data stream environments and fulfills the requirements of being lightweight and energy-efficient for smart handheld devices.

  16. Selling health data: de-identification, privacy, and speech.

    Science.gov (United States)

    Kaplan, Bonnie

    2015-07-01

    Two court cases that involve selling prescription data for pharmaceutical marketing affect biomedical informatics, patient and clinician privacy, and regulation. Sorrell v. IMS Health Inc. et al. in the United States and R v. Department of Health, Ex Parte Source Informatics Ltd. in the United Kingdom concern privacy and health data protection, data de-identification and reidentification, drug detailing (marketing), commercial benefit from the required disclosure of personal information, clinician privacy and the duty of confidentiality, beneficial and unsavory uses of health data, regulating health technologies, and considering data as speech. Individuals should, at the very least, be aware of how data about them are collected and used. Taking account of how those data are used is needed so societal norms and law evolve ethically as new technologies affect health data privacy and protection.

  17. A Privacy-Protecting Authentication Scheme for Roaming Services with Smart Cards

    Science.gov (United States)

    Son, Kyungho; Han, Dong-Guk; Won, Dongho

    In this work we propose a novel smart card based privacy-protecting authentication scheme for roaming services. Our proposal achieves so-called Class 2 privacy protection, i.e., no information identifying a roaming user and also linking the user's behaviors is not revealed in a visited network. It can be used to overcome the inherent structural flaws of smart card based anonymous authentication schemes issued recently. As shown in our analysis, our scheme is computationally efficient for a mobile user.

  18. Development of measures of online privacy concern and protection for use on the Internet

    OpenAIRE

    Buchanan, T; Paine, C; Joinson, A; Reips, U D

    2007-01-01

    As the Internet grows in importance, concerns about online privacy have arisen. We describe the development and validation of three short Internet-administered scales measuring privacy related attitudes ('Privacy Concern') and behaviors ('General Caution' and 'Technical Protection').

  19. Syllabus for Privacy and Information Technology, Fall 2017, UCLA Information Studies

    OpenAIRE

    Borgman, Christine L.

    2017-01-01

    Privacy is a broad topic that covers many disciplines, stakeholders, and concerns. This course addresses the intersection of privacy and information technology, surveying a wide array of topics of concern for research and practice in the information fields. Among the topics covered are the history and changing contexts of privacy; privacy risks and harms; law, policies, and practices; privacy in searching for information, in reading, and in libraries; surveillance, networks, and privacy by de...

  20. Fourteen Reasons Privacy Matters: A Multidisciplinary Review of Scholarly Literature

    Science.gov (United States)

    Magi, Trina J.

    2011-01-01

    Librarians have long recognized the importance of privacy to intellectual freedom. As digital technology and its applications advance, however, efforts to protect privacy may become increasingly difficult. With some users behaving in ways that suggest they do not care about privacy and with powerful voices claiming that privacy is dead, librarians…

  1. Libraries Protecting Privacy on Social Media: Sharing without "Oversharing"

    Directory of Open Access Journals (Sweden)

    Kelley Cotter

    2016-11-01

    Full Text Available Libraries have increasingly adopted social media as an integral means of connecting with their users. However, social media presents many potential concerns regarding library patron privacy. This article presents the findings from a study of how librarians and library staff perceive and handle issues of patron privacy related to social media marketing in libraries. The study reports the results from a mixed-methods online survey, which used a nonprobability self-selection sampling method to collect responses from individuals employed by libraries, without restrictions on position or library type. Nearly three-quarters of respondents reported working in libraries that have either an official or unofficial social media policy. Approximately 53% of those policies mention patron privacy. The findings suggest that many respondents’ views and practices are influenced by the perception of the library’s physical space and social media presence as public places. The findings also suggest a lack of consensus regarding the extent of the library’s obligation to protect patron privacy on library social media sites and what would constitute a violation of privacy.

  2. Accountability as a Way Forward for Privacy Protection in the Cloud

    Science.gov (United States)

    Pearson, Siani; Charlesworth, Andrew

    The issue of how to provide appropriate privacy protection for cloud computing is important, and as yet unresolved. In this paper we propose an approach in which procedural and technical solutions are co-designed to demonstrate accountability as a path forward to resolving jurisdictional privacy and security risks within the cloud.

  3. Data privacy foundations, new developments and the big data challenge

    CERN Document Server

    Torra, Vicenç

    2017-01-01

    This book offers a broad, cohesive overview of the field of data privacy. It discusses, from a technological perspective, the problems and solutions of the three main communities working on data privacy: statistical disclosure control (those with a statistical background), privacy-preserving data mining (those working with data bases and data mining), and privacy-enhancing technologies (those involved in communications and security) communities. Presenting different approaches, the book describes alternative privacy models and disclosure risk measures as well as data protection procedures for respondent, holder and user privacy. It also discusses specific data privacy problems and solutions for readers who need to deal with big data.

  4. Hacktivism 1-2-3: how privacy enhancing technologies change the face of anonymous hacktivism

    Directory of Open Access Journals (Sweden)

    Balázs Bodó

    2014-11-01

    Full Text Available This short essay explores how the notion of hacktivism changes due to easily accessible, military grade Privacy Enhancing Technologies (PETs. Privacy Enhancing Technologies, technological tools which provide anonymous communications and protect users from online surveillance enable new forms of online political activism. Through the short summary of the ad-hoc vigilante group Anonymous, this article describes hacktivism 1.0 as electronic civil disobedience conducted by outsiders. Through the analysis of Wikileaks, the anonymous whistleblowing website, it describes how strong PETs enable the development of hacktivism 2.0, where the source of threat is shifted from outsiders to insiders. Insiders have access to documents with which power can be exposed, and who, by using PETs, can anonymously engage in political action. We also describe the emergence of a third generation of hacktivists who use PETs to disengage and create their own autonomous spaces rather than to engage with power through anonymous whistleblowing.

  5. Privacy Bridges: EU and US Privacy Experts In Search of Transatlantic Privacy Solutions

    NARCIS (Netherlands)

    Abramatic, J.-F.; Bellamy, B.; Callahan, M.E.; Cate, F.; van Eecke, P.; van Eijk, N.; Guild, E.; de Hert, P.; Hustinx, P.; Kuner, C.; Mulligan, D.; O'Connor, N.; Reidenberg, J.; Rubinstein, I.; Schaar, P.; Shadbolt, N.; Spiekermann, S.; Vladeck, D.; Weitzner, D.J.; Zuiderveen Borgesius, F.; Hagenauw, D.; Hijmans, H.

    2015-01-01

    The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the

  6. Protecting location privacy for outsourced spatial data in cloud storage.

    Science.gov (United States)

    Tian, Feng; Gui, Xiaolin; An, Jian; Yang, Pan; Zhao, Jianqiang; Zhang, Xuejun

    2014-01-01

    As cloud computing services and location-aware devices are fully developed, a large amount of spatial data needs to be outsourced to the cloud storage provider, so the research on privacy protection for outsourced spatial data gets increasing attention from academia and industry. As a kind of spatial transformation method, Hilbert curve is widely used to protect the location privacy for spatial data. But sufficient security analysis for standard Hilbert curve (SHC) is seldom proceeded. In this paper, we propose an index modification method for SHC (SHC(∗)) and a density-based space filling curve (DSC) to improve the security of SHC; they can partially violate the distance-preserving property of SHC, so as to achieve better security. We formally define the indistinguishability and attack model for measuring the privacy disclosure risk of spatial transformation methods. The evaluation results indicate that SHC(∗) and DSC are more secure than SHC, and DSC achieves the best index generation performance.

  7. Privacy by design in personal health monitoring.

    Science.gov (United States)

    Nordgren, Anders

    2015-06-01

    The concept of privacy by design is becoming increasingly popular among regulators of information and communications technologies. This paper aims at analysing and discussing the ethical implications of this concept for personal health monitoring. I assume a privacy theory of restricted access and limited control. On the basis of this theory, I suggest a version of the concept of privacy by design that constitutes a middle road between what I call broad privacy by design and narrow privacy by design. The key feature of this approach is that it attempts to balance automated privacy protection and autonomously chosen privacy protection in a way that is context-sensitive. In personal health monitoring, this approach implies that in some contexts like medication assistance and monitoring of specific health parameters one single automatic option is legitimate, while in some other contexts, for example monitoring in which relatives are receivers of health-relevant information rather than health care professionals, a multi-choice approach stressing autonomy is warranted.

  8. Privacy protection on the internet: The European model

    Directory of Open Access Journals (Sweden)

    Baltezarević Vesna

    2017-01-01

    Full Text Available The Internet has a huge impact on all areas of social activity. Everyday life, social interaction and economics are directed to new information and communication technologies. A positive aspect of the new technology is reflected in the fact that it has created a virtual space that has led to the elimination of the various barriers, which has enabled interaction and information exchange across the world. Inclusion in the virtual social network provides connectivity for communicators who are looking for space that allows them freedom of expression and connect them with new ' friends'. Because of the feeling of complete freedom and the absence of censorship on the network communicators leave many personal details and photos, without thinking about the possible abuses of privacy. Recording of the different incidents on the network has resulted in the need to take precaution measures, in order to protect the users and the rule of law, given that freedom on the network is only possible with the existence of an adequate system of safety and security. In this paper we deal with the problem of the protection of personal data of users of virtual social networks against malicious activity and abuse, with special reference to the activities of the European Union in an effort to regulate this area. The European Commission has concentrated on finding the best solutions to protect the user's virtual space for more than two decades, starting from 1995 until a directive on security of networks and information systems, which was adopted in the first half of 2016.

  9. Large-scale Health Information Database and Privacy Protection*1

    Science.gov (United States)

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA*2 projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients’ medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  10. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-01-01

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users’ privacy. PMID:29751670

  11. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection.

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-05-11

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users' privacy.

  12. Towards quantitative evaluation of privacy protection schemes for electricity usage data sharing

    Directory of Open Access Journals (Sweden)

    Daisuke Mashima

    2018-03-01

    Full Text Available Thanks to the roll-out of smart meters, availability of fine-grained electricity usage data has rapidly grown. Such data has enabled utility companies to perform robust and efficient grid operations. However, at the same time, privacy concerns associated with sharing and disclosure of such data have been raised. In this paper, we first demonstrate the feasibility of estimating privacy-sensitive household attributes based solely on the energy usage data of residential customers. We then discuss a framework to measure privacy gain and evaluate the effectiveness of customer-centric privacy-protection schemes, namely redaction of data irrelevant to services and addition of bounded artificial noise. Keywords: Privacy, Smart meter data, Quantitative evaluation

  13. Privacy-preserving Identity Management

    OpenAIRE

    Milutinovic, Milica

    2015-01-01

    With the technological advances and the evolution of online services, user privacy is becoming a crucial issue in the modern day society. Privacy in the general sense refers to individuals’ ability to protect information about themselves and selectively present it to other entities. This concept is nowadays strongly affected by everyday practices that assume personal data disclosure, such as online shopping and participation in loyalty schemes. This makes it difficult for an individual to con...

  14. Efficient task assignment in spatial crowdsourcing with worker and task privacy protection

    KAUST Repository

    Liu, An

    2017-08-01

    Spatial crowdsourcing (SC) outsources tasks to a set of workers who are required to physically move to specified locations and accomplish tasks. Recently, it is emerging as a promising tool for emergency management, as it enables efficient and cost-effective collection of critical information in emergency such as earthquakes, when search and rescue survivors in potential ares are required. However in current SC systems, task locations and worker locations are all exposed in public without any privacy protection. SC systems if attacked thus have penitential risk of privacy leakage. In this paper, we propose a protocol for protecting the privacy for both workers and task requesters while maintaining the functionality of SC systems. The proposed protocol is built on partially homomorphic encryption schemes, and can efficiently realize complex operations required during task assignment over encrypted data through a well-designed computation strategy. We prove that the proposed protocol is privacy-preserving against semi-honest adversaries. Simulation on two real-world datasets shows that the proposed protocol is more effective than existing solutions and can achieve mutual privacy-preserving with acceptable computation and communication cost.

  15. Usability Issues in the User Interfaces of Privacy-Enhancing Technologies

    Science.gov (United States)

    LaTouche, Lerone W.

    2013-01-01

    Privacy on the Internet has become one of the leading concerns for Internet users. These users are not wrong in their concerns if personally identifiable information is not protected and under their control. To minimize the collection of Internet users' personal information and help solve the problem of online privacy, a number of…

  16. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2018-05-01

    Full Text Available A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users’ privacy.

  17. DE-IDENTIFICATION TECHNIQUE FOR IOT WIRELESS SENSOR NETWORK PRIVACY PROTECTION

    Directory of Open Access Journals (Sweden)

    Yennun Huang

    2017-02-01

    Full Text Available As the IoT ecosystem becoming more and more mature, hardware and software vendors are trying create new value by connecting all kinds of devices together via IoT. IoT devices are usually equipped with sensors to collect data, and the data collected are transmitted over the air via different kinds of wireless connection. To extract the value of the data collected, the data owner may choose to seek for third-party help on data analysis, or even of the data to the public for more insight. In this scenario it is important to protect the released data from privacy leakage. Here we propose that differential privacy, as a de-identification technique, can be a useful approach to add privacy protection to the data released, as well as to prevent the collected from intercepted and decoded during over-the-air transmission. A way to increase the accuracy of the count queries performed on the edge cases in a synthetic database is also presented in this research.

  18. Mandatory Enforcement of Privacy Policies using Trusted Computing Principles

    NARCIS (Netherlands)

    Kargl, Frank; Schaub, Florian; Dietzel, Stefan

    Modern communication systems and information technology create significant new threats to information privacy. In this paper, we discuss the need for proper privacy protection in cooperative intelligent transportation systems (cITS), one instance of such systems. We outline general principles for

  19. Genetic privacy and confidentiality: why they are so hard to protect.

    Science.gov (United States)

    Rothstein, M A

    1998-01-01

    Author notes that widespread concerns have been raised about protecting genetic privacy and confidentiality in insurance and employment. He argues that effective protections are difficult because complicated issues, such as the right of access to health care, are invariably implicated.

  20. Privacy protected text analysis in DataSHIELD

    Directory of Open Access Journals (Sweden)

    Rebecca Wilson

    2017-04-01

    Whilst it is possible to analyse free text within a DataSHIELD infrastructure, the challenge is creating generalised and resilient anti-disclosure methods for free text analysis. There are a range of biomedical and health sciences applications for DataSHIELD methods of privacy protected analysis of free text including analysis of electronic health records and analysis of qualitative data e.g. from social media.

  1. A Failure to "Do No Harm" -- India's Aadhaar biometric ID program and its inability to protect privacy in relation to measures in Europe and the U.S.

    Science.gov (United States)

    Dixon, Pam

    2017-01-01

    It is important that digital biometric identity systems be used by governments with a Do no Harm mandate, and the establishment of regulatory, enforcement and restorative frameworks ensuring data protection and privacy needs to transpire prior to the implementation of technological programs and services. However, when, and where large government bureaucracies are involved, the proper planning and execution of public service programs very often result in ungainly outcomes, and are often qualitatively not guaranteeable. Several important factors, such as the strength of the political and legal systems, may affect such cases as the implementation of a national digital identity system. Digital identity policy development, as well as technical deployment of biometric technologies and enrollment processes, may all differ markedly, and could depend in some part at least, on the overall economic development of the country in question, or political jurisdiction, among other factors. This article focuses on the Republic of India's national digital biometric identity system, the Aadhaar , for its development, data protection and privacy policies, and impact. Two additional political jurisdictions, the European Union, and the United States are also situationally analyzed as they may be germane to data protection and privacy policies originated to safeguard biometric identities. Since biometrics are foundational elements in modern digital identity systems, expression of data protection policies that orient and direct how biometrics are to be utilized as unique identifiers are the focus of this analysis. As more of the world's economies create and elaborate capacities, capabilities and functionalities within their respective digital ambits, it is not enough to simply install suitable digital identity technologies; much, much more - is durably required. For example, both vigorous and descriptive means of data protection should be well situated within any jurisdictionally relevant

  2. Improving privacy protection in the area of behavioural targeting

    NARCIS (Netherlands)

    Zuiderveen Borgesius, F.J.

    2014-01-01

    This PhD thesis discusses how European law could improve privacy protection in the area of behavioural targeting. Behavioural targeting, also referred to as online profiling, involves monitoring people’s online behaviour, and using the collected information to show people individually targeted

  3. Security controls in an integrated Biobank to protect privacy in data sharing: rationale and study design.

    Science.gov (United States)

    Takai-Igarashi, Takako; Kinoshita, Kengo; Nagasaki, Masao; Ogishima, Soichi; Nakamura, Naoki; Nagase, Sachiko; Nagaie, Satoshi; Saito, Tomo; Nagami, Fuji; Minegishi, Naoko; Suzuki, Yoichi; Suzuki, Kichiya; Hashizume, Hiroaki; Kuriyama, Shinichi; Hozawa, Atsushi; Yaegashi, Nobuo; Kure, Shigeo; Tamiya, Gen; Kawaguchi, Yoshio; Tanaka, Hiroshi; Yamamoto, Masayuki

    2017-07-06

    With the goal of realizing genome-based personalized healthcare, we have developed a biobank that integrates personal health, genome, and omics data along with biospecimens donated by volunteers of 150,000. Such a large-scale of data integration involves obvious risks of privacy violation. The research use of personal genome and health information is a topic of global discussion with regard to the protection of privacy while promoting scientific advancement. The present paper reports on our plans, current attempts, and accomplishments in addressing security problems involved in data sharing to ensure donor privacy while promoting scientific advancement. Biospecimens and data have been collected in prospective cohort studies with the comprehensive agreement. The sample size of 150,000 participants was required for multiple researches including genome-wide screening of gene by environment interactions, haplotype phasing, and parametric linkage analysis. We established the T ohoku M edical M egabank (TMM) data sharing policy: a privacy protection rule that requires physical, personnel, and technological safeguards against privacy violation regarding the use and sharing of data. The proposed policy refers to that of NCBI and that of the Sanger Institute. The proposed policy classifies shared data according to the strength of re-identification risks. Local committees organized by TMM evaluate re-identification risk and assign a security category to a dataset. Every dataset is stored in an assigned segment of a supercomputer in accordance with its security category. A security manager should be designated to handle all security problems at individual data use locations. The proposed policy requires closed networks and IP-VPN remote connections. The mission of the biobank is to distribute biological resources most productively. This mission motivated us to collect biospecimens and health data and simultaneously analyze genome/omics data in-house. The biobank also has the

  4. The study on privacy preserving data mining for information security

    Science.gov (United States)

    Li, Xiaohui

    2012-04-01

    Privacy preserving data mining have a rapid development in a short year. But it still faces many challenges in the future. Firstly, the level of privacy has different definitions in different filed. Therefore, the measure of privacy preserving data mining technology protecting private information is not the same. So, it's an urgent issue to present a unified privacy definition and measure. Secondly, the most of research in privacy preserving data mining is presently confined to the theory study.

  5. The Regulatory Framework for Privacy and Security

    Science.gov (United States)

    Hiller, Janine S.

    The internet enables the easy collection of massive amounts of personally identifiable information. Unregulated data collection causes distrust and conflicts with widely accepted principles of privacy. The regulatory framework in the United States for ensuring privacy and security in the online environment consists of federal, state, and self-regulatory elements. New laws have been passed to address technological and internet practices that conflict with privacy protecting policies. The United States and the European Union approaches to privacy differ significantly, and the global internet environment will likely cause regulators to face the challenge of balancing privacy interests with data collection for many years to come.

  6. Privacy versus autonomy: a tradeoff model for smart home monitoring technologies.

    Science.gov (United States)

    Townsend, Daphne; Knoefel, Frank; Goubran, Rafik

    2011-01-01

    Smart homes are proposed as a new location for the delivery of healthcare services. They provide healthcare monitoring and communication services, by using integrated sensor network technologies. We validate a hypothesis regarding older adults' adoption of home monitoring technologies by conducting a literature review of articles studying older adults' attitudes and perceptions of sensor technologies. Using current literature to support the hypothesis, this paper applies the tradeoff model to decisions about sensor acceptance. Older adults are willing to trade privacy (by accepting a monitoring technology), for autonomy. As the information captured by the sensor becomes more intrusive and the infringement on privacy increases, sensors are accepted if the loss in privacy is traded for autonomy. Even video cameras, the most intrusive sensor type were accepted in exchange for the height of autonomy which is to remain in the home.

  7. Privacy preservation and information security protection for patients' portable electronic health records.

    Science.gov (United States)

    Huang, Lu-Chou; Chu, Huei-Chung; Lien, Chung-Yueh; Hsiao, Chia-Hung; Kao, Tsair

    2009-09-01

    As patients face the possibility of copying and keeping their electronic health records (EHRs) through portable storage media, they will encounter new risks to the protection of their private information. In this study, we propose a method to preserve the privacy and security of patients' portable medical records in portable storage media to avoid any inappropriate or unintentional disclosure. Following HIPAA guidelines, the method is designed to protect, recover and verify patient's identifiers in portable EHRs. The results of this study show that our methods are effective in ensuring both information security and privacy preservation for patients through portable storage medium.

  8. The secret to health information technology's success within the diabetes patient population: a comprehensive privacy and security framework.

    Science.gov (United States)

    Pandya, Sheel M

    2010-05-01

    Congress made an unprecedented investment in health information technology (IT) when it passed the American Recovery and Reinvestment Act in February 2009. Health IT provides enormous opportunities to improve health care quality, reduce costs, and engage patients in their own care. But the potential payoff for use of health IT for diabetes care is magnified given the prevalence, cost, and complexity of the disease. However, without proper privacy and security protections in place, diabetes patient data are at risk of misuse, and patient trust in the system is undermined. We need a comprehensive privacy and security framework that articulates clear parameters for access, use, and disclosure of diabetes patient data for all entities storing and exchanging electronic data. (c) 2010 Diabetes Technology Society.

  9. Protection of the Locational Privacy Using Mosaic Theory of Data (Varstvo lokacijske zasebnosti s pomočjo mozaične teorije podatkov

    Directory of Open Access Journals (Sweden)

    Primož Križnar

    2016-12-01

    Full Text Available The individual’s right to privacy is one of the fundamental human rights. Part of this »embedded« right presents a person’s capability to move from a variety of different points and locations with reasonable expectation that performed paths, stops and current locations are not systematically recorded and stored for future use. Notwithstanding this, individuals often seem to be ignorant of the modern technology capabilities, which is aggressively interfering with wide spectrum of their privacy, part of which is also locational privacy. However, the following as one of the existential component of privacy must also be given all the necessary legal protection, which, at least for the time being, is reflected in the implementation of the mosaic theory in the European legal traditions with the help of established legal standards of the European Court of Human Rights regarding privacy.

  10. Security and privacy issues with health care information technology.

    Science.gov (United States)

    Meingast, Marci; Roosta, Tanya; Sastry, Shankar

    2006-01-01

    The face of health care is changing as new technologies are being incorporated into the existing infrastructure. Electronic patient records and sensor networks for in-home patient monitoring are at the current forefront of new technologies. Paper-based patient records are being put in electronic format enabling patients to access their records via the Internet. Remote patient monitoring is becoming more feasible as specialized sensors can be placed inside homes. The combination of these technologies will improve the quality of health care by making it more personalized and reducing costs and medical errors. While there are benefits to technologies, associated privacy and security issues need to be analyzed to make these systems socially acceptable. In this paper we explore the privacy and security implications of these next-generation health care technologies. We describe existing methods for handling issues as well as discussing which issues need further consideration.

  11. Pervasive Computing, Privacy and Distribution of the Self

    Directory of Open Access Journals (Sweden)

    Soraj Hongladarom

    2011-05-01

    Full Text Available The emergence of what is commonly known as “ambient intelligence” or “ubiquitous computing” means that our conception of privacy and trust needs to be reconsidered. Many have voiced their concerns about the threat to privacy and the more prominent role of trust that have been brought about by emerging technologies. In this paper, I will present an investigation of what this means for the self and identity in our ambient intelligence environment. Since information about oneself can be actively distributed and processed, it is proposed that in a significant sense it is the self itself that is distributed throughout a pervasive or ubiquitous computing network when information pertaining to the self of the individual travels through the network. Hence privacy protection needs to be extended to all types of information distributed. It is also recommended that appropriately strong legislation on privacy and data protection regarding this pervasive network is necessary, but at present not sufficient, to ensure public trust. What is needed is a campaign on public awareness and positive perception of the technology.

  12. Information Privacy: The Attitudes and Behaviours of Internet Users

    OpenAIRE

    Jakovljević, Marija

    2011-01-01

    The rise of electronic commerce and the Internet have created new technologies and capabilities, which increase concern for privacy online. This study reports on the results of an investigation of Internet users attitudes towards concern for privacy online, online behaviours adopted under varying levels of concern for privacy (high, moderate and low) and the types of information Internet users are protective of. Methodological triangulation was used, whereby both quantitative and qualitative ...

  13. Online privacy: overview and preliminary research

    Directory of Open Access Journals (Sweden)

    Renata Mekovec

    2010-12-01

    Full Text Available Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4 Over the last decade using the Internet for online shopping, information browsing and searching as well as for online communication has become part of everyday life. Although the Internet technology has a lot of benefits for users, one of the most important disadvantages is related to the increasing capacity for users’ online activity surveillance. However, the users are increasingly becoming aware of online surveillance methods, which results in their increased concern for privacy protection. Numerous factors influence the way in which individuals perceive the level of privacy protection when they are online. This article provides a review of factors that influence the privacy perception of Internet users. Previous online privacy research related to e-business was predominantly focused on the dimension of information privacy and concerned with the way users’ personal information is collected, saved and used by an online company. This article’s main aim is to provide an overview of numerous Internet users’ privacy perception elements across various privacy dimensions as well as their potential categorization. In addition, considering that e-banking and online shopping are one of the most widely used e-services, an examination of online privacy perception of e-banking/online shopping users was performed. 

  14. Privacy and User Experience in 21st Century Library Discovery

    Directory of Open Access Journals (Sweden)

    Shayna Pekala

    2017-06-01

    Full Text Available Over the last decade, libraries have taken advantage of emerging technologies to provide new discovery tools to help users find information and resources more efficiently. In the wake of this technological shift in discovery, privacy has become an increasingly prominent and complex issue for libraries. The nature of the web, over which users interact with discovery tools, has substantially diminished the library’s ability to control patron privacy. The emergence of a data economy has led to a new wave of online tracking and surveillance, in which multiple third parties collect and share user data during the discovery process, making it much more difficult, if not impossible, for libraries to protect patron privacy. In addition, users are increasingly starting their searches with web search engines, diminishing the library’s control over privacy even further. While libraries have a legal and ethical responsibility to protect patron privacy, they are simultaneously challenged to meet evolving user needs for discovery. In a world where “search” is synonymous with Google, users increasingly expect their library discovery experience to mimic their experience using web search engines. However, web search engines rely on a drastically different set of privacy standards, as they strive to create tailored, personalized search results based on user data. Libraries are seemingly forced to make a choice between delivering the discovery experience users expect and protecting user privacy. This paper explores the competing interests of privacy and user experience, and proposes possible strategies to address them in the future design of library discovery tools.

  15. The Privacy Jungle:On the Market for Data Protection in Social Networks

    Science.gov (United States)

    Bonneau, Joseph; Preibusch, Sören

    We have conducted the first thorough analysis of the market for privacy practices and policies in online social networks. From an evaluation of 45 social networking sites using 260 criteria we find that many popular assumptions regarding privacy and social networking need to be revisited when considering the entire ecosystem instead of only a handful of well-known sites. Contrary to the common perception of an oligopolistic market, we find evidence of vigorous competition for new users. Despite observing many poor security practices, there is evidence that social network providers are making efforts to implement privacy enhancing technologies with substantial diversity in the amount of privacy control offered. However, privacy is rarely used as a selling point, even then only as auxiliary, nondecisive feature. Sites also failed to promote their existing privacy controls within the site. We similarly found great diversity in the length and content of formal privacy policies, but found an opposite promotional trend: though almost all policies are not accessible to ordinary users due to obfuscating legal jargon, they conspicuously vaunt the sites' privacy practices. We conclude that the market for privacy in social networks is dysfunctional in that there is significant variation in sites' privacy controls, data collection requirements, and legal privacy policies, but this is not effectively conveyed to users. Our empirical findings motivate us to introduce the novel model of a privacy communication game, where the economically rational choice for a site operator is to make privacy control available to evade criticism from privacy fundamentalists, while hiding the privacy control interface and privacy policy to maximize sign-up numbers and encourage data sharing from the pragmatic majority of users.

  16. Nano-technology and privacy: on continuous surveillance outside the panopticon.

    Science.gov (United States)

    Hoven, Jeroen Van Den; Vermaas, Pieter E

    2007-01-01

    We argue that nano-technology in the form of invisible tags, sensors, and Radio Frequency Identity Chips (RFIDs) will give rise to privacy issues that are in two ways different from the traditional privacy issues of the last decades. One, they will not exclusively revolve around the idea of centralization of surveillance and concentration of power, as the metaphor of the Panopticon suggests, but will be about constant observation at decentralized levels. Two, privacy concerns may not exclusively be about constraining information flows but also about designing of materials and nano-artifacts such as chips and tags. We begin by presenting a framework for structuring the current debates on privacy, and then present our arguments.

  17. Balance between Privacy Protecting and Selling User Data of Wearable Devices

    OpenAIRE

    Huang, Kuang-Chiu; Hsu, Jung-Fang

    2017-01-01

    Smart bracelets are capable of identifying individual data, which can synchronize the step count, mileage, calorie consumption, heart rate, sleeping data and even the pictures users uploaded with the APP. This feature is so convenient on one hand but makes us lose control of our privacy on the other hand. With poor privacy protection mechanism embedded in these wearable devices that hackers can easily invade and steal user data. In addition, most smart bracelet companies have not made a clear...

  18. A privacy protection for an mHealth messaging system

    Science.gov (United States)

    Aaleswara, Lakshmipathi; Akopian, David; Chronopoulos, Anthony T.

    2015-03-01

    In this paper, we propose a new software system that employs features that help the organization to comply with USA HIPAA regulations. The system uses SMS as the primary way of communication to transfer information. Lack of knowledge about some diseases is still a major reason for some harmful diseases spreading. The developed system includes different features that may help to communicate amongst low income people who don't even have access to the internet. Since the software system deals with Personal Health Information (PHI) it is equipped with an access control authentication system mechanism to protect privacy. The system is analyzed for performance to identify how much overhead the privacy rules impose.

  19. Perspectives of Australian adults about protecting the privacy of their health information in statistical databases.

    Science.gov (United States)

    King, Tatiana; Brankovic, Ljiljana; Gillard, Patricia

    2012-04-01

    . Assuring individuals that their personal health information is de-identified reduces their concern about the necessity of consent for releasing health information for research purposes, but many people are not aware that removing their names and other direct identifiers from medical records does not guarantee full privacy protection for their health information. Privacy concerns decrease as extra security measures are introduced to protect privacy. Therefore, instead of "tailoring concern" as proposed by Willison we suggest improving privacy protection of personal information by introducing additional security measures in data publishing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. User Privacy in RFID Networks

    Science.gov (United States)

    Singelée, Dave; Seys, Stefaan

    Wireless RFID networks are getting deployed at a rapid pace and have already entered the public space on a massive scale: public transport cards, the biometric passport, office ID tokens, customer loyalty cards, etc. Although RFID technology offers interesting services to customers and retailers, it could also endanger the privacy of the end-users. The lack of protection mechanisms being deployed could potentially result in a privacy leakage of personal data. Furthermore, there is the emerging threat of location privacy. In this paper, we will show some practical attack scenarios and illustrates some of them with cases that have received press coverage. We will present the main challenges of enhancing privacy in RFID networks and evaluate some solutions proposed in literature. The main advantages and shortcomings will be briefly discussed. Finally, we will give an overview of some academic and industrial research initiatives on RFID privacy.

  1. Privacy After Snowden: Theoretical Developments and Public Opinion Perceptions of Privacy in Slovenia (Zasebnost po Snowdnu: novejša pojmovanja zasebnosti in odnos javnosti do le-te v Sloveniji

    Directory of Open Access Journals (Sweden)

    Aleš Završnik

    2014-10-01

    Full Text Available The article analyses recent theorizing of privacy arising from new technologies that allow constant and ubiquitous monitoring of our communication and movement. The theoretical part analyses Helen Nissenbaum’s theory of contextual integrity of privacy and pluralistic understanding of privacy by Daniel Solove. The empirical part presents the results of an online survey on the Slovenian public perceptions of privacy that includes questions on types and frequency of victimizations relating to the right to privacy; self-reported privacy violations; concern for the protection of one’s own privacy; perception of primary privacy offenders; the value of privacy; attitude towards data retention in public telecommunication networks; and acquaintance with the Information Commissioner of RS. Despite growing distrust of large internet corporations and – after Edward Snowden’s revelations – Intelligence agencies, the findings indicate a low degree of awareness and care for the protection of personal data.

  2. The interplay between decentralization and privacy: the case of blockchain technologies

    OpenAIRE

    De Filippi , Primavera

    2016-01-01

    International audience; Decentralized architectures are gaining popularity as a way to protect one's privacy against the ubiquitous surveillance of states and corporations. Yet, in spite of the obvious benefits they provide when it comes to data sovereignty, decentralized architectures also present certain characteristics that—if not properly accounted for—might ultimately impinge upon users' privacy. While they are capable of preserving the confidentiality of data, decentralized architecture...

  3. Privacy Practices of Health Social Networking Sites: Implications for Privacy and Data Security in Online Cancer Communities.

    Science.gov (United States)

    Charbonneau, Deborah H

    2016-08-01

    While online communities for social support continue to grow, little is known about the state of privacy practices of health social networking sites. This article reports on a structured content analysis of privacy policies and disclosure practices for 25 online ovarian cancer communities. All of the health social networking sites in the study sample provided privacy statements to users, yet privacy practices varied considerably across the sites. The majority of sites informed users that personal information was collected about participants and shared with third parties (96%, n = 24). Furthermore, more than half of the sites (56%, n = 14) stated that cookies technology was used to track user behaviors. Despite these disclosures, only 36% (n = 9) offered opt-out choices for sharing data with third parties. In addition, very few of the sites (28%, n = 7) allowed individuals to delete their personal information. Discussions about specific security measures used to protect personal information were largely missing. Implications for privacy, confidentiality, consumer choice, and data safety in online environments are discussed. Overall, nurses and other health professionals can utilize these findings to encourage individuals seeking online support and participating in social networking sites to build awareness of privacy risks to better protect their personal health information in the digital age.

  4. Privacy and senior willingness to adopt smart home information technology in residential care facilities.

    Science.gov (United States)

    Courtney, K L

    2008-01-01

    With large predicted increases of the older adult (65 years and older) population, researchers have been exploring the use of smart home information technologies (IT) in residential care (RC) facilities to enhance resident quality of life and safety. Older adults' perceptions of privacy can inhibit their acceptance and subsequent adoption of smart home IT. This qualitative study, guided by principles of grounded theory research, investigated the relationship between privacy, living environment and willingness of older adults living in residential care facilities to adopt smart home IT through focus groups and individual interviews. The findings from this study indicate that privacy can be a barrier for older adults' adoption of smart home IT; however their own perception of their need for the technology may override their privacy concerns. Privacy concerns, as a barrier to technology adoption, can be influenced by both individual-level and community-level factors. Further exploration of the factors influencing older adults' perceptions of smart home IT need is necessary.

  5. The Influence of Security Statement, Technical Protection, and Privacy on Satisfaction and Loyalty; A Structural Equation Modeling

    Science.gov (United States)

    Peikari, Hamid Reza

    Customer satisfaction and loyalty have been cited as the e-commerce critical success factors and various studies have been conducted to find the antecedent determinants of these concepts in the online transactions. One of the variables suggested by some studies is perceived security. However, these studies have referred to security from a broad general perspective and no attempts have been made to study the specific security related variables. This paper intends to study the influence on security statement and technical protection on satisfaction, loyalty and privacy. The data was collected from 337 respondents and after the reliability and validity tests, path analysis was applied to examine the hypotheses. The results suggest that loyalty is influenced by satisfaction and security statement and no empirical support was found for the influence on technical protection and privacy on loyalty. Moreover, it was found that security statement and technical protection have a positive significant influence on satisfaction while no significant effect was found for privacy. Furthermore, the analysis indicated that security statement have a positive significant influence on technical protection while technical protection was found to have a significant negative impact on perceived privacy.

  6. Outsourcing medical data analyses: can technology overcome legal, privacy, and confidentiality issues?

    Science.gov (United States)

    Brumen, Bostjan; Heričko, Marjan; Sevčnikar, Andrej; Završnik, Jernej; Hölbl, Marko

    2013-12-16

    Medical data are gold mines for deriving the knowledge that could change the course of a single patient's life or even the health of the entire population. A data analyst needs to have full access to relevant data, but full access may be denied by privacy and confidentiality of medical data legal regulations, especially when the data analyst is not affiliated with the data owner. Our first objective was to analyze the privacy and confidentiality issues and the associated regulations pertaining to medical data, and to identify technologies to properly address these issues. Our second objective was to develop a procedure to protect medical data in such a way that the outsourced analyst would be capable of doing analyses on protected data and the results would be comparable, if not the same, as if they had been done on the original data. Specifically, our hypothesis was there would not be a difference between the outsourced decision trees built on encrypted data and the ones built on original data. Using formal definitions, we developed an algorithm to protect medical data for outsourced analyses. The algorithm was applied to publicly available datasets (N=30) from the medical and life sciences fields. The analyses were performed on the original and the protected datasets and the results of the analyses were compared. Bootstrapped paired t tests for 2 dependent samples were used to test whether the mean differences in size, number of leaves, and the accuracy of the original and the encrypted decision trees were significantly different. The decision trees built on encrypted data were virtually the same as those built on original data. Out of 30 datasets, 100% of the trees had identical accuracy. The size of a tree and the number of leaves was different only once (1/30, 3%, P=.19). The proposed algorithm encrypts a file with plain text medical data into an encrypted file with the data protected in such a way that external data analyses are still possible. The results

  7. Protecting and Evaluating Genomic Privacy in Medical Tests and Personalized Medicine

    OpenAIRE

    Ayday, Erman; Raisaro, Jean Louis; Rougemont, Jacques; Hubaux, Jean-Pierre

    2013-01-01

    In this paper, we propose privacy-enhancing technologies for medical tests and personalized medicine methods that use patients' genomic data. Focusing on genetic disease-susceptibility tests, we develop a new architecture (between the patient and the medical unit) and propose a "privacy-preserving disease susceptibility test" (PDS) by using homomorphic encryption and proxy re-encryption. Assuming the whole genome sequencing to be done by a certified institution, we propose to store patients' ...

  8. Enhancing Privacy Education with a Technical Emphasis in IT Curriculum

    Science.gov (United States)

    Peltsverger, Svetlana; Zheng, Guangzhi

    2016-01-01

    The paper describes the development of four learning modules that focus on technical details of how a person's privacy might be compromised in real-world scenarios. The paper shows how students benefited from the addition of hands-on learning experiences of privacy and data protection to the existing information technology courses. These learning…

  9. Biomedical databases: protecting privacy and promoting research.

    Science.gov (United States)

    Wylie, Jean E; Mineau, Geraldine P

    2003-03-01

    When combined with medical information, large electronic databases of information that identify individuals provide superlative resources for genetic, epidemiology and other biomedical research. Such research resources increasingly need to balance the protection of privacy and confidentiality with the promotion of research. Models that do not allow the use of such individual-identifying information constrain research; models that involve commercial interests raise concerns about what type of access is acceptable. Researchers, individuals representing the public interest and those developing regulatory guidelines must be involved in an ongoing dialogue to identify practical models.

  10. Genomic research and data-mining technology: implications for personal privacy and informed consent.

    Science.gov (United States)

    Tavani, Herman T

    2004-01-01

    This essay examines issues involving personal privacy and informed consent that arise at the intersection of information and communication technology (ICT) and population genomics research. I begin by briefly examining the ethical, legal, and social implications (ELSI) program requirements that were established to guide researchers working on the Human Genome Project (HGP). Next I consider a case illustration involving deCODE Genetics, a privately owned genetic company in Iceland, which raises some ethical concerns that are not clearly addressed in the current ELSI guidelines. The deCODE case also illustrates some ways in which an ICT technique known as data mining has both aided and posed special challenges for researchers working in the field of population genomics. On the one hand, data-mining tools have greatly assisted researchers in mapping the human genome and in identifying certain "disease genes" common in specific populations (which, in turn, has accelerated the process of finding cures for diseases tha affect those populations). On the other hand, this technology has significantly threatened the privacy of research subjects participating in population genomics studies, who may, unwittingly, contribute to the construction of new groups (based on arbitrary and non-obvious patterns and statistical correlations) that put those subjects at risk for discrimination and stigmatization. In the final section of this paper I examine some ways in which the use of data mining in the context of population genomics research poses a critical challenge for the principle of informed consent, which traditionally has played a central role in protecting the privacy interests of research subjects participating in epidemiological studies.

  11. Privacy Policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy; Petkovic, M.; Jonker, W.; Jonker, Willem

    2007-01-01

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  12. BORDERS OF COMMUNICATION PRIVACY IN SLOVENIAN CRIMINAL PROCEDURE – CONSTITUTIONAL CHALLENGES

    Directory of Open Access Journals (Sweden)

    Sabina Zgaga

    2015-01-01

    Full Text Available Due to fast technological development and our constant communication protection of communication privacy in every aspect of our (legal life has become more important than ever before. Regarding protection of privacy in criminal procedure special emphasis should be given to the regulation of privacy in Slovenian Constitution and its interpretation in the case law of the Constitutional Court. This paper presents the definition of privacy and communication privacy in Slovenian constitutional law and exposes the main issues of communication privacy that have been discussed in the case law of the Constitutional Court in the last twenty years. Thereby the paper tries to show the general trend in the case law of Constitutional Court regarding the protection of communication privacy and to expose certain unsolved issues and unanswered challenges. Slovenian constitutional regulation of communication privacy is very protective, considering the broad definition of privacy and the strict conditions for encroachment of communication privacy. The case law of Slovenian Constitutional Court has also shown such trend, with the possible exception of the recent decision on a dynamic IP address. The importance of this decision is however significant, since it could be applicable to all forms of communication via internet, the prevailing form of communication nowadays. Certain challenges still lay ahead, such as the current proposal for the amendment of Criminal Procedure Act-M, which includes the use of IMSI catchers and numerous unanswered issues regarding data retention after the decisive annulment of its partial legal basis by the Constitutional Court.

  13. Privacy and Anonymity in the Information Society – Challenges for the European Union

    Directory of Open Access Journals (Sweden)

    Ioannis A. Tsoukalas

    2011-01-01

    Full Text Available Electronic information is challenging traditional views on property and privacy. The explosion of digital data, driven by novel web applications, social networking, and mobile devices makes data security and the protection of privacy increasingly difficult. Furthermore, biometric data and radiofrequency identification applications enable correlations that are able to trace our cultural, behavioral, and emotional states. The concept of privacy in the digital realm is transformed and emerges as one of the biggest risks facing today's Information Society. In this context, the European Union (EU policy-making procedures strive to adapt to the pace of technological advancement. The EU needs to improve the existing legal frameworks for privacy and data protection. It needs to work towards a “privacy by education” approach for the empowerment of “privacy-literate” European digital citizens.

  14. Privacy and Security within Biobanking: The Role of Information Technology.

    Science.gov (United States)

    Heatherly, Raymond

    2016-03-01

    Along with technical issues, biobanking frequently raises important privacy and security issues that must be resolved as biobanks continue to grow in scale and scope. Consent mechanisms currently in use range from fine-grained to very broad, and in some cases participants are offered very few privacy protections. However, developments in information technology are bringing improvements. New programs and systems are being developed to allow researchers to conduct analyses without distributing the data itself offsite, either by allowing the investigator to communicate with a central computer, or by having each site participate in meta-analysis that results in a shared statistic or final significance result. The implementation of security protocols into the research biobanking setting requires three key elements: authentication, authorization, and auditing. Authentication is the process of making sure individuals are who they claim to be, frequently through the use of a password, a key fob, or a physical (i.e., retinal or fingerprint) scan. Authorization involves ensuring that every individual who attempts an action has permission to do that action. Finally, auditing allows for actions to be logged so that inappropriate or unethical actions can later be traced back to their source. © 2016 American Society of Law, Medicine & Ethics.

  15. Privacy policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, S.; Hartog, den J.I.; Petkovic, M.; Jonker, W.

    2007-01-01

    Privacy is a prime concern in today’s information society. To protect the privacy of individuals, enterprises must follow certain privacy practices while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website, processes

  16. Using genetic information while protecting the privacy of the soul.

    Science.gov (United States)

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  17. A Legal Approach to Civilian Use of Drones in Europe. Privacy and Personal Data Protection Concerns

    OpenAIRE

    Pauner Chulvi, Cristina; Viguri Cordero, Jorge Agustín

    2015-01-01

    Drones are a growth industry evolving quickly from military to civilian uses however, they have the potential to pose a serious risk to security, privacy and data protection. After a first stage focused on safety issues, Europe is facing the challenge to develop a regulatory framework for drones integration into the airspace system while safeguarding the guarantees of fundamental rights and civil liberties. This paper analyses the potential privacy and data protection risks ...

  18. Enhancing Privacy Education with a Technical Emphasis in IT Curriculum

    Directory of Open Access Journals (Sweden)

    Svetlana Peltsverger

    2015-12-01

    Full Text Available The paper describes the development of four learning modules that focus on technical details of how a person’s privacy might be compromised in real-world scenarios. The paper shows how students benefited from the addition of hands-on learning experiences of privacy and data protection to the existing information technology courses. These learning modules raised students’ awareness of potential breaches of privacy as a user as well as a developer. The demonstration of a privacy breach in action helped students to design, configure, and implement technical solutions to prevent privacy violations. The assessment results demonstrate the strength of the technical approach.

  19. The protection of the right to privacy as the social imperative of digital age: How vulnerable are we?

    Directory of Open Access Journals (Sweden)

    Levakov-Vermezović Tijana

    2016-01-01

    Full Text Available The paper examines various forms of jeopardizing the privacy of individuals in digital world, with specific focus on criminal protection provided by current international and national legal framework and the jurisprudence of European Court of Human Rights. The significance of conducting this scientific research is essential considering that we live in the era of electronic communications in which no one is anonymous. Development of information and communication technologies has brought, among its many advantages, various challenges in all spheres of modern life. Since the Internet has become the global forum, individuals have been increasingly target of countless insults, defamation and threats; moreover, numerous personal information or media get published without consent. The practice shows that effective suppression and control of illegal behavior on the Internet and punishing the perpetrators is at the rudimental level. In order to provide proper protection for the victims of criminal offenses committed against their privacy in the digital world, it is necessary to create new models and approaches to solving this problem.

  20. Role-task conditional-purpose policy model for privacy preserving data publishing

    Directory of Open Access Journals (Sweden)

    Rana Elgendy

    2017-12-01

    Full Text Available Privacy becomes a major concern for both consumers and enterprises; therefore many research efforts have been devoted to the development of privacy preserving technology. The challenge in data privacy is to share the data while assuring the protection of personal information. Data privacy includes assuring protection for both insider ad outsider threats even if the data is published. Access control can help to protect the data from outsider threats. Access control is defined as the process of mediating every request to resources and data maintained by a system and determining whether the request should be granted or denied. This can be enforced by a mechanism implementing regulations established by a security policy. In this paper, we present privacy preserving data publishing model based on integration of CPBAC, MD-TRBAC, PBFW, protection against database administrator technique inspired from oracle vault technique and benefits of anonymization technique to protect data when being published using k-anonymity. The proposed model meets the requirements of workflow and non-workflow system in enterprise environment. It is based on the characteristics of the conditional purposes, conditional roles, tasks, and policies. It guarantees the protection against insider threats such as database administrator. Finally it assures needed protection in case of publishing the data. Keywords: Database security, Access control, Data publishing, Anonymization

  1. Protection of the right to privacy in the practice of the European Court of Human Rights

    Directory of Open Access Journals (Sweden)

    Mladenov Marijana

    2013-01-01

    Full Text Available The right to privacy is a fundamental human right and an essential component of the protection of human autonomy and freedom. The development of science and information systems creates various opportunities for interferences with physical and moral integrity of a person. Therefore, it is necessary to determine the precise content of the right to privacy. The European Convention on Human Rights and Fundamental Freedoms guarantees this right under Article 8. The European Court of Human Rights did not precisely define the content of the right to privacy and thereby the applicants could bring different aspects of life into the scope of respect for private life. According to the Court, the concept of privacy and private life includes the following areas of human life: the right to establish and maintain relationships with other human beings, protection of the physical and moral integrity of persons, protection of personal data, change of personal name, various issues related to sexual orientation and transgender. The subject of this paper is referring to previously mentioned spheres of human life in the light of interpretation of Article 8 of the Convention.

  2. Economics of Privacy: Users'€™ Attitudes and Economic Impact of Information Privacy Protection

    OpenAIRE

    Frik, Alisa

    2017-01-01

    This doctoral thesis consists of three essays within the field of economics of information privacy examined through the lens of behavioral and experimental economics. Rapid development and expansion of Internet, mobile and network technologies in the last decades has provided multitudinous opportunities and benefits to both business and society proposing the customized services and personalized offers at a relatively low price and high speed. However, such innovations and progress have al...

  3. Preserving Differential Privacy for Similarity Measurement in Smart Environments

    Directory of Open Access Journals (Sweden)

    Kok-Seng Wong

    2014-01-01

    Full Text Available Advances in both sensor technologies and network infrastructures have encouraged the development of smart environments to enhance people’s life and living styles. However, collecting and storing user’s data in the smart environments pose severe privacy concerns because these data may contain sensitive information about the subject. Hence, privacy protection is now an emerging issue that we need to consider especially when data sharing is essential for analysis purpose. In this paper, we consider the case where two agents in the smart environment want to measure the similarity of their collected or stored data. We use similarity coefficient function FSC as the measurement metric for the comparison with differential privacy model. Unlike the existing solutions, our protocol can facilitate more than one request to compute FSC without modifying the protocol. Our solution ensures privacy protection for both the inputs and the computed FSC results.

  4. Surveillance, Privacy and Trans-Atlantic Relations

    DEFF Research Database (Denmark)

    Recent revelations, by Edward Snowden and others, of the vast network of government spying enabled by modern technology have raised major concerns both in the European Union and the United States on how to protect privacy in the face of increasing governmental surveillance. This book brings...

  5. PRUB: A Privacy Protection Friend Recommendation System Based on User Behavior

    Directory of Open Access Journals (Sweden)

    Wei Jiang

    2016-01-01

    Full Text Available The fast developing social network is a double-edged sword. It remains a serious problem to provide users with excellent mobile social network services as well as protecting privacy data. Most popular social applications utilize behavior of users to build connection with people having similar behavior, thus improving user experience. However, many users do not want to share their certain behavioral information to the recommendation system. In this paper, we aim to design a secure friend recommendation system based on the user behavior, called PRUB. The system proposed aims at achieving fine-grained recommendation to friends who share some same characteristics without exposing the actual user behavior. We utilized the anonymous data from a Chinese ISP, which records the user browsing behavior, for 3 months to test our system. The experiment result shows that our system can achieve a remarkable recommendation goal and, at the same time, protect the privacy of the user behavior information.

  6. Taiwan's perspective on electronic medical records' security and privacy protection: lessons learned from HIPAA.

    Science.gov (United States)

    Yang, Che-Ming; Lin, Herng-Ching; Chang, Polun; Jian, Wen-Shan

    2006-06-01

    The protection of patients' health information is a very important concern in the information age. The purpose of this study is to ascertain what constitutes an effective legal framework in protecting both the security and privacy of health information, especially electronic medical records. All sorts of bills regarding electronic medical data protection have been proposed around the world including Health Insurance Portability and Accountability Act (HIPAA) of the U.S. The trend of a centralized bill that focuses on managing computerized health information is the part that needs our further attention. Under the sponsor of Taiwan's Department of Health (DOH), our expert panel drafted the "Medical Information Security and Privacy Protection Guidelines", which identifies nine principles and entails 12 articles, in the hope that medical organizations will have an effective reference in how to manage their medical information in a confidential and secured fashion especially in electronic transactions.

  7. New threats to health data privacy.

    Science.gov (United States)

    Li, Fengjun; Zou, Xukai; Liu, Peng; Chen, Jake Y

    2011-11-24

    Along with the rapid digitalization of health data (e.g. Electronic Health Records), there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. We study how patient privacy could be compromised with the help of today's information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  8. New threats to health data privacy

    Directory of Open Access Journals (Sweden)

    Li Fengjun

    2011-11-01

    Full Text Available Abstract Background Along with the rapid digitalization of health data (e.g. Electronic Health Records, there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. Results We study how patient privacy could be compromised with the help of today’s information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. Conclusion We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  9. Understanding Factors Associated with Singaporean Adolescents' Intention to Adopt Privacy Protection Behavior Using an Extended Theory of Planned Behavior.

    Science.gov (United States)

    Ho, Shirley S; Lwin, May O; Yee, Andrew Z H; Lee, Edmund W J

    2017-09-01

    Using an extended theory of planned behavior (TPB), this study explores how the original TPB variables (attitude, subjective norms, and perceived behavioral control), personality traits, privacy concern, past privacy protection behaviors (PPBs), as well as parental mediation strategies relate to adolescents' intention to engage in privacy protection measures. We administered a cross-sectional survey to a nationally representative sample of adolescents (N = 4,920) in Singapore. The sample comprised 50.5 percent females and 49.5 percent males with age ranging from 13 to 21 years (M = 14.73). Results from the hierarchical regression analysis showed that the proposed extended TPB model received partial support. Subjective norms, among the TPB and other factors, have the strongest relationship with adolescents' intention to engage in PPBs on social network sites. Adolescents' privacy concern and their past PPBs are more important in influencing their future PPB compared with personality traits such as neuroticism and extraversion. Adolescents whose parents have engaged in regulated parental mediation are more likely to protect their privacy on SNSs compared with adolescents whose parents have adopted active mediation style.

  10. Impact of Mini-drone based Video Surveillance on Invasion of Privacy

    OpenAIRE

    Korshunov, Pavel; Bonetto, Margherita; Ebrahimi, Touradj; Ramponi, Giovanni

    2015-01-01

    An increase in adoption of video surveillance, affecting many aspects of daily lives, raises public concern about an intrusion into individual privacy. New sensing and surveillance technologies, such as mini-drones, threaten to eradicate boundaries of private space even more. Therefore, it is important to study the effect of mini-drones on privacy intrusion and to understand how existing protection privacy filters perform on a video captured by a mini-drone. To this end, we have built a publi...

  11. Modifications to the HIPAA Privacy, Security, Enforcement, and Breach Notification rules under the Health Information Technology for Economic and Clinical Health Act and the Genetic Information Nondiscrimination Act; other modifications to the HIPAA rules.

    Science.gov (United States)

    2013-01-25

    The Department of Health and Human Services (HHS or ``the Department'') is issuing this final rule to: Modify the Health Insurance Portability and Accountability Act (HIPAA) Privacy, Security, and Enforcement Rules to implement statutory amendments under the Health Information Technology for Economic and Clinical Health Act (``the HITECH Act'' or ``the Act'') to strengthen the privacy and security protection for individuals' health information; modify the rule for Breach Notification for Unsecured Protected Health Information (Breach Notification Rule) under the HITECH Act to address public comment received on the interim final rule; modify the HIPAA Privacy Rule to strengthen the privacy protections for genetic information by implementing section 105 of Title I of the Genetic Information Nondiscrimination Act of 2008 (GINA); and make certain other modifications to the HIPAA Privacy, Security, Breach Notification, and Enforcement Rules (the HIPAA Rules) to improve their workability and effectiveness and to increase flexibility for and decrease burden on the regulated entities.

  12. Location-Related Privacy in Geo-Social Networks

    DEFF Research Database (Denmark)

    Ruiz Vicente, Carmen; Freni, Dario; Bettini, Claudio

    2011-01-01

    -ins." However, this ability to reveal users' locations causes new privacy threats, which in turn call for new privacy-protection methods. The authors study four privacy aspects central to these social networks - location, absence, co-location, and identity privacy - and describe possible means of protecting...... privacy in these circumstances....

  13. Privacy and Security Issues Surrounding the Protection of Data Generated by Continuous Glucose Monitors.

    Science.gov (United States)

    Britton, Katherine E; Britton-Colonnese, Jennifer D

    2017-03-01

    Being able to track, analyze, and use data from continuous glucose monitors (CGMs) and through platforms and apps that communicate with CGMs helps achieve better outcomes and can advance the understanding of diabetes. The risks to patients' expectation of privacy are great, and their ability to control how their information is collected, stored, and used is virtually nonexistent. Patients' physical security is also at risk if adequate cybersecurity measures are not taken. Currently, data privacy and security protections are not robust enough to address the privacy and security risks and stymies the current and future benefits of CGM and the platforms and apps that communicate with them.

  14. Exploring the Perceived Measures of Privacy: RFID in Public Applications

    Directory of Open Access Journals (Sweden)

    Mohammad Alamgir Hossain

    2014-06-01

    Full Text Available The purpose of this study is to explore the measures that may protect privacy of the users - in the context of RFID use in public applications. More specifically, this study investigates what the users perceive to have securing their privacy, particularly for the RFID applications in public uses. Qualitative research approach has been utilised for this study. The author conducted two focus-group discussion sessions and eight in-depth interviews in two countries: one from Australasia region (Australia and the other from Asia (Bangladesh, assuming that the status, and the perceptions and tolerance of the citizens on privacy issues are different in the stated regions. The explored factors have been analysed from privacy perspectives. The findings show that, in developed and developing countries, the basic perceptions of the users on privacy protection are complimentary; however, privacy is a more serious concern in Australia than in Bangladesh. Data analysis proposed some attributes that may improve users’ privacy perceptions when RFID is used in public applications. This study is the single initiative that focuses on privacy of RFID users from national-use context. As practical implication, the proposed attributes can be exercised by the deploying agencies that implement RFID technology for citizens’ use.

  15. Privacy and security of patient data in the pathology laboratory.

    Science.gov (United States)

    Cucoranu, Ioan C; Parwani, Anil V; West, Andrew J; Romero-Lauro, Gonzalo; Nauman, Kevin; Carter, Alexis B; Balis, Ulysses J; Tuthill, Mark J; Pantanowitz, Liron

    2013-01-01

    Data protection and security are critical components of routine pathology practice because laboratories are legally required to securely store and transmit electronic patient data. With increasing connectivity of information systems, laboratory work-stations, and instruments themselves to the Internet, the demand to continuously protect and secure laboratory information can become a daunting task. This review addresses informatics security issues in the pathology laboratory related to passwords, biometric devices, data encryption, internet security, virtual private networks, firewalls, anti-viral software, and emergency security situations, as well as the potential impact that newer technologies such as mobile devices have on the privacy and security of electronic protected health information (ePHI). In the United States, the Health Insurance Portability and Accountability Act (HIPAA) govern the privacy and protection of medical information and health records. The HIPAA security standards final rule mandate administrative, physical, and technical safeguards to ensure the confidentiality, integrity, and security of ePHI. Importantly, security failures often lead to privacy breaches, invoking the HIPAA privacy rule as well. Therefore, this review also highlights key aspects of HIPAA and its impact on the pathology laboratory in the United States.

  16. Privacy and security of patient data in the pathology laboratory

    Directory of Open Access Journals (Sweden)

    Ioan C Cucoranu

    2013-01-01

    Full Text Available Data protection and security are critical components of routine pathology practice because laboratories are legally required to securely store and transmit electronic patient data. With increasing connectivity of information systems, laboratory work-stations, and instruments themselves to the Internet, the demand to continuously protect and secure laboratory information can become a daunting task. This review addresses informatics security issues in the pathology laboratory related to passwords, biometric devices, data encryption, internet security, virtual private networks, firewalls, anti-viral software, and emergency security situations, as well as the potential impact that newer technologies such as mobile devices have on the privacy and security of electronic protected health information (ePHI. In the United States, the Health Insurance Portability and Accountability Act (HIPAA govern the privacy and protection of medical information and health records. The HIPAA security standards final rule mandate administrative, physical, and technical safeguards to ensure the confidentiality, integrity, and security of ePHI. Importantly, security failures often lead to privacy breaches, invoking the HIPAA privacy rule as well. Therefore, this review also highlights key aspects of HIPAA and its impact on the pathology laboratory in the United States.

  17. From privacy to data protection in the EU : Implications for big data health research

    NARCIS (Netherlands)

    Mostert, Menno; Bredenoord, Annelien L.; Van Der Slootb, Bart; Van Delden, Johannes J.M.

    2018-01-01

    The right to privacy has usually been considered as the most prominent fundamental right to protect in data-intensive (Big Data) health research. Within the European Union (EU), however, the right to data protection is gaining relevance as a separate fundamental right that should in particular be

  18. End-to-End Privacy Protection for Facebook Mobile Chat based on AES with Multi-Layered MD5

    Directory of Open Access Journals (Sweden)

    Wibisono Sukmo Wardhono

    2018-01-01

    Full Text Available As social media environments become more interactive and amount of users grown tremendously, privacy is a matter of increasing concern. When personal data become a commodity, social media company can share users data to another party such as government. Facebook, inc is one of the social media company that frequently asked for user’s data. Although this private data request mechanism through a formal and valid legal process, it still undermine the fundamental right to information privacy. In This Case, social media users need protection against privacy violation from social media platform provider itself.  Private chat is the most favorite feature of a social media. Inside a chat room, user can share their private information contents. Cryptography is one of data protection methods that can be used to hides private communication data from unauthorized parties. In our study, we proposed a system that can encrypt chatting content based on AES and multi-layered MD5 to ensure social media users have privacy protection against social media company that use user informations as a commodity. In addition, this system can make users convenience to share their private information through social media platform.

  19. An innovative privacy preserving technique for incremental datasets on cloud computing.

    Science.gov (United States)

    Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan

    2016-08-01

    Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Secure and privacy-preserving data communication in Internet of Things

    CERN Document Server

    Zhu, Liehuang; Xu, Chang

    2017-01-01

    This book mainly concentrates on protecting data security and privacy when participants communicate with each other in the Internet of Things (IoT). Technically, this book categorizes and introduces a collection of secure and privacy-preserving data communication schemes/protocols in three traditional scenarios of IoT: wireless sensor networks, smart grid and vehicular ad-hoc networks recently. This book presents three advantages which will appeal to readers. Firstly, it broadens reader’s horizon in IoT by touching on three interesting and complementary topics: data aggregation, privacy protection, and key agreement and management. Secondly, various cryptographic schemes/protocols used to protect data confidentiality and integrity is presented. Finally, this book will illustrate how to design practical systems to implement the algorithms in the context of IoT communication. In summary, readers can simply learn and directly apply the new technologies to communicate data in IoT after reading this book.

  1. Network Security Hacks Tips & Tools for Protecting Your Privacy

    CERN Document Server

    Lockhart, Andrew

    2009-01-01

    This second edition of Network Security Hacks offers 125 concise and practical hacks, including more information for Windows administrators, hacks for wireless networking (such as setting up a captive portal and securing against rogue hotspots), and techniques to ensure privacy and anonymity, including ways to evade network traffic analysis, encrypt email and files, and protect against phishing attacks. System administrators looking for reliable answers will also find concise examples of applied encryption, intrusion detection, logging, trending, and incident response.

  2. Large-scale Health Information Database and Privacy Protection*1

    OpenAIRE

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law...

  3. Ambiguity in Social Network Data for Presence, Sensitive-Attribute, Degree and Relationship Privacy Protection.

    Science.gov (United States)

    Rajaei, Mehri; Haghjoo, Mostafa S; Miyaneh, Eynollah Khanjari

    2015-01-01

    Maintaining privacy in network data publishing is a major challenge. This is because known characteristics of individuals can be used to extract new information about them. Recently, researchers have developed privacy methods based on k-anonymity and l-diversity to prevent re-identification or sensitive label disclosure through certain structural information. However, most of these studies have considered only structural information and have been developed for undirected networks. Furthermore, most existing approaches rely on generalization and node clustering so may entail significant information loss as all properties of all members of each group are generalized to the same value. In this paper, we introduce a framework for protecting sensitive attribute, degree (the number of connected entities), and relationships, as well as the presence of individuals in directed social network data whose nodes contain attributes. First, we define a privacy model that specifies privacy requirements for the above private information. Then, we introduce the technique of Ambiguity in Social Network data (ASN) based on anatomy, which specifies how to publish social network data. To employ ASN, individuals are partitioned into groups. Then, ASN publishes exact values of properties of individuals of each group with common group ID in several tables. The lossy join of those tables based on group ID injects uncertainty to reconstruct the original network. We also show how to measure different privacy requirements in ASN. Simulation results on real and synthetic datasets demonstrate that our framework, which protects from four types of private information disclosure, preserves data utility in tabular, topological and spectrum aspects of networks at a satisfactory level.

  4. Privacy protection in HealthGrid: distributing encryption management over the VO.

    Science.gov (United States)

    Torres, Erik; de Alfonso, Carlos; Blanquer, Ignacio; Hernández, Vicente

    2006-01-01

    Grid technologies have proven to be very successful in tackling challenging problems in which data access and processing is a bottleneck. Notwithstanding the benefits that Grid technologies could have in Health applications, privacy leakages of current DataGrid technologies due to the sharing of data in VOs and the use of remote resources, compromise its widespreading. Privacy control for Grid technology has become a key requirement for the adoption of Grids in the Healthcare sector. Encrypted storage of confidential data effectively reduces the risk of disclosure. A self-enforcing scheme for encrypted data storage can be achieved by combining Grid security systems with distributed key management and classical cryptography techniques. Virtual Organizations, as the main unit of user management in Grid, can provide a way to organize key sharing, access control lists and secure encryption management. This paper provides programming models and discusses the value, costs and behavior of such a system implemented on top of one of the latest Grid middlewares. This work is partially funded by the Spanish Ministry of Science and Technology in the frame of the project Investigación y Desarrollo de Servicios GRID: Aplicación a Modelos Cliente-Servidor, Colaborativos y de Alta Productividad, with reference TIC2003-01318.

  5. Privacy Protection on Multiple Sensitive Attributes

    Science.gov (United States)

    Li, Zhen; Ye, Xiaojun

    In recent years, a privacy model called k-anonymity has gained popularity in the microdata releasing. As the microdata may contain multiple sensitive attributes about an individual, the protection of multiple sensitive attributes has become an important problem. Different from the existing models of single sensitive attribute, extra associations among multiple sensitive attributes should be invested. Two kinds of disclosure scenarios may happen because of logical associations. The Q&S Diversity is checked to prevent the foregoing disclosure risks, with an α Requirement definition used to ensure the diversity requirement. At last, a two-step greedy generalization algorithm is used to carry out the multiple sensitive attributes processing which deal with quasi-identifiers and sensitive attributes respectively. We reduce the overall distortion by the measure of Masking SA.

  6. The Privacy Coach: Supporting customer privacy in the Internet of Things

    OpenAIRE

    Broenink, Gerben; Hoepman, Jaap-Henk; Hof, Christian van 't; van Kranenburg, Rob; Smits, David; Wisman, Tijmen

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing technologies on the RFID tags themselves. Instead the Privacy Coach functions as a mediator between customer privacy preferences and corporate privacy policies, trying to find a match between the ...

  7. Genetic privacy.

    Science.gov (United States)

    Sankar, Pamela

    2003-01-01

    During the past 10 years, the number of genetic tests performed more than tripled, and public concern about genetic privacy emerged. The majority of states and the U.S. government have passed regulations protecting genetic information. However, research has shown that concerns about genetic privacy are disproportionate to known instances of information misuse. Beliefs in genetic determinacy explain some of the heightened concern about genetic privacy. Discussion of the debate over genetic testing within families illustrates the most recent response to genetic privacy concerns.

  8. Protecting patient privacy when sharing patient-level data from clinical trials.

    Science.gov (United States)

    Tucker, Katherine; Branson, Janice; Dilleen, Maria; Hollis, Sally; Loughlin, Paul; Nixon, Mark J; Williams, Zoë

    2016-07-08

    Greater transparency and, in particular, sharing of patient-level data for further scientific research is an increasingly important topic for the pharmaceutical industry and other organisations who sponsor and conduct clinical trials as well as generally in the interests of patients participating in studies. A concern remains, however, over how to appropriately prepare and share clinical trial data with third party researchers, whilst maintaining patient confidentiality. Clinical trial datasets contain very detailed information on each participant. Risk to patient privacy can be mitigated by data reduction techniques. However, retention of data utility is important in order to allow meaningful scientific research. In addition, for clinical trial data, an excessive application of such techniques may pose a public health risk if misleading results are produced. After considering existing guidance, this article makes recommendations with the aim of promoting an approach that balances data utility and privacy risk and is applicable across clinical trial data holders. Our key recommendations are as follows: 1. Data anonymisation/de-identification: Data holders are responsible for generating de-identified datasets which are intended to offer increased protection for patient privacy through masking or generalisation of direct and some indirect identifiers. 2. Controlled access to data, including use of a data sharing agreement: A legally binding data sharing agreement should be in place, including agreements not to download or further share data and not to attempt to seek to identify patients. Appropriate levels of security should be used for transferring data or providing access; one solution is use of a secure 'locked box' system which provides additional safeguards. This article provides recommendations on best practices to de-identify/anonymise clinical trial data for sharing with third-party researchers, as well as controlled access to data and data sharing

  9. A smart-card-enabled privacy preserving E-prescription system.

    Science.gov (United States)

    Yang, Yanjiang; Han, Xiaoxi; Bao, Feng; Deng, Robert H

    2004-03-01

    Within the overall context of protection of health care information, privacy of prescription data needs special treatment. First, the involvement of diverse parties, especially nonmedical parties in the process of drug prescription complicates the protection of prescription data. Second, both patients and doctors have privacy stakes in prescription, and their privacy should be equally protected. Third, the following facts determine that prescription should not be processed in a truly anonymous manner: certain involved parties conduct useful research on the basis of aggregation of prescription data that are linkable with respect to either the patients or the doctors; prescription data has to be identifiable in some extreme circumstances, e.g., under the court order for inspection and assign liability. In this paper, we propose an e-prescription system to address issues pertaining to the privacy protection in the process of drug prescription. In our system, patients' smart cards play an important role. For one thing, the smart cards are implemented to be portable repositories carrying up-to-date personal medical records and insurance information, providing doctors instant data access crucial to the process of diagnosis and prescription. For the other, with the secret signing key being stored inside, the smart card enables the patient to sign electronically the prescription pad, declaring his acceptance of the prescription. To make the system more realistic, we identify the needs for a patient to delegate his signing capability to other people so as to protect the privacy of information housed on his card. A strong proxy signature scheme achieving technologically mutual agreements on the delegation is proposed to implement the delegation functionality.

  10. A Utility Maximizing and Privacy Preserving Approach for Protecting Kinship in Genomic Databases.

    Science.gov (United States)

    Kale, Gulce; Ayday, Erman; Tastan, Oznur

    2017-09-12

    Rapid and low cost sequencing of genomes enabled widespread use of genomic data in research studies and personalized customer applications, where genomic data is shared in public databases. Although the identities of the participants are anonymized in these databases, sensitive information about individuals can still be inferred. One such information is kinship. We define two routes kinship privacy can leak and propose a technique to protect kinship privacy against these risks while maximizing the utility of shared data. The method involves systematic identification of minimal portions of genomic data to mask as new participants are added to the database. Choosing the proper positions to hide is cast as an optimization problem in which the number of positions to mask is minimized subject to privacy constraints that ensure the familial relationships are not revealed.We evaluate the proposed technique on real genomic data. Results indicate that concurrent sharing of data pertaining to a parent and an offspring results in high risks of kinship privacy, whereas the sharing data from further relatives together is often safer. We also show arrival order of family members have a high impact on the level of privacy risks and on the utility of sharing data. Available at: https://github.com/tastanlab/Kinship-Privacy. erman@cs.bilkent.edu.tr or oznur.tastan@cs.bilkent.edu.tr. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Identity management and privacy languages technologies: Improving user control of data privacy

    Science.gov (United States)

    García, José Enrique López; García, Carlos Alberto Gil; Pacheco, Álvaro Armenteros; Organero, Pedro Luis Muñoz

    The identity management solutions have the capability to bring confidence to internet services, but this confidence could be improved if user has more control over the privacy policy of its attributes. Privacy languages could help to this task due to its capability to define privacy policies for data in a very flexible way. So, an integration problem arises: making work together both identity management and privacy languages. Despite several proposals for accomplishing this have already been defined, this paper suggests some topics and improvements that could be considered.

  12. Regulating Online Data Privacy

    OpenAIRE

    Paul Reid

    2004-01-01

    With existing data protection laws proving inadequate in the fight to protect online data privacy and with the offline law of privacy in a state of change and uncertainty, the search for an alternative solution to the important problem of online data privacy should commence. With the inherent problem of jurisdiction that the Internet presents, such a solution is best coming from a multi-national body with the power to approximate laws in as many jurisdictions as possible, with a recognised au...

  13. Cancer surveillance and information: balancing public health with privacy and confidentiality concerns (United States).

    Science.gov (United States)

    Deapen, Dennis

    2006-06-01

    Rapid advances in informatics and communication technologies are greatly expanding the capacity for information capture and transportation. While these tools can be used for great good, they also offer new opportunities for those who seek to obtain and use information for improper purposes. While issues related to identity theft for financial gain garner the most attention, protection of privacy in public health endeavors such as cancer surveillance is also a significant concern. Some efforts to protect health-related information have had unintended consequences detrimental to health research and public health practice. Achieving a proper balance between measures to protect privacy and the ability to guard and improve public health requires careful consideration and development of appropriate policies, regulations and use of technology.

  14. Toward Privacy-Preserving Personalized Recommendation Services

    Directory of Open Access Journals (Sweden)

    Cong Wang

    2018-02-01

    Full Text Available Recommendation systems are crucially important for the delivery of personalized services to users. With personalized recommendation services, users can enjoy a variety of targeted recommendations such as movies, books, ads, restaurants, and more. In addition, personalized recommendation services have become extremely effective revenue drivers for online business. Despite the great benefits, deploying personalized recommendation services typically requires the collection of users’ personal data for processing and analytics, which undesirably makes users susceptible to serious privacy violation issues. Therefore, it is of paramount importance to develop practical privacy-preserving techniques to maintain the intelligence of personalized recommendation services while respecting user privacy. In this paper, we provide a comprehensive survey of the literature related to personalized recommendation services with privacy protection. We present the general architecture of personalized recommendation systems, the privacy issues therein, and existing works that focus on privacy-preserving personalized recommendation services. We classify the existing works according to their underlying techniques for personalized recommendation and privacy protection, and thoroughly discuss and compare their merits and demerits, especially in terms of privacy and recommendation accuracy. We also identity some future research directions. Keywords: Privacy protection, Personalized recommendation services, Targeted delivery, Collaborative filtering, Machine learning

  15. Is Electronic Privacy Achievable?

    National Research Council Canada - National Science Library

    Irvine, Cynthia E; Levin, Timothy E

    2000-01-01

    ... individuals. The purpose of this panel was to focus on how new technologies are affecting privacy. Technologies that might adversely affect privacy were identified by Rein Turn at previous symposia...

  16. Mum's the Word: Feds Are Serious About Protecting Patients' Privacy.

    Science.gov (United States)

    Conde, Crystal

    2010-08-01

    The Health Information Technology for Economic and Clinical Health (HITECH) Act significantly changes HIPAA privacy and security policies that affect physicians. Chief among the changes are the new breach notification regulations, developed by the U.S. Department of Health and Human Services Office for Civil Rights. The Texas Medical Association has developed resources to help physicians comply with the new HIPAA regulations.

  17. (a,k)-Anonymous Scheme for Privacy-Preserving Data Collection in IoT-based Healthcare Services Systems.

    Science.gov (United States)

    Li, Hongtao; Guo, Feng; Zhang, Wenyin; Wang, Jie; Xing, Jinsheng

    2018-02-14

    The widely use of IoT technologies in healthcare services has pushed forward medical intelligence level of services. However, it also brings potential privacy threat to the data collection. In healthcare services system, health and medical data that contains privacy information are often transmitted among networks, and such privacy information should be protected. Therefore, there is a need for privacy-preserving data collection (PPDC) scheme to protect clients (patients) data. We adopt (a,k)-anonymity model as privacy pretection scheme for data collection, and propose a novel anonymity-based PPDC method for healthcare services in this paper. The threat model is analyzed in the client-server-to-user (CS2U) model. On client-side, we utilize (a,k)-anonymity notion to generate anonymous tuples which can resist possible attack, and adopt a bottom-up clustering method to create clusters that satisfy a base privacy level of (a 1 ,k 1 )-anonymity. On server-side, we reduce the communication cost through generalization technology, and compress (a 1 ,k 1 )-anonymous data through an UPGMA-based cluster combination method to make the data meet the deeper level of privacy (a 2 ,k 2 )-anonymity (a 1  ≥ a 2 , k 2  ≥ k 1 ). Theoretical analysis and experimental results prove that our scheme is effective in privacy-preserving and data quality.

  18. Privacy and Security Research Group workshop on network and distributed system security: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report contains papers on the following topics: NREN Security Issues: Policies and Technologies; Layer Wars: Protect the Internet with Network Layer Security; Electronic Commission Management; Workflow 2000 - Electronic Document Authorization in Practice; Security Issues of a UNIX PEM Implementation; Implementing Privacy Enhanced Mail on VMS; Distributed Public Key Certificate Management; Protecting the Integrity of Privacy-enhanced Electronic Mail; Practical Authorization in Large Heterogeneous Distributed Systems; Security Issues in the Truffles File System; Issues surrounding the use of Cryptographic Algorithms and Smart Card Applications; Smart Card Augmentation of Kerberos; and An Overview of the Advanced Smart Card Access Control System. Selected papers were processed separately for inclusion in the Energy Science and Technology Database.

  19. Secure Mix-Zones for Privacy Protection of Road Network Location Based Services Users

    Directory of Open Access Journals (Sweden)

    Rubina S. Zuberi

    2016-01-01

    Full Text Available Privacy has been found to be the major impediment and hence the area to be worked out for the provision of Location Based Services in the wide sense. With the emergence of smart, easily portable, communicating devices, information acquisition is achieving new domains. The work presented here is an extension of the ongoing work towards achieving privacy for the present day emerging communication techniques. This work emphasizes one of the most effective real-time privacy enhancement techniques called Mix-Zones. In this paper, we have presented a model of a secure road network with Mix-Zones getting activated on the basis of spatial as well as temporal factors. The temporal factors are ascertained by the amount of traffic and its flow. The paper also discusses the importance of the number of Mix-Zones a user traverses and their mixing effectiveness. We have also shown here using our simulations which are required for the real-time treatment of the problem that the proposed transient Mix-Zones are part of a viable and robust solution towards the road network privacy protection of the communicating moving objects of the present scenario.

  20. Privacy encounters in Teledialogue

    DEFF Research Database (Denmark)

    Andersen, Lars Bo; Bøge, Ask Risom; Danholt, Peter

    2017-01-01

    Privacy is a major concern when new technologies are introduced between public authorities and private citizens. What is meant by privacy, however, is often unclear and contested. Accordingly, this article utilises grounded theory to study privacy empirically in the research and design project...... Teledialogue aimed at introducing new ways for public case managers and placed children to communicate through IT. The resulting argument is that privacy can be understood as an encounter, that is, as something that arises between implicated actors and entails some degree of friction and negotiation....... An argument which is further qualified through the philosophy of Gilles Deleuze. The article opens with a review of privacy literature before continuing to present privacy as an encounter with five different foci: what technologies bring into the encounter; who is related to privacy by implication; what...

  1. Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential

    Directory of Open Access Journals (Sweden)

    John Cologne

    2012-01-01

    Full Text Available Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying the data, but overmasking can compromise utility (analysis potential. Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.

  2. Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential

    International Nuclear Information System (INIS)

    Cologne, J.; Nakashima, E.; Funamoto, S.; Grant, E.J.; Chen, Y.; Hiroaki Katayama, H.

    2012-01-01

    Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but over masking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs

  3. 34 CFR 98.4 - Protection of students' privacy in examination, testing, or treatment.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Protection of students' privacy in examination, testing, or treatment. 98.4 Section 98.4 Education Office of the Secretary, Department of Education STUDENT... are not directly related to academic instruction and that is designed to affect behavioral, emotional...

  4. Towards Self-Awareness Privacy Protection for Internet of Things Data Collection

    Directory of Open Access Journals (Sweden)

    Kok-Seng Wong

    2014-01-01

    Full Text Available The Internet of Things (IoT is now an emerging global Internet-based information architecture used to facilitate the exchange of goods and services. IoT-related applications are aiming to bring technology to people anytime and anywhere, with any device. However, the use of IoT raises a privacy concern because data will be collected automatically from the network devices and objects which are embedded with IoT technologies. In the current applications, data collector is a dominant player who enforces the secure protocol that cannot be verified by the data owners. In view of this, some of the respondents might refuse to contribute their personal data or submit inaccurate data. In this paper, we study a self-awareness data collection protocol to raise the confidence of the respondents when submitting their personal data to the data collector. Our self-awareness protocol requires each respondent to help others in preserving his privacy. The communication (respondents and data collector and collaboration (among respondents in our solution will be performed automatically.

  5. Protecting Privacy in Shared Photos via Adversarial Examples Based Stealth

    Directory of Open Access Journals (Sweden)

    Yujia Liu

    2017-01-01

    Full Text Available Online image sharing in social platforms can lead to undesired privacy disclosure. For example, some enterprises may detect these large volumes of uploaded images to do users’ in-depth preference analysis for commercial purposes. And their technology might be today’s most powerful learning model, deep neural network (DNN. To just elude these automatic DNN detectors without affecting visual quality of human eyes, we design and implement a novel Stealth algorithm, which makes the automatic detector blind to the existence of objects in an image, by crafting a kind of adversarial examples. It is just like all objects disappear after wearing an “invisible cloak” from the view of the detector. Then we evaluate the effectiveness of Stealth algorithm through our newly defined measurement, named privacy insurance. The results indicate that our scheme has considerable success rate to guarantee privacy compared with other methods, such as mosaic, blur, and noise. Better still, Stealth algorithm has the smallest impact on image visual quality. Meanwhile, we set a user adjustable parameter called cloak thickness for regulating the perturbation intensity. Furthermore, we find that the processed images have transferability property; that is, the adversarial images generated for one particular DNN will influence the others as well.

  6. 78 FR 3015 - Privacy Act of 1974; U.S. Customs and Border Protection; DHS/CBP-004-Intellectual Property Rights...

    Science.gov (United States)

    2013-01-15

    ... Search Systems, System of Records AGENCY: Department of Homeland Security, Privacy Office. ACTION: Notice... and Border Protection, Mint Annex, 799 9th Street NW., Washington, DC 20229-1177. For privacy issues... Property Rights Internal Search (IPRiS) system. IPRS provides a web-based search engine for the public to...

  7. Efficient Method of Achieving Agreements between Individuals and Organizations about RFID Privacy

    Science.gov (United States)

    Cha, Shi-Cho

    This work presents novel technical and legal approaches that address privacy concerns for personal data in RFID systems. In recent years, to minimize the conflict between convenience and the privacy risk of RFID systems, organizations have been requested to disclose their policies regarding RFID activities, obtain customer consent, and adopt appropriate mechanisms to enforce these policies. However, current research on RFID typically focuses on enforcement mechanisms to protect personal data stored in RFID tags and prevent organizations from tracking user activity through information emitted by specific RFID tags. A missing piece is how organizations can obtain customers' consent efficiently and flexibly. This study recommends that organizations obtain licenses automatically or semi-automatically before collecting personal data via RFID technologies rather than deal with written consents. Such digitalized and standard licenses can be checked automatically to ensure that collection and use of personal data is based on user consent. While individuals can easily control who has licenses and license content, the proposed framework provides an efficient and flexible way to overcome the deficiencies in current privacy protection technologies for RFID systems.

  8. New Collaborative Filtering Algorithms Based on SVD++ and Differential Privacy

    Directory of Open Access Journals (Sweden)

    Zhengzheng Xian

    2017-01-01

    Full Text Available Collaborative filtering technology has been widely used in the recommender system, and its implementation is supported by the large amount of real and reliable user data from the big-data era. However, with the increase of the users’ information-security awareness, these data are reduced or the quality of the data becomes worse. Singular Value Decomposition (SVD is one of the common matrix factorization methods used in collaborative filtering, which introduces the bias information of users and items and is realized by using algebraic feature extraction. The derivative model SVD++ of SVD achieves better predictive accuracy due to the addition of implicit feedback information. Differential privacy is defined very strictly and can be proved, which has become an effective measure to solve the problem of attackers indirectly deducing the personal privacy information by using background knowledge. In this paper, differential privacy is applied to the SVD++ model through three approaches: gradient perturbation, objective-function perturbation, and output perturbation. Through theoretical derivation and experimental verification, the new algorithms proposed can better protect the privacy of the original data on the basis of ensuring the predictive accuracy. In addition, an effective scheme is given that can measure the privacy protection strength and predictive accuracy, and a reasonable range for selection of the differential privacy parameter is provided.

  9. Customer privacy on UK healthcare websites.

    Science.gov (United States)

    Mundy, Darren P

    2006-09-01

    Privacy has been and continues to be one of the key challenges of an age devoted to the accumulation, processing, and mining of electronic information. In particular, privacy of healthcare-related information is seen as a key issue as health organizations move towards the electronic provision of services. The aim of the research detailed in this paper has been to analyse privacy policies on popular UK healthcare-related websites to determine the extent to which consumer privacy is protected. The author has combined approaches (such as approaches focused on usability, policy content, and policy quality) used in studies by other researchers on e-commerce and US healthcare websites to provide a comprehensive analysis of UK healthcare privacy policies. The author identifies a wide range of issues related to the protection of consumer privacy through his research analysis using quantitative results. The main outcomes from the author's research are that only 61% of healthcare-related websites in their sample group posted privacy policies. In addition, most of the posted privacy policies had poor readability standards and included a variety of privacy vulnerability statements. Overall, the author's findings represent significant current issues in relation to healthcare information protection on the Internet. The hope is that raising awareness of these results will drive forward changes in the industry, similar to those experienced with information quality.

  10. Privacy Protection: Mandating New Arrangements to Implement and Assess Federal Privacy Policy and Practice

    National Research Council Canada - National Science Library

    Relyea, Harold C

    2004-01-01

    When Congress enacted the Privacy Act of 1974, it established a temporary national study commission to conduct a comprehensive assessment of privacy policy and practice in both the public and private...

  11. Privacy in an Ambient World

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  12. The Models of Applying Online Privacy Literacy Strategies: A Case Study of Instagram Girl Users

    Directory of Open Access Journals (Sweden)

    Abdollah Bicharanlou

    2017-09-01

    Full Text Available Social networks affect remarkably in the lives of virtual space users. These networks like most human relations involve compromising between self-disclosure and privacy protection. A process which is realized through improving privacy and empowering the user at the personal level. This study aimed to assess strategies based on online privacy literacy. In particular, strategies that Instagram young girls users should employ to achieve the optimum level of privacy. For this purpose, firstly the paradox of privacy, benefits and risks of self-disclosure are explained, then according to online privacy literacy, some social and technological strategies are introduced by which users can solve the “paradox of privacy.” In the result section, after describing the main benefits and risks of self-disclosure by girl users, the current models of using these social and technological strategies to solve the mentioned paradox are discussed. The research method is ethnography based on non-collaborative observation of Instagram pages and semi-structured interviews with 20 girl users of social networks.

  13. Privacy and Innovation

    OpenAIRE

    Avi Goldfarb; Catherine Tucker

    2011-01-01

    Information and communication technology now enables firms to collect detailed and potentially intrusive data about their customers both easily and cheaply. This means that privacy concerns are no longer limited to government surveillance and public figures' private lives. The empirical literature on privacy regulation shows that privacy regulation may affect the extent and direction of data-based innovation. We also show that the impact of privacy regulation can be extremely heterogeneous. T...

  14. Privacy vs. Reward in Indoor Location-Based Services

    Directory of Open Access Journals (Sweden)

    Fawaz Kassem

    2016-10-01

    Full Text Available With the advance of indoor localization technology, indoor location-based services (ILBS are gaining popularity. They, however, accompany privacy concerns. ILBS providers track the users’ mobility to learn more about their behavior, and then provide them with improved and personalized services. Our survey of 200 individuals highlighted their concerns about this tracking for potential leakage of their personal/private traits, but also showed their willingness to accept reduced tracking for improved service. In this paper, we propose PR-LBS (Privacy vs. Reward for Location-Based Service, a system that addresses these seemingly conflicting requirements by balancing the users’ privacy concerns and the benefits of sharing location information in indoor location tracking environments. PR-LBS relies on a novel location-privacy criterion to quantify the privacy risks pertaining to sharing indoor location information. It also employs a repeated play model to ensure that the received service is proportionate to the privacy risk. We implement and evaluate PR-LBS extensively with various real-world user mobility traces. Results show that PR-LBS has low overhead, protects the users’ privacy, and makes a good tradeoff between the quality of service for the users and the utility of shared location data for service providers.

  15. Minding the Gap: The Growing Divide Between Privacy and Surveillance Technology

    Science.gov (United States)

    2013-06-01

    issues regarding evolving technology remain unaddressed. George Orwell saw government as big brother—all watching (Orwell, 1949). However, complex...action, and information. Within the broad realm of the literature on privacy, inclusive of the works of Thomas Locke through Margaret Mead , more...review ( Herbert , 2011, pp. 448–450). Additionally, as newer technology emerges, the level and degree of the government’s physical intrusion into

  16. How can hospitals better protect the privacy of electronic medical records? Perspectives from staff members of health information management departments.

    Science.gov (United States)

    Sher, Ming-Ling; Talley, Paul C; Cheng, Tain-Junn; Kuo, Kuang-Ming

    2017-05-01

    The adoption of electronic medical records (EMR) is expected to better improve overall healthcare quality and to offset the financial pressure of excessive administrative burden. However, safeguarding EMR against potentially hostile security breaches from both inside and outside healthcare facilities has created increased patients' privacy concerns from all sides. The aim of our study was to examine the influencing factors of privacy protection for EMR by healthcare professionals. We used survey methodology to collect questionnaire responses from staff members in health information management departments among nine Taiwanese hospitals active in EMR utilisation. A total of 209 valid responses were collected in 2014. We used partial least squares for analysing the collected data. Perceived benefits, perceived barriers, self-efficacy and cues to action were found to have a significant association with intention to protect EMR privacy, while perceived susceptibility and perceived severity were not. Based on the findings obtained, we suggest that hospitals should provide continuous ethics awareness training to relevant staff and design more effective strategies for improving the protection of EMR privacy in their charge. Further practical and research implications are also discussed.

  17. 76 FR 66940 - Privacy Act of 1974; Department of Homeland Security/United States Secret Service-004 Protection...

    Science.gov (United States)

    2011-10-28

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2011-0083] Privacy Act of 1974; Department of Homeland Security/United States Secret Service--004 Protection Information System... Security (DHS)/United States Secret Service (USSS)-004 System name: DHS/USSS-004 Protection Information...

  18. Ensuring privacy in the study of pathogen genetics.

    Science.gov (United States)

    Mehta, Sanjay R; Vinterbo, Staal A; Little, Susan J

    2014-08-01

    Rapid growth in the genetic sequencing of pathogens in recent years has led to the creation of large sequence databases. This aggregated sequence data can be very useful for tracking and predicting epidemics of infectious diseases. However, the balance between the potential public health benefit and the risk to personal privacy for individuals whose genetic data (personal or pathogen) are included in such work has been difficult to delineate, because neither the true benefit nor the actual risk to participants has been adequately defined. Existing approaches to minimise the risk of privacy loss to participants are based on de-identification of data by removal of a predefined set of identifiers. These approaches neither guarantee privacy nor protect the usefulness of the data. We propose a new approach to privacy protection that will quantify the risk to participants, while still maximising the usefulness of the data to researchers. This emerging standard in privacy protection and disclosure control, which is known as differential privacy, uses a process-driven rather than data-centred approach to protecting privacy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. The privacy implications of Bluetooth

    OpenAIRE

    Kostakos, Vassilis

    2008-01-01

    A substantial amount of research, as well as media hype, has surrounded RFID technology and its privacy implications. Currently, researchers and the media focus on the privacy threats posed by RFID, while consumer groups choose to boycott products bearing RFID tags. At the same, however, a very similar technology has quietly become part of our everyday lives: Bluetooth. In this paper we highlight the fact that Bluetooth is a widespread technology that has real privacy implications. Furthermor...

  20. Patient Privacy in the Era of Big Data

    Directory of Open Access Journals (Sweden)

    Mehmet Kayaalp

    2018-02-01

    Full Text Available Protecting patient privacy requires various technical tools. It involves regulations for sharing, de-identifying, securely storing, transmitting and handling protected health information (PHI. It involves privacy laws and legal agreements. It requires establishing rules for monitoring privacy leaks, determining actions when they occur, and handling de-identified clinical narrative reports. Deidentification is one such indispensable instrument in this set of privacy tools

  1. Practical Privacy Assessment

    DEFF Research Database (Denmark)

    Peen, Søren; Jansen, Thejs Willem; Jensen, Christian D.

    2008-01-01

    This chapter proposes a privacy assessment model called the Operational Privacy Assessment Model that includes organizational, operational and technical factors for the protection of personal data stored in an IT system. The factors can be evaluated in a simple scale so that not only the resulting...... graphical depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment tool may be used to standardize privacy assessment criteria, making it less painful...... for the management to assess privacy risks on their systems....

  2. Radio frequency identification (RFID) in health care: privacy and security concerns limiting adoption.

    Science.gov (United States)

    Rosenbaum, Benjamin P

    2014-03-01

    Radio frequency identification (RFID) technology has been implemented in a wide variety of industries. Health care is no exception. This article explores implementations and limitations of RFID in several health care domains: authentication, medication safety, patient tracking, and blood transfusion medicine. Each domain has seen increasing utilization of unique applications of RFID technology. Given the importance of protecting patient and data privacy, potential privacy and security concerns in each domain are discussed. Such concerns, some of which are inherent to existing RFID hardware and software technology, may limit ubiquitous adoption. In addition, an apparent lack of security standards within the RFID domain and specifically health care may also hinder the growth and utility of RFID within health care for the foreseeable future. Safeguarding the privacy of patient data may be the most important obstacle to overcome to allow the health care industry to take advantage of the numerous benefits RFID technology affords.

  3. Computer-Aided Identification and Validation of Privacy Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2016-05-01

    Full Text Available Privacy is a software quality that is closely related to security. The main difference is that security properties aim at the protection of assets that are crucial for the considered system, and privacy aims at the protection of personal data that are processed by the system. The identification of privacy protection needs in complex systems is a hard and error prone task. Stakeholders whose personal data are processed might be overlooked, or the sensitivity and the need of protection of the personal data might be underestimated. The later personal data and the needs to protect them are identified during the development process, the more expensive it is to fix these issues, because the needed changes of the system-to-be often affect many functionalities. In this paper, we present a systematic method to identify the privacy needs of a software system based on a set of functional requirements by extending the problem-based privacy analysis (ProPAn method. Our method is tool-supported and automated where possible to reduce the effort that has to be spent for the privacy analysis, which is especially important when considering complex systems. The contribution of this paper is a semi-automatic method to identify the relevant privacy requirements for a software-to-be based on its functional requirements. The considered privacy requirements address all dimensions of privacy that are relevant for software development. As our method is solely based on the functional requirements of the system to be, we enable users of our method to identify the privacy protection needs that have to be addressed by the software-to-be at an early stage of the development. As initial evaluation of our method, we show its applicability on a small electronic health system scenario.

  4. 78 FR 57319 - Children's Online Privacy Protection Rule Safe Harbor Proposed Self-Regulatory Guidelines...

    Science.gov (United States)

    2013-09-18

    ...-AB20 Children's Online Privacy Protection Rule Safe Harbor Proposed Self-Regulatory Guidelines; kidSAFE... proposed self-regulatory guidelines submitted by the kidSAFE Seal Program (``kidSAFE''), owned and operated... enabling industry groups or others to submit to the Commission for approval self-regulatory guidelines that...

  5. Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy

    Science.gov (United States)

    Koopman, Colin; Doty, Nick

    2016-01-01

    The meaning of privacy has been much disputed throughout its history in response to wave after wave of new technological capabilities and social configurations. The current round of disputes over privacy fuelled by data science has been a cause of despair for many commentators and a death knell for privacy itself for others. We argue that privacy’s disputes are neither an accidental feature of the concept nor a lamentable condition of its applicability. Privacy is essentially contested. Because it is, privacy is transformable according to changing technological and social conditions. To make productive use of privacy’s essential contestability, we argue for a new approach to privacy research and practical design, focused on the development of conceptual analytics that facilitate dissecting privacy’s multiple uses across multiple contexts. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336797

  6. Electronic Communication of Protected Health Information: Privacy, Security, and HIPAA Compliance.

    Science.gov (United States)

    Drolet, Brian C; Marwaha, Jayson S; Hyatt, Brad; Blazar, Phillip E; Lifchez, Scott D

    2017-06-01

    Technology has enhanced modern health care delivery, particularly through accessibility to health information and ease of communication with tools like mobile device messaging (texting). However, text messaging has created new risks for breach of protected health information (PHI). In the current study, we sought to evaluate hand surgeons' knowledge and compliance with privacy and security standards for electronic communication by text message. A cross-sectional survey of the American Society for Surgery of the Hand membership was conducted in March and April 2016. Descriptive and inferential statistical analyses were performed of composite results as well as relevant subgroup analyses. A total of 409 responses were obtained (11% response rate). Although 63% of surgeons reported that they believe that text messaging does not meet Health Insurance Portability and Accountability Act of 1996 security standards, only 37% reported they do not use text messages to communicate PHI. Younger surgeons and respondents who believed that their texting was compliant were statistically significantly more like to report messaging of PHI (odds ratio, 1.59 and 1.22, respectively). A majority of hand surgeons in this study reported the use of text messaging to communicate PHI. Of note, neither the Health Insurance Portability and Accountability Act of 1996 statute nor US Department of Health and Human Services specifically prohibits this form of electronic communication. To be compliant, surgeons, practices, and institutions need to take reasonable security precautions to prevent breach of privacy with electronic communication. Communication of clinical information by text message is not prohibited under Health Insurance Portability and Accountability Act of 1996, but surgeons should use appropriate safeguards to prevent breach when using this form of communication. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  7. Data Transmission and Access Protection of Community Medical Internet of Things

    OpenAIRE

    Wang, Xunbao; Chen, Fulong; Ye, Heping; Yang, Jie; Zhu, Junru; Zhang, Ziyang; Huang, Yakun

    2017-01-01

    On the basis of Internet of Things (IoT) technologies, Community Medical Internet of Things (CMIoT) is a new medical information system and generates massive multiple types of medical data which contain all kinds of user identity data, various types of medical data, and other sensitive information. To effectively protect users’ privacy, we propose a secure privacy data protection scheme including transmission protection and access control. For the uplink transmission data protection, bidirect...

  8. Figure 2. Representation of the technological dimension of privacy-Educational Research on the Technological Dimension of Private Life

    OpenAIRE

    Liliana Mâţă

    2010-01-01

    Following the development of new technologies in recent decades have been a number of innovative, but also privacy threats, elements: bank cards, personal computers, communication networks, internet, digital signature, email, surveillance systems for children. The structure of the technological dimension of private life can be represented by the following elements (Figure 2): personal objects technology (material itSelf), electronic identity, personal blog (personal Self), spec...

  9. Protecting Privacy in Shared Photos via Adversarial Examples Based Stealth

    OpenAIRE

    Liu, Yujia; Zhang, Weiming; Yu, Nenghai

    2017-01-01

    Online image sharing in social platforms can lead to undesired privacy disclosure. For example, some enterprises may detect these large volumes of uploaded images to do users’ in-depth preference analysis for commercial purposes. And their technology might be today’s most powerful learning model, deep neural network (DNN). To just elude these automatic DNN detectors without affecting visual quality of human eyes, we design and implement a novel Stealth algorithm, which makes the automatic det...

  10. Public assessment of new surveillance-oriented security technologies: Beyond the trade-off between privacy and security.

    Science.gov (United States)

    Pavone, Vincenzo; Esposti, Sara Degli

    2012-07-01

    As surveillance-oriented security technologies (SOSTs) are considered security enhancing but also privacy infringing, citizens are expected to trade part of their privacy for higher security. Drawing from the PRISE project, this study casts some light on how citizens actually assess SOSTs through a combined analysis of focus groups and survey data. First, the outcomes suggest that people did not assess SOSTs in abstract terms but in relation to the specific institutional and social context of implementation. Second, from this embedded viewpoint, citizens either expressed concern about government's surveillance intentions and considered SOSTs mainly as privacy infringing, or trusted political institutions and believed that SOSTs effectively enhanced their security. None of them, however, seemed to trade privacy for security because concerned citizens saw their privacy being infringed without having their security enhanced, whilst trusting citizens saw their security being increased without their privacy being affected.

  11. 76 FR 11435 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2011-03-02

    ... Security Administration. SUMMARY: Pursuant to the Computer Matching and Privacy Protection Act of 1988, Public Law 100-503, the Computer Matching and Privacy Protections Amendments of 1990, Pub. L. 101-508... Interpreting the Provisions of Public Law 100-503, the Computer Matching and Privacy Protection Act of 1988...

  12. Protection of Location Privacy Based on Distributed Collaborative Recommendations.

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users' location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users' location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users' location information profiles and used generalization and encryption to ensure the safety of the user's location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user's location privacy.

  13. INSPIRATIONS OF THE FRAMEWORK OF INTERNET PRIVACY PROTECTION IN AMERICA%美国网络隐私保护框架的启示

    Institute of Scientific and Technical Information of China (English)

    王忠

    2013-01-01

    介绍了美国白宫发布的《网络世界中消费者数据隐私:全球数字经济中保护隐私及促进创新的框架》的背景及主要内容,结合我国网络隐私保护的实际情况,提出了促进我国网络隐私保护与产业创新良性互动的措施建议.%The background and main content of (Consumer privacy in a networked world: a framework for protecting privacy and promoting innovation in the global digital economy) was introduced, which released by the White House. Combing with actual situation of China's Online Privacy Protection, measures were proposed to promote positive interaction between online privacy protection and industrial innovation.

  14. Governance Through Privacy, Fairness, and Respect for Individuals.

    Science.gov (United States)

    Baker, Dixie B; Kaye, Jane; Terry, Sharon F

    2016-01-01

    Individuals have a moral claim to be involved in the governance of their personal data. Individuals' rights include privacy, autonomy, and the ability to choose for themselves how they want to manage risk, consistent with their own personal values and life situations. The Fair Information Practices principles (FIPPs) offer a framework for governance. Privacy-enhancing technology that complies with applicable law and FIPPs offers a dynamic governance tool for enabling the fair and open use of individual's personal data. Any governance model must protect against the risks posed by data misuse. Individual perceptions of risks are a subjective function involving individuals' values toward self, family, and society, their perceptions of trust, and their cognitive decision-making skills. Individual privacy protections and individuals' right to choose are codified in the HIPAA Privacy Rule, which attempts to strike a balance between the dual goals of information flow and privacy protection. The choices most commonly given individuals regarding the use of their health information are binary ("yes" or "no") and immutable. Recent federal recommendations and law recognize the need for granular, dynamic choices. Individuals expect that they will govern the use of their own health and genomic data. Failure to build and maintain individuals' trust increases the likelihood that they will refuse to grant permission to access or use their data. The "no surprises principle" asserts that an individual's personal information should never be collected, used, transmitted, or disclosed in a way that would surprise the individual were she to learn about it. The FIPPs provide a powerful framework for enabling data sharing and use, while maintaining trust. We introduce the eight FIPPs adopted by the Department of Health and Human Services, and provide examples of their interpretation and implementation. Privacy risk and health risk can be reduced by giving consumers control, autonomy, and

  15. Privacy and confidentiality in pragmatic clinical trials.

    Science.gov (United States)

    McGraw, Deven; Greene, Sarah M; Miner, Caroline S; Staman, Karen L; Welch, Mary Jane; Rubel, Alan

    2015-10-01

    With pragmatic clinical trials, an opportunity exists to answer important questions about the relative risks, burdens, and benefits of therapeutic interventions. However, concerns about protecting the privacy of this information are significant and must be balanced with the imperative to learn from the data gathered in routine clinical practice. Traditional privacy protections for research uses of identifiable information rely disproportionately on informed consent or authorizations, based on a presumption that this is necessary to fulfill ethical principles of respect for persons. But frequently, the ideal of informed consent is not realized in its implementation. Moreover, the principle of respect for persons—which encompasses their interests in health information privacy—can be honored through other mechanisms. Data anonymization also plays a role in protecting privacy but is not suitable for all research, particularly pragmatic clinical trials. In this article, we explore both the ethical foundation and regulatory framework intended to protect privacy in pragmatic clinical trials. We then review examples of novel approaches to respecting persons in research that may have the added benefit of honoring patient privacy considerations. © The Author(s) 2015.

  16. Identifying genetic relatives without compromising privacy.

    Science.gov (United States)

    He, Dan; Furlotte, Nicholas A; Hormozdiari, Farhad; Joo, Jong Wha J; Wadia, Akshay; Ostrovsky, Rafail; Sahai, Amit; Eskin, Eleazar

    2014-04-01

    The development of high-throughput genomic technologies has impacted many areas of genetic research. While many applications of these technologies focus on the discovery of genes involved in disease from population samples, applications of genomic technologies to an individual's genome or personal genomics have recently gained much interest. One such application is the identification of relatives from genetic data. In this application, genetic information from a set of individuals is collected in a database, and each pair of individuals is compared in order to identify genetic relatives. An inherent issue that arises in the identification of relatives is privacy. In this article, we propose a method for identifying genetic relatives without compromising privacy by taking advantage of novel cryptographic techniques customized for secure and private comparison of genetic information. We demonstrate the utility of these techniques by allowing a pair of individuals to discover whether or not they are related without compromising their genetic information or revealing it to a third party. The idea is that individuals only share enough special-purpose cryptographically protected information with each other to identify whether or not they are relatives, but not enough to expose any information about their genomes. We show in HapMap and 1000 Genomes data that our method can recover first- and second-order genetic relationships and, through simulations, show that our method can identify relationships as distant as third cousins while preserving privacy.

  17. Advanced research in data privacy

    CERN Document Server

    Torra, Vicenç

    2015-01-01

    This book provides an overview of the research work on data privacy and privacy enhancing technologies carried by the participants of the ARES project. ARES (Advanced Research in Privacy an Security, CSD2007-00004) has been one of the most important research projects funded by the Spanish Government in the fields of computer security and privacy. It is part of the now extinct CONSOLIDER INGENIO 2010 program, a highly competitive program which aimed to advance knowledge and open new research lines among top Spanish research groups. The project started in 2007 and will finish this 2014. Composed by 6 research groups from 6 different institutions, it has gathered an important number of researchers during its lifetime. Among the work produced by the ARES project, one specific work package has been related to privacy. This books gathers works produced by members of the project related to data privacy and privacy enhancing technologies. The presented works not only summarize important research carried in the proje...

  18. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    Science.gov (United States)

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed.

  19. 区块链隐私保护研究综述%Survey on Privacy Preserving Techniques for Blockchain Technology

    Institute of Scientific and Technical Information of China (English)

    祝烈煌; 高峰; 沈蒙; 李艳东; 郑宝昆; 毛洪亮; 吴震

    2017-01-01

    Core features of the blockchain technology are "de-centralization" and "de-trusting" .As a distributed ledger technology ,smart contract infrastructure platform and novel distributed computing paradigm ,it can effectively build programmable currency ,programmable finance and programmable society ,which will have a far-reaching impact on the financial and other fields ,and drive a new round of technological change and application change .While blockchain technology can improve efficiency , reduce costs and enhance data security ,it is still in the face of serious privacy issues which have been widely concerned by researchers . The survey first analyzes the technical characteristics of the blockchain ,defines the concept of identity privacy and transaction privacy ,points out the advantages and disadvantages of blockchain technology in privacy protection and introduces the attack methods in existing researches ,such as transaction tracing technology and account clustering technology .And then we introduce a variety of privacy mechanisms ,including malicious nodes detection and restricting access technology for the network layer ,transaction mixing technology ,encryption technology and limited release technology for the transaction layer , and some defense mechanisms for blockchain applications layer .In the end ,we discuss the limitations of the existing technologies and envision future directions on this topic .In addition ,the regulatory approach to malicious use of blockchain technology is discussed .%区块链技术的核心特征是"去中心化"和"去信任化",作为分布式总账技术、智能合约基础平台、分布式新型计算范式,可以有效构建可编程货币、可编程金融和可编程社会,势必将对金融及其他领域

  20. Smart Grid Privacy through Distributed Trust

    Science.gov (United States)

    Lipton, Benjamin

    Though the smart electrical grid promises many advantages in efficiency and reliability, the risks to consumer privacy have impeded its deployment. Researchers have proposed protecting privacy by aggregating user data before it reaches the utility, using techniques of homomorphic encryption to prevent exposure of unaggregated values. However, such schemes generally require users to trust in the correct operation of a single aggregation server. We propose two alternative systems based on secret sharing techniques that distribute this trust among multiple service providers, protecting user privacy against a misbehaving server. We also provide an extensive evaluation of the systems considered, comparing their robustness to privacy compromise, error handling, computational performance, and data transmission costs. We conclude that while all the systems should be computationally feasible on smart meters, the two methods based on secret sharing require much less computation while also providing better protection against corrupted aggregators. Building systems using these techniques could help defend the privacy of electricity customers, as well as customers of other utilities as they move to a more data-driven architecture.

  1. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  2. Minutiae Matching with Privacy Protection Based on the Combination of Garbled Circuit and Homomorphic Encryption

    Directory of Open Access Journals (Sweden)

    Mengxing Li

    2014-01-01

    Full Text Available Biometrics plays an important role in authentication applications since they are strongly linked to holders. With an increasing growth of e-commerce and e-government, one can expect that biometric-based authentication systems are possibly deployed over the open networks in the near future. However, due to its openness, the Internet poses a great challenge to the security and privacy of biometric authentication. Biometric data cannot be revoked, so it is of paramount importance that biometric data should be handled in a secure way. In this paper we present a scheme achieving privacy-preserving fingerprint authentication between two parties, in which fingerprint minutiae matching algorithm is completed in the encrypted domain. To improve the efficiency, we exploit homomorphic encryption as well as garbled circuits to design the protocol. Our goal is to provide protection for the security of template in storage and data privacy of two parties in transaction. The experimental results show that the proposed authentication protocol runs efficiently. Therefore, the protocol can run over open networks and help to alleviate the concerns on security and privacy of biometric applications over the open networks.

  3. Minutiae matching with privacy protection based on the combination of garbled circuit and homomorphic encryption.

    Science.gov (United States)

    Li, Mengxing; Feng, Quan; Zhao, Jian; Yang, Mei; Kang, Lijun; Wu, Lili

    2014-01-01

    Biometrics plays an important role in authentication applications since they are strongly linked to holders. With an increasing growth of e-commerce and e-government, one can expect that biometric-based authentication systems are possibly deployed over the open networks in the near future. However, due to its openness, the Internet poses a great challenge to the security and privacy of biometric authentication. Biometric data cannot be revoked, so it is of paramount importance that biometric data should be handled in a secure way. In this paper we present a scheme achieving privacy-preserving fingerprint authentication between two parties, in which fingerprint minutiae matching algorithm is completed in the encrypted domain. To improve the efficiency, we exploit homomorphic encryption as well as garbled circuits to design the protocol. Our goal is to provide protection for the security of template in storage and data privacy of two parties in transaction. The experimental results show that the proposed authentication protocol runs efficiently. Therefore, the protocol can run over open networks and help to alleviate the concerns on security and privacy of biometric applications over the open networks.

  4. Privacy enhanced recommender system

    NARCIS (Netherlands)

    Erkin, Zekeriya; Erkin, Zekeriya; Beye, Michael; Veugen, Thijs; Lagendijk, Reginald L.

    2010-01-01

    Recommender systems are widely used in online applications since they enable personalized service to the users. The underlying collaborative filtering techniques work on user’s data which are mostly privacy sensitive and can be misused by the service provider. To protect the privacy of the users, we

  5. Problem of data privacy protection in direct marketing

    Directory of Open Access Journals (Sweden)

    Markov Jasmina

    2011-01-01

    Full Text Available The dynamism of modern business conditions, as well as increasing competition, call for companies to change their usual ways of doing business and communicating with consumers. Therefore, today's direct marketing industry is experiencing explosive growth, as more and more companies include these activities in their communication mix. Many companies benefit from the development and usage of direct marketing, but at the same time, its growing usage led to numerous problems for companies as well as for the consumers. Direct marketing, advanced information technologies and Internet, on whose use it is more and more based, caused a number of legal and ethical questions without precedent. One of the issues that is making consumers more and more worried is concerning the privacy of their personal data and information which is being collected by a large number of companies. In addition, consumers are often not aware of this data collecting, which is adding even more gravity to this problem. The remainder of this paper will point to the necessity and great importance of careful and responsible use of consumer's personal data by direct marketers, with the aim of build long-term partnership relationships between the two. In addition, special attention will be paid to major problems that consumers face today in the field of data protection, as well as to the efforts committed in order to bring these problems to a minimum by getting consumers more involved in making decisions about usage of their personal data and information.

  6. Enhancing Privacy for Digital Rights Management

    NARCIS (Netherlands)

    Petkovic, M.; Conrado, C.; Schrijen, G.J.; Jonker, Willem

    2007-01-01

    This chapter addresses privacy issues in DRM systems. These systems provide a means of protecting digital content, but may violate the privacy of users in that the content they purchase and their actions in the system can be linked to specific users. The chapter proposes a privacy-preserving DRM

  7. Information protection playbook

    CERN Document Server

    Kane, Greg

    2013-01-01

    The primary goal of the Information Protection Playbook is to serve as a comprehensive resource for information protection (IP) professionals who must provide adequate information security at a reasonable cost. It emphasizes a holistic view of IP: one that protects the applications, systems, and networks that deliver business information from failures of confidentiality, integrity, availability, trust and accountability, and privacy. Using the guidelines provided in the Information Protection Playbook, security and information technology (IT) managers will learn how to

  8. Data privacy for the smart grid

    CERN Document Server

    Herold, Rebecca

    2015-01-01

    The Smart Grid and PrivacyWhat Is the Smart Grid? Changes from Traditional Energy Delivery Smart Grid Possibilities Business Model Transformations Emerging Privacy Risks The Need for Privacy PoliciesPrivacy Laws, Regulations, and Standards Privacy-Enhancing Technologies New Privacy Challenges IOT Big Data What Is the Smart Grid?Market and Regulatory OverviewTraditional Electricity Business SectorThe Electricity Open Market Classifications of Utilities Rate-Making ProcessesElectricity Consumer

  9. Genetic information, non-discrimination, and privacy protections in genetic counseling practice.

    Science.gov (United States)

    Prince, Anya E R; Roche, Myra I

    2014-12-01

    The passage of the Genetic Information Non Discrimination Act (GINA) was hailed as a pivotal achievement that was expected to calm the fears of both patients and research participants about the potential misuse of genetic information. However, 6 years later, patient and provider awareness of legal protections at both the federal and state level remains discouragingly low, thereby, limiting their potential effectiveness. The increasing demand for genetic testing will expand the number of individuals and families who could benefit from obtaining accurate information about the privacy and anti-discriminatory protections that GINA and other laws extend. In this paper we describe legal protections that are applicable to individuals seeking genetic counseling, review the literature on patient and provider fears of genetic discrimination and examine their awareness and understandings of existing laws, and summarize how genetic counselors currently discuss genetic discrimination. We then present three genetic counseling cases to illustrate issues of genetic discrimination and provide relevant information on applicable legal protections. Genetic counselors have an unprecedented opportunity, as well as the professional responsibility, to disseminate accurate knowledge about existing legal protections to their patients. They can strengthen their effectiveness in this role by achieving a greater knowledge of current protections including being able to identify specific steps that can help protect genetic information.

  10. A survey of the SWISS researchers on the impact of sibling privacy protections on pedigree recruitment.

    Science.gov (United States)

    Worrall, Bradford B; Chen, Donna T; Brown, Robert D; Brott, Thomas G; Meschia, James F

    2005-01-01

    To understand the perceptions and attitudes about privacy safeguards in research and investigate the impact of letter-based proband-initiated contact on recruitment, we surveyed researchers in the Siblings With Ischemic Stroke Study (SWISS). All 49 actively recruiting sites provided at least 1 response, and 61% reported that potential probands were enthusiastic. Although 66% of researchers valued proband-initiated contact, only 23% said that probands viewed this strategy as important to protecting the privacy of siblings. A substantial minority of researchers (37%) said the strategy impeded enrollment, and 44% said it was overly burdensome to probands.

  11. Parasiteware: Unlocking Personal Privacy

    Directory of Open Access Journals (Sweden)

    Daniel B. Garrie

    2006-09-01

    Full Text Available Spyware presents a threat of privacy infringement to unassuming internet users irrespective of their country of citizenship. European legislation attempts to protect end-users from unethical processing of their personal data. Spyware technologies, however, skirts these laws and often break them in their entirety. Outlawing the spyware and strengthening the legal consent requirement to mine data are statutory solutions that can prevent spyware users from skirting the law. An internationally standardized technology education system for the judiciaries in Europe and the U.S. can help ensure that when spyware users do break the law, they cannot hide by escaping from one nation to another without being held accountable. Transnational improvements are necessary to remedy the global spyware epidemic.

  12. Data Security and Privacy in Apps for Dementia: An Analysis of Existing Privacy Policies.

    Science.gov (United States)

    Rosenfeld, Lisa; Torous, John; Vahia, Ipsit V

    2017-08-01

    Despite tremendous growth in the number of health applications (apps), little is known about how well these apps protect their users' health-related data. This gap in knowledge is of particular concern for apps targeting people with dementia, whose cognitive impairment puts them at increased risk of privacy breaches. In this article, we determine how many dementia apps have privacy policies and how well they protect user data. Our analysis included all iPhone apps that matched the search terms "medical + dementia" or "health & fitness + dementia" and collected user-generated content. We evaluated all available privacy policies for these apps based on criteria that systematically measure how individual user data is handled. Seventy-two apps met the above search teams and collected user data. Of these, only 33 (46%) had an available privacy policy. Nineteen of the 33 with policies (58%) were specific to the app in question, and 25 (76%) specified how individual-user as opposed to aggregate data would be handled. Among these, there was a preponderance of missing information, the majority acknowledged collecting individual data for internal purposes, and most admitted to instances in which they would share user data with outside parties. At present, the majority of health apps focused on dementia lack a privacy policy, and those that do exist lack clarity. Bolstering safeguards and improving communication about privacy protections will help facilitate consumer trust in apps, thereby enabling more widespread and meaningful use by people with dementia and those involved in their care. Copyright © 2017. Published by Elsevier Inc.

  13. AnonySense: Opportunistic and Privacy-Preserving Context Collection

    DEFF Research Database (Denmark)

    Triandopoulos, Nikolaos; Kapadia, Apu; Cornelius, Cory

    2008-01-01

    on tessellation and clustering to protect users' privacy against the system while reporting context, and k-anonymous report aggregation to improve the users' privacy against applications receiving the context. We outline the architecture and security properties of AnonySense, and focus on evaluating our....... We propose AnonySense, a general-purpose architecture for leveraging users' mobile devices for measuring context, while maintaining the privacy of the users.AnonySense features multiple layers of privacy protection-a framework for nodes to receive tasks anonymously, a novel blurring mechanism based...

  14. Privacy in domestic environments

    OpenAIRE

    Radics, Peter J; Gracanin, Denis

    2011-01-01

    non-peer-reviewed While there is a growing body of research on privacy,most of the work puts the focus on information privacy. Physical and psychological privacy issues receive little to no attention. However, the introduction of technology into our lives can cause problems with regard to these aspects of privacy. This is especially true when it comes to our homes, both as nodes of our social life and places for relaxation. This paper presents the results of a study intended to captu...

  15. A secure data privacy preservation for on-demand

    Directory of Open Access Journals (Sweden)

    Dhasarathan Chandramohan

    2017-04-01

    Full Text Available This paper spotlights privacy and its obfuscation issues of intellectual, confidential information owned by insurance and finance sectors. Privacy risk in business era if authoritarians misuse secret information. Software interruptions in steeling digital data in the name of third party services. Liability in digital secrecy for the business continuity isolation, mishandling causing privacy breaching the vicinity and its preventive phenomenon is scrupulous in the cloud, where a huge amount of data is stored and maintained enormously. In this developing IT-world toward cloud, users privacy protection is becoming a big question , albeit cloud computing made changes in the computing field by increasing its effectiveness, efficiency and optimization of the service environment etc, cloud users data and their identity, reliability, maintainability and privacy may vary for different CPs (cloud providers. CP ensures that the user’s proprietary information is maintained more secretly with current technologies. More remarkable occurrence is even the cloud provider does not have suggestions regarding the information and the digital data stored and maintained globally anywhere in the cloud. The proposed system is one of the obligatory research issues in cloud computing. We came forward by proposing the Privacy Preserving Model to Prevent Digital Data Loss in the Cloud (PPM–DDLC. This proposal helps the CR (cloud requester/users to trust their proprietary information and data stored in the cloud.

  16. PRIVACY AS A CULTURAL PHENOMENON

    Directory of Open Access Journals (Sweden)

    Garfield Benjamin

    2017-07-01

    Full Text Available Privacy remains both contentious and ever more pertinent in contemporary society. Yet it persists as an ill-defined term, not only within specific fields but in its various uses and implications between and across technical, legal and political contexts. This article offers a new critical review of the history of privacy in terms of two dominant strands of thinking: freedom and property. These two conceptions of privacy can be seen as successive historical epochs brought together under digital technologies, yielding increasingly complex socio-technical dilemmas. By simplifying the taxonomy to its socio-cultural function, the article provides a generalisable, interdisciplinary approach to privacy. Drawing on new technologies, historical trends, sociological studies and political philosophy, the article presents a discussion of the value of privacy as a term, before proposing a defense of the term cyber security as a mode of scalable cognitive privacy that integrates the relative needs of individuals, governments and corporations.

  17. Privacy and Security in Mobile Health (mHealth) Research.

    Science.gov (United States)

    Arora, Shifali; Yttri, Jennifer; Nilse, Wendy

    2014-01-01

    Research on the use of mobile technologies for alcohol use problems is a developing field. Rapid technological advances in mobile health (or mHealth) research generate both opportunities and challenges, including how to create scalable systems capable of collecting unprecedented amounts of data and conducting interventions-some in real time-while at the same time protecting the privacy and safety of research participants. Although the research literature in this area is sparse, lessons can be borrowed from other communities, such as cybersecurity or Internet security, which offer many techniques to reduce the potential risk of data breaches or tampering in mHealth. More research into measures to minimize risk to privacy and security effectively in mHealth is needed. Even so, progress in mHealth research should not stop while the field waits for perfect solutions.

  18. openPDS: protecting the privacy of metadata through SafeAnswers.

    Directory of Open Access Journals (Sweden)

    Yves-Alexandre de Montjoye

    Full Text Available The rise of smartphones and web services made possible the large-scale collection of personal metadata. Information about individuals' location, phone call logs, or web-searches, is collected and used intensively by organizations and big data researchers. Metadata has however yet to realize its full potential. Privacy and legal concerns, as well as the lack of technical solutions for personal metadata management is preventing metadata from being shared and reconciled under the control of the individual. This lack of access and control is furthermore fueling growing concerns, as it prevents individuals from understanding and managing the risks associated with the collection and use of their data. Our contribution is two-fold: (1 we describe openPDS, a personal metadata management framework that allows individuals to collect, store, and give fine-grained access to their metadata to third parties. It has been implemented in two field studies; (2 we introduce and analyze SafeAnswers, a new and practical way of protecting the privacy of metadata at an individual level. SafeAnswers turns a hard anonymization problem into a more tractable security one. It allows services to ask questions whose answers are calculated against the metadata instead of trying to anonymize individuals' metadata. The dimensionality of the data shared with the services is reduced from high-dimensional metadata to low-dimensional answers that are less likely to be re-identifiable and to contain sensitive information. These answers can then be directly shared individually or in aggregate. openPDS and SafeAnswers provide a new way of dynamically protecting personal metadata, thereby supporting the creation of smart data-driven services and data science research.

  19. Privacy Act

    Science.gov (United States)

    Learn about the Privacy Act of 1974, the Electronic Government Act of 2002, the Federal Information Security Management Act, and other information about the Environmental Protection Agency maintains its records.

  20. Privacy Implications for Information and Communications Technology (ICT): the Case of the Jordanian E-Government

    OpenAIRE

    Almatarneh, Akram

    2011-01-01

    Information and Communications Technology (ICT) is one of the fastest growing sectors in Jordan. The importance of ICT cannot be ignored as it affects all aspects of Jordanian society including telecommunications, education, banking, commerce and employment. However, the issue of individual privacy in this sector is a particular challenge as individuals are disclosing large amounts of personal information than ever at a time when there are no specific privacy laws or regulations. This paper i...

  1. Security and privacy preserving in social networks

    CERN Document Server

    Chbeir, Richard

    2013-01-01

    This volume aims at assessing the current approaches and technologies, as well as to outline the major challenges and future perspectives related to the security and privacy protection of social networks. It provides the reader with an overview of the state-of-the art techniques, studies, and approaches as well as outlining future directions in this field. A wide range of interdisciplinary contributions from various research groups ensures for a balanced and complete perspective.

  2. Disentangling privacy from property: toward a deeper understanding of genetic privacy.

    Science.gov (United States)

    Suter, Sonia M

    2004-04-01

    With the mapping of the human genome, genetic privacy has become a concern to many. People care about genetic privacy because genes play an important role in shaping us--our genetic information is about us, and it is deeply connected to our sense of ourselves. In addition, unwanted disclosure of our genetic information, like a great deal of other personal information, makes us vulnerable to unwanted exposure, stigmatization, and discrimination. One recent approach to protecting genetic privacy is to create property rights in genetic information. This Article argues against that approach. Privacy and property are fundamentally different concepts. At heart, the term "property" connotes control within the marketplace and over something that is disaggregated or alienable from the self. "Privacy," in contrast, connotes control over access to the self as well as things close to, intimately connected to, and about the self. Given these different meanings, a regime of property rights in genetic information would impoverish our understanding of that information, ourselves, and the relationships we hope will be built around and through its disclosure. This Article explores our interests in genetic information in order to deepen our understanding of the ongoing discourse about the distinction between property and privacy. It develops a conception of genetic privacy with a strong relational component. We ordinarily share genetic information in the context of relationships in which disclosure is important to the relationship--family, intimate, doctor-patient, researcher-participant, employer-employee, and insurer-insured relationships. Such disclosure makes us vulnerable to and dependent on the person to whom we disclose it. As a result, trust is essential to the integrity of these relationships and our sharing of genetic information. Genetic privacy can protect our vulnerability in these relationships and enhance the trust we hope to have in them. Property, in contrast, by

  3. A Survey of Privacy Awareness and Current Online Practices of Indian Users

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning

    2015-01-01

    Today, users with their smart devices can communicate and access a wide range of services via the Internet to make their life easier. However, loss of privacy is becoming a major issue for architects or policy makers, accelerated by the rapid development of mobile and wireless technologies...... that eases the collection, storage, sharing, analysis, and manipulation of the individual’s information. The main objective of this paper is to study the privacy perception and awareness of Internet users in an Indian context. Results of comprehensive survey with 297 users are presented, focusing...... on their perception and awareness towards personal information privacy (PIP). The survey responses show that the user’s perception is noticeably low considering PIP and that the privacy awareness is not the same as their understanding. The results indicate the need for a solution for PIP protection where the users...

  4. Recommendation on the Use of Biometric Technology

    DEFF Research Database (Denmark)

    Juul, Niels Christian

    2013-01-01

    Biometric technology is based on the use of information linked to individuals. Hence, privacy and security in biometric applications becomes a concern and the need to assess such applications thoroughly becomes equally important. Guidelines for application of biometric technology must ensure...... a positive impact on both security and privacy. Based on two cases of biometric application, which have been assessed by the Danish Data Protecting Agency, this chapter present a set of recommendations to legislators, regulators, corporations and individuals on the appropriate use of biometric technologies...... put forward by the Danish Board of Technology. The recommendations are discussed and compared to the similar proposal put forward by the European Article 29 Data Protection Working Party....

  5. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees.

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-09-10

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one.

  6. Patient Privacy in the Era of Big Data.

    Science.gov (United States)

    Kayaalp, Mehmet

    2018-01-20

    Privacy was defined as a fundamental human right in the Universal Declaration of Human Rights at the 1948 United Nations General Assembly. However, there is still no consensus on what constitutes privacy. In this review, we look at the evolution of privacy as a concept from the era of Hippocrates to the era of social media and big data. To appreciate the modern measures of patient privacy protection and correctly interpret the current regulatory framework in the United States, we need to analyze and understand the concepts of individually identifiable information, individually identifiable health information, protected health information, and de-identification. The Privacy Rule of the Health Insurance Portability and Accountability Act defines the regulatory framework and casts a balance between protective measures and access to health information for secondary (scientific) use. The rule defines the conditions when health information is protected by law and how protected health information can be de-identified for secondary use. With the advents of artificial intelligence and computational linguistics, computational text de-identification algorithms produce de-identified results nearly as well as those produced by human experts, but much faster, more consistently and basically for free. Modern clinical text de-identification systems now pave the road to big data and enable scientists to access de-identified clinical information while firmly protecting patient privacy. However, clinical text de-identification is not a perfect process. In order to maximize the protection of patient privacy and to free clinical and scientific information from the confines of electronic healthcare systems, all stakeholders, including patients, health institutions and institutional review boards, scientists and the scientific communities, as well as regulatory and law enforcement agencies must collaborate closely. On the one hand, public health laws and privacy regulations define rules

  7. Privacy, the individual and genetic information: a Buddhist perspective.

    Science.gov (United States)

    Hongladarom, Soraj

    2009-09-01

    Bioinformatics is a new field of study whose ethical implications involve a combination of bioethics, computer ethics and information ethics. This paper is an attempt to view some of these implications from the perspective of Buddhism. Privacy is a central concern in both computer/information ethics and bioethics, and with information technology being increasingly utilized to process biological and genetic data, the issue has become even more pronounced. Traditionally, privacy presupposes the individual self but as Buddhism does away with the ultimate conception of an individual self, it has to find a way to analyse and justify privacy that does not presuppose such a self. It does this through a pragmatic conception that does not depend on a positing of the substantial self, which is then found to be unnecessary for an effective protection of privacy. As it may be possible one day to link genetic data to individuals, the Buddhist conception perhaps offers a more flexible approach, as what is considered to be integral to an individual person is not fixed in objectivity but depends on convention.

  8. 76 FR 75603 - Family Educational Rights and Privacy

    Science.gov (United States)

    2011-12-02

    ... dropout status, demographics, and unique student identifiers. Schools and LEAs are the primary collectors... of using student data must always be balanced with the need to protect student privacy. Protecting student privacy helps achieve a number of important goals, including avoiding discrimination, identity...

  9. Privacy of genetic information: a review of the laws in the United States.

    Science.gov (United States)

    Fuller, B; Ip, M

    2001-01-01

    This paper examines the privacy of genetic information and the laws in the United States designed to protect genetic privacy. While all 50 states have laws protecting the privacy of health information, there are many states that have additional laws that carve out additional protections specifically for genetic information. The majority of the individual states have enacted legislation to protect individuals from discrimination on the basis of genetic information, and most of this legislation also has provisions to protect the privacy of genetic information. On the Federal level, there has been no antidiscrimination or genetic privacy legislation. Secretary Donna Shalala of the Department of Health and Human Services has issued proposed regulations to protect the privacy of individually identifiable health information. These regulations encompass individually identifiable health information and do not make specific provisions for genetic information. The variety of laws regarding genetic privacy, some found in statutes to protect health information and some found in statutes to prevent genetic discrimination, presents challenges to those charged with administering and executing these laws.

  10. A multibiometric face recognition fusion framework with template protection

    Science.gov (United States)

    Chindaro, S.; Deravi, F.; Zhou, Z.; Ng, M. W. R.; Castro Neves, M.; Zhou, X.; Kelkboom, E.

    2010-04-01

    In this work we present a multibiometric face recognition framework based on combining information from 2D with 3D facial features. The 3D biometrics channel is protected by a privacy enhancing technology, which uses error correcting codes and cryptographic primitives to safeguard the privacy of the users of the biometric system at the same time enabling accurate matching through fusion with 2D. Experiments are conducted to compare the matching performance of such multibiometric systems with the individual biometric channels working alone and with unprotected multibiometric systems. The results show that the proposed hybrid system incorporating template protection, match and in some cases exceed the performance of corresponding unprotected equivalents, in addition to offering the additional privacy protection.

  11. Challenges of privacy protection in big data analytics

    DEFF Research Database (Denmark)

    Jensen, Meiko

    2013-01-01

    The big data paradigm implies that almost every type of information eventually can be derived from sufficiently large datasets. However, in such terms, linkage of personal data of individuals poses a severe threat to privacy and civil rights. In this position paper, we propose a set of challenges...... that have to be addressed in order to perform big data analytics in a privacy-compliant way....

  12. Sharing Privacy Protected and Statistically Sound Clinical Research Data Using Outsourced Data Storage

    Directory of Open Access Journals (Sweden)

    Geontae Noh

    2014-01-01

    Full Text Available It is critical to scientific progress to share clinical research data stored in outsourced generally available cloud computing services. Researchers are able to obtain valuable information that they would not otherwise be able to access; however, privacy concerns arise when sharing clinical data in these outsourced publicly available data storage services. HIPAA requires researchers to deidentify private information when disclosing clinical data for research purposes and describes two available methods for doing so. Unfortunately, both techniques degrade statistical accuracy. Therefore, the need to protect privacy presents a significant problem for data sharing between hospitals and researchers. In this paper, we propose a controlled secure aggregation protocol to secure both privacy and accuracy when researchers outsource their clinical research data for sharing. Since clinical data must remain private beyond a patient’s lifetime, we take advantage of lattice-based homomorphic encryption to guarantee long-term security against quantum computing attacks. Using lattice-based homomorphic encryption, we design an aggregation protocol that aggregates outsourced ciphertexts under distinct public keys. It enables researchers to get aggregated results from outsourced ciphertexts of distinct researchers. To the best of our knowledge, our protocol is the first aggregation protocol which can aggregate ciphertexts which are encrypted with distinct public keys.

  13. 77 FR 13098 - Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct

    Science.gov (United States)

    2012-03-05

    ... Promoting Innovation in the Global Digital Economy (the ``Privacy and Innovation Blueprint'') on February 23... practices do not appear to have kept pace with these rapid developments in technology and business models... publicly accessible. Do not submit Confidential Business Information or otherwise sensitive or protected...

  14. Cyber security challenges in Smart Cities: Safety, security and privacy

    Science.gov (United States)

    Elmaghraby, Adel S.; Losavio, Michael M.

    2014-01-01

    The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the “Internet of Things.” Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect. PMID:25685517

  15. Cyber security challenges in Smart Cities: Safety, security and privacy

    Directory of Open Access Journals (Sweden)

    Adel S. Elmaghraby

    2014-07-01

    Full Text Available The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the “Internet of Things.” Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect.

  16. Cyber security challenges in Smart Cities: Safety, security and privacy.

    Science.gov (United States)

    Elmaghraby, Adel S; Losavio, Michael M

    2014-07-01

    The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the "Internet of Things." Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect.

  17. PriBots: Conversational Privacy with Chatbots

    OpenAIRE

    Harkous, Hamza; Fawaz, Kassem; Shin, Kang G.; Aberer, Karl

    2016-01-01

    Traditional mechanisms for delivering notice and enabling choice have so far failed to protect users’ privacy. Users are continuously frustrated by complex privacy policies, unreachable privacy settings, and a multitude of emerging standards. The miniaturization trend of smart devices and the emergence of the Internet of Things (IoTs) will exacerbate this problem further. In this paper, we propose Conversational Privacy Bots (PriBots) as a new way of delivering notice and choice through a two...

  18. An overview of human genetic privacy.

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2017-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. © 2016 New York Academy of Sciences.

  19. An overview of human genetic privacy

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2016-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. PMID:27626905

  20. Revocable privacy: Principles, use cases, and technologies

    NARCIS (Netherlands)

    Lueks, W.; Everts, M.H.; Hoepman, J.H.

    2016-01-01

    Security and privacy often seem to be at odds with one another. In this paper, we revisit the design principle of revocable privacy which guides the creation of systems that offer anonymity for people who do not violate a predefined rule, but can still have consequences for people who do violate the

  1. Dynamic Recognition of Driver’s Propensity Based on GPS Mobile Sensing Data and Privacy Protection

    Directory of Open Access Journals (Sweden)

    Xiaoyuan Wang

    2016-01-01

    Full Text Available Driver’s propensity is a dynamic measurement of driver’s emotional preference characteristics in driving process. It is a core parameter to compute driver’s intention and consciousness in safety driving assist system, especially in vehicle collision warning system. It is also an important influence factor to achieve the Driver-Vehicle-Environment Collaborative Wisdom and Control macroscopically. In this paper, dynamic recognition model of driver’s propensity based on support vector machine is established taking the vehicle safety controlled technology and respecting and protecting the driver’s privacy as precondition. The experiment roads travel time obtained through GPS is taken as the characteristic parameter. The sensing information of Driver-Vehicle-Environment was obtained through psychological questionnaire tests, real vehicle experiments, and virtual driving experiments, and the information is used for parameter calibration and validation of the model. Results show that the established recognition model of driver’s propensity is reasonable and feasible, which can achieve the dynamic recognition of driver’s propensity to some extent. The recognition model provides reference and theoretical basis for personalized vehicle active safety systems taking people as center especially for the vehicle safety technology based on the networking.

  2. 45 CFR 503.2 - General policies-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false General policies-Privacy Act. 503.2 Section 503.2... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.2 General policies—Privacy Act. The Commission will protect the...

  3. Do privacy and data protection rules apply to legal persons and should they? A proposal for a two-tiered system

    NARCIS (Netherlands)

    van der Sloot, B.

    2015-01-01

    Privacy and data protection rules are usually said to protect the individual against intrusive governments and nosy companies. These rights guarantee the individual's freedom, personal autonomy and human dignity, among others. More and more, however, legal persons are also allowed to invoke the

  4. Pathology Image-Sharing on Social Media: Recommendations for Protecting Privacy While Motivating Education.

    Science.gov (United States)

    Crane, Genevieve M; Gardner, Jerad M

    2016-08-01

    There is a rising interest in the use of social media by pathologists. However, the use of pathology images on social media has been debated, particularly gross examination, autopsy, and dermatologic condition photographs. The immediacy of the interactions, increased interest from patients and patient groups, and fewer barriers to public discussion raise additional considerations to ensure patient privacy is protected. Yet these very features all add to the power of social media for educating other physicians and the nonmedical public about disease and for creating better understanding of the important role of pathologists in patient care. The professional and societal benefits are overwhelmingly positive, and we believe the potential for harm is minimal provided common sense and routine patient privacy principles are utilized. We lay out ethical and practical guidelines for pathologists who use social media professionally. © 2016 American Medical Association. All Rights Reserved.

  5. 78 FR 15732 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection Amendments of 1990 (Pub. L. 101...

  6. Privacy Protection: Regulations and Technologies, Opportunities and Threats

    OpenAIRE

    PEDRESCHI, Dino; BONCHI, Francesco; TURINI, Franco; VERYKIOS, Vassilios; Atzori, Maurizio; Malin, Brad; MOELANS, Bart; SAYGIN, Yucel

    2008-01-01

    nformation and communication technologies (ICTs) touch many aspects of our lives. The integration of ICTs is enhanced by the advent of mobile, wireless, and ubiquitous technologies. ICTs are increasingly embedded in common services, such as mobile and wireless communication, Internet browsing, credit card e-transactions, and electronic health records. As ICT-based services become ubiquitous, our everyday actions leave behind increasingly detailed digital traces in the information systems of I...

  7. Patient Perceptions About Data Sharing & Privacy: Insights from ActionADE.

    Science.gov (United States)

    Small, Serena S; Peddie, David; Ackerley, Christine; Hohl, Corinne M; Balka, Ellen

    2017-01-01

    Information communication technologies (ICTs) may improve health delivery by enhancing informational continuity of care and enabling secondary use of health data including public health surveillance and research. ICTs also introduce concerns related to privacy. In this paper, we examine and address this tension in the context of the development and implementation of a novel platform that will enable the documentation and communication of patient-specific ADE information, titled ActionADE. We explored privacy concerns qualitatively from the perspective of patients. Our findings will inform a series of recommendations for system design that seek to balance the need to both share and protect personal health information.

  8. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-01-01

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one. PMID:27626417

  9. A Neural-Network Clustering-Based Algorithm for Privacy Preserving Data Mining

    Science.gov (United States)

    Tsiafoulis, S.; Zorkadis, V. C.; Karras, D. A.

    The increasing use of fast and efficient data mining algorithms in huge collections of personal data, facilitated through the exponential growth of technology, in particular in the field of electronic data storage media and processing power, has raised serious ethical, philosophical and legal issues related to privacy protection. To cope with these concerns, several privacy preserving methodologies have been proposed, classified in two categories, methodologies that aim at protecting the sensitive data and those that aim at protecting the mining results. In our work, we focus on sensitive data protection and compare existing techniques according to their anonymity degree achieved, the information loss suffered and their performance characteristics. The ℓ-diversity principle is combined with k-anonymity concepts, so that background information can not be exploited to successfully attack the privacy of data subjects data refer to. Based on Kohonen Self Organizing Feature Maps (SOMs), we firstly organize data sets in subspaces according to their information theoretical distance to each other, then create the most relevant classes paying special attention to rare sensitive attribute values, and finally generalize attribute values to the minimum extend required so that both the data disclosure probability and the information loss are possibly kept negligible. Furthermore, we propose information theoretical measures for assessing the anonymity degree achieved and empirical tests to demonstrate it.

  10. Privacy Protection for Personal Health Device Communication and Healthcare Building Applications

    Directory of Open Access Journals (Sweden)

    Soon Seok Kim

    2014-01-01

    Full Text Available This paper proposes a new method for protecting patient privacy when communicating with a gateway which collects bioinformation through using personal health devices, a type of biosensor for telemedicine, at home and in other buildings. As the suggested method is designed to conform with ISO/IEEE 11073-20601, which is the international standard, interoperability with various health devices was considered. We believe it will be a highly valuable resource for dealing with basic data because it suggests an additional standard for security with the Continua Health Alliance or related international groups in the future.

  11. Security, privacy, and confidentiality issues on the Internet

    OpenAIRE

    Kelly, Grant; McKenzie, Bruce

    2002-01-01

    We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standa...

  12. Access to Information and Privacy | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    As a Crown corporation, IDRC is subject to Canada's laws on access to information and privacy protection. The following resources will help you learn more about IDRC and the access to information and privacy acts, including instructions for submitting an access to information or privacy act (ATIP) request. IDRC and ATIP ...

  13. Privacy Data Decomposition and Discretization Method for SaaS Services

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2017-01-01

    Full Text Available In cloud computing, user functional requirements are satisfied through service composition. However, due to the process of interaction and sharing among SaaS services, user privacy data tends to be illegally disclosed to the service participants. In this paper, we propose a privacy data decomposition and discretization method for SaaS services. First, according to logic between the data, we classify the privacy data into discrete privacy data and continuous privacy data. Next, in order to protect the user privacy information, continuous data chains are decomposed into discrete data chain, and discrete data chains are prevented from being synthesized into continuous data chains. Finally, we propose a protection framework for privacy data and demonstrate its correctness and feasibility with experiments.

  14. Personal Privacy Protection in Big Data Era%大数据时代个人隐私的保护

    Institute of Scientific and Technical Information of China (English)

    张永兵

    2016-01-01

    近年来,以云计算为基础平台的大数据时代正式到来,大数据因蕴藏有巨大的商业价值而使不法分子想方设法盗取个人隐私数据,从而影响用户的正常生活。本文通过分析大数据时代个人隐私安全面临的严峻挑战,对个人隐私保护所采用的技术措施进行总结,并提出了个人或企业应遵守的法律和行业规范,最后探索了个人隐私保护的进一步研究方向。%In recent years, the era of big data based on cloud computing platform officially arrived, and big data contains a huge commercial value and makes the criminals try to steal personal privacy data, thus affecting the normal life of the user. By analyzing the challenges faced by the privacy security in the era of big data, summarize the technical measures adopted in the protection of personal privacy, put forward the laws and industry standards the individual or enterprise should abide by, and finally explore the direction of further research on the protection of personal privacy.

  15. Enhancing Security and Privacy in Video Surveillance through Role-Oriented Access Control Mechanism

    DEFF Research Database (Denmark)

    Mahmood Rajpoot, Qasim

    sensitive regions, e.g. faces, from the videos. However, very few research efforts have focused on addressing the security aspects of video surveillance data and on authorizing access to this data. Interestingly, while PETs help protect the privacy of individuals, they may also hinder the usefulness....... Pervasive usage of such systems gives substantial powers to those monitoring the videos and poses a threat to the privacy of anyone observed by the system. Aside from protecting privacy from the outside attackers, it is equally important to protect the privacy of individuals from the inside personnel...... involved in monitoring surveillance data to minimize the chances of misuse of the system, e.g. voyeurism. In this context, several techniques to protect the privacy of individuals, called privacy enhancing techniques (PET) have therefore been proposed in the literature which detect and mask the privacy...

  16. Differences in legislation of data privacy protection in internet marketing in USA, EU and Serbia

    Directory of Open Access Journals (Sweden)

    Markov Jasmina

    2012-01-01

    Full Text Available There is a growing number of companies that are, in its operations and dealings with consumers, turning to the Internet and using huge opportunities that it provides. Therefore, Internet marketing is now experiencing extreme expansion and it is considered to be the marketing segment that is vulnerable to intensive and continuous change. Along with the positive effects brought to both businesses and consumers, there are some negatives associated with this form of marketing, and one of them is the insufficient protection of privacy. The fact is that we must raise the level of data protection, and improve its quality. Intense changes have to be taken on the normative level, because there are still plenty of reasons for the dissatisfaction of consumers when it comes to protecting their privacy. Thus, the legislation must play a key role in building consumer confidence as well as in the establishment of a positive relationship with marketers. The aim of this paper is to show the importance of the construction of such levels of private data protection which will establish longterm partnerships between consumers, marketers and other participants in the market, since only the aforementioned relations can bring prosperity to all parties. The paper will make a comparative analysis of the legislative framework in this field in the United States, the European Union and Serbia, as well as stress still present significant backlog of Serbia in relation to the aforementioned developed countries.

  17. mSieve: Differential Behavioral Privacy in Time Series of Mobile Sensor Data.

    Science.gov (United States)

    Saleheen, Nazir; Chakraborty, Supriyo; Ali, Nasir; Mahbubur Rahman, Md; Hossain, Syed Monowar; Bari, Rummana; Buder, Eugene; Srivastava, Mani; Kumar, Santosh

    2016-09-01

    Differential privacy concepts have been successfully used to protect anonymity of individuals in population-scale analysis. Sharing of mobile sensor data, especially physiological data, raise different privacy challenges, that of protecting private behaviors that can be revealed from time series of sensor data. Existing privacy mechanisms rely on noise addition and data perturbation. But the accuracy requirement on inferences drawn from physiological data, together with well-established limits within which these data values occur, render traditional privacy mechanisms inapplicable. In this work, we define a new behavioral privacy metric based on differential privacy and propose a novel data substitution mechanism to protect behavioral privacy. We evaluate the efficacy of our scheme using 660 hours of ECG, respiration, and activity data collected from 43 participants and demonstrate that it is possible to retain meaningful utility, in terms of inference accuracy (90%), while simultaneously preserving the privacy of sensitive behaviors.

  18. Bridging the transatlantic divide in privacy

    Directory of Open Access Journals (Sweden)

    Paula Kift

    2013-08-01

    Full Text Available In the context of the US National Security Agency surveillance scandal, the transatlantic privacy divide has come back to the fore. In the United States, the right to privacy is primarily understood as a right to physical privacy, thus the protection from unwarranted government searches and seizures. In Germany on the other hand, it is also understood as a right to spiritual privacy, thus the right of citizens to develop into autonomous moral agents. The following article will discuss the different constitutional assumptions that underlie American and German attitudes towards privacy, namely privacy as an aspect of liberty or as an aspect of dignity. As data flows defy jurisdictional boundaries, however, policymakers across the Atlantic are faced with a conundrum: how can German and American privacy cultures be reconciled?

  19. The Internet of Things ecosystem: the blockchain and data protection issues

    Directory of Open Access Journals (Sweden)

    Nicola Fabiano

    2018-03-01

    Full Text Available The IoT is innovative and important phenomenon prone to several services and applications such as the blockchain which an emerging phenomenon. We can describe the blockchain as blockchain as a service because of the opportunity to use several applications based on this technology. We, indeed, should take into account the legal issues related to the data protection and privacy law to avoid breaches of the law. In this context, it is important to consider the new European General Data Protection Regulation (GDPR that will be in force on 25 May 2018. The contribution describes the main legal issues related to data protection and privacy focusing on the Data Protection by Design approach, according to the GDPR. Furthermore, I resolutely believe that is possible to develop a global privacy standard framework that organizations can use for their data protection activities.

  20. Data security breaches and privacy in Europe

    CERN Document Server

    Wong, Rebecca

    2013-01-01

    Data Security Breaches and Privacy in Europe aims to consider data protection and cybersecurity issues; more specifically, it aims to provide a fruitful discussion on data security breaches. A detailed analysis of the European Data Protection framework will be examined. In particular, the Data Protection Directive 95/45/EC, the Directive on Privacy and Electronic Communications and the proposed changes under the Data Protection Regulation (data breach notifications) and its implications are considered. This is followed by an examination of the Directive on Attacks against information systems a

  1. Breathing Room in Monitored Space: The Impact of Passive Monitoring Technology on Privacy in Independent Living.

    Science.gov (United States)

    Berridge, Clara

    2016-10-01

    This study examines articulations of the relationship between privacy and passive monitoring by users and former users of a sensor-based remote monitoring system. A new conceptualization of privacy provides a framework for a constructive analysis of the study's findings with practical implications. Forty-nine in-depth semistructured interviews were conducted with elder residents, family members, and staff of 6 low-income independent living residence apartment buildings where the passive monitoring system had been offered for 6 years. Transcribed interviews were coded into the Dedoose software service and were analyzed using methods of grounded theory. Five diverse articulations of the relationship between privacy and passive monitoring emerged. The system produced new knowledge about residents and enabled staff to decide how much of that knowledge to disclose to residents. They chose not to disclose to residents their reason for following up on system-generated alerts for 2 reasons: concern that feelings of privacy invasion may arise and cause dissatisfaction with the technology, and the knowledge that many resident users did not comprehend the extent of its features and would be alarmed. This research reveals the importance and challenges of obtaining informed consent. It identifies where boundary intrusion can occur in the use of passive monitoring as well as how changes to technology design and practice could create opportunities for residents to manage their own boundaries according to their privacy needs. The diversity of approaches to privacy supports the need for "opportunity for boundary management" to be employed as both a design and practice principle. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Privacy Training Program

    Science.gov (United States)

    Recognizing that training and awareness are critical to protecting agency Personally Identifiable Information (PII), the EPA is developing online training for privacy contacts in its programs and regions.

  3. Analysis of Privacy-Enhancing Identity Management Systems

    DEFF Research Database (Denmark)

    Adjei, Joseph K.; Olesen, Henning

    Privacy has become a major issue for policy makers. This has been impelled by the rapid development of technologies that facilitate collection, distribution, storage, and manipulation of personal information. Business organizations are finding new ways of leveraging the value derived from consumer...... is an attempt to understand the relationship between individuals’ intentions to disclose personal information, their actual personal information disclosure behaviours, and how these can be leveraged to develop privacy-enhancing identity management systems (IDMS) that users can trust. Legal, regulatory...... and technological aspects of privacy and technology adoption are also discussed....

  4. The Best of Both Worlds? Free Trade in Services and EU Law on Privacy and Data Protection

    NARCIS (Netherlands)

    Yakovleva, S.; Irion, K.

    2016-01-01

    The article focuses on the interplay between European Union (EU) law on privacy and data protection and international trade law, in particular the General Agreement on Trade in Services (GATS) and the WTO dispute settlement system. The argument distinguishes between the effects of international

  5. A Quantum Private Query Protocol for Enhancing both User and Database Privacy

    Science.gov (United States)

    Zhou, Yi-Hua; Bai, Xue-Wei; Li, Lei-Lei; Shi, Wei-Min; Yang, Yu-Guang

    2018-01-01

    In order to protect the privacy of query user and database, some QKD-based quantum private query (QPQ) protocols were proposed. Unfortunately some of them cannot resist internal attack from database perfectly; some others can ensure better user privacy but require a reduction of database privacy. In this paper, a novel two-way QPQ protocol is proposed to ensure the privacy of both sides of communication. In our protocol, user makes initial quantum states and derives the key bit by comparing initial quantum state and outcome state returned from database by ctrl or shift mode instead of announcing two non-orthogonal qubits as others which may leak part secret information. In this way, not only the privacy of database be ensured but also user privacy is strengthened. Furthermore, our protocol can also realize the security of loss-tolerance, cheat-sensitive, and resisting JM attack etc. Supported by National Natural Science Foundation of China under Grant Nos. U1636106, 61572053, 61472048, 61602019, 61502016; Beijing Natural Science Foundation under Grant Nos. 4152038, 4162005; Basic Research Fund of Beijing University of Technology (No. X4007999201501); The Scientific Research Common Program of Beijing Municipal Commission of Education under Grant No. KM201510005016

  6. (IN-PRIVACY IN MOBILE APPS. CUSTOMER OPPORTUNITIES

    Directory of Open Access Journals (Sweden)

    Yu.S. Chemerkina

    2016-01-01

    Full Text Available Subject of Study. The paper presents the results of an investigation of cross-platform mobile applications. This paper focuses on a cross-platform app data investigation in purpose of creating a database that helps to make decisions from data privacy viewpoint. These decisions refer to knowledge about mobile apps that are available to the public, especially on how consumer data is protected while it is stored locally or transferred via network as well as what type of data may leak. Methods. This paper proposes a forensics methodology as a cornerstone of an app data investigation process. The object of research is an application data protection under different security control types among modern mobile OS. The subject of research is a modification of forensics approach and behavioral analysis to examine application data privacy in order to find data that are not properly handled by applications which lead to data leakages, defining protection control type without forensics limits. In addition, this paper relies on using the simplest tools, proposing a limit to examine locally stored data and transmitted over the network to cover all data, excluding memory and code analysis unless it is valuable (behavioral analysis. The research methods of the tasks set in the paper include digital forensics approach methods depending on data conception (at-rest, in-use/memory, in-transit with behavioral analysis of application, and static and dynamic application code analysis. Main Results. The research was carried out for the scope of that thesis, and the following scientific results were obtained. First, the methods used to investigate the privacy of application data allow considering application features and protection code design and flaws in the context of incomplete user awareness about the privacy state due to external activity of the developer. Second, the knowledge set about facts of application data protection that allows making a knowledge database to

  7. Privacy-preserving digital rights management

    NARCIS (Netherlands)

    Conrado, C.; Petkovic, M.; Jonker, W.; Jonker, W.; Petkovic, M.

    2004-01-01

    DRM systems provide a means for protecting digital content, but at the same time they violate the privacy of users in a number of ways. This paper addresses privacy issues in DRM systems. The main challenge is how to allow a user to interact with the system in an anonymous/pseudonymous way, while

  8. European Data Protection Law Review > Volume 3 , Issue 3 > Privacy and Data Protection in the Age of Pervasive Technologies in AI and Robotics DOI https://doi.org/10.21552/edpl/2017/3/8 Robert van den Hoven van Genderen

    NARCIS (Netherlands)

    van den Hoven van Genderen, R.

    2017-01-01

    Robots have been a part of the popular imagination since antiquity. And yet the idea of a robot — a being that exists somehow in the twilight between machine and person — continues to fascinate. Privacy, data protection and physical integrity will be structurally influenced by the pervasive

  9. An Alternative View of Privacy on Facebook

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2011-02-01

    Full Text Available The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy tends to ignore the political economy of privacy in capitalism that can mask socio-economic inequality and protect capital and the rich from public accountability. Facebook is in this paper analyzed with the help of an approach, in which privacy for dominant groups, in regard to the ability of keeping wealth and power secret from the public, is seen as problematic, whereas privacy at the bottom of the power pyramid for consumers and normal citizens is seen as a protection from dominant interests. Facebook’s privacy concept is based on an understanding that stresses self-regulation and on an individualistic understanding of privacy. The theoretical analysis of the political economy of privacy on Facebook in this paper is based on the political theories of Karl Marx, Hannah Arendt and Jürgen Habermas. Based on the political economist Dallas Smythe’s concept of audience commodification, the process of prosumer commodification on Facebook is analyzed. The political economy of privacy on Facebook is analyzed with the help of a theory of drives that is grounded in Herbert Marcuse’s interpretation of Sigmund Freud, which allows to analyze Facebook based on the concept of play labor (= the convergence of play and labor.

  10. The privacy paradox : Investigating discrepancies between expressed privacy concerns and actual online behavior - A systematic literature review

    NARCIS (Netherlands)

    Barth, Susanne; de Jong, Menno D.T.

    2017-01-01

    Also known as the privacy paradox, recent research on online behavior has revealed discrepancies between user attitude and their actual behavior. More specifically: While users claim to be very concerned about their privacy, they nevertheless undertake very little to protect their personal data.

  11. 77 FR 64962 - Privacy Act of 1974, as Amended

    Science.gov (United States)

    2012-10-24

    ... social media, and recipients of other public relations materials issued by the CFPB about CFPB sponsored... THE BUREAU OF CONSUMER FINANCIAL PROTECTION Privacy Act of 1974, as Amended AGENCY: Bureau of Consumer Financial Protection. ACTION: Notice of Proposed Privacy Act System of Records. SUMMARY: In...

  12. 77 FR 60382 - Privacy Act of 1974, as Amended

    Science.gov (United States)

    2012-10-03

    ... financial products or services, (b) consumer behavior with respect to consumer financial products and... BUREAU OF CONSUMER FINANCIAL PROTECTION Privacy Act of 1974, as Amended AGENCY: Bureau of Consumer... the Privacy Act of 1974, as amended, the Bureau of Consumer Financial Protection, hereinto referred to...

  13. Privacy & Social Media in the Context of the Arab Gulf

    OpenAIRE

    Abokhodair, Norah; Vieweg, Sarah

    2016-01-01

    Theories of privacy and how it relates to the use of Information Communication Technology (ICT) have been a topic of research for decades. However, little attention has been paid to the perception of privacy from the perspective of technology users in the Middle East. In this paper, we delve into interpretations of privacy from the approach of Arab Gulf citizens. We consider how privacy is practiced and understood in technology-mediated environments among this population, paying particular at...

  14. Privacy information management for video surveillance

    Science.gov (United States)

    Luo, Ying; Cheung, Sen-ching S.

    2013-05-01

    The widespread deployment of surveillance cameras has raised serious privacy concerns. Many privacy-enhancing schemes have been proposed to automatically redact images of trusted individuals in the surveillance video. To identify these individuals for protection, the most reliable approach is to use biometric signals such as iris patterns as they are immutable and highly discriminative. In this paper, we propose a privacy data management system to be used in a privacy-aware video surveillance system. The privacy status of a subject is anonymously determined based on her iris pattern. For a trusted subject, the surveillance video is redacted and the original imagery is considered to be the privacy information. Our proposed system allows a subject to access her privacy information via the same biometric signal for privacy status determination. Two secure protocols, one for privacy information encryption and the other for privacy information retrieval are proposed. Error control coding is used to cope with the variability in iris patterns and efficient implementation is achieved using surrogate data records. Experimental results on a public iris biometric database demonstrate the validity of our framework.

  15. Privacy and ethics in pediatric environmental health research-part II: protecting families and communities.

    Science.gov (United States)

    Fisher, Celia B

    2006-10-01

    In pediatric environmental health research, information about family members is often directly sought or indirectly obtained in the process of identifying child risk factors and helping to tease apart and identify interactions between genetic and environmental factors. However, federal regulations governing human subjects research do not directly address ethical issues associated with protections for family members who are not identified as the primary "research participant." Ethical concerns related to family consent and privacy become paramount as pediatric environmental health research increasingly turns to questions of gene-environment interactions. In this article I identify issues arising from and potential solutions for the privacy and informed consent challenges of pediatric environmental health research intended to adequately protect the rights and welfare of children, family members, and communities. I first discuss family members as secondary research participants and then the specific ethical challenges of longitudinal research on late-onset environmental effects and gene-environment interactions. I conclude with a discussion of the confidentiality and social risks of recruitment and data collection of research conducted within small or unique communities, ethnic minority populations, and low-income families. The responsible conduct of pediatric environmental health research must be conceptualized as a goodness of fit between the specific research context and the unique characteristics of subjects and other family stakeholders.

  16. Differential privacy in intelligent transportation systems

    NARCIS (Netherlands)

    Kargl, Frank; Friedman, Arik; Boreli, Roksana

    2013-01-01

    In this paper, we investigate how the concept of differential privacy can be applied to Intelligent Transportation Systems (ITS), focusing on protection of Floating Car Data (FCD) stored and processed in central Traffic Data Centers (TDC). We illustrate an integration of differential privacy with

  17. Fourier domain asymmetric cryptosystem for privacy protected multimodal biometric security

    Science.gov (United States)

    Choudhury, Debesh

    2016-04-01

    We propose a Fourier domain asymmetric cryptosystem for multimodal biometric security. One modality of biometrics (such as face) is used as the plaintext, which is encrypted by another modality of biometrics (such as fingerprint). A private key is synthesized from the encrypted biometric signature by complex spatial Fourier processing. The encrypted biometric signature is further encrypted by other biometric modalities, and the corresponding private keys are synthesized. The resulting biometric signature is privacy protected since the encryption keys are provided by the human, and hence those are private keys. Moreover, the decryption keys are synthesized using those private encryption keys. The encrypted signatures are decrypted using the synthesized private keys and inverse complex spatial Fourier processing. Computer simulations demonstrate the feasibility of the technique proposed.

  18. Preserving differential privacy under finite-precision semantics.

    Directory of Open Access Journals (Sweden)

    Ivan Gazeau

    2013-06-01

    Full Text Available The approximation introduced by finite-precision representation of continuous data can induce arbitrarily large information leaks even when the computation using exact semantics is secure. Such leakage can thus undermine design efforts aimed at protecting sensitive information. We focus here on differential privacy, an approach to privacy that emerged from the area of statistical databases and is now widely applied also in other domains. In this approach, privacy is protected by the addition of noise to a true (private value. To date, this approach to privacy has been proved correct only in the ideal case in which computations are made using an idealized, infinite-precision semantics. In this paper, we analyze the situation at the implementation level, where the semantics is necessarily finite-precision, i.e. the representation of real numbers and the operations on them, are rounded according to some level of precision. We show that in general there are violations of the differential privacy property, and we study the conditions under which we can still guarantee a limited (but, arguably, totally acceptable variant of the property, under only a minor degradation of the privacy level. Finally, we illustrate our results on two cases of noise-generating distributions: the standard Laplacian mechanism commonly used in differential privacy, and a bivariate version of the Laplacian recently introduced in the setting of privacy-aware geolocation.

  19. A Research on Issues Related to RFID Security and Privacy

    Science.gov (United States)

    Kim, Jongki; Yang, Chao; Jeon, Jinhwan

    Radio Frequency Identification (RFID) is a technology for automated identification of objects and people. RFID systems have been gaining more popularity in areas especially in supply chain management and automated identification systems. However, there are many existing and potential problems in the RFID systems which could threat the technology's future. To successfully adopt RFID technology in various applications, we need to develop the solutions to protect the RFID system's data information. This study investigates important issues related to privacy and security of RFID based on the recent literature and suggests solutions to cope with the problem.

  20. 76 FR 63896 - Federal Acquisition Regulation; Privacy Training, 2010-013

    Science.gov (United States)

    2011-10-14

    ... should a breach occur; and (7) Any agency-specific privacy training requirements. (d) The contractor is... Acquisition Regulation; Privacy Training, 2010-013 AGENCY: Department of Defense (DoD), General Services... contractors to complete training that addresses the protection of privacy, in accordance with the Privacy Act...

  1. Understanding Engagement with the Privacy Domain Through Design Research.

    OpenAIRE

    Vasalou, A.; Oostveen, A.; Bowers, Christopher; Beale, R.

    2015-01-01

    This paper reports findings from participatory design research aimed at uncovering how technological interventions can engage users in the domain of privacy. Our work was undertaken in the context of a new design concept “Privacy Trends” whose aspiration is to foster technology users’ digital literacy regarding ongoing privacy risks and elucidate how such risks fit within existing social, organizational and political systems, leading to a longer term privacy concern. Our study reveals two cha...

  2. Cancelable remote quantum fingerprint templates protection scheme

    International Nuclear Information System (INIS)

    Liao Qin; Guo Ying; Huang Duan

    2017-01-01

    With the increasing popularity of fingerprint identification technology, its security and privacy have been paid much attention. Only the security and privacy of biological information are insured, the biological technology can be better accepted and used by the public. In this paper, we propose a novel quantum bit (qbit)-based scheme to solve the security and privacy problem existing in the traditional fingerprint identification system. By exploiting the properties of quantm mechanics, our proposed scheme, cancelable remote quantum fingerprint templates protection scheme, can achieve the unconditional security guaranteed in an information-theoretical sense. Moreover, this novel quantum scheme can invalidate most of the attacks aimed at the fingerprint identification system. In addition, the proposed scheme is applicable to the requirement of remote communication with no need to worry about its security and privacy during the transmission. This is an absolute advantage when comparing with other traditional methods. Security analysis shows that the proposed scheme can effectively ensure the communication security and the privacy of users’ information for the fingerprint identification. (paper)

  3. Decrypting Information Sensitivity: Risk, Privacy, and Data Protection Law in the United States and the European Union

    Science.gov (United States)

    Fazlioglu, Muge

    2017-01-01

    This dissertation examines the risk-based approach to privacy and data protection and the role of information sensitivity within risk management. Determining what information carries the greatest risk is a multi-layered challenge that involves balancing the rights and interests of multiple actors, including data controllers, data processors, and…

  4. Secure privacy-preserving biometric authentication scheme for telecare medicine information systems.

    Science.gov (United States)

    Li, Xuelei; Wen, Qiaoyan; Li, Wenmin; Zhang, Hua; Jin, Zhengping

    2014-11-01

    Healthcare delivery services via telecare medicine information systems (TMIS) can help patients to obtain their desired telemedicine services conveniently. However, information security and privacy protection are important issues and crucial challenges in healthcare information systems, where only authorized patients and doctors can employ telecare medicine facilities and access electronic medical records. Therefore, a secure authentication scheme is urgently required to achieve the goals of entity authentication, data confidentiality and privacy protection. This paper investigates a new biometric authentication with key agreement scheme, which focuses on patient privacy and medical data confidentiality in TMIS. The new scheme employs hash function, fuzzy extractor, nonce and authenticated Diffie-Hellman key agreement as primitives. It provides patient privacy protection, e.g., hiding identity from being theft and tracked by unauthorized participant, and preserving password and biometric template from being compromised by trustless servers. Moreover, key agreement supports secure transmission by symmetric encryption to protect patient's medical data from being leaked. Finally, the analysis shows that our proposal provides more security and privacy protection for TMIS.

  5. Privacy and CHI : methodologies for studying privacy issues

    NARCIS (Netherlands)

    Patil, S.; Romero, N.A.; Karat, J.

    2006-01-01

    This workshop aims to reflect on methodologies to empirically study privacy issues related to advanced technology. The goal is to address methodological concerns by drawing upon both theoretical perspectives as well as practical experiences.

  6. Internet privacy options for adequate realisation

    CERN Document Server

    2013-01-01

    A thorough multidisciplinary analysis of various perspectives on internet privacy was published as the first volume of a study, revealing the results of the achatech project "Internet Privacy - A Culture of Privacy and Trust on the Internet." The second publication from this project presents integrated, interdisciplinary options for improving privacy on the Internet utilising a normative, value-oriented approach. The ways in which privacy promotes and preconditions fundamental societal values and how privacy violations endanger the flourishing of said values are exemplified. The conditions which must be fulfilled in order to achieve a culture of privacy and trust on the internet are illuminated. This volume presents options for policy-makers, educators, businesses and technology experts how to facilitate solutions for more privacy on the Internet and identifies further research requirements in this area.

  7. Big data privacy protection model based on multi-level trusted system

    Science.gov (United States)

    Zhang, Nan; Liu, Zehua; Han, Hongfeng

    2018-05-01

    This paper introduces and inherit the multi-level trusted system model that solves the Trojan virus by encrypting the privacy of user data, and achieve the principle: "not to read the high priority hierarchy, not to write the hierarchy with low priority". Thus ensuring that the low-priority data privacy leak does not affect the disclosure of high-priority data privacy. This paper inherits the multi-level trustworthy system model of Trojan horse and divides seven different risk levels. The priority level 1˜7 represent the low to high value of user data privacy, and realize seven kinds of encryption with different execution efficiency Algorithm, the higher the priority, the greater the value of user data privacy, at the expense of efficiency under the premise of choosing a more encrypted encryption algorithm to ensure data security. For enterprises, the price point is determined by the unit equipment users to decide the length of time. The higher the risk sub-group algorithm, the longer the encryption time. The model assumes that users prefer the lower priority encryption algorithm to ensure efficiency. This paper proposes a privacy cost model for each of the seven risk subgroups. Among them, the higher the privacy cost, the higher the priority of the risk sub-group, the higher the price the user needs to pay to ensure the privacy of the data. Furthermore, by introducing the existing pricing model of economics and the human traffic model proposed by this paper and fluctuating with the market demand, this paper improves the price of unit products when the market demand is low. On the other hand, when the market demand increases, the profit of the enterprise will be guaranteed under the guidance of the government by reducing the price per unit of product. Then, this paper introduces the dynamic factors of consumers' mood and age to optimize. At the same time, seven algorithms are selected from symmetric and asymmetric encryption algorithms to define the enterprise

  8. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    Science.gov (United States)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  9. Piloting a deceased subject integrated data repository and protecting privacy of relatives.

    Science.gov (United States)

    Huser, Vojtech; Kayaalp, Mehmet; Dodd, Zeyno A; Cimino, James J

    2014-01-01

    Use of deceased subject Electronic Health Records can be an important piloting platform for informatics or biomedical research. Existing legal framework allows such research under less strict de-identification criteria; however, privacy of non-decedent must be protected. We report on creation of the decease subject Integrated Data Repository (dsIDR) at National Institutes of Health, Clinical Center and a pilot methodology to remove secondary protected health information or identifiable information (secondary PxI; information about persons other than the primary patient). We characterize available structured coded data in dsIDR and report the estimated frequencies of secondary PxI, ranging from 12.9% (sensitive token presence) to 1.1% (using stricter criteria). Federating decedent EHR data from multiple institutions can address sample size limitations and our pilot study provides lessons learned and methodology that can be adopted by other institutions.

  10. Security, privacy, and confidentiality issues on the Internet.

    Science.gov (United States)

    Kelly, Grant; McKenzie, Bruce

    2002-01-01

    We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standard for encrypting Web-based information interchanges is Secure Sockets Layer (SSL), more recently known as Transport Layer Security or TLS. There is a public key infrastructure process to 'sign' a message whereby the private key of an individual can be used to 'hash' the message. This can then be verified against the sender's public key. This ensures the data's authenticity and origin without conferring privacy, and is called a 'digital signature'. The best protection against viruses is not opening e-mails from unknown sources or those containing unusual message headers.

  11. Privacy issues in mobile advertising

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix

    The emergence of the wired Internet and mobile telecommunication networks is creating new opportunities for advertisers to generate new revenue streams through mobile users. As consumer adoption of mobile technology continues to increase, it is only a question of time when mobile advertising...... becomes an important part of marketing strategies. The development of mobile advertising, however, will be dependent on acceptance and usability issues in order to ensure permission-based advertising. Growing concerns about the protection of the users' privacy have been raised since mobile advertising may...... become extremely intrusive practices in an intimate personal space. This article focuses on the evaluation of legal problems raised by this novel form of advertising. It is assumed that a technological design, which is in line with the legal framework, will ensure that the benefits of mobile advertising...

  12. Consumer Responses to the Introduction of Privacy Protection Measures: An Exploratory Research Framework

    OpenAIRE

    Heng Xu

    2009-01-01

    Information privacy is at the center of discussion and controversy among multiple stakeholders including business leaders, privacy activists, and government regulators. However, conceptualizations of information privacy have been somewhat patchy in current privacy literature. In this article, we review the conceptualizations of information privacy through three different lenses (information exchange, social contract and information control), and then try to build upon previous literature from...

  13. Face and Emotion Recognition on Commercial Property under EU Data Protection Law

    DEFF Research Database (Denmark)

    Lewinski, Peter; Trzaskowski, Jan; Luzak, Joasia

    2016-01-01

    This paper integrates and cuts through domains of privacy law and biometrics. Specifically, this paper presents a legal analysis on the use of Automated Facial Recognition Systems (the AFRS) in commercial (retail store) settings within the European Union data protection framework. The AFRS...... to the technology's potential of becoming a substantial privacy issue. First, this paper introduces the AFRS and EU data protection law. This is followed by an analysis of European Data protection law and its application in relation to the use of the AFRS, including requirements concerning data quality...

  14. Protecting human health and security in digital Europe: how to deal with the "privacy paradox"?

    Science.gov (United States)

    Büschel, Isabell; Mehdi, Rostane; Cammilleri, Anne; Marzouki, Yousri; Elger, Bernice

    2014-09-01

    This article is the result of an international research between law and ethics scholars from Universities in France and Switzerland, who have been closely collaborating with technical experts on the design and use of information and communication technologies in the fields of human health and security. The interdisciplinary approach is a unique feature and guarantees important new insights in the social, ethical and legal implications of these technologies for the individual and society as a whole. Its aim is to shed light on the tension between secrecy and transparency in the digital era. A special focus is put from the perspectives of psychology, medical ethics and European law on the contradiction between individuals' motivations for consented processing of personal data and their fears about unknown disclosure, transferal and sharing of personal data via information and communication technologies (named the "privacy paradox"). Potential benefits and harms for the individual and society resulting from the use of computers, mobile phones, the Internet and social media are being discussed. Furthermore, the authors point out the ethical and legal limitations inherent to the processing of personal data in a democratic society governed by the rule of law. Finally, they seek to demonstrate that the impact of information and communication technology use on the individuals' well-being, the latter being closely correlated with a high level of fundamental rights protection in Europe, is a promising feature of the socalled "e-democracy" as a new way to collectively attribute meaning to large-scale online actions, motivations and ideas.

  15. Cybersecurity and Privacy

    DEFF Research Database (Denmark)

    he huge potential in future connected services has as a precondition that privacy and security needs are dealt with in order for new services to be accepted. This issue is increasingly on the agenda both at the company and at individual level. Cybersecurity and Privacy – bridging the gap addresses...... two very complex fields of the digital world, i.e., Cybersecurity and Privacy. These multifaceted, multidisciplinary and complex issues are usually understood and valued differently by different individuals, data holders and legal bodies. But a change in one field immediately affects the others....... Policies, frameworks, strategies, laws, tools, techniques, and technologies – all of these are tightly interwoven when it comes to security and privacy. This book is another attempt to bridge the gap between the industry and academia. The book addresses the views from academia and industry on the subject...

  16. Are organisations in South Africa ready to comply with personal data protection or privacy legislation and regulations?

    CSIR Research Space (South Africa)

    Baloyi, Ntsako

    2017-06-01

    Full Text Available people. Organisations require people’s trust and in turn, people are entitled to demand, as far as practicable and lawful, certain privileges from these organisations, such as the right to data protection or privacy. The power imbalance between... of restrictions on international data transfers, where there are no ‘adequate’ levels of personal data protection [5, 6]. This could have dire consequences for businesses. The European Union (EU) Directive [5] was a game changer. It resulted in the conclusion...

  17. Multilayered security and privacy protection in Car-to-X networks solutions from application down to physical layer

    CERN Document Server

    Stübing, Hagen

    2013-01-01

    Car-to-X (C2X) communication in terms of Car-to-Car (C2C) and Car-to-Infrastructure (C2I) communication aims at increasing road safety and traffic efficiency by exchanging foresighted traffic information. Thereby, security and privacy are regarded as an absolute prerequisite for successfully establishing the C2X technology on the market. Towards the paramount objective of covering the entire ITS reference model with security and privacy measures, Hagen Stübing develops dedicated solutions for each layer, respectively. On application layer a security architecture in terms of a Public Key Infras

  18. NCTA v. FCC - Do Commercial Free Speech Justifications Trump Consumers' Personal Data Protection Rights? Answer To Shape Mobile Advertising Industry

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix; King, Nancy J.

    2010-01-01

    's right to communicate with their customers. Considering privacy risks associated with advances in computer technology, the complexities of modern information processing and evolving mobile advertising (m-advertising) practices, privacy regulations should not be equated with unwarranted speech regulations...... balance between protecting consumers' information privacy in an era of pervasive data processing and protecting the rights of marketers to engage in protected commercial free speech that involves using customers' personal information. A ruling against the FCC would have limited the use of government...... to support the growth of the global mobile advertising (m-advertising) industry....

  19. 78 FR 38724 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-06-27

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... Agreement that establishes a computer matching program between the Department of Homeland Security/U.S... and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection...

  20. Data privacy considerations in Intensive Care Grids.

    Science.gov (United States)

    Luna, Jesus; Dikaiakos, Marios D; Kyprianou, Theodoros; Bilas, Angelos; Marazakis, Manolis

    2008-01-01

    Novel eHealth systems are being designed to provide a citizen-centered health system, however the even demanding need for computing and data resources has required the adoption of Grid technologies. In most of the cases, this novel Health Grid requires not only conveying patient's personal data through public networks, but also storing it into shared resources out of the hospital premises. These features introduce new security concerns, in particular related with privacy. In this paper we survey current legal and technological approaches that have been taken to protect a patient's personal data into eHealth systems, with a particular focus in Intensive Care Grids. However, thanks to a security analysis applied over the Intensive Care Grid system (ICGrid) we show that these security mechanisms are not enough to provide a comprehensive solution, mainly because the data-at-rest is still vulnerable to attacks coming from untrusted Storage Elements where an attacker may directly access them. To cope with these issues, we propose a new privacy-oriented protocol which uses a combination of encryption and fragmentation to improve data's assurance while keeping compatibility with current legislations and Health Grid security mechanisms.

  1. Monitoring Employee Behavior Through the Use of Technology and Issues of Employee Privacy in America

    Directory of Open Access Journals (Sweden)

    Mahmoud Moussa

    2015-04-01

    Full Text Available Despite the historic American love for privacy that has enhanced innovation and creativity throughout the country, encroachments on privacy restrain individual freedom. Noticeable, advances in technology have offered decision makers remarkable monitoring aptitudes that can be used in numerous tasks for multiple reasons. This has led scholars and practitioners to pose a significant number of questions about what is legitimate and illegitimate in the day-to-day affairs of a business. This article is composed of (a research about electronic monitoring and privacy concerns; (b definitions of, critiques of, and alternatives to electronic performance monitoring (EPM; (c motives behind employee monitoring and leadership behaviors; (d advice that makes monitoring less distressful; (e employee monitoring policies; (f reviewing policies and procedures; (g the role of human resource development (HRD in employee assessment and development; and (h conclusion and recommendations for further studies.

  2. A Secure and Privacy-Preserving Targeted Ad-System

    Science.gov (United States)

    Androulaki, Elli; Bellovin, Steven M.

    Thanks to its low product-promotion cost and its efficiency, targeted online advertising has become very popular. Unfortunately, being profile-based, online advertising methods violate consumers' privacy, which has engendered resistance to the ads. However, protecting privacy through anonymity seems to encourage click-fraud. In this paper, we define consumer's privacy and present a privacy-preserving, targeted ad system (PPOAd) which is resistant towards click fraud. Our scheme is structured to provide financial incentives to all entities involved.

  3. Location Privacy in RFID Applications

    Science.gov (United States)

    Sadeghi, Ahmad-Reza; Visconti, Ivan; Wachsmann, Christian

    RFID-enabled systems allow fully automatic wireless identification of objects and are rapidly becoming a pervasive technology with various applications. However, despite their benefits, RFID-based systems also pose challenging risks, in particular concerning user privacy. Indeed, improvident use of RFID can disclose sensitive information about users and their locations allowing detailed user profiles. Hence, it is crucial to identify and to enforce appropriate security and privacy requirements of RFID applications (that are also compliant to legislation). This chapter first discusses security and privacy requirements for RFID-enabled systems, focusing in particular on location privacy issues. Then it explores the advances in RFID applications, stressing the security and privacy shortcomings of existing proposals. Finally, it presents new promising directions for privacy-preserving RFID systems, where as a case study we focus electronic tickets (e-tickets) for public transportation.

  4. Privacy and Ethics in Pediatric Environmental Health Research—Part II: Protecting Families and Communities

    Science.gov (United States)

    Fisher, Celia B.

    2006-01-01

    Background In pediatric environmental health research, information about family members is often directly sought or indirectly obtained in the process of identifying child risk factors and helping to tease apart and identify interactions between genetic and environmental factors. However, federal regulations governing human subjects research do not directly address ethical issues associated with protections for family members who are not identified as the primary “research participant.” Ethical concerns related to family consent and privacy become paramount as pediatric environmental health research increasingly turns to questions of gene–environment interactions. Objectives In this article I identify issues arising from and potential solutions for the privacy and informed consent challenges of pediatric environmental health research intended to adequately protect the rights and welfare of children, family members, and communities. Discussion I first discuss family members as secondary research participants and then the specific ethical challenges of longitudinal research on late-onset environmental effects and gene–environment interactions. I conclude with a discussion of the confidentiality and social risks of recruitment and data collection of research conducted within small or unique communities, ethnic minority populations, and low-income families. Conclusions The responsible conduct of pediatric environmental health research must be conceptualized as a goodness of fit between the specific research context and the unique characteristics of subjects and other family stakeholders. PMID:17035154

  5. "Everybody Knows Everybody Else's Business"-Privacy in Rural Communities.

    Science.gov (United States)

    Leung, Janni; Smith, Annetta; Atherton, Iain; McLaughlin, Deirdre

    2016-12-01

    Patients have a right to privacy in a health care setting. This involves conversational discretion, security of medical records and physical privacy of remaining unnoticed or unidentified when using health care services other than by those who need to know or whom the patient wishes to know. However, the privacy of cancer patients who live in rural areas is more difficult to protect due to the characteristics of rural communities. The purpose of this article is to reflect on concerns relating to the lack of privacy experienced by cancer patients and health care professionals in the rural health care setting. In addition, this article suggests future research directions to provide much needed evidence for educating health care providers and guiding health care policies that can lead to better protection of privacy among cancer patients living in rural communities.

  6. Online Privacy as a Corporate Social Responsibility

    DEFF Research Database (Denmark)

    Pollach, Irene

    2011-01-01

    Information technology and the Internet have added a new stakeholder concern to the corporate social responsibility agenda: online privacy. While theory suggests that online privacy is a corporate social responsibility, only very few studies in the business ethics literature have connected...... of the companies have comprehensive privacy programs, although more than half of them voice moral or relational motives for addressing online privacy. The privacy measures they have taken are primarily compliance measures, while measures that stimulate a stakeholder dialogue are rare. Overall, a wide variety...

  7. Privacy Implications of Surveillance Systems

    DEFF Research Database (Denmark)

    Thommesen, Jacob; Andersen, Henning Boje

    2009-01-01

    This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed for vari......This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed...... for various purposes, even as a service for those being observed, but in any case they will to some degree invade their privacy. The model provided here can indicate how invasive any particular system may be – and be used to compare the invasiveness of different systems. Applying a functional approach......, the model is established by first considering the social function of privacy in everyday life, which in turn lets us determine which different domains will be considered as private, and finally identify the different types of privacy invasion. This underlying model (function – domain – invasion) then serves...

  8. Mechanism of personal privacy protection based on blockchain%基于区块链的个人隐私保护机制

    Institute of Scientific and Technical Information of China (English)

    章宁; 钟珊

    2017-01-01

    Aiming at the problem of personal privacy protection in Interact car rental scenario,a personal privacy protection mechanism based on blockchain was proposed.Firstly,a framework for personal privacy protection based on blockchain was proposed for solving personal privacy issues exposed in the Internet car rental.Secondly,the design and definition of the model were given by participant profile,database design and performance analysis,and the framework and implementation of the model were expounded from the aspects of granting authority,writing data,reading data and revoking authority.Finally,the realizability of the mechanism was proved by the system development based on blockchain.%针对互联网租车场景中个人隐私保护问题,提出一种基于区块链的个人隐私保护机制.首先,针对互联网租车中暴露的个人隐私问题提出一个基于区块链的个人隐私保护解决方案框架;然后,通过参与者简介、数据库设计以及性能分析给出模型的设计和定义,并从授予权限、写入数据、读取数据和撤销权限等方面阐述该模型的框架和实现;最后,通过基于区块链的系统开发表明了该机制的可实现性.

  9. Privacy-Preserving Trajectory Collection

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Xuegang, Huang; Pedersen, Torben Bach

    2008-01-01

    In order to provide context--aware Location--Based Services, real location data of mobile users must be collected and analyzed by spatio--temporal data mining methods. However, the data mining methods need precise location data, while the mobile users want to protect their location privacy....... To remedy this situation, this paper first formally defines novel location privacy requirements. Then, it briefly presents a system for privacy--preserving trajectory collection that meets these requirements. The system is composed of an untrusted server and clients communicating in a P2P network. Location...... data is anonymized in the system using data cloaking and data swapping techniques. Finally, the paper empirically demonstrates that the proposed system is effective and feasible....

  10. The Effectiveness of Health Care Information Technologies: Evaluation of Trust, Security Beliefs, and Privacy as Determinants of Health Care Outcomes

    Science.gov (United States)

    2018-01-01

    Background The diffusion of health information technologies (HITs) within the health care sector continues to grow. However, there is no theory explaining how success of HITs influences patient care outcomes. With the increase in data breaches, HITs’ success now hinges on the effectiveness of data protection solutions. Still, empirical research has only addressed privacy concerns, with little regard for other factors of information assurance. Objective The objective of this study was to study the effectiveness of HITs using the DeLone and McLean Information Systems Success Model (DMISSM). We examined the role of information assurance constructs (ie, the role of information security beliefs, privacy concerns, and trust in health information) as measures of HIT effectiveness. We also investigated the relationships between information assurance and three aspects of system success: attitude toward health information exchange (HIE), patient access to health records, and perceived patient care quality. Methods Using structural equation modeling, we analyzed the data from a sample of 3677 cancer patients from a public dataset. We used R software (R Project for Statistical Computing) and the Lavaan package to test the hypothesized relationships. Results Our extension of the DMISSM to health care was supported. We found that increased privacy concerns reduce the frequency of patient access to health records use, positive attitudes toward HIE, and perceptions of patient care quality. Also, belief in the effectiveness of information security increases the frequency of patient access to health records and positive attitude toward HIE. Trust in health information had a positive association with attitudes toward HIE and perceived patient care quality. Trust in health information had no direct effect on patient access to health records; however, it had an indirect relationship through privacy concerns. Conclusions Trust in health information and belief in the effectiveness of

  11. The Effectiveness of Health Care Information Technologies: Evaluation of Trust, Security Beliefs, and Privacy as Determinants of Health Care Outcomes.

    Science.gov (United States)

    Kisekka, Victoria; Giboney, Justin Scott

    2018-04-11

    The diffusion of health information technologies (HITs) within the health care sector continues to grow. However, there is no theory explaining how success of HITs influences patient care outcomes. With the increase in data breaches, HITs' success now hinges on the effectiveness of data protection solutions. Still, empirical research has only addressed privacy concerns, with little regard for other factors of information assurance. The objective of this study was to study the effectiveness of HITs using the DeLone and McLean Information Systems Success Model (DMISSM). We examined the role of information assurance constructs (ie, the role of information security beliefs, privacy concerns, and trust in health information) as measures of HIT effectiveness. We also investigated the relationships between information assurance and three aspects of system success: attitude toward health information exchange (HIE), patient access to health records, and perceived patient care quality. Using structural equation modeling, we analyzed the data from a sample of 3677 cancer patients from a public dataset. We used R software (R Project for Statistical Computing) and the Lavaan package to test the hypothesized relationships. Our extension of the DMISSM to health care was supported. We found that increased privacy concerns reduce the frequency of patient access to health records use, positive attitudes toward HIE, and perceptions of patient care quality. Also, belief in the effectiveness of information security increases the frequency of patient access to health records and positive attitude toward HIE. Trust in health information had a positive association with attitudes toward HIE and perceived patient care quality. Trust in health information had no direct effect on patient access to health records; however, it had an indirect relationship through privacy concerns. Trust in health information and belief in the effectiveness of information security safeguards increases

  12. E-Commerce and Privacy: Conflict and Opportunity.

    Science.gov (United States)

    Farah, Badie N.; Higby, Mary A.

    2001-01-01

    Electronic commerce has intensified conflict between businesses' need to collect data and customers' desire to protect privacy. Web-based privacy tools and legislation could add to the costs of e-commerce and reduce profitability. Business models not based on profiling customers may be needed. (SK)

  13. Fuzzy Privacy Decision for Context-Aware Access Personal Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qingsheng; QI Yong; ZHAO Jizhong; HOU Di; NIU Yujie

    2007-01-01

    A context-aware privacy protection framework was designed for context-aware services and privacy control methods about access personal information in pervasive environment. In the process of user's privacy decision, it can produce fuzzy privacy decision as the change of personal information sensitivity and personal information receiver trust. The uncertain privacy decision model was proposed about personal information disclosure based on the change of personal information receiver trust and personal information sensitivity. A fuzzy privacy decision information system was designed according to this model. Personal privacy control policies can be extracted from this information system by using rough set theory. It also solves the problem about learning privacy control policies of personal information disclosure.

  14. Technology as a Threat to Privacy: Ethical Challenges and Guidelines for the Information Professionals.

    Science.gov (United States)

    Britz, J. J.

    1996-01-01

    Assesses the impact of technology on privacy. Discusses electronic monitoring of people in the workplace; interception and reading of e-mail messages; merging of databases which contain personal information; rise in the number of hackers; and the development of software that makes the decoding of digital information virtually impossible. Presents…

  15. Children's Privacy in the Big Data Era: Research Opportunities.

    Science.gov (United States)

    Montgomery, Kathryn C; Chester, Jeff; Milosevic, Tijana

    2017-11-01

    This article focuses on the privacy implications of advertising on social media, mobile apps, and games directed at children. Academic research on children's privacy has primarily focused on the safety risks involved in sharing personal information on the Internet, leaving market forces (such as commercial data collection) as a less discussed aspect of children's privacy. Yet, children's privacy in the digital era cannot be fully understood without examining marketing practices, especially in the context of "big data." As children increasingly consume content on an ever-expanding variety of digital devices, media and advertising industries are creating new ways to track their behaviors and target them with personalized content and marketing messages based on individual profiles. The advent of the so-called Internet of Things, with its ubiquitous sensors, is expanding these data collection and profiling practices. These trends raise serious concerns about digital dossiers that could follow young people into adulthood, affecting their access to education, employment, health care, and financial services. Although US privacy law provides some safeguards for children younger than 13 years old online, adolescents are afforded no such protections. Moreover, scholarship on children and privacy continues to lag behind the changes taking place in global media, advertising, and technology. This article proposes collaboration among researchers from a range of fields that will enable cross-disciplinary studies addressing not only the developmental issues related to different age groups but also the design of digital media platforms and the strategies used to influence young people. Copyright © 2017 by the American Academy of Pediatrics.

  16. PMDP: A Framework for Preserving Multiparty Data Privacy in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ji Li

    2017-01-01

    Full Text Available The amount of Internet data is significantly increasing due to the development of network technology, inducing the appearance of big data. Experiments have shown that deep mining and analysis on large datasets would introduce great benefits. Although cloud computing supports data analysis in an outsourced and cost-effective way, it brings serious privacy issues when sending the original data to cloud servers. Meanwhile, the returned analysis result suffers from malicious inference attacks and also discloses user privacy. In this paper, to conquer the above privacy issues, we propose a general framework for Preserving Multiparty Data Privacy (PMDP for short in cloud computing. The PMDP framework can protect numeric data computing and publishing with the assistance of untrusted cloud servers and achieve delegation of storage simultaneously. Our framework is built upon several cryptography primitives (e.g., secure multiparty computation and differential privacy mechanism, which guarantees its security against semihonest participants without collusion. We further instantiate PMDP with specific algorithms and demonstrate its security, efficiency, and advantages by presenting security analysis and performance discussion. Moreover, we propose a security enhanced framework sPMDP to resist malicious inside participants and outside adversaries. We illustrate that both PMDP and sPMDP are reliable and scale well and thus are desirable for practical applications.

  17. Privacy in the Internet: Myth or reality

    Directory of Open Access Journals (Sweden)

    Mikarić Bratislav

    2016-01-01

    Full Text Available The present time, unthinkable without using Internet - from e-mail, through social networks, cloud services, GPS, to YouTube and mobile computing in business, as well as on a private level, poses a question: Is there a way to protect data and their privacy on the Internet? What are the ways to control what personal information we will publicly share with others and is there a safe way to protect privacy on the world's global computer network? The paper gives an overview of the situation in the field, as well as tips for achieving the desired level of data protection.

  18. Safeguarding patient privacy in electronic healthcare in the USA: the legal view.

    Science.gov (United States)

    Walsh, Diana; Passerini, Katia; Varshney, Upkar; Fjermestad, Jerry

    2008-01-01

    The conflict between the sweeping power of technology to access and assemble personal information and the ongoing concern about our privacy and security is ever increasing. While we gradually need higher electronic access to medical information, issues relating to patient privacy and reducing vulnerability to security breaches surmount. In this paper, we take a legal perspective and examine the existing patchwork of laws and obligations governing health information in the USA. The study finds that as Electronic Medical Records (EMRs) increase in scope and dissemination, privacy protections gradually decrease due to the shortcomings in the legal system. The contributions of this paper are (1) an overview of the legal EMR issues in the USA, and (2) the identification of the unresolved legal issues and how these will escalate when health information is transmitted over wireless networks. More specifically, the paper discusses federal and state government regulations such as the Electronic Communications Privacy Act, the Health Insurance Portability and Accountability Act (HIPAA) and judicial intervention. Based on the legal overview, the unresolved challenges are identified and suggestions for future research are included.

  19. Big Data and Consumer Participation in Privacy Contracts: Deciding who Decides on Privacy

    Directory of Open Access Journals (Sweden)

    Michiel Rhoen

    2015-02-01

    Full Text Available Big data puts data protection to the test. Consumers granting permission to process their personal data are increasingly opening up their personal lives, thanks to the “datafication” of everyday life, indefinite data retention and the increasing sophistication of algorithms for analysis.The privacy implications of big data call for serious consideration of consumers’ opportunities to participate in decision-making processes about their contracts. If these opportunities are insufficient, the resulting rules may represent special interests rather than consumers’ needs. This may undermine the legitimacy of big data applications.This article argues that providing sufficient consumer participation in privacy matters requires choosing the best available decision making mechanism. Is a consumer to negotiate his own privacy terms in the market, will lawmakers step in on his behalf, or is he to seek protection through courts? Furthermore is this a matter of national law or European law? These choices will affect the opportunities for achieving different policy goals associated with the possible benefits of the “big data revolution”.

  20. The pedagogy of Momus technologies: Facebook, privacy, and online intimacy.

    Science.gov (United States)

    van Manen, Max

    2010-08-01

    Through cable and wireless connections at home and at work, through Wi-Fi networks and wireless spots in hotels, coffee shops, and town squares, we are indeed connected to each other. But what is the phenomenology of this connection? Technologies of expression such as Facebook, MySpace, Twitter, and other social networking technologies increasingly become like Momus windows of Greek mythology, revealing one's innermost thoughts for all to see. They give access to what used to be personal, secret, and hidden in the lives of its users, especially the young. In this article I explore the pedagogy of Momus effects of social networking technologies in the way they may alter young people's experience of privacy, secrecy, solitude, and intimacy. In addition, I examine the forms of contact afforded by instant messaging and texting on wireless mobile technologies such as the cell phone (and its wireless hybrids) for the way young people are and stay in touch with each other, and how intimacies and inner lives are attended to.

  1. Security, privacy, and confidentiality issues on the Internet

    Science.gov (United States)

    Kelly, Grant; McKenzie, Bruce

    2002-01-01

    We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standard for encrypting Web-based information interchanges is Secure Sockets Layer (SSL), more recently known as Transport Layer Security or TLS. There is a public key infrastructure process to `sign' a message whereby the private key of an individual can be used to `hash' the message. This can then be verified against the sender's public key. This ensures the data's authenticity and origin without conferring privacy, and is called a `digital signature'. The best protection against viruses is not opening e-mails from unknown sources or those containing unusual message headers. PMID:12554559

  2. A concatenated coding scheme for biometric template protection

    NARCIS (Netherlands)

    Shao, X.; Xu, H.; Veldhuis, Raymond N.J.; Slump, Cornelis H.

    2012-01-01

    Cryptography may mitigate the privacy problem in biometric recognition systems. However, cryptography technologies lack error-tolerance and biometric samples cannot be reproduced exactly, rising the robustness problem. The biometric template protection system needs a good feature extraction

  3. Mars Technology Program Planetary Protection Technology Development

    Science.gov (United States)

    Lin, Ying

    2006-01-01

    The objectives of the NASA Planetary Protection program are to preserve biological and organic conditions of solar-system bodies for future scientific exploration and to protect the Earth from potential hazardous extraterrestrial contamination. As the exploration of solar system continues, NASA remains committed to the implementation of planetary protection policy and regulations. To fulfill this commitment, the Mars Technology Program (MTP) has invested in a portfolio of tasks for developing necessary technologies to meet planetary protection requirements for the next decade missions.

  4. Transnational Saudi Arabian Youth and Facebook: Enacting Privacy and Identity

    Science.gov (United States)

    Abokhodair, Norah Abdulwahab

    2017-01-01

    Theories of privacy and identity in relationship to the use of Information Communication Technology (ICT) have been a topic of research for decades. However, little attention has been paid to the perception of privacy and identity from the perspective of Muslim Arab technology users. Privacy and identity in the context of the Arab world is highly…

  5. An Effective Grouping Method for Privacy-Preserving Bike Sharing Data Publishing

    Directory of Open Access Journals (Sweden)

    A S M Touhidul Hasan

    2017-10-01

    Full Text Available Bike sharing programs are eco-friendly transportation systems that are widespread in smart city environments. In this paper, we study the problem of privacy-preserving bike sharing microdata publishing. Bike sharing systems collect visiting information along with user identity and make it public by removing the user identity. Even after excluding user identification, the published bike sharing dataset will not be protected against privacy disclosure risks. An adversary may arrange published datasets based on bike’s visiting information to breach a user’s privacy. In this paper, we propose a grouping based anonymization method to protect published bike sharing dataset from linking attacks. The proposed Grouping method ensures that the published bike sharing microdata will be protected from disclosure risks. Experimental results show that our approach can protect user privacy in the released datasets from disclosure risks and can keep more data utility compared with existing methods.

  6. Millennial dissonance: an analysis of the privacy generational gap

    OpenAIRE

    Sher, Matthew J.

    2012-01-01

    The young Millennial generation has adopted social media and internet technology to an unprecedented degree. But this generation’s extensive usage of online services leaves Millennials open to various privacy vulnerabilities that have emerged with the new technology. Older generations hold concern that Millennials are ignoring the value of privacy when disclosing their personal information in exchange for online connectivity. This paper investigates the generational privacy concern through di...

  7. GAIN RATIO BASED FEATURE SELECTION METHOD FOR PRIVACY PRESERVATION

    Directory of Open Access Journals (Sweden)

    R. Praveena Priyadarsini

    2011-04-01

    Full Text Available Privacy-preservation is a step in data mining that tries to safeguard sensitive information from unsanctioned disclosure and hence protecting individual data records and their privacy. There are various privacy preservation techniques like k-anonymity, l-diversity and t-closeness and data perturbation. In this paper k-anonymity privacy protection technique is applied to high dimensional datasets like adult and census. since, both the data sets are high dimensional, feature subset selection method like Gain Ratio is applied and the attributes of the datasets are ranked and low ranking attributes are filtered to form new reduced data subsets. K-anonymization privacy preservation technique is then applied on reduced datasets. The accuracy of the privacy preserved reduced datasets and the original datasets are compared for their accuracy on the two functionalities of data mining namely classification and clustering using naïve Bayesian and k-means algorithm respectively. Experimental results show that classification and clustering accuracy are comparatively the same for reduced k-anonym zed datasets and the original data sets.

  8. ‘Regulating’ Online Data Privacy

    Directory of Open Access Journals (Sweden)

    Paul Reid

    2004-09-01

    Full Text Available With existing data protection laws proving inadequate in the fight to protect online data privacy and with the offline law of privacy in a state of change and uncertainty, the search for an alternative solution to the important problem of online data privacy should commence. With the inherent problem of jurisdiction that the Internet presents, such a solution is best coming from a multi-national body with the power to approximate laws in as many jurisdictions as possible, with a recognised authority and a functioning enforcement mechanism. The European Union is such a body and while existing data protection laws stem from the EU, they were neither tailored specifically for the Internet and the online world, nor do they fully harmonise the laws of the member states – an essential element in Internet regulation. Current laws face further problems with the ease and frequency of data transfers outwith the EU. An Internet specific online data privacy regulation would fully approximate the laws of the twenty five member states and, if suitably drafted, could perhaps, drawing upon EC competition jurisprudence, achieve a degree of extraterritoriality, thus combating the problem posed by transfers outwith the EU. Any solution, however, is dependant upon our political leaders having the political will and courage to reach and agreement upon any new law.

  9. Quantifying privacy and security of biometric fuzzy commitment

    NARCIS (Netherlands)

    Zhou, Xuebing; Kuijper, Arjan; Veldhuis, Raymond N.J.; Busch, Christoph

    2011-01-01

    Fuzzy commitment is an efficient template protection algorithm that can improve security and safeguard privacy of biometrics. Existing theoretical security analysis has proved that although privacy leakage is unavoidable, perfect security from information-theoretical points of view is possible when

  10. Toward sensitive document release with privacy guarantees

    OpenAIRE

    David Sánchez; Montserrat Batet

    2017-01-01

    Toward sensitive document release with privacy guarantees DOI: 10.1016/j.engappai.2016.12.013 URL: http://www.sciencedirect.com/science/article/pii/S0952197616302408 Filiació URV: SI Inclòs a la memòria: SI Privacy has become a serious concern for modern Information Societies. The sensitive nature of much of the data that are daily exchanged or released to untrusted parties requires that responsible organizations undertake appropriate privacy protection measures. Nowadays, much...

  11. Mars Technology Program: Planetary Protection Technology Development

    Science.gov (United States)

    Lin, Ying

    2006-01-01

    This slide presentation reviews the development of Planetary Protection Technology in the Mars Technology Program. The goal of the program is to develop technologies that will enable NASA to build, launch, and operate a mission that has subsystems with different Planetary Protection (PP) classifications, specifically for operating a Category IVb-equivalent subsystem from a Category IVa platform. The IVa category of planetary protection requires bioburden reduction (i.e., no sterilization is required) The IVb category in addition to IVa requirements: (i.e., terminal sterilization of spacecraft is required). The differences between the categories are further reviewed.

  12. Extending SQL to Support Privacy Policies

    Science.gov (United States)

    Ghazinour, Kambiz; Pun, Sampson; Majedi, Maryam; Chinaci, Amir H.; Barker, Ken

    Increasing concerns over Internet applications that violate user privacy by exploiting (back-end) database vulnerabilities must be addressed to protect both customer privacy and to ensure corporate strategic assets remain trustworthy. This chapter describes an extension onto database catalogues and Structured Query Language (SQL) for supporting privacy in Internet applications, such as in social networks, e-health, e-governmcnt, etc. The idea is to introduce new predicates to SQL commands to capture common privacy requirements, such as purpose, visibility, generalization, and retention for both mandatory and discretionary access control policies. The contribution is that corporations, when creating the underlying databases, will be able to define what their mandatory privacy policies arc with which all application users have to comply. Furthermore, each application user, when providing their own data, will be able to define their own privacy policies with which other users have to comply. The extension is supported with underlying catalogues and algorithms. The experiments demonstrate a very reasonable overhead for the extension. The result is a low-cost mechanism to create new systems that arc privacy aware and also to transform legacy databases to their privacy-preserving equivalents. Although the examples arc from social networks, one can apply the results to data security and user privacy of other enterprises as well.

  13. Privacy and Property? Multi-level Strategies for Protecting Personal Interests in Genetic Material

    OpenAIRE

    Laurie, Graeme

    2003-01-01

    The paper builds on earlier medico-legal work by Laurie on privacy in relation to genetic material. In this chapter, the author discusses not only Laurie's views as 'pro-privacy' but the limitations of privacy, particularly once information, genetic or otherwise, enters a public sphere. The article draws on cases and laws in the UK, continental Europe, and the US, to provide a comparative view in suggesting an alternative approach to privacy.

  14. Data Transmission and Access Protection of Community Medical Internet of Things

    Directory of Open Access Journals (Sweden)

    Xunbao Wang

    2017-01-01

    Full Text Available On the basis of Internet of Things (IoT technologies, Community Medical Internet of Things (CMIoT is a new medical information system and generates massive multiple types of medical data which contain all kinds of user identity data, various types of medical data, and other sensitive information. To effectively protect users’ privacy, we propose a secure privacy data protection scheme including transmission protection and access control. For the uplink transmission data protection, bidirectional identity authentication and fragmented multipath data transmission are used, and for the downlink data protection, fine grained access control and dynamic authorization are used. Through theoretical analysis and experiment evaluation, it is proved that the community medical data can be effectively protected in the transmission and access process without high performance loss.

  15. 78 FR 23810 - Privacy Act System of Records

    Science.gov (United States)

    2013-04-22

    ... SMALL BUSINESS ADMINISTRATION Privacy Act System of Records AGENCY: Small Business Administration. ACTION: Notice of new Privacy Act system of records and request for comment. SUMMARY: The Small Business... the protected information collected from applicants and participants in the Small Business Innovation...

  16. Achieving Network Level Privacy in Wireless Sensor Networks†

    Science.gov (United States)

    Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2010-01-01

    Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power), sensor networks (e.g., mobility and topology) and QoS issues (e.g., packet reach-ability and timeliness). In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks. PMID:22294881

  17. Achieving Network Level Privacy in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sungyoung Lee

    2010-02-01

    Full Text Available Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power, sensor networks (e.g., mobility and topology and QoS issues (e.g., packet reach-ability and timeliness. In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks.

  18. The Impact of User Privacy Concerns and Ethnic Cultural Values on Attitudes toward the Use of Biometric Technology

    Science.gov (United States)

    Carpenter, Darrell R.

    2011-01-01

    Biometric technology is rapidly gaining popularity as an access control mechanism in the workplace. In some instances, systems relying on biometric technology have not been well received by employees. One reason for resistance may be perceived privacy issues associated with biometrics. This research draws on previous organizational information…

  19. Trust-aware Privacy Control for Social Media

    OpenAIRE

    Li, Na; Najafian-Razavi, Maryam; Gillet, Denis

    2011-01-01

    Due to the huge exposure of personal information in social media, a challenge now is to design effective privacy mechanisms that protect against unauthorized access to social data. In this paper, a trust model for social media is first presented. Based on the trust model, a trust-aware privacy control protocol is proposed, that exploits the underlying inter-entity trust information. The objective is to design a fine-grained privacy scheme that ensures a user’s online information is disclosed ...

  20. Personal Data Protection in New Zealand: Lessons for South Africa?

    Directory of Open Access Journals (Sweden)

    A Roos

    2008-12-01

    Full Text Available In 1995 the European Union adopted a Directive on data protection. Article 25 of this Directive compels all EU member countries to adopt data protection legislation and to prevent the transfer of personal data to non-EU member countries ('third countries' that do not provide an adequate level of data protection. Article 25 results in the Directive having extra-territorial effect and exerting an influence in countries outside the EU. Like South Africa, New Zealand is a 'third' country in terms of the EU Directive on data protection. New Zealand recognised the need for data protection and adopted a data protection Act over 15 years ago. The focus of this article is on the data protection provisions in New Zealand law with a view to establishing whether South Africa can learn any lessons from them. In general, it can be said that although New Zealand law does not expressly recognise a right to privacy, it has a data protection regime that functions well and that goes a long way to providing adequate data protection as required by the EU Directive on data protection. Nevertheless, the EU has not made a finding to that effect as yet. The New Zealand data protection act requires a couple of amendments before New Zealand might be adjudged ‘adequate’. South Africa’s protection of the right to privacy and identity is better developed and more extensive than that of New Zealand. Privacy is recognised and protected in the law of delict and by the South African Constitution. Despite South Africa’s apparently high regard for the individual’s right to privacy and identity and our well-developed common and constitutional law of privacy, South Africa does not meet the adequacy requirement of the EU Directive, because we do not have a data protection Act. This means that South African participants in the information technology arena are at a constant disadvantage. It is argued that South Africa should follow New Zealand’s example and adopt a data

  1. Electronic Mail, Privacy, and the Electronic Communications Privacy Act of 1986: Technology in Search of Law.

    Science.gov (United States)

    Samoriski, Jan H.; And Others

    1996-01-01

    Attempts to clarify the status of e-mail privacy under the Electronic Communications Privacy Act of 1986 (ECPA). Examines current law and the paucity of definitive case law. A review of cases and literature suggests there is a gap in the existing ECPA that allows for potentially abusive electronic monitoring and interception of e-mail,…

  2. The awareness of Privacy issues in Ambient Intelligence

    Directory of Open Access Journals (Sweden)

    Mar LÓPEZ

    2015-03-01

    Full Text Available Ambient Intelligence (AmI involves extensive and invisible integration of computer technologies in people´s daily lives: Smart Sensors, Smart Phones, Tablets, Wireless Sensor Network (Wi-Fi, Bluetooth, NFC, RFID, etc., Internet (Facebook, WhatsApp, Twitter, You Tube, Blogs, Cloud Computing, etc.. The Intelligent Environments (IE collect and process a massive amount of person-related and sensitive information.The aim of this work is to show the awareness of privacy issues in AmI and to identify the relevant design issues that should be addressed in order to provide privacy in the design of Ambient Intelligence’s applications focused in the user´s domain and involved technologies. We propose a conceptual framework in order to enforce privacy that takes care of interaction between technologies and devices, users and application´s domain with different modules that contain different steps relating to the privacy policies.

  3. Biometrics and privacy

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2001-01-01

    Biometrics offers many alternatives for protecting our privacy and preventing us from falling victim to crime. Biometrics can even serve as a solid basis for safe anonymous and semi-anonymous legal transactions. In this article Jan Grijpink clarifies which concepts and practical applications this

  4. The disclosure of diagnosis codes can breach research participants' privacy.

    Science.gov (United States)

    Loukides, Grigorios; Denny, Joshua C; Malin, Bradley

    2010-01-01

    De-identified clinical data in standardized form (eg, diagnosis codes), derived from electronic medical records, are increasingly combined with research data (eg, DNA sequences) and disseminated to enable scientific investigations. This study examines whether released data can be linked with identified clinical records that are accessible via various resources to jeopardize patients' anonymity, and the ability of popular privacy protection methodologies to prevent such an attack. The study experimentally evaluates the re-identification risk of a de-identified sample of Vanderbilt's patient records involved in a genome-wide association study. It also measures the level of protection from re-identification, and data utility, provided by suppression and generalization. Privacy protection is quantified using the probability of re-identifying a patient in a larger population through diagnosis codes. Data utility is measured at a dataset level, using the percentage of retained information, as well as its description, and at a patient level, using two metrics based on the difference between the distribution of Internal Classification of Disease (ICD) version 9 codes before and after applying privacy protection. More than 96% of 2800 patients' records are shown to be uniquely identified by their diagnosis codes with respect to a population of 1.2 million patients. Generalization is shown to reduce further the percentage of de-identified records by less than 2%, and over 99% of the three-digit ICD-9 codes need to be suppressed to prevent re-identification. Popular privacy protection methods are inadequate to deliver a sufficiently protected and useful result when sharing data derived from complex clinical systems. The development of alternative privacy protection models is thus required.

  5. Security, privacy and trust in cloud systems

    CERN Document Server

    Nepal, Surya

    2013-01-01

    The book compiles technologies for enhancing and provisioning security, privacy and trust in cloud systems based on Quality of Service requirements. It is a timely contribution to a field that is gaining considerable research interest, momentum, and provides a comprehensive coverage of technologies related to cloud security, privacy and trust. In particular, the book includes - Cloud security fundamentals and related technologies to-date, with a comprehensive coverage of evolution, current landscape, and future roadmap. - A smooth organization with introductory, advanced and specialist content

  6. Developing genetic privacy legislation: the South Carolina experience.

    Science.gov (United States)

    Edwards, J G; Young, S R; Brooks, K A; Aiken, J H; Patterson, E D; Pritchett, S T

    1998-01-01

    The availability of presymptomatic and predisposition genetic testing has spawned the need for legislation prohibiting health insurance discrimination on the basis of genetic information. The federal effort, the Health Insurance Portability and Accountability Act (HIPAA) of 1996, falls short by protecting only those who access insurance through group plans. A committee of University of South Carolina professionals convened in 1996 to develop legislation in support of genetic privacy for the state of South Carolina. The legislation prevents health insurance companies from denying coverage or setting insurance rates on the basis of genetic information. It also protects the privacy of genetic information and prohibits performance of genetic tests without specific informed consent. In preparing the bill, genetic privacy laws from other states were reviewed, and a modified version of the Virginia law adopted. The South Carolina Committee for the Protection of Genetic Privacy version went a step further by including enforcement language and excluding Virginia's sunset clause. The definition of genetic information encompassed genetic test results, and importantly, includes family history of genetic disease. Our experience in navigating through the state legislature and working through opposition from the health insurance lobby is detailed herein.

  7. Privacy and human behavior in the age of information.

    Science.gov (United States)

    Acquisti, Alessandro; Brandimarte, Laura; Loewenstein, George

    2015-01-30

    This Review summarizes and draws connections between diverse streams of empirical research on privacy behavior. We use three themes to connect insights from social and behavioral sciences: people's uncertainty about the consequences of privacy-related behaviors and their own preferences over those consequences; the context-dependence of people's concern, or lack thereof, about privacy; and the degree to which privacy concerns are malleable—manipulable by commercial and governmental interests. Organizing our discussion by these themes, we offer observations concerning the role of public policy in the protection of privacy in the information age. Copyright © 2015, American Association for the Advancement of Science.

  8. Robust image obfuscation for privacy protection in Web 2.0 applications

    Science.gov (United States)

    Poller, Andreas; Steinebach, Martin; Liu, Huajian

    2012-03-01

    We present two approaches to robust image obfuscation based on permutation of image regions and channel intensity modulation. The proposed concept of robust image obfuscation is a step towards end-to-end security in Web 2.0 applications. It helps to protect the privacy of the users against threats caused by internet bots and web applications that extract biometric and other features from images for data-linkage purposes. The approaches described in this paper consider that images uploaded to Web 2.0 applications pass several transformations, such as scaling and JPEG compression, until the receiver downloads them. In contrast to existing approaches, our focus is on usability, therefore the primary goal is not a maximum of security but an acceptable trade-off between security and resulting image quality.

  9. Privacy-Preserving Matching of Spatial Datasets with Protection against Background Knowledge

    DEFF Research Database (Denmark)

    Ghinita, Gabriel; Vicente, Carmen Ruiz; Shang, Ning

    2010-01-01

    should be disclosed. Previous research efforts focused on private matching for relational data, and rely either on spaceembedding or on SMC techniques. Space-embedding transforms data points to hide their exact attribute values before matching is performed, whereas SMC protocols simulate complex digital...... circuits that evaluate the matching condition without revealing anything else other than the matching outcome. However, existing solutions have at least one of the following drawbacks: (i) they fail to protect against adversaries with background knowledge on data distribution, (ii) they compromise privacy...... by returning large amounts of false positives and (iii) they rely on complex and expensive SMC protocols. In this paper, we introduce a novel geometric transformation to perform private matching on spatial datasets. Our method is efficient and it is not vulnerable to background knowledge attacks. We consider...

  10. 75 FR 23214 - HIPAA Privacy Rule Accounting of Disclosures Under the Health Information Technology for Economic...

    Science.gov (United States)

    2010-05-03

    ...-AB62 HIPAA Privacy Rule Accounting of Disclosures Under the Health Information Technology for Economic... disclosures, the administrative burden on covered entities and business associates of accounting for such...: HITECH Accounting of Disclosures, Hubert H. Humphrey Building, Room 509F, 200 Independence Avenue, SW...

  11. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  12. Privacy Protection in Participatory Sensing Applications Requiring Fine-Grained Locations

    DEFF Research Database (Denmark)

    Dong, Kai; Gu, Tao; Tao, Xianping

    2010-01-01

    The emerging participatory sensing applications have brought a privacy risk where users expose their location information. Most of the existing solutions preserve location privacy by generalizing a precise user location to a coarse-grained location, and hence they cannot be applied in those appli...... provider is an trustworthy entity, making our solution more feasible to practical applications. We present and analyze our security model, and evaluate the performance and scalability of our system....

  13. 75 FR 68852 - Privacy Act of 1974; System of Records Notice

    Science.gov (United States)

    2010-11-09

    ...., Washington, DC 20590 or [email protected] . FOR FURTHER INFORMATION CONTACT: For privacy issues please... DEPARTMENT OF TRANSPORTATION Office of the Secretary Privacy Act of 1974; System of Records Notice... Secretary of Transportation (DOT/OST) proposes to establish a DOT-wide system of records under the Privacy...

  14. Privacy as Personality Right: Why the ECtHR’s Focus on Ulterior Interests Might Prove Indispensable in the Age of “Big Data”

    Directory of Open Access Journals (Sweden)

    Bart van der Sloot

    2015-02-01

    Full Text Available Article 8 ECHR was adopted as a classic negative right, which provides the citizen protection from unlawful and arbitrary interference by the state with his private and family life, home and communication. The ECtHR, however, has gradually broadened its scope so that the right to privacy encroaches upon other provisions embodied in the Convention, includes rights and freedoms explicitly left out of the ECHR by the drafters of the Convention and functions as the main pillar on which the Court has built its practice of opening up the Convention for new rights and freedoms. Consequently, Article 8 ECHR has been transformed from a classic privacy right to a personality right, providing protection to the personal development of individuals. Apart from its theoretical significance, this shift might prove indispensable in the age of Big Data, as personality rights protect a different type of interest, which is far more easy to substantiate in the new technological paradigm than those associated with the right to privacy.

  15. The Genetic Privacy Act and commentary

    Energy Technology Data Exchange (ETDEWEB)

    Annas, G.J.; Glantz, L.H.; Roche, P.A.

    1995-02-28

    The Genetic Privacy Act is a proposal for federal legislation. The Act is based on the premise that genetic information is different from other types of personal information in ways that require special protection. Therefore, to effectively protect genetic privacy unauthorized collection and analysis of individually identifiable DNA must be prohibited. As a result, the premise of the Act is that no stranger should have or control identifiable DNA samples or genetic information about an individual unless that individual specifically authorizes the collection of DNA samples for the purpose of genetic analysis, authorized the creation of that private information, and has access to and control over the dissemination of that information.

  16. Protecting Privacy in Big Data: A Layered Approach for Curriculum Integration

    Science.gov (United States)

    Schwieger, Dana; Ladwig, Christine

    2016-01-01

    The demand for college graduates with skills in big data analysis is on the rise. Employers in all industry sectors have found significant value in analyzing both separate and combined data streams. However, news reports continue to script headlines drawing attention to data improprieties, privacy breaches and identity theft. While data privacy is…

  17. Personalized privacy-preserving frequent itemset mining using randomized response.

    Science.gov (United States)

    Sun, Chongjing; Fu, Yan; Zhou, Junlin; Gao, Hui

    2014-01-01

    Frequent itemset mining is the important first step of association rule mining, which discovers interesting patterns from the massive data. There are increasing concerns about the privacy problem in the frequent itemset mining. Some works have been proposed to handle this kind of problem. In this paper, we introduce a personalized privacy problem, in which different attributes may need different privacy levels protection. To solve this problem, we give a personalized privacy-preserving method by using the randomized response technique. By providing different privacy levels for different attributes, this method can get a higher accuracy on frequent itemset mining than the traditional method providing the same privacy level. Finally, our experimental results show that our method can have better results on the frequent itemset mining while preserving personalized privacy.

  18. Digital privacy in Asia: Setting the agenda | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-06-09

    Jun 9, 2016 ... The report, A New Dawn: Privacy in Asia, summarizes the findings of the research. ... among citizens about protecting their personal data and Internet privacy. ... A study on mobile phone use by the poor has resulted in the ...

  19. What's that, you say? Employee expectations of privacy when using employer-provided technology--and how employers can defeat them.

    Science.gov (United States)

    Herrin, Barry S

    2012-01-01

    Two 2010 court cases that determined the effectiveness of policies governing employees' use of employer-provided communication devices can be used to guide employers when constructing their own technology policies. In light of a policy that stated that "users should have no expectation of privacy or confidentiality," one case established that the employer was in the right. However, a separate case favored the employee due, in part, to an "unclear and ambiguous" policy. Ultimately, employers can restrict the use of employer-furnished technology by employees by: 1) clearly outlining that employees do not have a reasonable expectation of privacy in their use of company devices; 2) stating that any use of personal e-mail accounts using employer-provided technology will be subject to the policy; 3) detailing all technology used to monitor employees; 4) identifying company devices covered; 5) not exposing the content of employee communications; and 6) having employees sign and acknowledge the policy.

  20. Because we care: Privacy Dashboard on Firefox OS

    OpenAIRE

    Piekarska, Marta; Zhou, Yun; Strohmeier, Dominik; Raake, Alexander

    2015-01-01

    In this paper we present the Privacy Dashboard -- a tool designed to inform and empower the people using mobile devices, by introducing features such as Remote Privacy Protection, Backup, Adjustable Location Accuracy, Permission Control and Secondary-User Mode. We have implemented our solution on FirefoxOS and conducted user studies to verify the usefulness and usability of our tool. The paper starts with a discussion of different aspects of mobile privacy, how users perceive it and how much ...

  1. Reclaiming Data Ownership: Differential Privacy in a Decentralized Setting

    OpenAIRE

    Asplund, Alexander Benjamin; Hartvigsen, Peter F

    2015-01-01

    In the field of privacy-preserving data mining the common practice have been to gather data from the users, centralize it in a single database, and employ various anonymization techniques to protect the personally identifiable information contained within the data. Both theoretical analyses and real-world examples of data breaches have proven that these methods have severe shortcomings in protecting an individual's privacy. A major breakthrough was achieved in 2006 when a method called differ...

  2. Balancing Between Privacy and Patient Needs for Health Information in the Age of Participatory Health and Social Media: A Scoping Review.

    Science.gov (United States)

    Househ, Mowafa; Grainger, Rebecca; Petersen, Carolyn; Bamidis, Panagiotis; Merolli, Mark

    2018-04-22

    balancing individual needs and the desire to uphold privacy and confidentiality. We recommend that guidelines for both patients and clinicians, in terms of their use of participatory health-enabling technologies, are developed to ensure that patient privacy and confidentiality are protected, and a maximum benefit can be realized. Georg Thieme Verlag KG Stuttgart.

  3. FIRE PROTECTION SYSTEMS AND TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Aristov Denis Ivanovich

    2016-03-01

    Full Text Available The All-Russian Congress “Fire Stop Moscow” was de-voted to the analysis of the four segments of the industry of fire protection systems and technologies: the design of fire protec-tion systems, the latest developments and technologies of active and passive fire protection of buildings, the state and the devel-opment of the legal framework, the practice of fire protection of buildings and structures. The forum brought together the repre-sentatives of the industry of fire protection systems, scientists, leading experts, specialists in fire protection and representatives of construction companies from different regions of Russia. In parallel with the Congress Industrial Exhibition of fire protection systems, materials and technology was held, where manufacturers presented their products. The urgency of the “Fire Stop Moscow” Congress in 2015 organized by the Congress Bureau ODF Events lies primarily in the fact that it considered the full range of issues related to the fire protection of building and construction projects; studied the state of the regulatory framework for fire safety and efficiency of public services, research centers, private companies and busi-nesses in the area of fire safety. The main practical significance of the event which was widely covered in the media space, was the opportunity to share the views and information between management, science, and practice of business on implementing fire protection systems in the conditions of modern economic relations and market realities. : congress, fire protection, systems, technologies, fire protection systems, exhibition

  4. Security and privacy in biometrics

    CERN Document Server

    Campisi, Patrizio

    2013-01-01

    This important text/reference presents the latest secure and privacy-compliant techniques in automatic human recognition. Featuring viewpoints from an international selection of experts in the field, the comprehensive coverage spans both theory and practical implementations, taking into consideration all ethical and legal issues. Topics and features: presents a unique focus on novel approaches and new architectures for unimodal and multimodal template protection; examines signal processing techniques in the encrypted domain, security and privacy leakage assessment, and aspects of standardizati

  5. 77 FR 46643 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2012-08-06

    ... providing notice to and obtaining consent from parents. Conversely, online services whose business models..., challenging others to gameplay, swapping digital collectibles, participating in monitored `chat' with... Digital Democracy (``CDD''), Consumers Union (``CU''), and the Electronic Privacy Information Center...

  6. Not All Adware Is Badware: Towards Privacy-Aware Advertising

    Science.gov (United States)

    Haddadi, Hamed; Guha, Saikat; Francis, Paul

    Online advertising is a major economic force in the Internet today. A basic goal of any advertising system is to accurately target the ad to the recipient audience. While Internet technology brings the promise of extremely well-targeted ad placement, there have always been serious privacy concerns surrounding personalization. Today there is a constant battle between privacy advocates and advertisers, where advertisers try to push new personalization technologies, and privacy advocates try to stop them. As long as privacy advocates, however, are unable to propose an alternative personalization system that is private, this is a battle they are destined to lose. This paper presents the framework for such an alternative system, the Private Verifiable Advertising (Privad). We describe the privacy issues associated with today’s advertising systems, describe Privad, and discuss its pros and cons and the challenges that remain.

  7. Privacy Challenges of Genomic Big Data.

    Science.gov (United States)

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  8. What was privacy?

    Science.gov (United States)

    McCreary, Lew

    2008-10-01

    Why is that question in the past tense? Because individuals can no longer feel confident that the details of their lives--from identifying numbers to cultural preferences--will be treated with discretion rather than exploited. Even as Facebook users happily share the names of their favorite books, movies, songs, and brands, they often regard marketers' use of that information as an invasion of privacy. In this wide-ranging essay, McCreary, a senior editor at HBR, examines numerous facets of the privacy issue, from Google searches, public shaming on the internet, and cell phone etiquette to passenger screening devices, public surveillance cameras, and corporate chief privacy officers. He notes that IBM has been a leader on privacy; its policy forswearing the use of employees' genetic information in hiring and benefits decisions predated the federal Genetic Information Nondiscrimination Act by three years. Now IBM is involved in an open-source project known as Higgins to provide users with transportable, potentially anonymous online presences. Craigslist, whose CEO calls it "as close to 100% user driven as you can get," has taken an extremely conservative position on privacy--perhaps easier for a company with a declared lack of interest in maximizing revenue. But TJX and other corporate victims of security breaches have discovered that retaining consumers' transaction information can be both costly and risky. Companies that underestimate the importance of privacy to their customers or fail to protect it may eventually face harsh regulation, reputational damage, or both. The best thing they can do, says the author, is negotiate directly with those customers over where to draw the line.

  9. Privacy in the digital world: medical and health data outside of HIPAA protections.

    Science.gov (United States)

    Glenn, Tasha; Monteith, Scott

    2014-11-01

    Increasing quantities of medical and health data are being created outside of HIPAA protection, primarily by patients. Data sources are varied, including the use of credit cards for physician visit and medication co-pays, Internet searches, email content, social media, support groups, and mobile health apps. Most medical and health data not covered by HIPAA are controlled by third party data brokers and Internet companies. These companies combine this data with a wide range of personal information about consumer daily activities, transactions, movements, and demographics. The combined data are used for predictive profiling of individual health status, and often sold for advertising and other purposes. The rapid expansion of medical and health data outside of HIPAA protection is encroaching on privacy and the doctor-patient relationship, and is of particular concern for psychiatry. Detailed discussion of the appropriate handling of this medical and health data is needed by individuals with a wide variety of expertise.

  10. The ABC of ABC : An analysis of attribute-based credentials in the light of data protection, privacy and identity.

    NARCIS (Netherlands)

    Korenhof, P.E.I.; Koning, Merel; Alpár, Gergely; Hoepman, J.H.; Padullés, Joan Balcells; i Martínez, Agustí Cerrillo; Poch, Miquel Peguera; López, Ismael Peña; de Moner, María José Pifarré; Solana, Mònica Vilasau

    2014-01-01

    Our networked society increasingly needs secure identity sys- tems. The Attribute-based credential (ABC) technology is designed to be privacy-friendlier than contemporary authentication methods, which often suffer from information leakage. So far, however, some of the wider implications of ABC have

  11. Privacy and security in the digital age: Contemporary ethical challenges and future directions

    DEFF Research Database (Denmark)

    Hiranandani, Vanmala Sunder

    2011-01-01

    Privacy is at the core of civil rights from which all other human rights and freedoms flow. Since the twentieth century, and particularly since 9/11, rapid deployment of information and surveillance technologies in the name of national security has grave implications for individual privacy...... and human rights. This article reviews major strands in contemporary privacy-security debate, while critiquing existing conceptualisations of privacy that are inadequate in the context of multifaceted and ubiquitous surveillance technologies post 9/11. Further, this paper contends most privacy...

  12. Discrimination and Privacy in the Information Society Data Mining and Profiling in Large Databases

    CERN Document Server

    Calders, Toon; Schermer, Bart; Zarsky, Tal

    2013-01-01

    Vast amounts of data are nowadays collected, stored and processed, in an effort to assist in  making a variety of administrative and governmental decisions. These innovative steps considerably improve the speed, effectiveness and quality of decisions. Analyses are increasingly performed by data mining and profiling technologies that statistically and automatically determine patterns and trends. However, when such practices lead to unwanted or unjustified selections, they may result in unacceptable forms of  discrimination. Processing vast amounts of data may lead to situations in which data controllers know many of the characteristics, behaviors and whereabouts of people. In some cases, analysts might know more about individuals than these individuals know about themselves. Judging people by their digital identities sheds a different light on our views of privacy and data protection. This book discusses discrimination and privacy issues related to data mining and profiling practices. It provides technologic...

  13. Privacy Concerns: The Effects of the Latest FERPA Changes

    Science.gov (United States)

    Cossler, Christine

    2010-01-01

    Privacy, something once taken for granted, has again become top-of-mind for public school districts thanks to technology's increasing reach, as well as new changes to privacy laws governing student information. Recently, educators have had to face important changes to the Family Educational Rights and Privacy Act (FERPA), originally signed into…

  14. Privacy Protection in Data Sharing : Towards Feedback Solutions

    NARCIS (Netherlands)

    R. Meijer; P. Conradie; R. Choenni; M.S. Bargh

    2014-01-01

    Sharing data is gaining importance in recent years due to proliferation of social media and a growing tendency of governments to gain citizens’ trust through being transparent. Data dissemination, however, increases chance of compromising privacy sensitive data, which undermines trust of data

  15. Comparative Approaches to Biobanks and Privacy.

    Science.gov (United States)

    Rothstein, Mark A; Knoppers, Bartha Maria; Harrell, Heather L

    2016-03-01

    Laws in the 20 jurisdictions studied for this project display many similar approaches to protecting privacy in biobank research. Although few have enacted biobank-specific legislation, many countries address biobanking within other laws. All provide for some oversight mechanisms for biobank research, even though the nature of that oversight varies between jurisdictions. Most have some sort of controlled access system in place for research with biobank specimens. While broad consent models facilitate biobanking, countries without national or federated biobanks have been slow to adopt broad consent. International guidelines have facilitated sharing and generally take a proportional risk approach, but many countries have provisions guiding international sharing and a few even limit international sharing. Although privacy laws may not prohibit international collaborations, the multi-prong approach to privacy unique to each jurisdiction can complicate international sharing. These symposium issues can serve as a resource for explaining the sometimes intricate privacy laws in each studied jurisdiction, outlining the key issues with regards to privacy and biobanking, and serving to describe a framework for the process of harmonization of privacy laws. © 2016 American Society of Law, Medicine & Ethics.

  16. 45 CFR 2508.3 - What is the Corporation's Privacy Act policy?

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false What is the Corporation's Privacy Act policy? 2508... NATIONAL AND COMMUNITY SERVICE IMPLEMENTATION OF THE PRIVACY ACT OF 1974 § 2508.3 What is the Corporation's Privacy Act policy? It is the policy of the Corporation to protect, preserve, and defend the right of...

  17. Privacy-Preserving Restricted Boltzmann Machine

    Directory of Open Access Journals (Sweden)

    Yu Li

    2014-01-01

    Full Text Available With the arrival of the big data era, it is predicted that distributed data mining will lead to an information technology revolution. To motivate different institutes to collaborate with each other, the crucial issue is to eliminate their concerns regarding data privacy. In this paper, we propose a privacy-preserving method for training a restricted boltzmann machine (RBM. The RBM can be got without revealing their private data to each other when using our privacy-preserving method. We provide a correctness and efficiency analysis of our algorithms. The comparative experiment shows that the accuracy is very close to the original RBM model.

  18. δ-dependency for privacy-preserving XML data publishing.

    Science.gov (United States)

    Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny

    2014-08-01

    An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis

  19. Adolescents and Social Media: Privacy, Brain Development, and the Law.

    Science.gov (United States)

    Costello, Caitlin R; McNiel, Dale E; Binder, Renée L

    2016-09-01

    Adolescents under the age of 18 are not recognized in the law as adults, nor do they have the fully developed capacity of adults. Yet teens regularly enter into contractual arrangements with operators of websites to send and post information about themselves. Their level of development limits their capacity to understand the implications of online communications, yet the risks are real to adolescents' privacy and reputations. This article explores an apparent contradiction in the law: that in areas other than online communications, U.S. legal systems seek to protect minors from the limitations of youth. The Children's Online Privacy Protection Act provides some protection to the privacy of young people, but applies only to children under age 13, leaving minors of ages 13 to 17 with little legal protection in their online activities. In this article, we discuss several strategies to mitigate the risks of adolescent online activity. © 2016 American Academy of Psychiatry and the Law.

  20. Privacy as Fundamental right:The Case of Indian AAdhar

    DEFF Research Database (Denmark)

    Khajuria, Samant; Skouby, Knud Erik; Sørensen, Lene Tolstrup

    In August 2017; unanimous judgment by the Supreme Court of India (SCI) i was a resounding victory for privacy. The ruling was the outcome of a petition challenging the constitutional validity of the Indian biometric identity scheme Aadhaar. The one-page order signed by all nine judges declares....... “The right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” Privacy is a key concerns today in the emerging digital market of India. The vision of Digital Indian can only......-mining tool. Looking into EU General Data Protection Regulation (GDPR), where the main goal of the regulation is to build and/or increase trust in the EU citizens in using digital services. Similarly, a clear well defined privacy regulations needs to be in place in India with heavy fines failing to comply....

  1. Practical Secure Transaction for Privacy-Preserving Ride-Hailing Services

    Directory of Open Access Journals (Sweden)

    Chenglong Cao

    2018-01-01

    Full Text Available Ride-hailing service solves the issue of taking a taxi difficultly in rush hours. It is changing the way people travel and has had a rapid development in recent years. Since the service is offered over the Internet, there is a great deal of uncertainty about security and privacy. Focusing on the issue, we changed payment pattern of existing systems and designed a privacy protection ride-hailing scheme. E-cash was generated by a new partially blind signature protocol that achieves e-cash unforgeability and passenger privacy. Particularly, in the face of a service platform and a payment platform, a passenger is still anonymous. Additionally, a lightweight hash chain was constructed to keep e-cash divisible and reusable, which increases practicability of transaction systems. The analysis shows that the scheme has small communication and computation costs, and it can be effectively applied in the ride-hailing service with privacy protection.

  2. Evaluating Common Privacy Vulnerabilities in Internet Service Providers

    Science.gov (United States)

    Kotzanikolaou, Panayiotis; Maniatis, Sotirios; Nikolouzou, Eugenia; Stathopoulos, Vassilios

    Privacy in electronic communications receives increased attention in both research and industry forums, stemming from both the users' needs and from legal and regulatory requirements in national or international context. Privacy in internet-based communications heavily relies on the level of security of the Internet Service Providers (ISPs), as well as on the security awareness of the end users. This paper discusses the role of the ISP in the privacy of the communications. Based on real security audits performed in national-wide ISPs, we illustrate privacy-specific threats and vulnerabilities that many providers fail to address when implementing their security policies. We subsequently provide and discuss specific security measures that the ISPs can implement, in order to fine-tune their security policies in the context of privacy protection.

  3. Pythia: A Privacy-enhanced Personalized Contextual Suggestion System for Tourism

    NARCIS (Netherlands)

    Drosatos, G.; Efraimidis, P.S.; Arampatzis, A.; Stamatelatos, G.; Athanasiadis, I.N.

    2015-01-01

    We present Pythia, a privacy-enhanced non-invasive contextual suggestion system for tourists, with important architectural innovations. The system offers high quality personalized recommendations, non-invasive operation and protection of user privacy. A key feature of Pythia is the exploitation of

  4. Survey of main challenges (security and privacy in wireless body area networks for healthcare applications

    Directory of Open Access Journals (Sweden)

    Samaher Al-Janabi

    2017-07-01

    Full Text Available Wireless Body Area Network (WBAN is a new trend in the technology that provides remote mechanism to monitor and collect patient’s health record data using wearable sensors. It is widely recognized that a high level of system security and privacy play a key role in protecting these data when being used by the healthcare professionals and during storage to ensure that patient’s records are kept safe from intruder’s danger. It is therefore of great interest to discuss security and privacy issues in WBANs. In this paper, we reviewed WBAN communication architecture, security and privacy requirements and security threats and the primary challenges in WBANs to these systems based on the latest standards and publications. This paper also covers the state-of-art security measures and research in WBAN. Finally, open areas for future research and enhancements are explored.

  5. The Privacy Calculus: Mobile Apps and User Perceptions of Privacy and Security

    Directory of Open Access Journals (Sweden)

    Elizabeth Fife

    2012-07-01

    Full Text Available A continuing stream of new mobile data services are being released that rely upon the collection of personal data to support a business model. New technologies including facial recognition, sensors and Near Field Communications (NFC will increasingly become a part of everyday services and applications that challenge traditional concepts of individual privacy. The average person as well as the “tech‐savvy” mobile phone user may not yet be fully aware of the extent to which their privacy and security are being affected through their mobile activities and how comparable this situation is to personal computer usage. We investigate perceptions and usage of mobile data services that appear to have specific privacy and security sensitivities, specifically social networking,\tbanking/payments\tand\thealth‐related activities. Our annual survey of smartphone users in the U.S. and Japan is presented from 2011. This nationally representative survey data is used to show demographic and cultural differences, and substantiate our hypotheses about the links between use and privacy concerns

  6. 76 FR 71417 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Law Enforcement Agencies (LEA...

    Science.gov (United States)

    2011-11-17

    ...; Computer Matching Program (SSA/ Law Enforcement Agencies (LEA)) Match Number 5001 AGENCY: Social Security... protections for such persons. The Privacy Act, as amended, regulates the use of computer matching by Federal... accordance with the Privacy Act of 1974, as amended by the Computer Matching and Privacy Protection Act of...

  7. Acoustic assessment of speech privacy curtains in two nursing units

    Science.gov (United States)

    Pope, Diana S.; Miller-Klein, Erik T.

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s’ standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered. PMID:26780959

  8. Acoustic assessment of speech privacy curtains in two nursing units.

    Science.gov (United States)

    Pope, Diana S; Miller-Klein, Erik T

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s' standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  9. Acoustic assessment of speech privacy curtains in two nursing units

    Directory of Open Access Journals (Sweden)

    Diana S Pope

    2016-01-01

    Full Text Available Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient′s bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s′ standard hospital construction and the other was newly refurbished (2013 with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  10. Privacy and Ethics in Undergraduate GIS Curricula

    Science.gov (United States)

    Scull, Peter; Burnett, Adam; Dolfi, Emmalee; Goldfarb, Ali; Baum, Peter

    2016-01-01

    The development of location-aware technologies, such as smartphones, raises serious questions regarding locational privacy and the ethical use of geographic data. The degree to which these concepts are taught in undergraduate geographic information science (GISci) courses is unknown. A survey of GISci educators shows that issues of privacy and…

  11. Privacy concerns in smart cities

    OpenAIRE

    van Zoonen, Liesbet

    2016-01-01

    textabstractIn this paper a framework is constructed to hypothesize if and how smart city technologies and urban big data produce privacy concerns among the people in these cities (as inhabitants, workers, visitors, and otherwise). The framework is built on the basis of two recurring dimensions in research about people's concerns about privacy: one dimensions represents that people perceive particular data as more personal and sensitive than others, the other dimension represents that people'...

  12. Understanding compliance to privacy guidelines using text-and video-based scenarios

    NARCIS (Netherlands)

    Al Mahmud, A.; Kaptein, M.C.; Moran, O.P.; Garde - Perik, van de E.M.; Markopoulos, P.; Baranauskas, C.; Palanque, P.

    2008-01-01

    Privacy is a major concern for the design and user acceptance of pervasive technology. Investigating privacy poses several methodological challenges. A popular approach involves surveying reactions of people to scenarios that highlight privacy issues. This paper examines the validity of this

  13. Mining Roles and Access Control for Relational Data under Privacy and Accuracy Constraints

    Science.gov (United States)

    Pervaiz, Zahid

    2013-01-01

    Access control mechanisms protect sensitive information from unauthorized users. However, when sensitive information is shared and a Privacy Protection Mechanism (PPM) is not in place, an authorized insider can still compromise the privacy of a person leading to identity disclosure. A PPM can use suppression and generalization to anonymize and…

  14. Location Privacy Techniques in Client-Server Architectures

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yiu, Man Lung

    2009-01-01

    A typical location-based service returns nearby points of interest in response to a user location. As such services are becoming increasingly available and popular, location privacy emerges as an important issue. In a system that does not offer location privacy, users must disclose their exact...... locations in order to receive the desired services. We view location privacy as an enabling technology that may lead to increased use of location-based services. In this chapter, we consider location privacy techniques that work in traditional client-server architectures without any trusted components other....... Third, their effectiveness is independent of the distribution of other users, unlike the k-anonymity approach. The chapter characterizes the privacy models assumed by existing techniques and categorizes these according to their approach. The techniques are then covered in turn according...

  15. User Privacy and Empowerment: Trends, Challenges, and Opportunities

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2018-01-01

    to the service providers. Considering business models that are slanted towards service provid-ers, privacy has become a crucial issue in today’s fast growing digital world. Hence, this paper elaborates personal information flow between users, service providers, and data brokers. We also discussed the significant...... privacy issues like present business models, user awareness about privacy and user control over per-sonal data. To address such issues, this paper also identified challenges that com-prise unavailability of effective privacy awareness or protection tools and the ef-fortless way to study and see the flow...... of personal information and its manage-ment. Thus, empowering users and enhancing awareness are essential to compre-hending the value of secrecy. This paper also introduced latest advances in the domain of privacy issues like User Managed Access (UMA) can state suitable requirements for user empowerment...

  16. DQC Comments on the Posted Recommendations Regarding Data Security and Privacy Protections

    Science.gov (United States)

    Data Quality Campaign, 2010

    2010-01-01

    The U.S. Department of Education is conducting several activities to address privacy and security issues related to education data. Earlier this year a contractor for the Department convened a group of privacy and security experts and produced a report with recommendations to the Department on ways they can address emerging challenges in…

  17. Protecting genomic sequence anonymity with generalization lattices.

    Science.gov (United States)

    Malin, B A

    2005-01-01

    Current genomic privacy technologies assume the identity of genomic sequence data is protected if personal information, such as demographics, are obscured, removed, or encrypted. While demographic features can directly compromise an individual's identity, recent research demonstrates such protections are insufficient because sequence data itself is susceptible to re-identification. To counteract this problem, we introduce an algorithm for anonymizing a collection of person-specific DNA sequences. The technique is termed DNA lattice anonymization (DNALA), and is based upon the formal privacy protection schema of k -anonymity. Under this model, it is impossible to observe or learn features that distinguish one genetic sequence from k-1 other entries in a collection. To maximize information retained in protected sequences, we incorporate a concept generalization lattice to learn the distance between two residues in a single nucleotide region. The lattice provides the most similar generalized concept for two residues (e.g. adenine and guanine are both purines). The method is tested and evaluated with several publicly available human population datasets ranging in size from 30 to 400 sequences. Our findings imply the anonymization schema is feasible for the protection of sequences privacy. The DNALA method is the first computational disclosure control technique for general DNA sequences. Given the computational nature of the method, guarantees of anonymity can be formally proven. There is room for improvement and validation, though this research provides the groundwork from which future researchers can construct genomics anonymization schemas tailored to specific datasharing scenarios.

  18. 77 FR 30433 - Privacy Act of 1974: Implementation of Exemptions; Automated Targeting System

    Science.gov (United States)

    2012-05-23

    ... Border Protection, Mint Annex, 799 Ninth Street NW., Washington, DC 20229. For privacy issues please... Secretary 6 CFR Part 5 [Docket No. DHS-2012-0020] Privacy Act of 1974: Implementation of Exemptions; Automated Targeting System AGENCY: Privacy Office, DHS. ACTION: Notice of proposed rulemaking. SUMMARY: The...

  19. Protecting the privacy of individual general practice patient electronic records for geospatial epidemiology research.

    Science.gov (United States)

    Mazumdar, Soumya; Konings, Paul; Hewett, Michael; Bagheri, Nasser; McRae, Ian; Del Fante, Peter

    2014-12-01

    General practitioner (GP) practices in Australia are increasingly storing patient information in electronic databases. These practice databases can be accessed by clinical audit software to generate reports that inform clinical or population health decision making and public health surveillance. Many audit software applications also have the capacity to generate de-identified patient unit record data. However, the de-identified nature of the extracted data means that these records often lack geographic information. Without spatial references, it is impossible to build maps reflecting the spatial distribution of patients with particular conditions and needs. Links to socioeconomic, demographic, environmental or other geographically based information are also not possible. In some cases, relatively coarse geographies such as postcode are available, but these are of limited use and researchers cannot undertake precision spatial analyses such as calculating travel times. We describe a method that allows researchers to implement meaningful mapping and spatial epidemiological analyses of practice level patient data while preserving privacy. This solution has been piloted in a diabetes risk research project in the patient population of a practice in Adelaide. The method offers researchers a powerful means of analysing geographic clinic data in a privacy-protected manner. © 2014 Public Health Association of Australia.

  20. National Privacy Research Strategy

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — On July 1, NITRD released the National Privacy Research Strategy. Research agencies across government participated in the development of the strategy, reviewing...

  1. A Privacy Preservation Model for Health-Related Social Networking Sites

    Science.gov (United States)

    2015-01-01

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS. PMID:26155953

  2. A Privacy Preservation Model for Health-Related Social Networking Sites.

    Science.gov (United States)

    Li, Jingquan

    2015-07-08

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS.

  3. Enforcement of Privacy Policies over Multiple Online Social Networks for Collaborative Activities

    Science.gov (United States)

    Wu, Zhengping; Wang, Lifeng

    Our goal is to tend to develop an enforcement architecture of privacy policies over multiple online social networks. It is used to solve the problem of privacy protection when several social networks build permanent or temporary collaboration. Theoretically, this idea is practical, especially due to more and more social network tend to support open source framework “OpenSocial”. But as we known different social network websites may have the same privacy policy settings based on different enforcement mechanisms, this would cause problems. In this case, we have to manually write code for both sides to make the privacy policy settings enforceable. We can imagine that, this is a huge workload based on the huge number of current social networks. So we focus on proposing a middleware which is used to automatically generate privacy protection component for permanent integration or temporary interaction of social networks. This middleware provide functions, such as collecting of privacy policy of each participant in the new collaboration, generating a standard policy model for each participant and mapping all those standard policy to different enforcement mechanisms of those participants.

  4. Social Media Users’ Legal Consciousness About Privacy

    Directory of Open Access Journals (Sweden)

    Katharine Sarikakis

    2017-02-01

    Full Text Available This article explores the ways in which the concept of privacy is understood in the context of social media and with regard to users’ awareness of privacy policies and laws in the ‘Post-Snowden’ era. In the light of presumably increased public exposure to privacy debates, generated partly due to the European “Right to be Forgotten” ruling and the Snowden revelations on mass surveillance, this article explores users’ meaning-making of privacy as a matter of legal dimension in terms of its violations and threats online and users’ ways of negotiating their Internet use, in particular social networking sites. Drawing on the concept of legal consciousness, this article explores through focus group interviews the ways in which social media users negotiate privacy violations and what role their understanding of privacy laws (or lack thereof might play in their strategies of negotiation. The findings are threefold: first, privacy is understood almost universally as a matter of controlling one’s own data, including information disclosure even to friends, and is strongly connected to issues about personal autonomy; second, a form of resignation with respect to control over personal data appears to coexist with a recognized need to protect one’s private data, while respondents describe conscious attempts to circumvent systems of monitoring or violation of privacy, and third, despite widespread coverage of privacy legal issues in the press, respondents’ concerns about and engagement in “self-protecting” tactics derive largely from being personally affected by violations of law and privacy.

  5. Privacy and data security in E-health: requirements from the user's perspective.

    Science.gov (United States)

    Wilkowska, Wiktoria; Ziefle, Martina

    2012-09-01

    In this study two currently relevant aspects of using medical assistive technologies were addressed-security and privacy. In a two-step empirical approach that used focus groups (n = 19) and a survey (n = 104), users' requirements for the use of medical technologies were collected and evaluated. Specifically, we focused on the perceived importance of data security and privacy issues. Outcomes showed that both security and privacy aspects play an important role in the successful adoption of medical assistive technologies in the home environment. In particular, analysis of data with respect to gender, health-status and age (young, middle-aged and old users) revealed that females and healthy adults require, and insist on, the highest security and privacy standards compared with males and the ailing elderly.

  6. Privacy and security in teleradiology

    International Nuclear Information System (INIS)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  7. Privacy and security in teleradiology

    Energy Technology Data Exchange (ETDEWEB)

    Ruotsalainen, Pekka [National Institute for Health and Welfare, Helsinki (Finland)], E-mail: pekka.ruotsalainen@THL.fi

    2010-01-15

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  8. 78 FR 15731 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0011] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... amended by the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer...

  9. Open source tools for standardized privacy protection of medical images

    Science.gov (United States)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  10. By Policy or Design? Privacy in the US in a Post-Snowden World

    OpenAIRE

    Halbert, Debora; Larsson, Stefan

    2015-01-01

    By drawing from a number of studies in the field as well as the Snowden revelations and the case of MegaUpload/MEGA, the article makes an analysis of relevant legislation on privacy in the digital context. The purpose of the analysis is to understand to what extent and how the current paradigm of privacy protection is, or is not, sufficient for contemporary needs. In particular, we ask how privacy is protected by policy in an American context and to what extent this is or is not insufficient ...

  11. A Content Analysis of Library Vendor Privacy Policies: Do They Meet Our Standards?

    Science.gov (United States)

    Magi, Trina J.

    2010-01-01

    Librarians have a long history of protecting user privacy, but they have done seemingly little to understand or influence the privacy policies of library resource vendors that increasingly collect user information through Web 2.0-style personalization features. After citing evidence that college students value privacy, this study used content…

  12. Lightweight Privacy-Preserving Authentication Protocols Secure against Active Attack in an Asymmetric Way

    Science.gov (United States)

    Cui, Yank; Kobara, Kazukuni; Matsuura, Kanta; Imai, Hideki

    As pervasive computing technologies develop fast, the privacy protection becomes a crucial issue and needs to be coped with very carefully. Typically, it is difficult to efficiently identify and manage plenty of the low-cost pervasive devices like Radio Frequency Identification Devices (RFID), without leaking any privacy information. In particular, the attacker may not only eavesdrop the communication in a passive way, but also mount an active attack to ask queries adaptively, which is obviously more dangerous. Towards settling this problem, in this paper, we propose two lightweight authentication protocols which are privacy-preserving against active attack, in an asymmetric way. That asymmetric style with privacy-oriented simplification succeeds to reduce the load of low-cost devices and drastically decrease the computation cost for the management of server. This is because that, unlike the usual management of the identities, our approach does not require any synchronization nor exhaustive search in the database, which enjoys great convenience in case of a large-scale system. The protocols are based on a fast asymmetric encryption with specialized simplification and only one cryptographic hash function, which consequently assigns an easy work to pervasive devices. Besides, our results do not require the strong assumption of the random oracle.

  13. 77 FR 37061 - DHS Data Privacy and Integrity Advisory Committee

    Science.gov (United States)

    2012-06-20

    .... Please note that the meeting may end early if the Committee has completed its business. ADDRESSES: The... draft report to the Department providing guidance on privacy protections for cybersecurity pilot... . Please note that the meeting may end early if all business is completed. Privacy Act Statement: DHS's Use...

  14. 78 FR 1275 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-01-08

    ... Social Security Administration (Computer Matching Agreement 1071). SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of... of its new computer matching program with the Social Security Administration (SSA). DATES: OPM will...

  15. PrivateRide: A Privacy-Enhanced Ride-Hailing Service

    Directory of Open Access Journals (Sweden)

    Pham Anh

    2017-04-01

    Full Text Available In the past few years, we have witnessed a rise in the popularity of ride-hailing services (RHSs, an online marketplace that enables accredited drivers to use their own cars to drive ride-hailing users. Unlike other transportation services, RHSs raise significant privacy concerns, as providers are able to track the precise mobility patterns of millions of riders worldwide. We present the first survey and analysis of the privacy threats in RHSs. Our analysis exposes high-risk privacy threats that do not occur in conventional taxi services. Therefore, we propose PrivateRide, a privacy-enhancing and practical solution that offers anonymity and location privacy for riders, and protects drivers’ information from harvesting attacks. PrivateRide lowers the high-risk privacy threats in RHSs to a level that is at least as low as that of many taxi services. Using real data-sets from Uber and taxi rides, we show that PrivateRide significantly enhances riders’ privacy, while preserving tangible accuracy in ride matching and fare calculation, with only negligible effects on convenience. Moreover, by using our Android implementation for experimental evaluations, we show that PrivateRide’s overhead during ride setup is negligible. In short, we enable privacy-conscious riders to achieve levels of privacy that are not possible in current RHSs and even in some conventional taxi services, thereby offering a potential business differentiator.

  16. Aligning the Effective Use of Student Data with Student Privacy and Security Laws

    Science.gov (United States)

    Winnick, Steve; Coleman, Art; Palmer, Scott; Lipper, Kate; Neiditz, Jon

    2011-01-01

    This legal and policy guidance provides a summary framework for state policymakers as they work to use longitudinal data to improve student achievement while also protecting the privacy and security of individual student records. Summarizing relevant federal privacy and security laws, with a focus on the Family Educational Records and Privacy Act…

  17. Privacy and legal issues in cloud computing

    CERN Document Server

    Weber, Rolf H

    2015-01-01

    Adopting a multi-disciplinary and comparative approach, this book focuses on emerging and innovative attempts to tackle privacy and legal issues in cloud computing, such as personal data privacy, security and intellectual property protection. Leading international academics and practitioners in the fields of law and computer science examine the specific legal implications of cloud computing pertaining to jurisdiction, biomedical practice and information ownership. This collection offers original and critical responses to the rising challenges posed by cloud computing.

  18. 77 FR 60131 - DHS Data Privacy and Integrity Advisory Committee

    Science.gov (United States)

    2012-10-02

    .... to 5 p.m. Please note that the meeting may end early if the Committee has completed its business... privacy protections for the collection and use of biometrics and for cybersecurity pilot programs. These... meeting may end early if all business is completed. Privacy Act Statement: DHS's Use of Your Information...

  19. 77 FR 70795 - Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security...

    Science.gov (United States)

    2012-11-27

    ... 20598-6036; email: [email protected] . For privacy issues please contact: Jonathan Cantor, (202-343... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security Administration System of Records AGENCY: Privacy...

  20. 77 FR 70792 - Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security...

    Science.gov (United States)

    2012-11-27

    ..., VA 20598-6036; email: [email protected] . For privacy issues please contact: Jonathan R. Cantor... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Privacy Act of 1974; Retirement of Department of Homeland Security Transportation Security Administration System of Records AGENCY: Privacy...

  1. Authentication Method for Privacy Protection in Smart Grid Environment

    Directory of Open Access Journals (Sweden)

    Do-Eun Cho

    2014-01-01

    Full Text Available Recently, the interest in green energy is increasing as a means to resolve problems including the exhaustion of the energy source and, effective management of energy through the convergence of various fields. Therefore, the projects of smart grid which is called intelligent electrical grid for the accomplishment of low carbon green growth are being carried out in a rush. However, as the IT is centered upon the electrical grid, the shortage of IT also appears in smart grid and the complexity of convergence is aggravating the problem. Also, various personal information and payment information within the smart grid are gradually becoming big data and target for external invasion and attack; thus, there is increase in concerns for this matter. The purpose of this study is to analyze the security vulnerabilities and security requirement within smart grid and the authentication and access control method for privacy protection within home network. Therefore, we propose a secure access authentication and remote control method for user’s home device within home network environment, and we present their security analysis. The proposed access authentication method blocks the unauthorized external access and enables secure remote access to home network and its devices with a secure message authentication protocol.

  2. Scalable privacy-preserving data sharing methodology for genome-wide association studies.

    Science.gov (United States)

    Yu, Fei; Fienberg, Stephen E; Slavković, Aleksandra B; Uhler, Caroline

    2014-08-01

    The protection of privacy of individual-level information in genome-wide association study (GWAS) databases has been a major concern of researchers following the publication of "an attack" on GWAS data by Homer et al. (2008). Traditional statistical methods for confidentiality and privacy protection of statistical databases do not scale well to deal with GWAS data, especially in terms of guarantees regarding protection from linkage to external information. The more recent concept of differential privacy, introduced by the cryptographic community, is an approach that provides a rigorous definition of privacy with meaningful privacy guarantees in the presence of arbitrary external information, although the guarantees may come at a serious price in terms of data utility. Building on such notions, Uhler et al. (2013) proposed new methods to release aggregate GWAS data without compromising an individual's privacy. We extend the methods developed in Uhler et al. (2013) for releasing differentially-private χ(2)-statistics by allowing for arbitrary number of cases and controls, and for releasing differentially-private allelic test statistics. We also provide a new interpretation by assuming the controls' data are known, which is a realistic assumption because some GWAS use publicly available data as controls. We assess the performance of the proposed methods through a risk-utility analysis on a real data set consisting of DNA samples collected by the Wellcome Trust Case Control Consortium and compare the methods with the differentially-private release mechanism proposed by Johnson and Shmatikov (2013). Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Workshop--E-leaks: the privacy of health information in the age of electronic information.

    Science.gov (United States)

    Vonn, Michael; Lang, Renée; Perras, Maude

    2011-10-01

    This workshop examined some of the new challenges to health-related privacy emerging as a result of the proliferation of electronic communications and data storage, including through social media, electronic health records and ready access to personal information on the internet. The right to privacy is a human right. As such, protecting privacy and enforcing the duty of confidentiality regarding health information are fundamental to treating people with autonomy, dignity and respect. For people living with HIV, unauthorized disclosure of their status can lead to discrimination and breaches of other human rights. While this is not new, in this information age a new breed of privacy violation is emerging and our legal protections are not necessarily keeping pace.

  4. Reward-based spatial crowdsourcing with differential privacy preservation

    Science.gov (United States)

    Xiong, Ping; Zhang, Lefeng; Zhu, Tianqing

    2017-11-01

    In recent years, the popularity of mobile devices has transformed spatial crowdsourcing (SC) into a novel mode for performing complicated projects. Workers can perform tasks at specified locations in return for rewards offered by employers. Existing methods ensure the efficiency of their systems by submitting the workers' exact locations to a centralised server for task assignment, which can lead to privacy violations. Thus, implementing crowsourcing applications while preserving the privacy of workers' location is a key issue that needs to be tackled. We propose a reward-based SC method that achieves acceptable utility as measured by task assignment success rates, while efficiently preserving privacy. A differential privacy model ensures rigorous privacy guarantee, and Laplace noise is introduced to protect workers' exact locations. We then present a reward allocation mechanism that adjusts each piece of the reward for a task using the distribution of the workers' locations. Through experimental results, we demonstrate that this optimised-reward method is efficient for SC applications.

  5. A Case Study on Differential Privacy

    OpenAIRE

    Asseffa, Samrawit; Seleshi, Bihil

    2017-01-01

    Throughout the ages, human beings prefer to keep most things secret and brand this overall state with the title of privacy. Like most significant terms, privacy tends to create controversy regarding the extent of its flexible boundaries, since various technological advancements are slowly leaching away the power people have over their own information. Even as cell phone brands release new upgrades, the ways in which information is communicated has drastically increased, in turn facilitating t...

  6. Health Records and the Cloud Computing Paradigm from a Privacy Perspective

    Directory of Open Access Journals (Sweden)

    Christian Stingl

    2011-01-01

    Full Text Available With the advent of cloud computing, the realization of highly available electronic health records providing location-independent access seems to be very promising. However, cloud computing raises major security issues that need to be addressed particularly within the health care domain. The protection of the privacy of individuals often seems to be left on the sidelines. For instance, common protection against malicious insiders, i.e., non-disclosure agreements, is purely organizational. Clearly, such measures cannot prevent misuses but can at least discourage it. In this paper, we present an approach to storing highly sensitive health data in the cloud whereas the protection of patient's privacy is exclusively based on technical measures, so that users and providers of health records do not need to trust the cloud provider with privacy related issues. Our technical measures comprise anonymous communication and authentication, anonymous yet authorized transactions and pseudonymization of databases.

  7. Kids Sell: Celebrity Kids’ Right to Privacy

    Directory of Open Access Journals (Sweden)

    Seong Choul Hong

    2016-04-01

    Full Text Available The lives of celebrities are often spotlighted in the media because of their newsworthiness; however, many celebrities argue that their right to privacy is often infringed upon. Concerns about celebrity privacy are not limited to the celebrities themselves and often expand to their children. As a result of their popularity, public interest has pushed paparazzi and journalists to pursue trivial and private details about the lives of both celebrities and their children. This paper investigates conflicting areas where the right to privacy and the right to know collide when dealing with the children of celebrities. In general, the courts have been unsympathetic to celebrity privacy claims, noting their newsworthiness and self-promoted characteristic. Unless the press violates news-gathering ethics or torts, the courts will often rule in favor of the media. However, the story becomes quite different when related to an infringement on the privacy of celebrities’ children. This paper argues that all children have a right to protect their privacy regardless of their parents’ social status. Children of celebrities should not be exempt to principles of privacy just because their parents are a celebrity. Furthermore, they should not be exposed by the media without the voluntary consent of their legal patrons. That is, the right of the media to publish and the newsworthiness of children of celebrities must be restrictedly acknowledged.

  8. Privacy and security in teleradiology.

    Science.gov (United States)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  9. Service Outsourcing Character Oriented Privacy Conflict Detection Method in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2014-01-01

    Full Text Available Cloud computing has provided services for users as a software paradigm. However, it is difficult to ensure privacy information security because of its opening, virtualization, and service outsourcing features. Therefore how to protect user privacy information has become a research focus. In this paper, firstly, we model service privacy policy and user privacy preference with description logic. Secondly, we use the pellet reasonor to verify the consistency and satisfiability, so as to detect the privacy conflict between services and user. Thirdly, we present the algorithm of detecting privacy conflict in the process of cloud service composition and prove the correctness and feasibility of this method by case study and experiment analysis. Our method can reduce the risk of user sensitive privacy information being illegally used and propagated by outsourcing services. In the meantime, the method avoids the exception in the process of service composition by the privacy conflict, and improves the trust degree of cloud service providers.

  10. Balancing Cyberspace Promise, Privacy, and Protection: Tracking the Debate.

    Science.gov (United States)

    Metivier-Carreiro, Karen A.; LaFollette, Marcel C.

    1997-01-01

    Examines aspects of cyberspace policy: Internet content and expectations; privacy: medical information and data collected by the government; and the regulation of offensive material: the Communications Decency Act, Internet filters, and the American Library Association's proactive great Web sites for children. Suggests that even "child…

  11. Preserving Employee Privacy in Wellness.

    Science.gov (United States)

    Terry, Paul E

    2017-07-01

    The proposed "Preserving Employee Wellness Programs Act" states that the collection of information about the manifested disease or disorder of a family member shall not be considered an unlawful acquisition of genetic information. The bill recognizes employee privacy protections that are already in place and includes specific language relating to nondiscrimination based on illness. Why did legislation expressly intending to "preserve wellness programs" generate such antipathy about wellness among journalists? This article argues that those who are committed to preserving employee wellness must be equally committed to preserving employee privacy. Related to this, we should better parse between discussions and rules about commonplace health screenings versus much less common genetic testing.

  12. Privacy enhancing techniques - the key to secure communication and management of clinical and genomic data.

    Science.gov (United States)

    De Moor, G J E; Claerhout, B; De Meyer, F

    2003-01-01

    To introduce some of the privacy protection problems related to genomics based medicine and to highlight the relevance of Trusted Third Parties (TTPs) and of Privacy Enhancing Techniques (PETs) in the restricted context of clinical research and statistics. Practical approaches based on two different pseudonymisation models, both for batch and interactive data collection and exchange, are described and analysed. The growing need of managing both clinical and genetic data raises important legal and ethical challenges. Protecting human rights in the realm of privacy, while optimising research potential and other statistical activities is a challenge that can easily be overcome with the assistance of a trust service provider offering advanced privacy enabling/enhancing solutions. As such, the use of pseudonymisation and other innovative Privacy Enhancing Techniques can unlock valuable data sources.

  13. 78 FR 59082 - Privacy Act of 1974; Department of Transportation, Federal Motor Carrier Safety Administration...

    Science.gov (United States)

    2013-09-25

    ..., [email protected] . For privacy issues please contact: Claire W. Barrett, Departmental Chief... DEPARTMENT OF TRANSPORTATION Office of the Secretary [Docket No. FMCSA-2013-0306] Privacy Act of... Administration (FMCSA), DOT. ACTION: Notice to amend a system of records. SUMMARY: In accordance with the Privacy...

  14. Insights to develop privacy policy for organization in Indonesia

    Science.gov (United States)

    Rosmaini, E.; Kusumasari, T. F.; Lubis, M.; Lubis, A. R.

    2018-03-01

    Nowadays, the increased utilization of shared application in the network needs not only dictate to have enhanced security but also emphasize the need to balance its privacy protection and ease of use. Meanwhile, its accessibility and availability as the demand from organization service put privacy obligations become more complex process to be handled and controlled. Nonetheless, the underlying principles for privacy policy exist in Indonesian current laws, even though they spread across various article regulations. Religions, constitutions, statutes, regulations, custom and culture requirements still become the reference model to control the activity process for data collection and information sharing accordingly. Moreover, as the customer and organization often misinterpret their responsibilities and rights in the business function, process and level, the essential thing to be considered for professionals on how to articulate clearly the rules that manage their information gathering and distribution in a manner that translates into information system specification and requirements for developers and managers. This study focus on providing suggestion and recommendation to develop privacy policy based on descriptive analysis of 791 respondents on personal data protection in accordance with political and economic factor in Indonesia.

  15. Toward protocols for quantum-ensured privacy and secure voting

    International Nuclear Information System (INIS)

    Bonanome, Marianna; Buzek, Vladimir; Ziman, Mario; Hillery, Mark

    2011-01-01

    We present a number of schemes that use quantum mechanics to preserve privacy, in particular, we show that entangled quantum states can be useful in maintaining privacy. We further develop our original proposal [see M. Hillery, M. Ziman, V. Buzek, and M. Bielikova, Phys. Lett. A 349, 75 (2006)] for protecting privacy in voting, and examine its security under certain types of attacks, in particular dishonest voters and external eavesdroppers. A variation of these quantum-based schemes can be used for multiparty function evaluation. We consider functions corresponding to group multiplication of N group elements, with each element chosen by a different party. We show how quantum mechanics can be useful in maintaining the privacy of the choices group elements.

  16. Toward protocols for quantum-ensured privacy and secure voting

    Energy Technology Data Exchange (ETDEWEB)

    Bonanome, Marianna [Department of Applied Mathematics and Computer Science, New York City College of Technology, 300 Jay Street, Brooklyn, New York 11201 (United States); Buzek, Vladimir; Ziman, Mario [Research Center for Quantum Information, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia); Faculty of Informatics, Masaryk University, Botanicka 68a, 602 00 Brno (Czech Republic); Hillery, Mark [Department of Physics, Hunter College of CUNY, 695 Park Avenue, New York, New York 10021 (United States)

    2011-08-15

    We present a number of schemes that use quantum mechanics to preserve privacy, in particular, we show that entangled quantum states can be useful in maintaining privacy. We further develop our original proposal [see M. Hillery, M. Ziman, V. Buzek, and M. Bielikova, Phys. Lett. A 349, 75 (2006)] for protecting privacy in voting, and examine its security under certain types of attacks, in particular dishonest voters and external eavesdroppers. A variation of these quantum-based schemes can be used for multiparty function evaluation. We consider functions corresponding to group multiplication of N group elements, with each element chosen by a different party. We show how quantum mechanics can be useful in maintaining the privacy of the choices group elements.

  17. 78 FR 54454 - Open Meeting of the Information Security and Privacy Advisory Board

    Science.gov (United States)

    2013-09-04

    ... security and privacy issues pertaining to federal computer systems. Details regarding the ISPAB's... Information Security and Privacy Advisory Board AGENCY: National Institute of Standards and Technology, Commerce. ACTION: Notice. SUMMARY: The Information Security and Privacy Advisory Board (ISPAB) will meet...

  18. 78 FR 72063 - Open Meeting of the Information Security and Privacy Advisory Board

    Science.gov (United States)

    2013-12-02

    ... NIST on information security and privacy issues pertaining to federal computer systems. Details... Information Security and Privacy Advisory Board AGENCY: National Institute of Standards and Technology, Commerce. ACTION: Notice. SUMMARY: The Information Security and Privacy Advisory Board (ISPAB) will meet...

  19. Forensic DNA phenotyping: Developing a model privacy impact assessment.

    Science.gov (United States)

    Scudder, Nathan; McNevin, Dennis; Kelty, Sally F; Walsh, Simon J; Robertson, James

    2018-05-01

    Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Crossing borders: Security and privacy issues of the European e-passport

    NARCIS (Netherlands)

    Hoepman, J.H.; Hubbers, E.; Jacobs, B.P.F.; Oostdijk, M.D.; Wichers Schreur, R.

    2008-01-01

    The first generation of European e-passports will be issued in 2006. We discuss how borders are crossed regarding the security and privacy erosion of the proposed schemes, and show which borders need to be crossed to improve the security and the privacy protection of the next generation of

  1. Genetic privacy and non-discrimination.

    Science.gov (United States)

    Romeo Casabona, Carlos María

    2011-01-01

    The UN Inter-Agency Committee on Bioethics met for its tenth meeting at the UNESCO headquarters in Paris on 4-5th March 2011. Member organisations such as the WHO and UNESCO were in attendance alongside associate members such as the Council for Europe, the European Commission, the Organisation for Economic Co-operation and Development and the World Trade Organisation. Discussion centred on the theme "genetic privacy and nondiscrimination". The United Nations Economic and Social Council (ECOSOC) had previously considered, from a legal and ethical perspective, the implications of increasingly sophisticated technologies for genetic privacy and non-discrimination in fields such as medicine, employment and insurance. Thus, the ECOSOC requested that UNESCO report on relevant developments in the field of genetic privacy and non-discrimination. In parallel with a consultation process with member states, UNESCO launched a consultation with the UN Interagency Committee on Bioethics. This article analyses the report presented by the author concerning the analysis of the current contentions in the field and illustrates attempts at responding on a normative level to a perceived threat to genetic privacy and non-discrimination.

  2. Privacy and equality in diagnostic genetic testing.

    Science.gov (United States)

    Nyrhinen, Tarja; Hietala, Marja; Puukka, Pauli; Leino-Kilpi, Helena

    2007-05-01

    This study aimed to determine the extent to which the principles of privacy and equality were observed during diagnostic genetic testing according to views held by patients or child patients' parents (n = 106) and by staff (n = 162) from three Finnish university hospitals. The data were collected through a structured questionnaire and analysed using the SAS 8.1 statistical software. In general, the two principles were observed relatively satisfactorily in clinical practice. According to patients/parents, equality in the post-analytic phase and, according to staff, privacy in the pre-analytic phase, involved the greatest ethical problems. The two groups differed in their views concerning pre-analytic privacy. Although there were no major problems regarding the two principles, the differences between the testing phases require further clarification. To enhance privacy protection and equality, professionals need to be given more genetics/ethics training, and patients individual counselling by genetics units staff, giving more consideration to patients' world-view, the purpose of the test and the test result.

  3. 77 FR 75409 - Multistakeholder Meetings To Develop Consumer Data Privacy Code of Conduct Concerning Mobile...

    Science.gov (United States)

    2012-12-20

    ... Protecting Privacy and Promoting Innovation in the Global Digital Economy (the ``Privacy Blueprint'').\\1\\ The Privacy Blueprint directs NTIA to convene multistakeholder processes to develop legally enforceable codes... services for mobile devices handle personal data.\\3\\ On July 12, 2012, NTIA convened the first meeting of...

  4. Data Privacy Laws Follow Lead of Oklahoma and California

    Science.gov (United States)

    Vance, Amelia

    2016-01-01

    Oklahoma's Student Data Accessibility, Transparency, and Accountability Act (known as the Student DATA Act) arose just as privacy concerns about student data were beginning to surface. According to Linnette Attai, founder of education technology compliance consultancy PlayWell LLC, "When this climate of data privacy first emerged in its…

  5. Security and Privacy Analyses of Internet of Things Toys

    OpenAIRE

    Chu, Gordon; Apthorpe, Noah; Feamster, Nick

    2018-01-01

    This paper investigates the security and privacy of Internet-connected children's smart toys through case studies of three commercially-available products. We conduct network and application vulnerability analyses of each toy using static and dynamic analysis techniques, including application binary decompilation and network monitoring. We discover several publicly undisclosed vulnerabilities that violate the Children's Online Privacy Protection Rule (COPPA) as well as the toys' individual pr...

  6. Security of electronic medical information and patient privacy: what you need to know.

    Science.gov (United States)

    Andriole, Katherine P

    2014-12-01

    The responsibility that physicians have to protect their patients from harm extends to protecting the privacy and confidentiality of patient health information including that contained within radiological images. The intent of HIPAA and subsequent HIPAA Privacy and Security Rules is to keep patients' private information confidential while allowing providers access to and maintaining the integrity of relevant information needed to provide care. Failure to comply with electronic protected health information (ePHI) regulations could result in financial or criminal penalties or both. Protected health information refers to anything that can reasonably be used to identify a patient (eg, name, age, date of birth, social security number, radiology examination accession number). The basic tools and techniques used to maintain medical information security and patient privacy described in this article include physical safeguards such as computer device isolation and data backup, technical safeguards such as firewalls and secure transmission modes, and administrative safeguards including documentation of security policies, training of staff, and audit tracking through system logs. Other important concepts related to privacy and security are explained, including user authentication, authorization, availability, confidentiality, data integrity, and nonrepudiation. Patient privacy and security of medical information are critical elements in today's electronic health care environment. Radiology has led the way in adopting digital systems to make possible the availability of medical information anywhere anytime, and in identifying and working to eliminate any risks to patients. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. Anti-discrimination Analysis Using Privacy Attack Strategies

    KAUST Repository

    Ruggieri, Salvatore; Hajian, Sara; Kamiran, Faisal; Zhang, Xiangliang

    2014-01-01

    Social discrimination discovery from data is an important task to identify illegal and unethical discriminatory patterns towards protected-by-law groups, e.g., ethnic minorities. We deploy privacy attack strategies as tools for discrimination

  8. Privacy preserving data anonymization of spontaneous ADE reporting system dataset.

    Science.gov (United States)

    Lin, Wen-Yang; Yang, Duen-Chuan; Wang, Jie-Teng

    2016-07-18

    To facilitate long-term safety surveillance of marketing drugs, many spontaneously reporting systems (SRSs) of ADR events have been established world-wide. Since the data collected by SRSs contain sensitive personal health information that should be protected to prevent the identification of individuals, it procures the issue of privacy preserving data publishing (PPDP), that is, how to sanitize (anonymize) raw data before publishing. Although much work has been done on PPDP, very few studies have focused on protecting privacy of SRS data and none of the anonymization methods is favorable for SRS datasets, due to which contain some characteristics such as rare events, multiple individual records, and multi-valued sensitive attributes. We propose a new privacy model called MS(k, θ (*) )-bounding for protecting published spontaneous ADE reporting data from privacy attacks. Our model has the flexibility of varying privacy thresholds, i.e., θ (*) , for different sensitive values and takes the characteristics of SRS data into consideration. We also propose an anonymization algorithm for sanitizing the raw data to meet the requirements specified through the proposed model. Our algorithm adopts a greedy-based clustering strategy to group the records into clusters, conforming to an innovative anonymization metric aiming to minimize the privacy risk as well as maintain the data utility for ADR detection. Empirical study was conducted using FAERS dataset from 2004Q1 to 2011Q4. We compared our model with four prevailing methods, including k-anonymity, (X, Y)-anonymity, Multi-sensitive l-diversity, and (α, k)-anonymity, evaluated via two measures, Danger Ratio (DR) and Information Loss (IL), and considered three different scenarios of threshold setting for θ (*) , including uniform setting, level-wise setting and frequency-based setting. We also conducted experiments to inspect the impact of anonymized data on the strengths of discovered ADR signals. With all three

  9. Privacy as Invisibility: Pervasive Surveillance and the Privatization of Peer-to-Peer Systems

    Directory of Open Access Journals (Sweden)

    Francesca Musiani

    2011-06-01

    Full Text Available This article addresses the ongoing, increasing privatization of peer-to-peer (P2P file sharing systems – the emergence of systems that users may only join by personal, friend-to-friend invitation. It argues that, within P2P systems, privacy is increasingly coinciding with “mere” invisibility vis-à-vis the rest of the Internet ecosystem because of a trend that has shaped the recent history of P2P technology: The alternation between forms of pervasive surveillance of such systems, and reactions by developers and users to such restrictive measures. Yet, it also suggests that the richness of today’s landscape of P2P technology development and use, mainly in the field of Internet-based services, opens up new dimensions to the conceptualization of privacy, and may give room to a more articulate definition of the concept as related to P2P technology; one that includes not only the need of protection from external attacks, and the temporary outcomes of the competition between surveillance and counter-surveillance measures, but also issues such as user empowerment through better control over personal information, reconfiguration of data management practices, and removal of intermediaries in sharing and communication activities.

  10. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  11. Centralized Duplicate Removal Video Storage System with Privacy Preservation in IoT

    Directory of Open Access Journals (Sweden)

    Hongyang Yan

    2018-06-01

    Full Text Available In recent years, the Internet of Things (IoT has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users’ personal information, the privacy protection of users’ information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.

  12. Centralized Duplicate Removal Video Storage System with Privacy Preservation in IoT.

    Science.gov (United States)

    Yan, Hongyang; Li, Xuan; Wang, Yu; Jia, Chunfu

    2018-06-04

    In recent years, the Internet of Things (IoT) has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users' personal information, the privacy protection of users' information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX) technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.

  13. Privacy enhanced group communication in clinical environment

    Science.gov (United States)

    Li, Mingyan; Narayanan, Sreeram; Poovendran, Radha

    2005-04-01

    Privacy protection of medical records has always been an important issue and is mandated by the recent Health Insurance Portability and Accountability Act (HIPAA) standards. In this paper, we propose security architectures for a tele-referring system that allows electronic group communication among professionals for better quality treatments, while protecting patient privacy against unauthorized access. Although DICOM defines the much-needed guidelines for confidentiality of medical data during transmission, there is no provision in the existing medical security systems to guarantee patient privacy once the data has been received. In our design, we address this issue by enabling tracing back to the recipient whose received data is disclosed to outsiders, using watermarking technique. We present security architecture design of a tele-referring system using a distributed approach and a centralized web-based approach. The resulting tele-referring system (i) provides confidentiality during the transmission and ensures integrity and authenticity of the received data, (ii) allows tracing of the recipient who has either distributed the data to outsiders or whose system has been compromised, (iii) provides proof of receipt or origin, and (iv) can be easy to use and low-cost to employ in clinical environment.

  14. Valuating Privacy with Option Pricing Theory

    Science.gov (United States)

    Berthold, Stefan; Böhme, Rainer

    One of the key challenges in the information society is responsible handling of personal data. An often-cited reason why people fail to make rational decisions regarding their own informational privacy is the high uncertainty about future consequences of information disclosures today. This chapter builds an analogy to financial options and draws on principles of option pricing to account for this uncertainty in the valuation of privacy. For this purpose, the development of a data subject's personal attributes over time and the development of the attribute distribution in the population are modeled as two stochastic processes, which fit into the Binomial Option Pricing Model (BOPM). Possible applications of such valuation methods to guide decision support in future privacy-enhancing technologies (PETs) are sketched.

  15. Ethics and Privacy Implications of Using the Internet and Social Media to Recruit Participants for Health Research: A Privacy-by-Design Framework for Online Recruitment

    Science.gov (United States)

    Cyr, Alaina B; Arbuckle, Luk; Ferris, Lorraine E

    2017-01-01

    Background The Internet and social media offer promising ways to improve the reach, efficiency, and effectiveness of recruitment efforts at a reasonable cost, but raise unique ethical dilemmas. We describe how we used social media to recruit cancer patients and family caregivers for a research study, the ethical issues we encountered, and the strategies we developed to address them. Objective Drawing on the principles of Privacy by Design (PbD), a globally recognized standard for privacy protection, we aimed to develop a PbD framework for online health research recruitment. Methods We proposed a focus group study on the dietary behaviors of cancer patients and their families, and the role of Web-based dietary self-management tools. Using an established blog on our hospital website, we proposed publishing a recruitment post and sharing the link on our Twitter and Facebook pages. The Research Ethics Board (REB) raised concern about the privacy risks associated with our recruitment strategy; by clicking on a recruitment post, an individual could inadvertently disclose personal health information to third-party companies engaged in tracking online behavior. The REB asked us to revise our social media recruitment strategy with the following questions in mind: (1) How will you inform users about the potential for privacy breaches and their implications? and (2) How will you protect users from privacy breaches or inadvertently sharing potentially identifying information about themselves? Results Ethical guidelines recommend a proportionate approach to ethics assessment, which advocates for risk mitigation strategies that are proportional to the magnitude and probability of risks. We revised our social media recruitment strategy to inform users about privacy risks and to protect their privacy, while at the same time meeting our recruitment objectives. We provide a critical reflection of the perceived privacy risks associated with our social media recruitment strategy and

  16. Realizing IoT service's policy privacy over publish/subscribe-based middleware.

    Science.gov (United States)

    Duan, Li; Zhang, Yang; Chen, Shiping; Wang, Shiyao; Cheng, Bo; Chen, Junliang

    2016-01-01

    The publish/subscribe paradigm makes IoT service collaborations more scalable and flexible, due to the space, time and control decoupling of event producers and consumers. Thus, the paradigm can be used to establish large-scale IoT service communication infrastructures such as Supervisory Control and Data Acquisition systems. However, preserving IoT service's policy privacy is difficult in this paradigm, because a classical publisher has little control of its own event after being published; and a subscriber has to accept all the events from the subscribed event type with no choice. Few existing publish/subscribe middleware have built-in mechanisms to address the above issues. In this paper, we present a novel access control framework, which is capable of preserving IoT service's policy privacy. In particular, we adopt the publish/subscribe paradigm as the IoT service communication infrastructure to facilitate the protection of IoT services policy privacy. The key idea in our policy-privacy solution is using a two-layer cooperating method to match bi-directional privacy control requirements: (a) data layer for protecting IoT events; and (b) application layer for preserving the privacy of service policy. Furthermore, the anonymous-set-based principle is adopted to realize the functionalities of the framework, including policy embedding and policy encoding as well as policy matching. Our security analysis shows that the policy privacy framework is Chosen-Plaintext Attack secure. We extend the open source Apache ActiveMQ broker by building into a policy-based authorization mechanism to enforce the privacy policy. The performance evaluation results indicate that our approach is scalable with reasonable overheads.

  17. Profiling the Mobile Customer

    DEFF Research Database (Denmark)

    Jessen, Pernille Wegener; King, Nancy J.

    2010-01-01

    of significant concerns about privacy and data protection. This second article in a two part series on "Profiling the Mobile Customer" explores how to best protect consumers' privacy and personal data through available mechanisms that include industry self-regulation, privacy-enhancing technologies...... discusses the current limitations of using technology to protect consumers from privacy abuses related to profiling. Concluding that industry self-regulation and available privacy-enhancing technologies will not be adequate to close important privacy gaps related to consumer profiling without legislative...

  18. A Taxonomy of Privacy Constructs for Privacy-Sensitive Robotics

    OpenAIRE

    Rueben, Matthew; Grimm, Cindy M.; Bernieri, Frank J.; Smart, William D.

    2017-01-01

    The introduction of robots into our society will also introduce new concerns about personal privacy. In order to study these concerns, we must do human-subject experiments that involve measuring privacy-relevant constructs. This paper presents a taxonomy of privacy constructs based on a review of the privacy literature. Future work in operationalizing privacy constructs for HRI studies is also discussed.

  19. Privacy exposed : Consumer responses to data collection and usage practices of mobile apps

    NARCIS (Netherlands)

    Wottrich, V.M.

    2018-01-01

    Mobile apps are increasingly jeopardizing consumer privacy by collecting, storing, and sharing personal information. However, little is known about users’ responses to data collection and usage practices of apps. This dissertation investigated (1) the status quo of privacy protection behavior, (2)

  20. Privacy policies for health social networking sites

    Science.gov (United States)

    Li, Jingquan

    2013-01-01

    Health social networking sites (HSNS), virtual communities where users connect with each other around common problems and share relevant health data, have been increasingly adopted by medical professionals and patients. The growing use of HSNS like Sermo and PatientsLikeMe has prompted public concerns about the risks that such online data-sharing platforms pose to the privacy and security of personal health data. This paper articulates a set of privacy risks introduced by social networking in health care and presents a practical example that demonstrates how the risks might be intrinsic to some HSNS. The aim of this study is to identify and sketch the policy implications of using HSNS and how policy makers and stakeholders should elaborate upon them to protect the privacy of online health data. PMID:23599228