WorldWideScience

Sample records for model privacy form

  1. 16 CFR 313.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Model privacy form and examples. 313.2... PRIVACY OF CONSUMER FINANCIAL INFORMATION § 313.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  2. 12 CFR 716.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Model privacy form and examples. 716.2 Section... PRIVACY OF CONSUMER FINANCIAL INFORMATION § 716.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  3. 12 CFR 573.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Model privacy form and examples. 573.2 Section... FINANCIAL INFORMATION § 573.2 Model privacy form and examples. (a) Model privacy form. Use of the model... privacy form is not required. (b) Examples. The examples in this part are not exclusive. Compliance with...

  4. 12 CFR 332.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Model privacy form and examples. 332.2 Section... POLICY PRIVACY OF CONSUMER FINANCIAL INFORMATION § 332.2 Model privacy form and examples. (a) Model... this part, although use of the model privacy form is not required. (b) Examples. The examples in this...

  5. 17 CFR 160.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-04-01

    ... examples. 160.2 Section 160.2 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION PRIVACY OF CONSUMER FINANCIAL INFORMATION § 160.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  6. 12 CFR 216.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 2 2010-01-01 2010-01-01 false Model privacy form and examples. 216.2 Section... PRIVACY OF CONSUMER FINANCIAL INFORMATION (REGULATION P) § 216.2 Model privacy form and examples. (a... of this part, although use of the model privacy form is not required. (b) Examples. The examples in...

  7. 12 CFR 40.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 40.2 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY PRIVACY OF CONSUMER... privacy form in Appendix A of this part, consistent with the instructions in Appendix A, constitutes compliance with the notice content requirements of §§ 40.6 and 40.7 of this part, although use of the model...

  8. A Model-Based Privacy Compliance Checker

    OpenAIRE

    Siani Pearson; Damien Allison

    2009-01-01

    Increasingly, e-business organisations are coming under pressure to be compliant to a range of privacy legislation, policies and best practice. There is a clear need for high-level management and administrators to be able to assess in a dynamic, customisable way the degree to which their enterprise complies with these. We outline a solution to this problem in the form of a model-driven automated privacy process analysis and configuration checking system. This system models privacy compliance ...

  9. MODEL REGULATION FOR DATA PRIVACY IN THE APPLICATION OF BIOMETRIC SMART CARD

    Directory of Open Access Journals (Sweden)

    Sinta Dewi

    2017-03-01

    This article will explore data privacy model regulation which is intended to regulate and protect  data privacy. This  regulatory model  combining several approaches in managing data privacy, especially in using biometric smardcard. Firstly, through laws that enforces the principles and international standards. Secondly, through the market approach (market-based solution which is derived through industry associations to help protect consumer data privacy by applying privacy policy in the form of a statement that the industry will protect consumers' privacy by implementing fair information principles. Third, through technological approach such as PET's (privacy enchasing technology,  i.e the techniques for anonymous and pseudo-anonymous payment, communication, and web access. Fourthly, through corporate privacy rules.

  10. From Data Privacy to Location Privacy

    Science.gov (United States)

    Wang, Ting; Liu, Ling

    Over the past decade, the research on data privacy has achieved considerable advancement in the following two aspects: First, a variety of privacy threat models and privacy principles have been proposed, aiming at providing sufficient protection against different types of inference attacks; Second, a plethora of algorithms and methods have been developed to implement the proposed privacy principles, while attempting to optimize the utility of the resulting data. The first part of the chapter presents an overview of data privacy research by taking a close examination at the achievements from the above two aspects, with the objective of pinpointing individual research efforts on the grand map of data privacy protection. As a special form of data privacy, location privacy possesses its unique characteristics. In the second part of the chapter, we examine the research challenges and opportunities of location privacy protection, in a perspective analogous to data privacy. Our discussion attempts to answer the following three questions: (1) Is it sufficient to apply the data privacy models and algorithms developed to date for protecting location privacy? (2) What is the current state of the research on location privacy? (3) What are the open issues and technical challenges that demand further investigation? Through answering these questions, we intend to provide a comprehensive review of the state of the art in location privacy research.

  11. A Privacy Model for RFID Tag Ownership Transfer

    Directory of Open Access Journals (Sweden)

    Xingchun Yang

    2017-01-01

    Full Text Available The ownership of RFID tag is often transferred from one owner to another in its life cycle. To address the privacy problem caused by tag ownership transfer, we propose a tag privacy model which captures the adversary’s abilities to get secret information inside readers, to corrupt tags, to authenticate tags, and to observe tag ownership transfer processes. This model gives formal definitions for tag forward privacy and backward privacy and can be used to measure the privacy property of tag ownership transfer scheme. We also present a tag ownership transfer scheme, which is privacy-preserving under the proposed model and satisfies the other common security requirements, in addition to achieving better performance.

  12. Review of the model of technological pragmatism considering privacy and security

    Directory of Open Access Journals (Sweden)

    Kovačević-Lepojević Marina M.

    2013-01-01

    Full Text Available The model of technological pragmatism assumes awareness that technological development involves both benefits and dangers. Most modern security technologies represent citizens' mass surveillance tools, which can lead to compromising a significant amount of personal data due to the lack of institutional monitoring and control. On the other hand, people are interested in improving crime control and reducing the fear of potential victimization which this framework provides as a rational justification for the apparent loss of privacy, personal rights and freedoms. Citizens' perception on the categories of security and privacy, and their balancing, can provide the necessary guidelines to regulate the application of security technologies in the actual context. The aim of this paper is to analyze the attitudes of students at the University of Belgrade (N = 269 toward the application of security technology and identification of the key dimensions. On the basis of the relevant research the authors have formed assumptions about the following dimensions: security, privacy, trust in institutions and concern about the misuse of security technology. The Prise Questionnaire on Security Technology and Privacy was used for data collection. Factor analysis abstracted eight factors which together account for 58% of variance, with the highest loading of the four factors that are identified as security, privacy, trust and concern. The authors propose a model of technological pragmatism considering security and privacy. The data also showed that students are willing to change their privacy for the purpose of improving security and vice versa.

  13. 17 CFR Appendix A to Part 160 - Model Privacy Form

    Science.gov (United States)

    2010-04-01

    ... information to market to me.” A financial institution that chooses to offer an opt-out for joint marketing... institutions to jointly market to me.” (h) Barcodes. A financial institution may elect to include a barcode and... of a financial institution, including a group of financial institutions that use a common privacy...

  14. A Framework For Enhancing Privacy In Location Based Services Using K-Anonymity Model

    Directory of Open Access Journals (Sweden)

    Jane Mugi

    2015-08-01

    Full Text Available Abstract This paper presents a framework for enhancing privacy in Location Based Services using K-anonymity model. Users of location based services have to reveal their location information in order to use these services however this has threatened the user privacy. K-anonymity approach has been studied extensively in various forms. However it is only effective when the user location is fixed. When a user moves and continuously sends their location information the location service provider can approximate user trajectory which poses a threat to the trajectory privacy of the user. This framework will ensure that user privacy is enhanced for both snapshot and continuous queries. The efficiency and effectiveness of the proposed framework was evaluated the results indicate that the proposed framework has high success rate and good run time performance.

  15. Effective online privacy mechanisms with persuasive communication

    OpenAIRE

    Coopamootoo, P L

    2016-01-01

    This thesis contributes to research by taking a social psychological perspective to managing privacy online. The thesis proposes to support the effort to form a mental model that is required to evaluate a context with regards to privacy attitudes or to ease the effort by biasing activation of privacy attitudes. Privacy being a behavioural concept, the human-computer interaction design plays a major role in supporting and contributing to end users’ ability to manage their privacy online. Howev...

  16. Context trees for privacy-preserving modeling of genetic data

    NARCIS (Netherlands)

    Kusters, C.J.; Ignatenko, T.

    2016-01-01

    In this work, we use context trees for privacypreserving modeling of genetic sequences. The resulting estimated models are applied for functional comparison of genetic sequences in a privacy preserving way. Here we define privacy as uncertainty about the genetic source sequence given its model and

  17. The Impact of Privacy Concerns and Perceived Vulnerability to Risks on Users Privacy Protection Behaviors on SNS: A Structural Equation Model

    OpenAIRE

    Noora Sami Al-Saqer; Mohamed E. Seliaman

    2016-01-01

    This research paper investigates Saudi users’ awareness levels about privacy policies in Social Networking Sites (SNSs), their privacy concerns and their privacy protection measures. For this purpose, a research model that consists of five main constructs namely information privacy concern, awareness level of privacy policies of social networking sites, perceived vulnerability to privacy risks, perceived response efficacy, and privacy protecting behavior was developed. An online survey questi...

  18. Modelling information dissemination under privacy concerns in social media

    Science.gov (United States)

    Zhu, Hui; Huang, Cheng; Lu, Rongxing; Li, Hui

    2016-05-01

    Social media has recently become an important platform for users to share news, express views, and post messages. However, due to user privacy preservation in social media, many privacy setting tools are employed, which inevitably change the patterns and dynamics of information dissemination. In this study, a general stochastic model using dynamic evolution equations was introduced to illustrate how privacy concerns impact the process of information dissemination. Extensive simulations and analyzes involving the privacy settings of general users, privileged users, and pure observers were conducted on real-world networks, and the results demonstrated that user privacy settings affect information differently. Finally, we also studied the process of information diffusion analytically and numerically with different privacy settings using two classic networks.

  19. Towards a Formal Model of Privacy-Sensitive Dynamic Coalitions

    Directory of Open Access Journals (Sweden)

    Sebastian Bab

    2012-04-01

    Full Text Available The concept of dynamic coalitions (also virtual organizations describes the temporary interconnection of autonomous agents, who share information or resources in order to achieve a common goal. Through modern technologies these coalitions may form across company, organization and system borders. Therefor questions of access control and security are of vital significance for the architectures supporting these coalitions. In this paper, we present our first steps to reach a formal framework for modeling and verifying the design of privacy-sensitive dynamic coalition infrastructures and their processes. In order to do so we extend existing dynamic coalition modeling approaches with an access-control-concept, which manages access to information through policies. Furthermore we regard the processes underlying these coalitions and present first works in formalizing these processes. As a result of the present paper we illustrate the usefulness of the Abstract State Machine (ASM method for this task. We demonstrate a formal treatment of privacy-sensitive dynamic coalitions by two example ASMs which model certain access control situations. A logical consideration of these ASMs can lead to a better understanding and a verification of the ASMs according to the aspired specification.

  20. A Privacy Preservation Model for Health-Related Social Networking Sites

    Science.gov (United States)

    2015-01-01

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS. PMID:26155953

  1. A Privacy Preservation Model for Health-Related Social Networking Sites.

    Science.gov (United States)

    Li, Jingquan

    2015-07-08

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS.

  2. Model-driven Privacy Assessment in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Neureiter, Christian [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-09

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures need to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.

  3. Big data privacy protection model based on multi-level trusted system

    Science.gov (United States)

    Zhang, Nan; Liu, Zehua; Han, Hongfeng

    2018-05-01

    This paper introduces and inherit the multi-level trusted system model that solves the Trojan virus by encrypting the privacy of user data, and achieve the principle: "not to read the high priority hierarchy, not to write the hierarchy with low priority". Thus ensuring that the low-priority data privacy leak does not affect the disclosure of high-priority data privacy. This paper inherits the multi-level trustworthy system model of Trojan horse and divides seven different risk levels. The priority level 1˜7 represent the low to high value of user data privacy, and realize seven kinds of encryption with different execution efficiency Algorithm, the higher the priority, the greater the value of user data privacy, at the expense of efficiency under the premise of choosing a more encrypted encryption algorithm to ensure data security. For enterprises, the price point is determined by the unit equipment users to decide the length of time. The higher the risk sub-group algorithm, the longer the encryption time. The model assumes that users prefer the lower priority encryption algorithm to ensure efficiency. This paper proposes a privacy cost model for each of the seven risk subgroups. Among them, the higher the privacy cost, the higher the priority of the risk sub-group, the higher the price the user needs to pay to ensure the privacy of the data. Furthermore, by introducing the existing pricing model of economics and the human traffic model proposed by this paper and fluctuating with the market demand, this paper improves the price of unit products when the market demand is low. On the other hand, when the market demand increases, the profit of the enterprise will be guaranteed under the guidance of the government by reducing the price per unit of product. Then, this paper introduces the dynamic factors of consumers' mood and age to optimize. At the same time, seven algorithms are selected from symmetric and asymmetric encryption algorithms to define the enterprise

  4. A model-driven privacy compliance decision support for medical data sharing in Europe.

    Science.gov (United States)

    Boussi Rahmouni, H; Solomonides, T; Casassa Mont, M; Shiu, S; Rahmouni, M

    2011-01-01

    Clinical practitioners and medical researchers often have to share health data with other colleagues across Europe. Privacy compliance in this context is very important but challenging. Automated privacy guidelines are a practical way of increasing users' awareness of privacy obligations and help eliminating unintentional breaches of privacy. In this paper we present an ontology-plus-rules based approach to privacy decision support for the sharing of patient data across European platforms. We use ontologies to model the required domain and context information about data sharing and privacy requirements. In addition, we use a set of Semantic Web Rule Language rules to reason about legal privacy requirements that are applicable to a specific context of data disclosure. We make the complete set invocable through the use of a semantic web application acting as an interactive privacy guideline system can then invoke the full model in order to provide decision support. When asked, the system will generate privacy reports applicable to a specific case of data disclosure described by the user. Also reports showing guidelines per Member State may be obtained. The advantage of this approach lies in the expressiveness and extensibility of the modelling and inference languages adopted and the ability they confer to reason with complex requirements interpreted from high level regulations. However, the system cannot at this stage fully simulate the role of an ethics committee or review board.

  5. A privacy protection model to support personal privacy in relational databases.

    OpenAIRE

    2008-01-01

    The individual of today incessantly insists on more protection of his/her personal privacy than a few years ago. During the last few years, rapid technological advances, especially in the field of information technology, directed most attention and energy to the privacy protection of the Internet user. Research was done and is still being done covering a vast area to protect the privacy of transactions performed on the Internet. However, it was established that almost no research has been don...

  6. Trajectory data privacy protection based on differential privacy mechanism

    Science.gov (United States)

    Gu, Ke; Yang, Lihao; Liu, Yongzhi; Liao, Niandong

    2018-05-01

    In this paper, we propose a trajectory data privacy protection scheme based on differential privacy mechanism. In the proposed scheme, the algorithm first selects the protected points from the user’s trajectory data; secondly, the algorithm forms the polygon according to the protected points and the adjacent and high frequent accessed points that are selected from the accessing point database, then the algorithm calculates the polygon centroids; finally, the noises are added to the polygon centroids by the differential privacy method, and the polygon centroids replace the protected points, and then the algorithm constructs and issues the new trajectory data. The experiments show that the running time of the proposed algorithms is fast, the privacy protection of the scheme is effective and the data usability of the scheme is higher.

  7. Practical Privacy Assessment

    DEFF Research Database (Denmark)

    Peen, Søren; Jansen, Thejs Willem; Jensen, Christian D.

    2008-01-01

    This chapter proposes a privacy assessment model called the Operational Privacy Assessment Model that includes organizational, operational and technical factors for the protection of personal data stored in an IT system. The factors can be evaluated in a simple scale so that not only the resulting...... graphical depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment tool may be used to standardize privacy assessment criteria, making it less painful...... for the management to assess privacy risks on their systems....

  8. A Model for Calculated Privacy and Trust in pHealth Ecosystems.

    Science.gov (United States)

    Ruotsalainen, Pekka; Blobel, Bernd

    2018-01-01

    A pHealth ecosystem is a community of service users and providers. It is also a dynamic socio-technical system. One of its main goals is to help users to maintain their personal health status. Another goal is to give economic benefit to stakeholders which use personal health information existing in the ecosystem. In pHealth ecosystems, a huge amount of health related data is collected and used by service providers such as data extracted from the regulated health record and information related to personal characteristics, genetics, lifestyle and environment. In pHealth ecosystems, there are different kinds of service providers such as regulated health care service providers, unregulated health service providers, ICT service providers, researchers and industrial organizations. This fact together with the multidimensional personal health data used raises serious privacy concerns. Privacy is a necessary enabler for successful pHealth, but it is also an elastic concept without any universally agreed definition. Regardless of what kind of privacy model is used in dynamic socio-technical systems, it is difficult for a service user to know the privacy level of services in real life situations. As privacy and trust are interrelated concepts, the authors have developed a hybrid solution where knowledge got from regulatory privacy requirements and publicly available privacy related documents is used for calculation of service providers' specific initial privacy value. This value is then used as an estimate for the initial trust score. In this solution, total trust score is a combination of recommended trust, proposed trust and initial trust. Initial privacy level is a weighted arithmetic mean of knowledge and user selected weights. The total trust score for any service provider in the ecosystem can be calculated deploying either a beta trust model or the Fuzzy trust calculation method. The prosed solution is easy to use and to understand, and it can be also automated. It is

  9. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    Science.gov (United States)

    Kuchinke, Wolfgang; Ohmann, Christian; Verheij, Robert A; van Veen, Evert-Ben; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C

    2014-12-01

    To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data privacy frameworks for research done with patient data. Based on an exploration of EU legal requirements for data protection and privacy, data access policies, and existing privacy frameworks of research projects, basic concepts and common processes were extracted, described and incorporated into a model with a formal graphical representation and a standardised notation. The Unified Modelling Language (UML) notation was enriched by workflow and own symbols to enable the representation of extended data flow requirements, data privacy and data security requirements, privacy enhancing techniques (PET) and to allow privacy threat analysis for research scenarios. Our model is built upon the concept of three privacy zones (Care Zone, Non-care Zone and Research Zone) containing databases, data transformation operators, such as data linkers and privacy filters. Using these model components, a risk gradient for moving data from a zone of high risk for patient identification to a zone of low risk can be described. The model was applied to the analysis of data flows in several general clinical research use cases and two research scenarios from the TRANSFoRm project (e.g., finding patients for clinical research and linkage of databases). The model was validated by representing research done with the NIVEL Primary Care Database in the Netherlands. The model allows analysis of data privacy and confidentiality issues for research with patient data in a structured way and provides a framework to specify a privacy compliant data flow, to communicate privacy requirements and to identify weak points for an adequate implementation of data privacy. Copyright © 2014 Elsevier Ireland Ltd. All rights

  10. Context-Aware Generative Adversarial Privacy

    Directory of Open Access Journals (Sweden)

    Chong Huang

    2017-12-01

    Full Text Available Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP. GAP leverages recent advancements in generative adversarial networks (GANs to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals’ private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP’s performance, we investigate two simple (yet canonical statistical dataset models: (a the binary data model; and (b the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.

  11. Context-Aware Generative Adversarial Privacy

    Science.gov (United States)

    Huang, Chong; Kairouz, Peter; Chen, Xiao; Sankar, Lalitha; Rajagopal, Ram

    2017-12-01

    Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP). GAP leverages recent advancements in generative adversarial networks (GANs) to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals' private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP's performance, we investigate two simple (yet canonical) statistical dataset models: (a) the binary data model, and (b) the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion) match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.

  12. Privacy-invading technologies : safeguarding privacy, liberty & security in the 21st century

    NARCIS (Netherlands)

    Klitou, Demetrius

    2012-01-01

    With a focus on the growing development and deployment of the latest technologies that threaten privacy, the PhD dissertation argues that the US and UK legal frameworks, in their present form, are inadequate to defend privacy and other civil liberties against the intrusive capabilities of body

  13. A Knowledge Model Sharing Based Approach to Privacy-Preserving Data Mining

    OpenAIRE

    Hongwei Tian; Weining Zhang; Shouhuai Xu; Patrick Sharkey

    2012-01-01

    Privacy-preserving data mining (PPDM) is an important problem and is currently studied in three approaches: the cryptographic approach, the data publishing, and the model publishing. However, each of these approaches has some problems. The cryptographic approach does not protect privacy of learned knowledge models and may have performance and scalability issues. The data publishing, although is popular, may suffer from too much utility loss for certain types of data mining applications. The m...

  14. Privacy-Preserving Evaluation of Generalization Error and Its Application to Model and Attribute Selection

    Science.gov (United States)

    Sakuma, Jun; Wright, Rebecca N.

    Privacy-preserving classification is the task of learning or training a classifier on the union of privately distributed datasets without sharing the datasets. The emphasis of existing studies in privacy-preserving classification has primarily been put on the design of privacy-preserving versions of particular data mining algorithms, However, in classification problems, preprocessing and postprocessing— such as model selection or attribute selection—play a prominent role in achieving higher classification accuracy. In this paper, we show generalization error of classifiers in privacy-preserving classification can be securely evaluated without sharing prediction results. Our main technical contribution is a new generalized Hamming distance protocol that is universally applicable to preprocessing and postprocessing of various privacy-preserving classification problems, such as model selection in support vector machine and attribute selection in naive Bayes classification.

  15. 48 CFR 52.224-2 - Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Privacy Act. 52.224-2... AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 52.224-2 Privacy... agency function: Privacy Act (APR 1984) (a) The Contractor agrees to— (1) Comply with the Privacy Act of...

  16. Role-task conditional-purpose policy model for privacy preserving data publishing

    Directory of Open Access Journals (Sweden)

    Rana Elgendy

    2017-12-01

    Full Text Available Privacy becomes a major concern for both consumers and enterprises; therefore many research efforts have been devoted to the development of privacy preserving technology. The challenge in data privacy is to share the data while assuring the protection of personal information. Data privacy includes assuring protection for both insider ad outsider threats even if the data is published. Access control can help to protect the data from outsider threats. Access control is defined as the process of mediating every request to resources and data maintained by a system and determining whether the request should be granted or denied. This can be enforced by a mechanism implementing regulations established by a security policy. In this paper, we present privacy preserving data publishing model based on integration of CPBAC, MD-TRBAC, PBFW, protection against database administrator technique inspired from oracle vault technique and benefits of anonymization technique to protect data when being published using k-anonymity. The proposed model meets the requirements of workflow and non-workflow system in enterprise environment. It is based on the characteristics of the conditional purposes, conditional roles, tasks, and policies. It guarantees the protection against insider threats such as database administrator. Finally it assures needed protection in case of publishing the data. Keywords: Database security, Access control, Data publishing, Anonymization

  17. Privacy driven internet ecosystem

    OpenAIRE

    Trinh, Tuan Anh; Gyarmati, Laszlo

    2012-01-01

    The dominant business model of today's Internet is built upon advertisements; users can access Internet services while the providers show ads to them. Although significant efforts have been made to model and analyze the economic aspects of this ecosystem, the heart of the current status quo, namely privacy, has not received the attention of the research community yet. Accordingly, we propose an economic model of the privacy driven Internet ecosystem where privacy is handled as an asset that c...

  18. Forensic DNA phenotyping: Developing a model privacy impact assessment.

    Science.gov (United States)

    Scudder, Nathan; McNevin, Dennis; Kelty, Sally F; Walsh, Simon J; Robertson, James

    2018-05-01

    Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Big data privacy: The datafication of personal information

    DEFF Research Database (Denmark)

    Mai, Jens-Erik

    2016-01-01

    . This broadened approach will take our thinking beyond current preoccupation with whether or not individuals’ consent was secured for data collection to privacy issues arising from the development of new information on individuals' likely behavior through analysis of already collected data—this new information......In the age of big data we need to think differently about privacy. We need to shift our thinking from definitions of privacy (characteristics of privacy) to models of privacy (how privacy works). Moreover, in addition to the existing models of privacy—the surveillance model and capture model......—we need to also consider a new model: the datafication model presented in this article, wherein new personal information is deduced by employing predictive analytics on already-gathered data. These three models of privacy supplement each other; they are not competing understandings of privacy...

  20. Determining the privacy policy deficiencies of health ICT applications through semi-formal modelling.

    Science.gov (United States)

    Croll, Peter R

    2011-02-01

    To ensure that patient confidentiality is securely maintained, health ICT applications that contain sensitive personal information demand comprehensive privacy policies. Determining the adequacy of these policies to meet legal conformity together with clinical users and patient expectation is demanding in practice. Organisations and agencies looking to analyse their Privacy and Security policies can benefit from guidance provided by outside entities such as the Privacy Office of their State or Government together with law firms and ICT specialists. The advice given is not uniform and often open to different interpretations. Of greater concern is the possibility of overlooking any important aspects that later result in a data breach. Based on three case studies, this paper considers whether a more formal approach to privacy analysis could be taken that would help identify the full coverage of a Privacy Impact Analysis and determine the deficiencies with an organisation's current policies and approach. A diagrammatic model showing the relationships between Confidentiality, Privacy, Trust, Security and Safety is introduced. First the validity of this model is determined by mapping it against the real-world case studies taken from three healthcare services that depend on ICT. Then, by using software engineering methods, a formal mapping of the relationships is undertaken to identify a full set of policies needed to satisfy the model. How effective this approach may prove as a generic method for deriving a comprehensive set of policies in health ICT applications is finally discussed. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Model-based Assessment for Balancing Privacy Requirements and Operational Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-17

    The smart grid changes the way energy is produced and distributed. In addition both, energy and information is exchanged bidirectionally among participating parties. Therefore heterogeneous systems have to cooperate effectively in order to achieve a common high-level use case, such as smart metering for billing or demand response for load curtailment. Furthermore, a substantial amount of personal data is often needed for achieving that goal. Capturing and processing personal data in the smart grid increases customer concerns about privacy and in addition, certain statutory and operational requirements regarding privacy aware data processing and storage have to be met. An increase of privacy constraints, however, often limits the operational capabilities of the system. In this paper, we present an approach that automates the process of finding an optimal balance between privacy requirements and operational requirements in a smart grid use case and application scenario. This is achieved by formally describing use cases in an abstract model and by finding an algorithm that determines the optimum balance by forward mapping privacy and operational impacts. For this optimal balancing algorithm both, a numeric approximation and – if feasible – an analytic assessment are presented and investigated. The system is evaluated by applying the tool to a real-world use case from the University of Southern California (USC) microgrid.

  2. δ-dependency for privacy-preserving XML data publishing.

    Science.gov (United States)

    Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny

    2014-08-01

    An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis

  3. Data privacy for the smart grid

    CERN Document Server

    Herold, Rebecca

    2015-01-01

    The Smart Grid and PrivacyWhat Is the Smart Grid? Changes from Traditional Energy Delivery Smart Grid Possibilities Business Model Transformations Emerging Privacy Risks The Need for Privacy PoliciesPrivacy Laws, Regulations, and Standards Privacy-Enhancing Technologies New Privacy Challenges IOT Big Data What Is the Smart Grid?Market and Regulatory OverviewTraditional Electricity Business SectorThe Electricity Open Market Classifications of Utilities Rate-Making ProcessesElectricity Consumer

  4. TRANSFoRm: a flexible zone model of a data privacy framework for Primary Care research.

    NARCIS (Netherlands)

    Kuchinke, W.; Veen, E.B. van; Delaney, B.C.; Verheij, R.; Taweel, A.; Ohmann, C.

    2011-01-01

    As part of the TRANSFoRm project a flexible zone model for data privacy in Primary Care research was developed. The model applies different privacy generating methods to different aspects of the research data flow and allows in this way for only minimal hindrance of research activities. This is

  5. Privacy in Social Networks

    CERN Document Server

    Zheleva, Elena

    2012-01-01

    This synthesis lecture provides a survey of work on privacy in online social networks (OSNs). This work encompasses concerns of users as well as service providers and third parties. Our goal is to approach such concerns from a computer-science perspective, and building upon existing work on privacy, security, statistical modeling and databases to provide an overview of the technical and algorithmic issues related to privacy in OSNs. We start our survey by introducing a simple OSN data model and describe common statistical-inference techniques that can be used to infer potentially sensitive inf

  6. Privacy Implications of Surveillance Systems

    DEFF Research Database (Denmark)

    Thommesen, Jacob; Andersen, Henning Boje

    2009-01-01

    This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed for vari......This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed...... for various purposes, even as a service for those being observed, but in any case they will to some degree invade their privacy. The model provided here can indicate how invasive any particular system may be – and be used to compare the invasiveness of different systems. Applying a functional approach......, the model is established by first considering the social function of privacy in everyday life, which in turn lets us determine which different domains will be considered as private, and finally identify the different types of privacy invasion. This underlying model (function – domain – invasion) then serves...

  7. Privacy versus autonomy: a tradeoff model for smart home monitoring technologies.

    Science.gov (United States)

    Townsend, Daphne; Knoefel, Frank; Goubran, Rafik

    2011-01-01

    Smart homes are proposed as a new location for the delivery of healthcare services. They provide healthcare monitoring and communication services, by using integrated sensor network technologies. We validate a hypothesis regarding older adults' adoption of home monitoring technologies by conducting a literature review of articles studying older adults' attitudes and perceptions of sensor technologies. Using current literature to support the hypothesis, this paper applies the tradeoff model to decisions about sensor acceptance. Older adults are willing to trade privacy (by accepting a monitoring technology), for autonomy. As the information captured by the sensor becomes more intrusive and the infringement on privacy increases, sensors are accepted if the loss in privacy is traded for autonomy. Even video cameras, the most intrusive sensor type were accepted in exchange for the height of autonomy which is to remain in the home.

  8. A Survey of Privacy on Data Integration

    OpenAIRE

    Do Son, Thanh

    2015-01-01

    This survey is an integrated view of other surveys on privacy preserving for data integration. First, we review the database context and challenges and research questions. Second, we formulate the privacy problems for schema matching and data matching. Next, we introduce the elements of privacy models. Then, we summarize the existing privacy techniques and the analysis (proofs) of privacy guarantees. Finally, we describe the privacy frameworks and their applications.

  9. The Models of Applying Online Privacy Literacy Strategies: A Case Study of Instagram Girl Users

    Directory of Open Access Journals (Sweden)

    Abdollah Bicharanlou

    2017-09-01

    Full Text Available Social networks affect remarkably in the lives of virtual space users. These networks like most human relations involve compromising between self-disclosure and privacy protection. A process which is realized through improving privacy and empowering the user at the personal level. This study aimed to assess strategies based on online privacy literacy. In particular, strategies that Instagram young girls users should employ to achieve the optimum level of privacy. For this purpose, firstly the paradox of privacy, benefits and risks of self-disclosure are explained, then according to online privacy literacy, some social and technological strategies are introduced by which users can solve the “paradox of privacy.” In the result section, after describing the main benefits and risks of self-disclosure by girl users, the current models of using these social and technological strategies to solve the mentioned paradox are discussed. The research method is ethnography based on non-collaborative observation of Instagram pages and semi-structured interviews with 20 girl users of social networks.

  10. A Game Theoretic Approach for Modeling Privacy Settings of an Online Social Network

    Directory of Open Access Journals (Sweden)

    Jundong Chen

    2014-05-01

    Full Text Available Users of online social networks often adjust their privacy settings to control how much information on their profiles is accessible to other users of the networks. While a variety of factors have been shown to affect the privacy strategies of these users, very little work has been done in analyzing how these factors influence each other and collectively contribute towards the users’ privacy strategies. In this paper, we analyze the influence of attribute importance, benefit, risk and network topology on the users’ attribute disclosure behavior by introducing a weighted evolutionary game model. Results show that: irrespective of risk, users aremore likely to reveal theirmost important attributes than their least important attributes; when the users’ range of influence is increased, the risk factor plays a smaller role in attribute disclosure; the network topology exhibits a considerable effect on the privacy in an environment with risk.

  11. Fuzzy Privacy Decision for Context-Aware Access Personal Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qingsheng; QI Yong; ZHAO Jizhong; HOU Di; NIU Yujie

    2007-01-01

    A context-aware privacy protection framework was designed for context-aware services and privacy control methods about access personal information in pervasive environment. In the process of user's privacy decision, it can produce fuzzy privacy decision as the change of personal information sensitivity and personal information receiver trust. The uncertain privacy decision model was proposed about personal information disclosure based on the change of personal information receiver trust and personal information sensitivity. A fuzzy privacy decision information system was designed according to this model. Personal privacy control policies can be extracted from this information system by using rough set theory. It also solves the problem about learning privacy control policies of personal information disclosure.

  12. Privacy context model for dynamic privacy adaptation in ubiquitous computing

    NARCIS (Netherlands)

    Schaub, Florian; Koenings, Bastian; Dietzel, Stefan; Weber, M.; Kargl, Frank

    Ubiquitous computing is characterized by the merger of physical and virtual worlds as physical artifacts gain digital sensing, processing, and communication capabilities. Maintaining an appropriate level of privacy in the face of such complex and often highly dynamic systems is challenging. We argue

  13. SmartPrivacy for the smart grid : embedding privacy into the design of electricity conservation

    Energy Technology Data Exchange (ETDEWEB)

    Cavoukian, A. [Ontario Information and Privacy Commissioner, Toronto, ON (Canada); Polonetsky, J.; Wolf, C. [Future of Privacy Forum, Washington, DC (United States)

    2009-11-15

    Modernization efforts are underway to make the current electrical grid smarter. The future of the Smart Grid will be capable of informing consumers of their day-to-day energy use, curbing greenhouse gas emissions, and reducing consumers' energy bills. However, the Smart Grid also brings with it the possibility of collecting detailed information on individual energy consumption use and patterns within peoples' homes. This paper discussed the Smart Grid and its benefits, as well as the questions that should be examined regarding privacy. The paper also outlined the concept of SmartPrivacy and discussed its application to the Smart Grid scenario. Privacy by design foundational principles and Smart Grid components were also presented in an appendix. It was concluded that the information collected on a Smart Grid will form a library of personal information. The mishandling of this information could be extremely invasive of consumer privacy. 46 refs., 1 fig., 2 appendices.

  14. Predicting user concerns about online privacy in Hong Kong.

    Science.gov (United States)

    Yao, Mike Z; Zhang, Jinguang

    2008-12-01

    Empirical studies on people's online privacy concerns have largely been conducted in the West. The global threat of privacy violations on the Internet calls for similar studies to be done in non-Western regions. To fill this void, the current study develops a path model to investigate the influence of people's Internet use-related factors, their beliefs in the right to privacy, and psychological need for privacy on Hong Kong people's concerns about online privacy. Survey responses from 332 university students were analyzed. Results from this study show that people's belief in the right to privacy was the most important predictor of their online privacy concerns. It also significantly mediated the relationship between people's psychological need for privacy and their concerns with privacy violations online. Moreover, while frequent use of the Internet may increase concerns about online privacy issues, Internet use diversity may actually reduce such worries. The final model, well supported by the observed data, successfully explained 25% of the variability in user concerns about online privacy.

  15. Building trusted national identity management systems: Presenting the privacy concern-trust (PCT) model

    DEFF Research Database (Denmark)

    Adjei, Joseph K.; Olesen, Henning

    This paper discusses the effect of trust and information privacy concerns on citizens’ attitude towards national identity management systems. We introduce the privacyconcerns- trust model, which shows the role of trust in mediating and moderating citizens’ attitude towards identity management...... systems. We adopted a qualitative research approach in our analysis of data that was gathered through a series of interviews and a stakeholder workshop in Ghana. Our findings indicate that, beyond the threshold level of trust, societal information privacy concern is low; hence, trust is high, thereby...

  16. Disentangling privacy from property: toward a deeper understanding of genetic privacy.

    Science.gov (United States)

    Suter, Sonia M

    2004-04-01

    connoting commodification, disaggregation, and arms-length dealings, can negatively affect the self and harm these relationships. This Article concludes that a deeper understanding of genetic privacy calls for remedies for privacy violations that address dignitary harm and breach of trust, as opposed to market harms, as the property model suggests.

  17. Gain-Based Relief for Invasion of Privacy

    Directory of Open Access Journals (Sweden)

    Sirko Harder

    2013-11-01

    Full Text Available In many common law jurisdictions, some or all instances of invasion of privacy constitute a privacy-specific wrong either at common law (including equity or under statute. A remedy invariably available for such a wrong is compensation for loss. However, the plaintiff may instead seek to claim the profit the defendant has made from the invasion. This article examines when a plaintiff is, and should be, entitled to claim that profit, provided that invasion of privacy is actionable as such. After a brief overview of the relevant law in major common law jurisdictions, the article investigates how invasion of privacy fits into a general concept of what is called ‘restitution for wrongs’. It will be argued that the right to privacy is a right against the whole world and as such forms a proper basis of awarding gain-based relief for the unauthorised use of that right.

  18. A Taxonomy of Privacy Constructs for Privacy-Sensitive Robotics

    OpenAIRE

    Rueben, Matthew; Grimm, Cindy M.; Bernieri, Frank J.; Smart, William D.

    2017-01-01

    The introduction of robots into our society will also introduce new concerns about personal privacy. In order to study these concerns, we must do human-subject experiments that involve measuring privacy-relevant constructs. This paper presents a taxonomy of privacy constructs based on a review of the privacy literature. Future work in operationalizing privacy constructs for HRI studies is also discussed.

  19. Development and Analyses of Privacy Management Models in Online Social Networks Based on Communication Privacy Management Theory

    Science.gov (United States)

    Lee, Ki Jung

    2013-01-01

    Online social networks (OSNs), while serving as an emerging means of communication, promote various issues of privacy. Users of OSNs encounter diverse occasions that lead to invasion of their privacy, e.g., published conversation, public revelation of their personally identifiable information, and open boundary of distinct social groups within…

  20. Trust-aware Privacy Control for Social Media

    OpenAIRE

    Li, Na; Najafian-Razavi, Maryam; Gillet, Denis

    2011-01-01

    Due to the huge exposure of personal information in social media, a challenge now is to design effective privacy mechanisms that protect against unauthorized access to social data. In this paper, a trust model for social media is first presented. Based on the trust model, a trust-aware privacy control protocol is proposed, that exploits the underlying inter-entity trust information. The objective is to design a fine-grained privacy scheme that ensures a user’s online information is disclosed ...

  1. Social Media Users’ Legal Consciousness About Privacy

    Directory of Open Access Journals (Sweden)

    Katharine Sarikakis

    2017-02-01

    Full Text Available This article explores the ways in which the concept of privacy is understood in the context of social media and with regard to users’ awareness of privacy policies and laws in the ‘Post-Snowden’ era. In the light of presumably increased public exposure to privacy debates, generated partly due to the European “Right to be Forgotten” ruling and the Snowden revelations on mass surveillance, this article explores users’ meaning-making of privacy as a matter of legal dimension in terms of its violations and threats online and users’ ways of negotiating their Internet use, in particular social networking sites. Drawing on the concept of legal consciousness, this article explores through focus group interviews the ways in which social media users negotiate privacy violations and what role their understanding of privacy laws (or lack thereof might play in their strategies of negotiation. The findings are threefold: first, privacy is understood almost universally as a matter of controlling one’s own data, including information disclosure even to friends, and is strongly connected to issues about personal autonomy; second, a form of resignation with respect to control over personal data appears to coexist with a recognized need to protect one’s private data, while respondents describe conscious attempts to circumvent systems of monitoring or violation of privacy, and third, despite widespread coverage of privacy legal issues in the press, respondents’ concerns about and engagement in “self-protecting” tactics derive largely from being personally affected by violations of law and privacy.

  2. BangA: An Efficient and Flexible Generalization-Based Algorithm for Privacy Preserving Data Publication

    Directory of Open Access Journals (Sweden)

    Adeel Anjum

    2017-01-01

    Full Text Available Privacy-Preserving Data Publishing (PPDP has become a critical issue for companies and organizations that would release their data. k-Anonymization was proposed as a first generalization model to guarantee against identity disclosure of individual records in a data set. Point access methods (PAMs are not well studied for the problem of data anonymization. In this article, we propose yet another approximation algorithm for anonymization, coined BangA, that combines useful features from Point Access Methods (PAMs and clustering. Hence, it achieves fast computation and scalability as a PAM, and very high quality thanks to its density-based clustering step. Extensive experiments show the efficiency and effectiveness of our approach. Furthermore, we provide guidelines for extending BangA to achieve a relaxed form of differential privacy which provides stronger privacy guarantees as compared to traditional privacy definitions.

  3. Privacy under construction : A developmental perspective on privacy perception

    NARCIS (Netherlands)

    Steijn, W.M.P.; Vedder, A.H.

    2015-01-01

    We present a developmental perspective regarding the difference in perceptions toward privacy between young and old. Here, we introduce the notion of privacy conceptions, that is, the specific ideas that individuals have regarding what privacy actually is. The differences in privacy concerns often

  4. Do Privacy Concerns Matter for Millennials?

    DEFF Research Database (Denmark)

    Fodor, Mark; Brem, Alexander

    2015-01-01

    data have raised the question, if location data are considered as sensitive data by users. Thus, we use two privacy concern models, namely Concern for Information Privacy (CFIP) and Internet Users’ Information Privacy Concerns (IUIPC) to find out. Our sample comprises of 235 individuals between 18...... and 34 years (Generation C) from Germany. The results of this study indicate that the second-order factor IUIPC showed better fit for the underlying data than CFIP did. Overall privacy concerns have been found to have an impact on behavioral intentions of users for LBS adoption. Furthermore, other risk...

  5. Privacy Bridges: EU and US Privacy Experts In Search of Transatlantic Privacy Solutions

    NARCIS (Netherlands)

    Abramatic, J.-F.; Bellamy, B.; Callahan, M.E.; Cate, F.; van Eecke, P.; van Eijk, N.; Guild, E.; de Hert, P.; Hustinx, P.; Kuner, C.; Mulligan, D.; O'Connor, N.; Reidenberg, J.; Rubinstein, I.; Schaar, P.; Shadbolt, N.; Spiekermann, S.; Vladeck, D.; Weitzner, D.J.; Zuiderveen Borgesius, F.; Hagenauw, D.; Hijmans, H.

    2015-01-01

    The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the

  6. A Generic Privacy Quantification Framework for Privacy-Preserving Data Publishing

    Science.gov (United States)

    Zhu, Zutao

    2010-01-01

    In recent years, the concerns about the privacy for the electronic data collected by government agencies, organizations, and industries are increasing. They include individual privacy and knowledge privacy. Privacy-preserving data publishing is a research branch that preserves the privacy while, at the same time, withholding useful information in…

  7. European Perspectives on Privacy in the Sharing Economy

    DEFF Research Database (Denmark)

    Ranzini, Giulia; Etter, Michael; Vermeulen, Ivar

    Report from the EU H2020 Research Project Ps2Share: Participation, Privacy, and Power in the Sharing Economy. This report ‘European Perspectives on Privacy in the Sharing Economy’ forms one element of a European Union Horizon 2020 Research Project on the sharing economy: Ps2Share ‘Participation......, Privacy, and Power in the Sharing Economy’. The study is undertaken within the scope of the European Union’s Horizon 2020 research and innovation programme, funded under grant agreement No. 732117 and with the objective (ICT-35) of “Enabling responsible ICT-related research and innovation”. This project...... recommendations to Europe’s institutions. We focus on topics of participation, privacy, and power in the sharing economy....

  8. Culture, Privacy Conception and Privacy Concern: Evidence from Europe before PRISM

    OpenAIRE

    Omrani, Nessrine; Soulié, Nicolas

    2017-01-01

    This article analyses individuals’ online privacy concerns between cultural country groups. We use a dataset of more than 14 000 Internet users collected by the European Union in 2010 in 26 EU countries. We use a probit model to examine the variables associated with the probability of being concerned about privacy, in order to draw policy and regulatory implications. The results show that women and poor people are more concerned than their counterparts. People who often use Internet are not p...

  9. Choose Privacy Week: Educate Your Students (and Yourself) about Privacy

    Science.gov (United States)

    Adams, Helen R.

    2016-01-01

    The purpose of "Choose Privacy Week" is to encourage a national conversation to raise awareness of the growing threats to personal privacy online and in day-to-day life. The 2016 Choose Privacy Week theme is "respecting individuals' privacy," with an emphasis on minors' privacy. A plethora of issues relating to minors' privacy…

  10. 75 FR 63703 - Privacy Act of 1974; Privacy Act Regulation

    Science.gov (United States)

    2010-10-18

    ... FEDERAL RESERVE SYSTEM 12 CFR Part 261a [Docket No. R-1313] Privacy Act of 1974; Privacy Act... implementing the Privacy Act of 1974 (Privacy Act). The primary changes concern the waiver of copying fees... records under the Privacy Act; the amendment of special procedures for the release of medical records to...

  11. Designing Privacy for You : A User Centric Approach For Privacy

    OpenAIRE

    Senarath, Awanthika; Arachchilage, Nalin A. G.; Slay, Jill

    2017-01-01

    Privacy directly concerns the user as the data owner (data- subject) and hence privacy in systems should be implemented in a manner which concerns the user (user-centered). There are many concepts and guidelines that support development of privacy and embedding privacy into systems. However, none of them approaches privacy in a user- centered manner. Through this research we propose a framework that would enable developers and designers to grasp privacy in a user-centered manner and implement...

  12. Query Monitoring and Analysis for Database Privacy - A Security Automata Model Approach.

    Science.gov (United States)

    Kumar, Anand; Ligatti, Jay; Tu, Yi-Cheng

    2015-11-01

    Privacy and usage restriction issues are important when valuable data are exchanged or acquired by different organizations. Standard access control mechanisms either restrict or completely grant access to valuable data. On the other hand, data obfuscation limits the overall usability and may result in loss of total value. There are no standard policy enforcement mechanisms for data acquired through mutual and copyright agreements. In practice, many different types of policies can be enforced in protecting data privacy. Hence there is the need for an unified framework that encapsulates multiple suites of policies to protect the data. We present our vision of an architecture named security automata model (SAM) to enforce privacy-preserving policies and usage restrictions. SAM analyzes the input queries and their outputs to enforce various policies, liberating data owners from the burden of monitoring data access. SAM allows administrators to specify various policies and enforces them to monitor queries and control the data access. Our goal is to address the problems of data usage control and protection through privacy policies that can be defined, enforced, and integrated with the existing access control mechanisms using SAM. In this paper, we lay out the theoretical foundation of SAM, which is based on an automata named Mandatory Result Automata. We also discuss the major challenges of implementing SAM in a real-world database environment as well as ideas to meet such challenges.

  13. The Privacy Coach: Supporting customer privacy in the Internet of Things

    OpenAIRE

    Broenink, Gerben; Hoepman, Jaap-Henk; Hof, Christian van 't; van Kranenburg, Rob; Smits, David; Wisman, Tijmen

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing technologies on the RFID tags themselves. Instead the Privacy Coach functions as a mediator between customer privacy preferences and corporate privacy policies, trying to find a match between the ...

  14. BORDERS OF COMMUNICATION PRIVACY IN SLOVENIAN CRIMINAL PROCEDURE – CONSTITUTIONAL CHALLENGES

    Directory of Open Access Journals (Sweden)

    Sabina Zgaga

    2015-01-01

    Full Text Available Due to fast technological development and our constant communication protection of communication privacy in every aspect of our (legal life has become more important than ever before. Regarding protection of privacy in criminal procedure special emphasis should be given to the regulation of privacy in Slovenian Constitution and its interpretation in the case law of the Constitutional Court. This paper presents the definition of privacy and communication privacy in Slovenian constitutional law and exposes the main issues of communication privacy that have been discussed in the case law of the Constitutional Court in the last twenty years. Thereby the paper tries to show the general trend in the case law of Constitutional Court regarding the protection of communication privacy and to expose certain unsolved issues and unanswered challenges. Slovenian constitutional regulation of communication privacy is very protective, considering the broad definition of privacy and the strict conditions for encroachment of communication privacy. The case law of Slovenian Constitutional Court has also shown such trend, with the possible exception of the recent decision on a dynamic IP address. The importance of this decision is however significant, since it could be applicable to all forms of communication via internet, the prevailing form of communication nowadays. Certain challenges still lay ahead, such as the current proposal for the amendment of Criminal Procedure Act-M, which includes the use of IMSI catchers and numerous unanswered issues regarding data retention after the decisive annulment of its partial legal basis by the Constitutional Court.

  15. Privacy vs security

    CERN Document Server

    Stalla-Bourdillon, Sophie; Ryan, Mark D

    2014-01-01

    Securing privacy in the current environment is one of the great challenges of today's democracies. Privacy vs. Security explores the issues of privacy and security and their complicated interplay, from a legal and a technical point of view. Sophie Stalla-Bourdillon provides a thorough account of the legal underpinnings of the European approach to privacy and examines their implementation through privacy, data protection and data retention laws. Joshua Philips and Mark D. Ryan focus on the technological aspects of privacy, in particular, on today's attacks on privacy by the simple use of today'

  16. Privacy transparency patterns

    NARCIS (Netherlands)

    Siljee B.I.J.

    2015-01-01

    This paper describes two privacy patterns for creating privacy transparency: the Personal Data Table pattern and the Privacy Policy Icons pattern, as well as a full overview of privacy transparency patterns. It is a first step in creating a full set of privacy design patterns, which will aid

  17. User Privacy and Empowerment: Trends, Challenges, and Opportunities

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2018-01-01

    to the service providers. Considering business models that are slanted towards service provid-ers, privacy has become a crucial issue in today’s fast growing digital world. Hence, this paper elaborates personal information flow between users, service providers, and data brokers. We also discussed the significant...... privacy issues like present business models, user awareness about privacy and user control over per-sonal data. To address such issues, this paper also identified challenges that com-prise unavailability of effective privacy awareness or protection tools and the ef-fortless way to study and see the flow...... of personal information and its manage-ment. Thus, empowering users and enhancing awareness are essential to compre-hending the value of secrecy. This paper also introduced latest advances in the domain of privacy issues like User Managed Access (UMA) can state suitable requirements for user empowerment...

  18. Privacy Awareness: A Means to Solve the Privacy Paradox?

    Science.gov (United States)

    Pötzsch, Stefanie

    People are limited in their resources, i.e. they have limited memory capabilities, cannot pay attention to too many things at the same time, and forget much information after a while; computers do not suffer from these limitations. Thus, revealing personal data in electronic communication environments and being completely unaware of the impact of privacy might cause a lot of privacy issues later. Even if people are privacy aware in general, the so-called privacy paradox shows that they do not behave according to their stated attitudes. This paper discusses explanations for the existing dichotomy between the intentions of people towards disclosure of personal data and their behaviour. We present requirements on tools for privacy-awareness support in order to counteract the privacy paradox.

  19. Privacy Policy

    Science.gov (United States)

    ... Home → NLM Privacy Policy URL of this page: https://medlineplus.gov/privacy.html NLM Privacy Policy To ... out of cookies in the most popular browsers, http://www.usa.gov/optout_instructions.shtml. Please note ...

  20. Rethinking the Privacy Calculus: On the Role of Dispositional Factors and Affect

    OpenAIRE

    Kehr, Flavius; Wentzel, Daniel; Mayer, Peter

    2013-01-01

    Existing research on information privacy has mostly relied on the privacy calculus model which views privacy-related decision making as a rational process where individuals weigh the anticipated risks of disclosing personal data against the potential benefits. However, scholars have recently challenged two basic propositions of the privacy calculus model. First, some authors have distinguished between general and situational factors in the context of privacy calculus and have argued that ...

  1. The Privacy Calculus: Mobile Apps and User Perceptions of Privacy and Security

    Directory of Open Access Journals (Sweden)

    Elizabeth Fife

    2012-07-01

    Full Text Available A continuing stream of new mobile data services are being released that rely upon the collection of personal data to support a business model. New technologies including facial recognition, sensors and Near Field Communications (NFC will increasingly become a part of everyday services and applications that challenge traditional concepts of individual privacy. The average person as well as the “tech‐savvy” mobile phone user may not yet be fully aware of the extent to which their privacy and security are being affected through their mobile activities and how comparable this situation is to personal computer usage. We investigate perceptions and usage of mobile data services that appear to have specific privacy and security sensitivities, specifically social networking,\tbanking/payments\tand\thealth‐related activities. Our annual survey of smartphone users in the U.S. and Japan is presented from 2011. This nationally representative survey data is used to show demographic and cultural differences, and substantiate our hypotheses about the links between use and privacy concerns

  2. Concentrated Differential Privacy

    OpenAIRE

    Dwork, Cynthia; Rothblum, Guy N.

    2016-01-01

    We introduce Concentrated Differential Privacy, a relaxation of Differential Privacy enjoying better accuracy than both pure differential privacy and its popular "(epsilon,delta)" relaxation without compromising on cumulative privacy loss over multiple computations.

  3. Game-Theoretic Model of Incentivizing Privacy-Aware Users to Consent to Location Tracking

    OpenAIRE

    Panaousis, Emmanouil; Laszka, Aron; Pohl, Johannes; Noack, Andreas; Alpcan, Tansu

    2016-01-01

    Nowadays, mobile users have a vast number of applications and services at their disposal. Each of these might impose some privacy threats on users' "Personally Identifiable Information" (PII). Location privacy is a crucial part of PII, and as such, privacy-aware users wish to maximize it. This privacy can be, for instance, threatened by a company, which collects users' traces and shares them with third parties. To maximize their location privacy, users can decide to get offline so that the co...

  4. Privacy-aware knowledge discovery novel applications and new techniques

    CERN Document Server

    Bonchi, Francesco

    2010-01-01

    Covering research at the frontier of this field, Privacy-Aware Knowledge Discovery: Novel Applications and New Techniques presents state-of-the-art privacy-preserving data mining techniques for application domains, such as medicine and social networks, that face the increasing heterogeneity and complexity of new forms of data. Renowned authorities from prominent organizations not only cover well-established results-they also explore complex domains where privacy issues are generally clear and well defined, but the solutions are still preliminary and in continuous development. Divided into seve

  5. Privacy and the Connected Society

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Khajuria, Samant; Skouby, Knud Erik

    The Vision of the 5G enabled connected society is highly based on the evolution and implementation of Internet of Things. This involves, amongst others, a significant raise in devices, sensors and communication in pervasive interconnections as well as cooperation amongst devices and entities across...... the society. Enabling the vision of the connected society, researchers point in the direction of security and privacy as areas to challenge the vision. By use of the Internet of Things reference model as well as the vision of the connected society, this paper identifies privacy of the individual with respect...... to three selected areas: Shopping, connected cars and online gaming. The paper concludes that privacy is a complexity within the connected society vision and that thee is a need for more privacy use cases to shed light on the challenge....

  6. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  7. FCJ-195 Privacy, Responsibility, and Human Rights Activism

    Directory of Open Access Journals (Sweden)

    Becky Kazansky

    2015-06-01

    Full Text Available In this article, we argue that many difficulties associated with the protection of digital privacy are rooted in the framing of privacy as a predominantly individual responsibility. We examine how models of privacy protection, such as Notice and Choice, contribute to the ‘responsibilisation’ of human rights activists who rely on the use of technologies for their work. We also consider how a group of human rights activists countered technology-mediated threats that this ‘responsibilisation’ causes by developing a collective approach to address their digital privacy and security needs. We conclude this article by discussing how technological tools used to maintain or counter the loss of privacy can be improved in order to support the privacy and digital security of human rights activists.

  8. Privacy-Preserving Restricted Boltzmann Machine

    Directory of Open Access Journals (Sweden)

    Yu Li

    2014-01-01

    Full Text Available With the arrival of the big data era, it is predicted that distributed data mining will lead to an information technology revolution. To motivate different institutes to collaborate with each other, the crucial issue is to eliminate their concerns regarding data privacy. In this paper, we propose a privacy-preserving method for training a restricted boltzmann machine (RBM. The RBM can be got without revealing their private data to each other when using our privacy-preserving method. We provide a correctness and efficiency analysis of our algorithms. The comparative experiment shows that the accuracy is very close to the original RBM model.

  9. Privacy Policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy; Petkovic, M.; Jonker, W.; Jonker, Willem

    2007-01-01

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  10. Valuating Privacy with Option Pricing Theory

    Science.gov (United States)

    Berthold, Stefan; Böhme, Rainer

    One of the key challenges in the information society is responsible handling of personal data. An often-cited reason why people fail to make rational decisions regarding their own informational privacy is the high uncertainty about future consequences of information disclosures today. This chapter builds an analogy to financial options and draws on principles of option pricing to account for this uncertainty in the valuation of privacy. For this purpose, the development of a data subject's personal attributes over time and the development of the attribute distribution in the population are modeled as two stochastic processes, which fit into the Binomial Option Pricing Model (BOPM). Possible applications of such valuation methods to guide decision support in future privacy-enhancing technologies (PETs) are sketched.

  11. Privacy policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, S.; Hartog, den J.I.; Petkovic, M.; Jonker, W.

    2007-01-01

    Privacy is a prime concern in today’s information society. To protect the privacy of individuals, enterprises must follow certain privacy practices while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website, processes

  12. 76 FR 64115 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-092)] Privacy Act of 1974; Privacy Act... retirement of one Privacy Act system of records notice. SUMMARY: In accordance with the Privacy Act of 1974, NASA is giving notice that it proposes to cancel the following Privacy Act system of records notice...

  13. Genetic privacy.

    Science.gov (United States)

    Sankar, Pamela

    2003-01-01

    During the past 10 years, the number of genetic tests performed more than tripled, and public concern about genetic privacy emerged. The majority of states and the U.S. government have passed regulations protecting genetic information. However, research has shown that concerns about genetic privacy are disproportionate to known instances of information misuse. Beliefs in genetic determinacy explain some of the heightened concern about genetic privacy. Discussion of the debate over genetic testing within families illustrates the most recent response to genetic privacy concerns.

  14. Location Privacy Techniques in Client-Server Architectures

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yiu, Man Lung

    2009-01-01

    A typical location-based service returns nearby points of interest in response to a user location. As such services are becoming increasingly available and popular, location privacy emerges as an important issue. In a system that does not offer location privacy, users must disclose their exact...... locations in order to receive the desired services. We view location privacy as an enabling technology that may lead to increased use of location-based services. In this chapter, we consider location privacy techniques that work in traditional client-server architectures without any trusted components other....... Third, their effectiveness is independent of the distribution of other users, unlike the k-anonymity approach. The chapter characterizes the privacy models assumed by existing techniques and categorizes these according to their approach. The techniques are then covered in turn according...

  15. Data privacy foundations, new developments and the big data challenge

    CERN Document Server

    Torra, Vicenç

    2017-01-01

    This book offers a broad, cohesive overview of the field of data privacy. It discusses, from a technological perspective, the problems and solutions of the three main communities working on data privacy: statistical disclosure control (those with a statistical background), privacy-preserving data mining (those working with data bases and data mining), and privacy-enhancing technologies (those involved in communications and security) communities. Presenting different approaches, the book describes alternative privacy models and disclosure risk measures as well as data protection procedures for respondent, holder and user privacy. It also discusses specific data privacy problems and solutions for readers who need to deal with big data.

  16. Privacy og selvbeskrivelse

    DEFF Research Database (Denmark)

    Rosengaard, Hans Ulrik

    2015-01-01

    En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse......En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse...

  17. Couldn't or wouldn't? The influence of privacy concerns and self-efficacy in privacy management on privacy protection.

    Science.gov (United States)

    Chen, Hsuan-Ting; Chen, Wenghong

    2015-01-01

    Sampling 515 college students, this study investigates how privacy protection, including profile visibility, self-disclosure, and friending, are influenced by privacy concerns and efficacy regarding one's own ability to manage privacy settings, a factor that researchers have yet to give a great deal of attention to in the context of social networking sites (SNSs). The results of this study indicate an inconsistency in adopting strategies to protect privacy, a disconnect from limiting profile visibility and friending to self-disclosure. More specifically, privacy concerns lead SNS users to limit their profile visibility and discourage them from expanding their network. However, they do not constrain self-disclosure. Similarly, while self-efficacy in privacy management encourages SNS users to limit their profile visibility, it facilitates self-disclosure. This suggests that if users are limiting their profile visibility and constraining their friending behaviors, it does not necessarily mean they will reduce self-disclosure on SNSs because these behaviors are predicted by different factors. In addition, the study finds an interaction effect between privacy concerns and self-efficacy in privacy management on friending. It points to the potential problem of increased risk-taking behaviors resulting from high self-efficacy in privacy management and low privacy concerns.

  18. The privacy coach: Supporting customer privacy in the internet of things

    NARCIS (Netherlands)

    Broenink, E.G.; Hoepman, J.H.; Hof, C. van 't; Kranenburg, R. van; Smits, D.; Wisman, T.

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing

  19. Semantic Security: Privacy Definitions Revisited

    OpenAIRE

    Jinfei Liu; Li Xiong; Jun Luo

    2013-01-01

    In this paper we illustrate a privacy framework named Indistinguishabley Privacy. Indistinguishable privacy could be deemed as the formalization of the existing privacy definitions in privacy preserving data publishing as well as secure multi-party computation. We introduce three representative privacy notions in the literature, Bayes-optimal privacy for privacy preserving data publishing, differential privacy for statistical data release, and privacy w.r.t. semi-honest behavior in the secure...

  20. Quantifying the costs and benefits of privacy-preserving health data publishing.

    Science.gov (United States)

    Khokhar, Rashid Hussain; Chen, Rui; Fung, Benjamin C M; Lui, Siu Man

    2014-08-01

    Cost-benefit analysis is a prerequisite for making good business decisions. In the business environment, companies intend to make profit from maximizing information utility of published data while having an obligation to protect individual privacy. In this paper, we quantify the trade-off between privacy and data utility in health data publishing in terms of monetary value. We propose an analytical cost model that can help health information custodians (HICs) make better decisions about sharing person-specific health data with other parties. We examine relevant cost factors associated with the value of anonymized data and the possible damage cost due to potential privacy breaches. Our model guides an HIC to find the optimal value of publishing health data and could be utilized for both perturbative and non-perturbative anonymization techniques. We show that our approach can identify the optimal value for different privacy models, including K-anonymity, LKC-privacy, and ∊-differential privacy, under various anonymization algorithms and privacy parameters through extensive experiments on real-life data. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. GAIN RATIO BASED FEATURE SELECTION METHOD FOR PRIVACY PRESERVATION

    Directory of Open Access Journals (Sweden)

    R. Praveena Priyadarsini

    2011-04-01

    Full Text Available Privacy-preservation is a step in data mining that tries to safeguard sensitive information from unsanctioned disclosure and hence protecting individual data records and their privacy. There are various privacy preservation techniques like k-anonymity, l-diversity and t-closeness and data perturbation. In this paper k-anonymity privacy protection technique is applied to high dimensional datasets like adult and census. since, both the data sets are high dimensional, feature subset selection method like Gain Ratio is applied and the attributes of the datasets are ranked and low ranking attributes are filtered to form new reduced data subsets. K-anonymization privacy preservation technique is then applied on reduced datasets. The accuracy of the privacy preserved reduced datasets and the original datasets are compared for their accuracy on the two functionalities of data mining namely classification and clustering using naïve Bayesian and k-means algorithm respectively. Experimental results show that classification and clustering accuracy are comparatively the same for reduced k-anonym zed datasets and the original data sets.

  2. E-Commerce and Privacy: Conflict and Opportunity.

    Science.gov (United States)

    Farah, Badie N.; Higby, Mary A.

    2001-01-01

    Electronic commerce has intensified conflict between businesses' need to collect data and customers' desire to protect privacy. Web-based privacy tools and legislation could add to the costs of e-commerce and reduce profitability. Business models not based on profiling customers may be needed. (SK)

  3. Comparative Approaches to Biobanks and Privacy.

    Science.gov (United States)

    Rothstein, Mark A; Knoppers, Bartha Maria; Harrell, Heather L

    2016-03-01

    Laws in the 20 jurisdictions studied for this project display many similar approaches to protecting privacy in biobank research. Although few have enacted biobank-specific legislation, many countries address biobanking within other laws. All provide for some oversight mechanisms for biobank research, even though the nature of that oversight varies between jurisdictions. Most have some sort of controlled access system in place for research with biobank specimens. While broad consent models facilitate biobanking, countries without national or federated biobanks have been slow to adopt broad consent. International guidelines have facilitated sharing and generally take a proportional risk approach, but many countries have provisions guiding international sharing and a few even limit international sharing. Although privacy laws may not prohibit international collaborations, the multi-prong approach to privacy unique to each jurisdiction can complicate international sharing. These symposium issues can serve as a resource for explaining the sometimes intricate privacy laws in each studied jurisdiction, outlining the key issues with regards to privacy and biobanking, and serving to describe a framework for the process of harmonization of privacy laws. © 2016 American Society of Law, Medicine & Ethics.

  4. Privacy and security in teleradiology

    International Nuclear Information System (INIS)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  5. Privacy and security in teleradiology

    Energy Technology Data Exchange (ETDEWEB)

    Ruotsalainen, Pekka [National Institute for Health and Welfare, Helsinki (Finland)], E-mail: pekka.ruotsalainen@THL.fi

    2010-01-15

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  6. Tales from the dark side: Privacy dark strategies and privacy dark patterns

    DEFF Research Database (Denmark)

    Bösch, Christoph; Erb, Benjamin; Kargl, Frank

    2016-01-01

    Privacy strategies and privacy patterns are fundamental concepts of the privacy-by-design engineering approach. While they support a privacy-aware development process for IT systems, the concepts used by malicious, privacy-threatening parties are generally less understood and known. We argue...... that understanding the “dark side”, namely how personal data is abused, is of equal importance. In this paper, we introduce the concept of privacy dark strategies and privacy dark patterns and present a framework that collects, documents, and analyzes such malicious concepts. In addition, we investigate from...... a psychological perspective why privacy dark strategies are effective. The resulting framework allows for a better understanding of these dark concepts, fosters awareness, and supports the development of countermeasures. We aim to contribute to an easier detection and successive removal of such approaches from...

  7. Privacy in Pharmacogenetics: An End-to-End Case Study of Personalized Warfarin Dosing.

    Science.gov (United States)

    Fredrikson, Matthew; Lantz, Eric; Jha, Somesh; Lin, Simon; Page, David; Ristenpart, Thomas

    2014-08-01

    We initiate the study of privacy in pharmacogenetics, wherein machine learning models are used to guide medical treatments based on a patient's genotype and background. Performing an in-depth case study on privacy in personalized warfarin dosing, we show that suggested models carry privacy risks, in particular because attackers can perform what we call model inversion : an attacker, given the model and some demographic information about a patient, can predict the patient's genetic markers. As differential privacy (DP) is an oft-proposed solution for medical settings such as this, we evaluate its effectiveness for building private versions of pharmacogenetic models. We show that DP mechanisms prevent our model inversion attacks when the privacy budget is carefully selected . We go on to analyze the impact on utility by performing simulated clinical trials with DP dosing models. We find that for privacy budgets effective at preventing attacks, patients would be exposed to increased risk of stroke, bleeding events, and mortality . We conclude that current DP mechanisms do not simultaneously improve genomic privacy while retaining desirable clinical efficacy, highlighting the need for new mechanisms that should be evaluated in situ using the general methodology introduced by our work.

  8. Efficient Dynamic Searchable Encryption with Forward Privacy

    Directory of Open Access Journals (Sweden)

    Etemad Mohammad

    2018-01-01

    Full Text Available Searchable symmetric encryption (SSE enables a client to perform searches over its outsourced encrypted files while preserving privacy of the files and queries. Dynamic schemes, where files can be added or removed, leak more information than static schemes. For dynamic schemes, forward privacy requires that a newly added file cannot be linked to previous searches. We present a new dynamic SSE scheme that achieves forward privacy by replacing the keys revealed to the server on each search. Our scheme is efficient and parallelizable and outperforms the best previous schemes providing forward privacy, and achieves competitive performance with dynamic schemes without forward privacy. We provide a full security proof in the random oracle model. In our experiments on the Wikipedia archive of about four million pages, the server takes one second to perform a search with 100,000 results.

  9. Achieving Optimal Privacy in Trust-Aware Social Recommender Systems

    Science.gov (United States)

    Dokoohaki, Nima; Kaleli, Cihan; Polat, Huseyin; Matskin, Mihhail

    Collaborative filtering (CF) recommenders are subject to numerous shortcomings such as centralized processing, vulnerability to shilling attacks, and most important of all privacy. To overcome these obstacles, researchers proposed for utilization of interpersonal trust between users, to alleviate many of these crucial shortcomings. Till now, attention has been mainly paid to strong points about trust-aware recommenders such as alleviating profile sparsity or calculation cost efficiency, while least attention has been paid on investigating the notion of privacy surrounding the disclosure of individual ratings and most importantly protection of trust computation across social networks forming the backbone of these systems. To contribute to addressing problem of privacy in trust-aware recommenders, within this paper, first we introduce a framework for enabling privacy-preserving trust-aware recommendation generation. While trust mechanism aims at elevating recommender's accuracy, to preserve privacy, accuracy of the system needs to be decreased. Since within this context, privacy and accuracy are conflicting goals we show that a Pareto set can be found as an optimal setting for both privacy-preserving and trust-enabling mechanisms. We show that this Pareto set, when used as the configuration for measuring the accuracy of base collaborative filtering engine, yields an optimized tradeoff between conflicting goals of privacy and accuracy. We prove this concept along with applicability of our framework by experimenting with accuracy and privacy factors, and we show through experiment how such optimal set can be inferred.

  10. Privacy-preserving heterogeneous health data sharing.

    Science.gov (United States)

    Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila

    2013-05-01

    Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis.

  11. Nano-technology and privacy: on continuous surveillance outside the panopticon.

    Science.gov (United States)

    Hoven, Jeroen Van Den; Vermaas, Pieter E

    2007-01-01

    We argue that nano-technology in the form of invisible tags, sensors, and Radio Frequency Identity Chips (RFIDs) will give rise to privacy issues that are in two ways different from the traditional privacy issues of the last decades. One, they will not exclusively revolve around the idea of centralization of surveillance and concentration of power, as the metaphor of the Panopticon suggests, but will be about constant observation at decentralized levels. Two, privacy concerns may not exclusively be about constraining information flows but also about designing of materials and nano-artifacts such as chips and tags. We begin by presenting a framework for structuring the current debates on privacy, and then present our arguments.

  12. The role of privacy protection in healthcare information systems adoption.

    Science.gov (United States)

    Hsu, Chien-Lung; Lee, Ming-Ren; Su, Chien-Hui

    2013-10-01

    Privacy protection is an important issue and challenge in healthcare information systems (HISs). Recently, some privacy-enhanced HISs are proposed. Users' privacy perception, intention, and attitude might affect the adoption of such systems. This paper aims to propose a privacy-enhanced HIS framework and investigate the role of privacy protection in HISs adoption. In the proposed framework, privacy protection, access control, and secure transmission modules are designed to enhance the privacy protection of a HIS. An experimental privacy-enhanced HIS is also implemented. Furthermore, we proposed a research model extending the unified theory of acceptance and use of technology by considering perceived security and information security literacy and then investigate user adoption of a privacy-enhanced HIS. The experimental results and analyses showed that user adoption of a privacy-enhanced HIS is directly affected by social influence, performance expectancy, facilitating conditions, and perceived security. Perceived security has a mediating effect between information security literacy and user adoption. This study proposes several implications for research and practice to improve designing, development, and promotion of a good healthcare information system with privacy protection.

  13. 75 FR 81205 - Privacy Act: Revision of Privacy Act Systems of Records

    Science.gov (United States)

    2010-12-27

    ... DEPARTMENT OF AGRICULTURE Office of the Secretary Privacy Act: Revision of Privacy Act Systems of Records AGENCY: Office of the Secretary, USDA. ACTION: Notice to Revise Privacy Act Systems of Records... two Privacy Act Systems of Records entitled ``Information on Persons Disqualified from the...

  14. Information privacy in organizations: empowering creative and extrarole performance.

    Science.gov (United States)

    Alge, Bradley J; Ballinger, Gary A; Tangirala, Subrahmaniam; Oakley, James L

    2006-01-01

    This article examines the relationship of employee perceptions of information privacy in their work organizations and important psychological and behavioral outcomes. A model is presented in which information privacy predicts psychological empowerment, which in turn predicts discretionary behaviors on the job, including creative performance and organizational citizenship behavior (OCB). Results from 2 studies (Study 1: single organization, N=310; Study 2: multiple organizations, N=303) confirm that information privacy entails judgments of information gathering control, information handling control, and legitimacy. Moreover, a model linking information privacy to empowerment and empowerment to creative performance and OCBs was supported. Findings are discussed in light of organizational attempts to control employees through the gathering and handling of their personal information. (c) 2006 APA, all rights reserved.

  15. 78 FR 40515 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-07-05

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-071] Privacy Act of 1974; Privacy Act System of Records AGENCY: National Aeronautics and Space Administration (NASA). ACTION: Notice of Privacy Act system of records. SUMMARY: Each Federal agency is required by the Privacy Act of 1974 to publish...

  16. Privacy Management and Networked PPD Systems - Challenges Solutions.

    Science.gov (United States)

    Ruotsalainen, Pekka; Pharow, Peter; Petersen, Francoise

    2015-01-01

    Modern personal portable health devices (PPDs) become increasingly part of a larger, inhomogeneous information system. Information collected by sensors are stored and processed in global clouds. Services are often free of charge, but at the same time service providers' business model is based on the disclosure of users' intimate health information. Health data processed in PPD networks is not regulated by health care specific legislation. In PPD networks, there is no guarantee that stakeholders share same ethical principles with the user. Often service providers have own security and privacy policies and they rarely offer to the user possibilities to define own, or adapt existing privacy policies. This all raises huge ethical and privacy concerns. In this paper, the authors have analyzed privacy challenges in PPD networks from users' viewpoint using system modeling method and propose the principle "Personal Health Data under Personal Control" must generally be accepted at global level. Among possible implementation of this principle, the authors propose encryption, computer understandable privacy policies, and privacy labels or trust based privacy management methods. The latter can be realized using infrastructural trust calculation and monitoring service. A first step is to require the protection of personal health information and the principle proposed being internationally mandatory. This requires both regulatory and standardization activities, and the availability of open and certified software application which all service providers can implement. One of those applications should be the independent Trust verifier.

  17. Privacy and security in teleradiology.

    Science.gov (United States)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  18. 78 FR 77503 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-12-23

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-149] Privacy Act of 1974; Privacy Act... proposed revisions to existing Privacy Act systems of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  19. Privacy Preserving Association Rule Mining Revisited: Privacy Enhancement and Resources Efficiency

    Science.gov (United States)

    Mohaisen, Abedelaziz; Jho, Nam-Su; Hong, Dowon; Nyang, Daehun

    Privacy preserving association rule mining algorithms have been designed for discovering the relations between variables in data while maintaining the data privacy. In this article we revise one of the recently introduced schemes for association rule mining using fake transactions (FS). In particular, our analysis shows that the FS scheme has exhaustive storage and high computation requirements for guaranteeing a reasonable level of privacy. We introduce a realistic definition of privacy that benefits from the average case privacy and motivates the study of a weakness in the structure of FS by fake transactions filtering. In order to overcome this problem, we improve the FS scheme by presenting a hybrid scheme that considers both privacy and resources as two concurrent guidelines. Analytical and empirical results show the efficiency and applicability of our proposed scheme.

  20. 76 FR 67763 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-11-02

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-109)] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  1. 76 FR 64114 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-093)] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  2. 77 FR 69898 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2012-11-21

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 12-100] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  3. Privacy and Innovation

    OpenAIRE

    Avi Goldfarb; Catherine Tucker

    2011-01-01

    Information and communication technology now enables firms to collect detailed and potentially intrusive data about their customers both easily and cheaply. This means that privacy concerns are no longer limited to government surveillance and public figures' private lives. The empirical literature on privacy regulation shows that privacy regulation may affect the extent and direction of data-based innovation. We also show that the impact of privacy regulation can be extremely heterogeneous. T...

  4. Simple Peer-to-Peer SIP Privacy

    Science.gov (United States)

    Koskela, Joakim; Tarkoma, Sasu

    In this paper, we introduce a model for enhancing privacy in peer-to-peer communication systems. The model is based on data obfuscation, preventing intermediate nodes from tracking calls, while still utilizing the shared resources of the peer network. This increases security when moving between untrusted, limited and ad-hoc networks, when the user is forced to rely on peer-to-peer schemes. The model is evaluated using a Host Identity Protocol-based prototype on mobile devices, and is found to provide good privacy, especially when combined with a source address hiding scheme. The contribution of this paper is to present the model and results obtained from its use, including usability considerations.

  5. Privacy and technology challenges for ubiquitous social networking

    DEFF Research Database (Denmark)

    Sapuppo, Antonio; Seet, Boon-Chong

    2015-01-01

    towards important challenges such as social sensing, enabling social networking and privacy protection. In this paper we firstly investigate the methods and technologies for acquisition of the relevant context for promotion of sociability among inhabitants of USN environments. Afterwards, we review...... architectures and techniques for enabling social interactions between participants. Finally, we identify privacy as the major challenge for networking in USN environments. Consequently, we depict design guidelines and review privacy protection models for facilitating personal information disclosure....

  6. Smartdata privacy meets evolutionary robotics

    CERN Document Server

    Harvey, Inman; Tomko, George

    2013-01-01

    Privacy by Design and the Promise of SmartData.- SmartData: the Need, the Goal and the Challenge.- Perspectives on Artificial Intelligence.- Context dependent information processing entails scale-free dynamics.- Philosophy and SmartData.- Relevance Realization and the Neurodynamics and Neural Connectivity of General Intelligence.- What Matters: Real Bodies and Virtual Worlds.- The development of autonomous virtual agents.- Patterns of Attractors in the "Brain"".- A Privacy-Enabled Mobile Computing Model Using Intelligent Cloud-Based Services.- Unconstraint the Population: the Benefits of Horiz

  7. Privacy-Aware Relevant Data Access with Semantically Enriched Search Queries for Untrusted Cloud Storage Services.

    Science.gov (United States)

    Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong

    2016-01-01

    Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider.

  8. Internet and Privacy

    OpenAIRE

    Al-Fadhli, Meshal Shehab

    2007-01-01

    The concept of privacy is hard to understand and is not easy to define, because this concept is linked with several dimensions. Internet Privacy is associated with the use of the Internet and most likely appointed under communications privacy, involving the user of the Internet’s personal information and activities, and the disclosure of them online. This essay is going to present the meaning of privacy and the implications of it for Internet users. Also, this essay will demonstrate some of t...

  9. The future of privacy - Addressing singularities to identify bright-line rules that speak to us

    NARCIS (Netherlands)

    de Hert, Paul

    2016-01-01

    To apprehend the future of privacy I have opted for a controlled exploration of the issue, mainly taking the form of delamination: an exploration or assessment of privacy in a broad sense is not the object of this reflection. The focus is on technology-related privacy. Is the future of (some aspects

  10. The disclosure of diagnosis codes can breach research participants' privacy.

    Science.gov (United States)

    Loukides, Grigorios; Denny, Joshua C; Malin, Bradley

    2010-01-01

    De-identified clinical data in standardized form (eg, diagnosis codes), derived from electronic medical records, are increasingly combined with research data (eg, DNA sequences) and disseminated to enable scientific investigations. This study examines whether released data can be linked with identified clinical records that are accessible via various resources to jeopardize patients' anonymity, and the ability of popular privacy protection methodologies to prevent such an attack. The study experimentally evaluates the re-identification risk of a de-identified sample of Vanderbilt's patient records involved in a genome-wide association study. It also measures the level of protection from re-identification, and data utility, provided by suppression and generalization. Privacy protection is quantified using the probability of re-identifying a patient in a larger population through diagnosis codes. Data utility is measured at a dataset level, using the percentage of retained information, as well as its description, and at a patient level, using two metrics based on the difference between the distribution of Internal Classification of Disease (ICD) version 9 codes before and after applying privacy protection. More than 96% of 2800 patients' records are shown to be uniquely identified by their diagnosis codes with respect to a population of 1.2 million patients. Generalization is shown to reduce further the percentage of de-identified records by less than 2%, and over 99% of the three-digit ICD-9 codes need to be suppressed to prevent re-identification. Popular privacy protection methods are inadequate to deliver a sufficiently protected and useful result when sharing data derived from complex clinical systems. The development of alternative privacy protection models is thus required.

  11. AN EMPIRICAL EXAMINATION OF THE PRIVACY BEHAVIOUR WITH REFERENCE SOCIAL NETWORKING SITES

    OpenAIRE

    Tanvi Gupta*1, Mamta Rani2 & Ravneet Singh Bhandari3

    2018-01-01

    Purpose – The purpose of this paper is to propose and examine a privacy behaviour model in the context of Social networking factors (Indian scenario). The effects of key elements of SNS factors on perceived values of privacy behaviour were empirically determined. Design/methodology/approach – SNS is conceptualized as a multi-dimensional construct including emotional privacy, social privacy, personal privacy, and value. The investigated socio demographic factors included, age, gender, ethn...

  12. Self-reflection on privacy research in social networking sites

    OpenAIRE

    De Wolf, Ralf; Vanderhoven, Ellen; Berendt, Bettina; Pierson, Jo; Schellens, Tammy

    2017-01-01

    The increasing popularity of social networking sites has been a source of many privacy concerns. To mitigate these concerns and empower users, different forms of educational and technological solutions have been developed. Developing and evaluating such solutions, however, cannot be considered a neutral process. Instead, it is socially bound and interwoven with norms and values of the researchers. In this contribution, we aim to make the research process and development of privacy solutions m...

  13. Assessing the privacy policies in mobile personal health records.

    Science.gov (United States)

    Zapata, Belén Cruz; Hernández Niñirola, Antonio; Fernández-Alemán, José Luis; Toval, Ambrosio

    2014-01-01

    The huge increase in the number and use of smartphones and tablets has led health service providers to take an interest in mHealth. Popular mobile app markets like Apple App Store or Google Play contain thousands of health applications. Although mobile personal health records (mPHRs) have a number of benefits, important challenges appear in the form of adoption barriers. Security and privacy have been identified as part of these barriers and should be addressed. This paper analyzes and assesses a total of 24 free mPHRs for Android and iOS. Characteristics regarding privacy and security were extracted from the HIPAA. The results show important differences in both the mPHRs and the characteristics analyzed. A questionnaire containing six questions concerning privacy policies was defined. Our questionnaire may assist developers and stakeholders to evaluate the security and privacy of their mPHRs.

  14. 76 FR 64112 - Privacy Act of 1974; Privacy Act System of Records Appendices

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-091)] Privacy Act of 1974; Privacy Act...: Revisions of NASA Appendices to Privacy Act System of Records. SUMMARY: Notice is hereby given that NASA is... Privacy Act of 1974. This notice publishes those amendments as set forth below under the caption...

  15. Privacy encounters in Teledialogue

    DEFF Research Database (Denmark)

    Andersen, Lars Bo; Bøge, Ask Risom; Danholt, Peter

    2017-01-01

    Privacy is a major concern when new technologies are introduced between public authorities and private citizens. What is meant by privacy, however, is often unclear and contested. Accordingly, this article utilises grounded theory to study privacy empirically in the research and design project...... Teledialogue aimed at introducing new ways for public case managers and placed children to communicate through IT. The resulting argument is that privacy can be understood as an encounter, that is, as something that arises between implicated actors and entails some degree of friction and negotiation....... An argument which is further qualified through the philosophy of Gilles Deleuze. The article opens with a review of privacy literature before continuing to present privacy as an encounter with five different foci: what technologies bring into the encounter; who is related to privacy by implication; what...

  16. A Distributed Ensemble Approach for Mining Healthcare Data under Privacy Constraints.

    Science.gov (United States)

    Li, Yan; Bai, Changxin; Reddy, Chandan K

    2016-02-10

    In recent years, electronic health records (EHRs) have been widely adapted at many healthcare facilities in an attempt to improve the quality of patient care and increase the productivity and efficiency of healthcare delivery. These EHRs can accurately diagnose diseases if utilized appropriately. While the EHRs can potentially resolve many of the existing problems associated with disease diagnosis, one of the main obstacles in effectively using them is the patient privacy and sensitivity of the medical information available in the EHR. Due to these concerns, even if the EHRs are available for storage and retrieval purposes, sharing of the patient records between different healthcare facilities has become a major concern and has hampered some of the effective advantages of using EHRs. Due to this lack of data sharing, most of the facilities aim at building clinical decision support systems using limited amount of patient data from their own EHR systems to provide important diagnosis related decisions. It becomes quite infeasible for a newly established healthcare facility to build a robust decision making system due to the lack of sufficient patient records. However, to make effective decisions from clinical data, it is indispensable to have large amounts of data to train the decision models. In this regard, there are conflicting objectives of preserving patient privacy and having sufficient data for modeling and decision making. To handle such disparate goals, we develop two adaptive distributed privacy-preserving algorithms based on a distributed ensemble strategy. The basic idea of our approach is to build an elegant model for each participating facility to accurately learn the data distribution, and then can transfer the useful healthcare knowledge acquired on their data from these participators in the form of their own decision models without revealing and sharing the patient-level sensitive data, thus protecting patient privacy. We demonstrate that our

  17. Hacktivism 1-2-3: how privacy enhancing technologies change the face of anonymous hacktivism

    NARCIS (Netherlands)

    Bodó, B.

    2014-01-01

    This short essay explores how the notion of hacktivism changes due to easily accessible, military grade Privacy Enhancing Technologies (PETs). Privacy Enhancing Technologies, technological tools which provide anonymous communications and protect users from online surveillance enable new forms of

  18. Privacy vs. Reward in Indoor Location-Based Services

    Directory of Open Access Journals (Sweden)

    Fawaz Kassem

    2016-10-01

    Full Text Available With the advance of indoor localization technology, indoor location-based services (ILBS are gaining popularity. They, however, accompany privacy concerns. ILBS providers track the users’ mobility to learn more about their behavior, and then provide them with improved and personalized services. Our survey of 200 individuals highlighted their concerns about this tracking for potential leakage of their personal/private traits, but also showed their willingness to accept reduced tracking for improved service. In this paper, we propose PR-LBS (Privacy vs. Reward for Location-Based Service, a system that addresses these seemingly conflicting requirements by balancing the users’ privacy concerns and the benefits of sharing location information in indoor location tracking environments. PR-LBS relies on a novel location-privacy criterion to quantify the privacy risks pertaining to sharing indoor location information. It also employs a repeated play model to ensure that the received service is proportionate to the privacy risk. We implement and evaluate PR-LBS extensively with various real-world user mobility traces. Results show that PR-LBS has low overhead, protects the users’ privacy, and makes a good tradeoff between the quality of service for the users and the utility of shared location data for service providers.

  19. Reward-based spatial crowdsourcing with differential privacy preservation

    Science.gov (United States)

    Xiong, Ping; Zhang, Lefeng; Zhu, Tianqing

    2017-11-01

    In recent years, the popularity of mobile devices has transformed spatial crowdsourcing (SC) into a novel mode for performing complicated projects. Workers can perform tasks at specified locations in return for rewards offered by employers. Existing methods ensure the efficiency of their systems by submitting the workers' exact locations to a centralised server for task assignment, which can lead to privacy violations. Thus, implementing crowsourcing applications while preserving the privacy of workers' location is a key issue that needs to be tackled. We propose a reward-based SC method that achieves acceptable utility as measured by task assignment success rates, while efficiently preserving privacy. A differential privacy model ensures rigorous privacy guarantee, and Laplace noise is introduced to protect workers' exact locations. We then present a reward allocation mechanism that adjusts each piece of the reward for a task using the distribution of the workers' locations. Through experimental results, we demonstrate that this optimised-reward method is efficient for SC applications.

  20. The Models of Applying Online Privacy Literacy Strategies: A Case Study of Instagram Girl Users

    OpenAIRE

    Abdollah Bicharanlou; Seyedeh farzaneh Siasi rad

    2017-01-01

    Social networks affect remarkably in the lives of virtual space users. These networks like most human relations involve compromising between self-disclosure and privacy protection. A process which is realized through improving privacy and empowering the user at the personal level. This study aimed to assess strategies based on online privacy literacy. In particular, strategies that Instagram young girls users should employ to achieve the optimum level of privacy. For this purpose, firstly the...

  1. Privacy preserving data anonymization of spontaneous ADE reporting system dataset.

    Science.gov (United States)

    Lin, Wen-Yang; Yang, Duen-Chuan; Wang, Jie-Teng

    2016-07-18

    To facilitate long-term safety surveillance of marketing drugs, many spontaneously reporting systems (SRSs) of ADR events have been established world-wide. Since the data collected by SRSs contain sensitive personal health information that should be protected to prevent the identification of individuals, it procures the issue of privacy preserving data publishing (PPDP), that is, how to sanitize (anonymize) raw data before publishing. Although much work has been done on PPDP, very few studies have focused on protecting privacy of SRS data and none of the anonymization methods is favorable for SRS datasets, due to which contain some characteristics such as rare events, multiple individual records, and multi-valued sensitive attributes. We propose a new privacy model called MS(k, θ (*) )-bounding for protecting published spontaneous ADE reporting data from privacy attacks. Our model has the flexibility of varying privacy thresholds, i.e., θ (*) , for different sensitive values and takes the characteristics of SRS data into consideration. We also propose an anonymization algorithm for sanitizing the raw data to meet the requirements specified through the proposed model. Our algorithm adopts a greedy-based clustering strategy to group the records into clusters, conforming to an innovative anonymization metric aiming to minimize the privacy risk as well as maintain the data utility for ADR detection. Empirical study was conducted using FAERS dataset from 2004Q1 to 2011Q4. We compared our model with four prevailing methods, including k-anonymity, (X, Y)-anonymity, Multi-sensitive l-diversity, and (α, k)-anonymity, evaluated via two measures, Danger Ratio (DR) and Information Loss (IL), and considered three different scenarios of threshold setting for θ (*) , including uniform setting, level-wise setting and frequency-based setting. We also conducted experiments to inspect the impact of anonymized data on the strengths of discovered ADR signals. With all three

  2. Neuroethics and Brain Privacy

    DEFF Research Database (Denmark)

    Ryberg, Jesper

    2017-01-01

    An introduction is presented in which editor discusses various articles within the issue on topics including ethical challenges with importance of privacy for well-being, impact of brain-reading on mind privacy and neurotechnology.......An introduction is presented in which editor discusses various articles within the issue on topics including ethical challenges with importance of privacy for well-being, impact of brain-reading on mind privacy and neurotechnology....

  3. Opening More Data : A New Privacy Risk Scoring Model for Open Data

    NARCIS (Netherlands)

    Ali-Eldin, A.M.T.; Zuiderwijk-van Eijk, AMG; Janssen, M.F.W.H.A.

    2017-01-01

    While the opening of data has become a common practice for both governments and companies, many datasets
    are still not published since they might violate privacy regulations. The risk on privacy violations is a factor
    that often blocks the publication of data and results in a reserved

  4. Service Outsourcing Character Oriented Privacy Conflict Detection Method in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2014-01-01

    Full Text Available Cloud computing has provided services for users as a software paradigm. However, it is difficult to ensure privacy information security because of its opening, virtualization, and service outsourcing features. Therefore how to protect user privacy information has become a research focus. In this paper, firstly, we model service privacy policy and user privacy preference with description logic. Secondly, we use the pellet reasonor to verify the consistency and satisfiability, so as to detect the privacy conflict between services and user. Thirdly, we present the algorithm of detecting privacy conflict in the process of cloud service composition and prove the correctness and feasibility of this method by case study and experiment analysis. Our method can reduce the risk of user sensitive privacy information being illegally used and propagated by outsourcing services. In the meantime, the method avoids the exception in the process of service composition by the privacy conflict, and improves the trust degree of cloud service providers.

  5. Protecting genetic privacy.

    Science.gov (United States)

    Roche, P A; Annas, G J

    2001-05-01

    This article outlines the arguments for and against new rules to protect genetic privacy. We explain why genetic information is different to other sensitive medical information, why researchers and biotechnology companies have opposed new rules to protect genetic privacy (and favour anti-discrimination laws instead), and discuss what can be done to protect privacy in relation to genetic-sequence information and to DNA samples themselves.

  6. Privacy Preserved Self-Awareness on the Community via Crowd Sensing

    Directory of Open Access Journals (Sweden)

    Huiting Fan

    2017-01-01

    Full Text Available In social activities, people are interested in some statistical data, such as purchase records, monthly consumption, and health data, which are usually utilized in recommendation systems. And it is seductive for them to acquire the ranking of these data among friends or other communities. In the meantime, they want their privacy data to be confidential. Therefore, a strategy is presented to allow users to obtain the result of calculating their privacy data while preserving these data. In this method, firstly a polynomial approximation function model is set up for each user. Afterwards, “fragment” the coefficients of each model into pieces. Eventually “blend” all scraps to build the global model of all users. Users can use the global model to gain their corresponding ranking results after a special computing. Security analyses of three aspects elaborate the validity of proposed privacy method, even if some spiteful attackers try to steal private data of users, no matter who they are (users or someone outside the community. Experiments results manifest that the global model competently fits all users data and all privacy data are protected.

  7. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees.

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-09-10

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one.

  8. Privacy Verification Using Ontologies

    NARCIS (Netherlands)

    Kost, Martin; Freytag, Johann-Christoph; Kargl, Frank; Kung, Antonio

    2011-01-01

    As information systems extensively exchange information between participants, privacy concerns may arise from its potential misuse. A Privacy by Design (PbD) approach considers privacy requirements of different stakeholders during the design and the implementation of a system. Currently, a

  9. Identity management and privacy languages technologies: Improving user control of data privacy

    Science.gov (United States)

    García, José Enrique López; García, Carlos Alberto Gil; Pacheco, Álvaro Armenteros; Organero, Pedro Luis Muñoz

    The identity management solutions have the capability to bring confidence to internet services, but this confidence could be improved if user has more control over the privacy policy of its attributes. Privacy languages could help to this task due to its capability to define privacy policies for data in a very flexible way. So, an integration problem arises: making work together both identity management and privacy languages. Despite several proposals for accomplishing this have already been defined, this paper suggests some topics and improvements that could be considered.

  10. Privacy and internet services

    OpenAIRE

    Samec, Marek

    2010-01-01

    This thesis is focused on internet services user privacy. Goal of this thesis is to determine level of user awareness of how is their privacy approached while using internet services. Then suggest procedure to improve this awareness, or that will lead to better control of individual privacy. In theoretical part I analyze general and legislative approach to privacy, followed by analysis of behaviour of internet service users and providers. Part of this analysis deals with usage of web cookies ...

  11. Privacy, Liveliness and Fairness for Reputation

    Science.gov (United States)

    Schiffner, Stefan; Clauß, Sebastian; Steinbrecher, Sandra

    In various Internet applications, reputation systems are typical means to collect experiences users make with each other. We present a reputation system that balances the security and privacy requirements of all users involed. Our system provides privacy in the form of information theoretic relationship anonymity w.r.t. users and the reputation provider. Furthermore, it preserves liveliness, i.e., all past ratings can influence the current reputation profile of a user. In addition, mutual ratings are forced to be simultaneous and self rating is prevented, which enforces fairness. What is more, without performing mock interactions - even if all users are colluding - users cannot forge ratings. As far as we know, this is the first protocol proposed that fulfills all these properties simultaneously.

  12. Gender and online privacy among teens: risk perception, privacy concerns, and protection behaviors.

    Science.gov (United States)

    Youn, Seounmi; Hall, Kimberly

    2008-12-01

    Survey data from 395 high school students revealed that girls perceive more privacy risks and have a higher level of privacy concerns than boys. Regarding privacy protection behaviors, boys tended to read unsolicited e-mail and register for Web sites while directly sending complaints in response to unsolicited e-mail. This study found girls to provide inaccurate information as their privacy concerns increased. Boys, however, refrained from registering to Web sites as their concerns increased.

  13. Enabling secure and privacy preserving communications in smart grids

    CERN Document Server

    Li, Hongwei

    2014-01-01

    This brief focuses on the current research on security and privacy preservation in smart grids. Along with a review of the existing works, this brief includes fundamental system models, possible frameworks, useful performance, and future research directions. It explores privacy preservation demand response with adaptive key evolution, secure and efficient Merkle tree based authentication, and fine-grained keywords comparison in the smart grid auction market. By examining the current and potential security and privacy threats, the author equips readers to understand the developing issues in sma

  14. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-01-01

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one. PMID:27626417

  15. 48 CFR 39.105 - Privacy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Privacy. 39.105 Section 39... CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.105 Privacy. Agencies shall ensure that contracts for information technology address protection of privacy in accordance with the Privacy Act (5 U.S.C...

  16. Privacy in domestic environments

    OpenAIRE

    Radics, Peter J; Gracanin, Denis

    2011-01-01

    non-peer-reviewed While there is a growing body of research on privacy,most of the work puts the focus on information privacy. Physical and psychological privacy issues receive little to no attention. However, the introduction of technology into our lives can cause problems with regard to these aspects of privacy. This is especially true when it comes to our homes, both as nodes of our social life and places for relaxation. This paper presents the results of a study intended to captu...

  17. Distributed privacy preserving data collection

    KAUST Repository

    Xue, Mingqiang

    2011-01-01

    We study the distributed privacy preserving data collection problem: an untrusted data collector (e.g., a medical research institute) wishes to collect data (e.g., medical records) from a group of respondents (e.g., patients). Each respondent owns a multi-attributed record which contains both non-sensitive (e.g., quasi-identifiers) and sensitive information (e.g., a particular disease), and submits it to the data collector. Assuming T is the table formed by all the respondent data records, we say that the data collection process is privacy preserving if it allows the data collector to obtain a k-anonymized or l-diversified version of T without revealing the original records to the adversary. We propose a distributed data collection protocol that outputs an anonymized table by generalization of quasi-identifier attributes. The protocol employs cryptographic techniques such as homomorphic encryption, private information retrieval and secure multiparty computation to ensure the privacy goal in the process of data collection. Meanwhile, the protocol is designed to leak limited but non-critical information to achieve practicability and efficiency. Experiments show that the utility of the anonymized table derived by our protocol is in par with the utility achieved by traditional anonymization techniques. © 2011 Springer-Verlag.

  18. Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy

    Science.gov (United States)

    Koopman, Colin; Doty, Nick

    2016-01-01

    The meaning of privacy has been much disputed throughout its history in response to wave after wave of new technological capabilities and social configurations. The current round of disputes over privacy fuelled by data science has been a cause of despair for many commentators and a death knell for privacy itself for others. We argue that privacy’s disputes are neither an accidental feature of the concept nor a lamentable condition of its applicability. Privacy is essentially contested. Because it is, privacy is transformable according to changing technological and social conditions. To make productive use of privacy’s essential contestability, we argue for a new approach to privacy research and practical design, focused on the development of conceptual analytics that facilitate dissecting privacy’s multiple uses across multiple contexts. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336797

  19. Factors and Predictors of Online Security and Privacy Behavior

    Directory of Open Access Journals (Sweden)

    Goran Bubaš

    2008-12-01

    Full Text Available Assumptions and habits regarding computer and Internet use are among the major factors which influence online privacy and security of Internet users. In our study a survey was performed on 312 subjects (college students who are Internet users with IT skills that investigated how assumptions and habits of Internet users are related to their online security and privacy. The following four factors of online security and privacy related behaviors were revealed in factor analysis: F1 – conscientiousness in the maintenance of the operating system, upgrading of the Internet browser and use of antivirus and antispyware programs; F2 –engagement in risky and careless online activities with lack of concern for personal online privacy; F3 – disbelief that privacy violations and security threats represent possible problems; F4 – lack of fear regarding potential privacy and security threats with no need for change in personal online behavior. Statistically significant correlations were found between some of the discovered factors on the one side, and criteria variables occurrence of malicious code (C1 and data loss on the home computer (C2 on the other. In addition, a regression analysis was performed which revealed that the potentially risky online behaviors of Internet users were associated with the two criteria variables. To properly interpret the results of correlation and regression analyses a conceptual model was developed of the potential causal relationships between the behavior of Internet users and their experiences with online security threats. An additional study was also performed which partly confirmed the conceptual model, as well as the factors of online security and privacy related behaviors.

  20. Deriving a Set of Privacy Specific Heuristics for the Assessment of PHRs (Personal Health Records).

    Science.gov (United States)

    Furano, Riccardo F; Kushniruk, Andre; Barnett, Jeff

    2017-01-01

    With the emergence of personal health record (PHR) platforms becoming more widely available, this research focused on the development of privacy heuristics to assess PHRs regarding privacy. Existing sets of heuristics are typically not application specific and do not address patient-centric privacy as a main concern prior to undergoing PHR procurement. A set of privacy specific heuristics were developed based on a scoping review of the literature. An internet-based commercially available, vendor specific PHR application was evaluated using the derived set of privacy specific heuristics. The proposed set of privacy specific derived heuristics is explored in detail in relation to ISO 29100. The assessment of the internet-based commercially available, vendor specific PHR application indicated numerous violations. These violations were noted within the study. It is argued that the new derived privacy heuristics should be used in addition to Nielsen's well-established set of heuristics. Privacy specific heuristics could be used to assess PHR portal system-level privacy mechanisms in the procurement process of a PHR application and may prove to be a beneficial form of assessment to prevent the selection of a PHR platform with a poor privacy specific interface design.

  1. Towards Territorial Privacy in Smart Environments

    NARCIS (Netherlands)

    Könings, Bastian; Schaub, Florian; Weber, M.; Kargl, Frank

    Territorial privacy is an old concept for privacy of the personal space dating back to the 19th century. Despite its former relevance, territorial privacy has been neglected in recent years, while privacy research and legislation mainly focused on the issue of information privacy. However, with the

  2. Privacy-preserving clinical decision support system using Gaussian kernel-based classification.

    Science.gov (United States)

    Rahulamathavan, Yogachandran; Veluru, Suresh; Phan, Raphael C-W; Chambers, Jonathon A; Rajarajan, Muttukrishnan

    2014-01-01

    A clinical decision support system forms a critical capability to link health observations with health knowledge to influence choices by clinicians for improved healthcare. Recent trends toward remote outsourcing can be exploited to provide efficient and accurate clinical decision support in healthcare. In this scenario, clinicians can use the health knowledge located in remote servers via the Internet to diagnose their patients. However, the fact that these servers are third party and therefore potentially not fully trusted raises possible privacy concerns. In this paper, we propose a novel privacy-preserving protocol for a clinical decision support system where the patients' data always remain in an encrypted form during the diagnosis process. Hence, the server involved in the diagnosis process is not able to learn any extra knowledge about the patient's data and results. Our experimental results on popular medical datasets from UCI-database demonstrate that the accuracy of the proposed protocol is up to 97.21% and the privacy of patient data is not compromised.

  3. Privacy and Library Records

    Science.gov (United States)

    Bowers, Stacey L.

    2006-01-01

    This paper summarizes the history of privacy as it relates to library records. It commences with a discussion of how the concept of privacy first originated through case law and follows the concept of privacy as it has affected library records through current day and the "USA PATRIOT Act."

  4. Using medical history embedded in biometrics medical card for user identity authentication: privacy preserving authentication model by features matching.

    Science.gov (United States)

    Fong, Simon; Zhuang, Yan

    2012-01-01

    Many forms of biometrics have been proposed and studied for biometrics authentication. Recently researchers are looking into longitudinal pattern matching that based on more than just a singular biometrics; data from user's activities are used to characterise the identity of a user. In this paper we advocate a novel type of authentication by using a user's medical history which can be electronically stored in a biometric security card. This is a sequel paper from our previous work about defining abstract format of medical data to be queried and tested upon authentication. The challenge to overcome is preserving the user's privacy by choosing only the useful features from the medical data for use in authentication. The features should contain less sensitive elements and they are implicitly related to the target illness. Therefore exchanging questions and answers about a few carefully chosen features in an open channel would not easily or directly expose the illness, but yet it can verify by inference whether the user has a record of it stored in his smart card. The design of a privacy preserving model by backward inference is introduced in this paper. Some live medical data are used in experiments for validation and demonstration.

  5. Using Medical History Embedded in Biometrics Medical Card for User Identity Authentication: Privacy Preserving Authentication Model by Features Matching

    Directory of Open Access Journals (Sweden)

    Simon Fong

    2012-01-01

    Full Text Available Many forms of biometrics have been proposed and studied for biometrics authentication. Recently researchers are looking into longitudinal pattern matching that based on more than just a singular biometrics; data from user’s activities are used to characterise the identity of a user. In this paper we advocate a novel type of authentication by using a user’s medical history which can be electronically stored in a biometric security card. This is a sequel paper from our previous work about defining abstract format of medical data to be queried and tested upon authentication. The challenge to overcome is preserving the user’s privacy by choosing only the useful features from the medical data for use in authentication. The features should contain less sensitive elements and they are implicitly related to the target illness. Therefore exchanging questions and answers about a few carefully chosen features in an open channel would not easily or directly expose the illness, but yet it can verify by inference whether the user has a record of it stored in his smart card. The design of a privacy preserving model by backward inference is introduced in this paper. Some live medical data are used in experiments for validation and demonstration.

  6. Designing Privacy-by-Design

    NARCIS (Netherlands)

    Rest, J.H.C. van; Boonstra, D.; Everts, M.H.; Rijn, M. van; Paassen, R.J.G. van

    2014-01-01

    The proposal for a new privacy regulation d.d. January 25th 2012 introduces sanctions of up to 2% of the annual turnover of enterprises. This elevates the importance of mitigation of privacy risks. This paper makes Privacy by Design more concrete, and positions it as the mechanism to mitigate these

  7. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao; Liu, An; Zhang, Xiangliang; Li, Zhixu; Liu, Guanfeng; Zhao, Lei; Zhou, Xiaofang

    2017-01-01

    result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP

  8. Recommendations for the Sharing Economy: Safeguarding Privacy

    NARCIS (Netherlands)

    Ranzini, G.; Kusber, Nina; Vermeulen, I.E.; Etter, Michael

    2018-01-01

    his report, ‘Recommendations: Privacy’, forms one element of a European Union Horizon 2020 Research Project on the sharing economy: Ps2Share ‘Participation, Privacy, and Power in the Sharing Economy’. The study is undertaken within the scope of the European Union’s Horizon 2020 research and

  9. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    NARCIS (Netherlands)

    Kuchinke, W.; Ohmann, C.; Verheij, R.A.; Veen, E.B. van; Arvanitis, T.N.; Taweel, A.; Delaney, B.C.

    2014-01-01

    Purpose: To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data

  10. New Collaborative Filtering Algorithms Based on SVD++ and Differential Privacy

    Directory of Open Access Journals (Sweden)

    Zhengzheng Xian

    2017-01-01

    Full Text Available Collaborative filtering technology has been widely used in the recommender system, and its implementation is supported by the large amount of real and reliable user data from the big-data era. However, with the increase of the users’ information-security awareness, these data are reduced or the quality of the data becomes worse. Singular Value Decomposition (SVD is one of the common matrix factorization methods used in collaborative filtering, which introduces the bias information of users and items and is realized by using algebraic feature extraction. The derivative model SVD++ of SVD achieves better predictive accuracy due to the addition of implicit feedback information. Differential privacy is defined very strictly and can be proved, which has become an effective measure to solve the problem of attackers indirectly deducing the personal privacy information by using background knowledge. In this paper, differential privacy is applied to the SVD++ model through three approaches: gradient perturbation, objective-function perturbation, and output perturbation. Through theoretical derivation and experimental verification, the new algorithms proposed can better protect the privacy of the original data on the basis of ensuring the predictive accuracy. In addition, an effective scheme is given that can measure the privacy protection strength and predictive accuracy, and a reasonable range for selection of the differential privacy parameter is provided.

  11. 76 FR 65196 - Privacy Act of 1974; Report of a New Routine Use for Selected CMS System of Records

    Science.gov (United States)

    2011-10-20

    ... and privacy requirements included. A Data Use Agreement (DUA) (CMS Form 0235) must be completed by the...: CMS Privacy Officer, Division of Information Security & Privacy Management, Enterprise Architecture... requirements that she may specify for QEs to meet, such as ensuring the security of data. The Medicare claims...

  12. Trust information-based privacy architecture for ubiquitous health.

    Science.gov (United States)

    Ruotsalainen, Pekka Sakari; Blobel, Bernd; Seppälä, Antto; Nykänen, Pirkko

    2013-10-08

    Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system

  13. Privacy information management for video surveillance

    Science.gov (United States)

    Luo, Ying; Cheung, Sen-ching S.

    2013-05-01

    The widespread deployment of surveillance cameras has raised serious privacy concerns. Many privacy-enhancing schemes have been proposed to automatically redact images of trusted individuals in the surveillance video. To identify these individuals for protection, the most reliable approach is to use biometric signals such as iris patterns as they are immutable and highly discriminative. In this paper, we propose a privacy data management system to be used in a privacy-aware video surveillance system. The privacy status of a subject is anonymously determined based on her iris pattern. For a trusted subject, the surveillance video is redacted and the original imagery is considered to be the privacy information. Our proposed system allows a subject to access her privacy information via the same biometric signal for privacy status determination. Two secure protocols, one for privacy information encryption and the other for privacy information retrieval are proposed. Error control coding is used to cope with the variability in iris patterns and efficient implementation is achieved using surrogate data records. Experimental results on a public iris biometric database demonstrate the validity of our framework.

  14. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  15. Advanced research in data privacy

    CERN Document Server

    Torra, Vicenç

    2015-01-01

    This book provides an overview of the research work on data privacy and privacy enhancing technologies carried by the participants of the ARES project. ARES (Advanced Research in Privacy an Security, CSD2007-00004) has been one of the most important research projects funded by the Spanish Government in the fields of computer security and privacy. It is part of the now extinct CONSOLIDER INGENIO 2010 program, a highly competitive program which aimed to advance knowledge and open new research lines among top Spanish research groups. The project started in 2007 and will finish this 2014. Composed by 6 research groups from 6 different institutions, it has gathered an important number of researchers during its lifetime. Among the work produced by the ARES project, one specific work package has been related to privacy. This books gathers works produced by members of the project related to data privacy and privacy enhancing technologies. The presented works not only summarize important research carried in the proje...

  16. Internet privacy options for adequate realisation

    CERN Document Server

    2013-01-01

    A thorough multidisciplinary analysis of various perspectives on internet privacy was published as the first volume of a study, revealing the results of the achatech project "Internet Privacy - A Culture of Privacy and Trust on the Internet." The second publication from this project presents integrated, interdisciplinary options for improving privacy on the Internet utilising a normative, value-oriented approach. The ways in which privacy promotes and preconditions fundamental societal values and how privacy violations endanger the flourishing of said values are exemplified. The conditions which must be fulfilled in order to achieve a culture of privacy and trust on the internet are illuminated. This volume presents options for policy-makers, educators, businesses and technology experts how to facilitate solutions for more privacy on the Internet and identifies further research requirements in this area.

  17. Cognitive Privacy for Personal Clouds

    Directory of Open Access Journals (Sweden)

    Milena Radenkovic

    2016-01-01

    Full Text Available This paper proposes a novel Cognitive Privacy (CogPriv framework that improves privacy of data sharing between Personal Clouds for different application types and across heterogeneous networks. Depending on the behaviour of neighbouring network nodes, their estimated privacy levels, resource availability, and social network connectivity, each Personal Cloud may decide to use different transmission network for different types of data and privacy requirements. CogPriv is fully distributed, uses complex graph contacts analytics and multiple implicit novel heuristics, and combines these with smart probing to identify presence and behaviour of privacy compromising nodes in the network. Based on sensed local context and through cooperation with remote nodes in the network, CogPriv is able to transparently and on-the-fly change the network in order to avoid transmissions when privacy may be compromised. We show that CogPriv achieves higher end-to-end privacy levels compared to both noncognitive cellular network communication and state-of-the-art strategies based on privacy-aware adaptive social mobile networks routing for a range of experiment scenarios based on real-world user and network traces. CogPriv is able to adapt to varying network connectivity and maintain high quality of service while managing to keep low data exposure for a wide range of privacy leakage levels in the infrastructure.

  18. The Privacy Jungle:On the Market for Data Protection in Social Networks

    Science.gov (United States)

    Bonneau, Joseph; Preibusch, Sören

    We have conducted the first thorough analysis of the market for privacy practices and policies in online social networks. From an evaluation of 45 social networking sites using 260 criteria we find that many popular assumptions regarding privacy and social networking need to be revisited when considering the entire ecosystem instead of only a handful of well-known sites. Contrary to the common perception of an oligopolistic market, we find evidence of vigorous competition for new users. Despite observing many poor security practices, there is evidence that social network providers are making efforts to implement privacy enhancing technologies with substantial diversity in the amount of privacy control offered. However, privacy is rarely used as a selling point, even then only as auxiliary, nondecisive feature. Sites also failed to promote their existing privacy controls within the site. We similarly found great diversity in the length and content of formal privacy policies, but found an opposite promotional trend: though almost all policies are not accessible to ordinary users due to obfuscating legal jargon, they conspicuously vaunt the sites' privacy practices. We conclude that the market for privacy in social networks is dysfunctional in that there is significant variation in sites' privacy controls, data collection requirements, and legal privacy policies, but this is not effectively conveyed to users. Our empirical findings motivate us to introduce the novel model of a privacy communication game, where the economically rational choice for a site operator is to make privacy control available to evade criticism from privacy fundamentalists, while hiding the privacy control interface and privacy policy to maximize sign-up numbers and encourage data sharing from the pragmatic majority of users.

  19. Cybersecurity and Privacy

    DEFF Research Database (Denmark)

    he huge potential in future connected services has as a precondition that privacy and security needs are dealt with in order for new services to be accepted. This issue is increasingly on the agenda both at the company and at individual level. Cybersecurity and Privacy – bridging the gap addresses...... two very complex fields of the digital world, i.e., Cybersecurity and Privacy. These multifaceted, multidisciplinary and complex issues are usually understood and valued differently by different individuals, data holders and legal bodies. But a change in one field immediately affects the others....... Policies, frameworks, strategies, laws, tools, techniques, and technologies – all of these are tightly interwoven when it comes to security and privacy. This book is another attempt to bridge the gap between the industry and academia. The book addresses the views from academia and industry on the subject...

  20. Privacy for Sale?

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Sørensen, Jannick Kirk; Khajuria, Samant

    Data brokers have become central players in the collection online of private user data. Data brokers’ activities are however not very transparent or even known by users. Many users regard privacy a central element when they use online services. Based on 12 short interviews with users, this paper...... analyses how users perceive the concept of online privacy in respect to data brokers col- lection of private data, and particularly novel services that offer users the possi- bility to sell their private data. Two groups of users are identified: Those who are considering selling their data under specific...... conditions, and those who reject the idea completely. Based on the literature we identify two positions to privacy either as an instrumental good, or as an intrinsic good. The paper positions vari- ous user perceptions on privacy that are relevant for future service develop- ment....

  1. Perceived Speech Privacy in Computer Simulated Open-plan Offices

    DEFF Research Database (Denmark)

    Pop, Claudiu B.; Rindel, Jens Holger

    2005-01-01

    In open plan offices the lack of speech privacy between the workstations is one of the major acoustic problems. Improving the speech privacy in an open plan design is therefore the main concern for a successful open plan environment. The project described in this paper aimed to find an objective...... parameter that correlates well with the perceived degree of speech privacy and to derive a clear method for evaluating the acoustic conditions in open plan offices. Acoustic measurements were carried out in an open plan office, followed by data analysis at the Acoustic Department, DTU. A computer model...

  2. Sex Differences in Attitudes towards Online Privacy and Anonymity among Israeli Students with Different Technical Backgrounds

    Science.gov (United States)

    Weinberger, Maor; Zhitomirsky-Geffet, Maayan; Bouhnik, Dan

    2017-01-01

    Introduction: In this exploratory study, we proposed an experimental framework to investigate and model male/female differences in attitudes towards online privacy and anonymity among Israeli students. Our aim was to comparatively model men and women's online privacy attitudes, and to assess the online privacy gender gap. Method: Various factors…

  3. A secure data privacy preservation for on-demand

    Directory of Open Access Journals (Sweden)

    Dhasarathan Chandramohan

    2017-04-01

    Full Text Available This paper spotlights privacy and its obfuscation issues of intellectual, confidential information owned by insurance and finance sectors. Privacy risk in business era if authoritarians misuse secret information. Software interruptions in steeling digital data in the name of third party services. Liability in digital secrecy for the business continuity isolation, mishandling causing privacy breaching the vicinity and its preventive phenomenon is scrupulous in the cloud, where a huge amount of data is stored and maintained enormously. In this developing IT-world toward cloud, users privacy protection is becoming a big question , albeit cloud computing made changes in the computing field by increasing its effectiveness, efficiency and optimization of the service environment etc, cloud users data and their identity, reliability, maintainability and privacy may vary for different CPs (cloud providers. CP ensures that the user’s proprietary information is maintained more secretly with current technologies. More remarkable occurrence is even the cloud provider does not have suggestions regarding the information and the digital data stored and maintained globally anywhere in the cloud. The proposed system is one of the obligatory research issues in cloud computing. We came forward by proposing the Privacy Preserving Model to Prevent Digital Data Loss in the Cloud (PPM–DDLC. This proposal helps the CR (cloud requester/users to trust their proprietary information and data stored in the cloud.

  4. Safeguarding donors' personal rights and biobank autonomy in biobank networks: the CRIP privacy regime.

    Science.gov (United States)

    Schröder, Christina; Heidtke, Karsten R; Zacherl, Nikolaus; Zatloukal, Kurt; Taupitz, Jochen

    2011-08-01

    Governance, underlying general ICT (Information and Communication Technology) architecture, and workflow of the Central Research Infrastructure for molecular Pathology (CRIP) are discussed as a model enabling biobank networks to form operational "meta biobanks" whilst respecting the donors' privacy, biobank autonomy and confidentiality, and the researchers' needs for appropriate biospecimens and information, as well as confidentiality. Tailored to these needs, CRIP efficiently accelerates and facilitates research with human biospecimens and data.

  5. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II.

    Science.gov (United States)

    Watzlaf, Valerie J M; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti

    2011-01-01

    In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR.

  6. Location Privacy in RFID Applications

    Science.gov (United States)

    Sadeghi, Ahmad-Reza; Visconti, Ivan; Wachsmann, Christian

    RFID-enabled systems allow fully automatic wireless identification of objects and are rapidly becoming a pervasive technology with various applications. However, despite their benefits, RFID-based systems also pose challenging risks, in particular concerning user privacy. Indeed, improvident use of RFID can disclose sensitive information about users and their locations allowing detailed user profiles. Hence, it is crucial to identify and to enforce appropriate security and privacy requirements of RFID applications (that are also compliant to legislation). This chapter first discusses security and privacy requirements for RFID-enabled systems, focusing in particular on location privacy issues. Then it explores the advances in RFID applications, stressing the security and privacy shortcomings of existing proposals. Finally, it presents new promising directions for privacy-preserving RFID systems, where as a case study we focus electronic tickets (e-tickets) for public transportation.

  7. Privacy and Data-Based Research

    OpenAIRE

    Ori Heffetz; Katrina Ligett

    2013-01-01

    What can we, as users of microdata, formally guarantee to the individuals (or firms) in our dataset, regarding their privacy? We retell a few stories, well-known in data-privacy circles, of failed anonymization attempts in publicly released datasets. We then provide a mostly informal introduction to several ideas from the literature on differential privacy, an active literature in computer science that studies formal approaches to preserving the privacy of individuals in statistical databases...

  8. A Randomized Response Model For Privacy Preserving Smart Metering

    Science.gov (United States)

    Cui, Lijuan; Que, Jialan; Choi, Dae-Hyun; Jiang, Xiaoqian; Cheng, Samuel; Xie, Le

    2012-01-01

    The adoption of smart meters may bring new privacy concerns to the general public. Given the fact that metering data of individual homes/factories is accumulated every 15 minutes, it is possible to infer the pattern of electricity consumption of individual users. In order to protect the privacy of users in a completely de-centralized setting (i.e., individuals do not communicate with one another), we propose a novel protocol, which allows individual meters to report the true electricity consumption reading with a pre-determinted probability. Load serving entities (LSE) can reconstruct the total electricity consumption of a region or a district through inference algorithm, but their ability of identifying individual users’ energy consumption pattern is significantly reduced. Using simulated data, we verify the feasibility of the proposed method and demonstrate performance advantages over existing approaches. PMID:23243488

  9. Public privacy: Reciprocity and Silence

    Directory of Open Access Journals (Sweden)

    Jenny Kennedy

    2014-10-01

    Full Text Available In his 1958 poem 'Dedication to my Wife' TS Eliot proclaims "these are private words addressed to you in public". Simultaneously written for his wife, Valerie Fletcher, and to the implied you of a discourse network, Eliot's poem helps to illustrate the narrative voices and silences that are constitutive of an intimate public sphere. This paper situates reciprocity as a condition of possibility for public privacy. It shows how reciprocity is enabled by systems of code operating through material and symbolic registers. Code promises to control communication, to produce neutral, systemic forms of meaning. Yet such automation is challenged by uneven and fragmented patterns of reciprocity. Moreover, examining the media of public privacy reveals historical trajectories important for understanding contemporary socio­technical platforms of reciprocity. To explore the implicit requirement of reciprocity in publicly private practices, three sites of communication are investigated framed by a media archaeology perspective: postal networks, the mail­art project PostSecret and the anonymous zine 'You'.

  10. Interpersonal Privacy Management in Distributed Collaboration: Situational Characteristics and Interpretive Influences

    Science.gov (United States)

    Patil, Sameer; Kobsa, Alfred; John, Ajita; Brotman, Lynne S.; Seligmann, Doree

    To understand how collaborators reconcile the often conflicting needs of awareness and privacy, we studied a large software development project in a multinational corporation involving individuals at sites in the U.S. and India. We present a theoretical framework describing privacy management practices and their determinants that emerged from field visits, interviews, and questionnaire responses. The framework identifies five relevant situational characteristics: issue(s) under consideration, physical place(s) involved in interaction(s), temporal aspects, affordances and limitations presented by technology, and nature of relationships among parties. Each actor, in turn, interprets the situation based on several simultaneous influences: self, team, work site, organization, and cultural environment. This interpretation guides privacy management action(s). Past actions form a feedback loop refining and/or reinforcing the interpretive influences. The framework suggests that effective support for privacy management will require that designers follow a socio-technical approach incorporating a wider scope of situational and interpretive differences.

  11. 76 FR 59073 - Privacy Act

    Science.gov (United States)

    2011-09-23

    ... CENTRAL INTELLIGENCE AGENCY 32 CFR Part 1901 Privacy Act AGENCY: Central Intelligence Agency. ACTION: Proposed rule. SUMMARY: Consistent with the Privacy Act (PA), the Central Intelligence Agency...-1379. SUPPLEMENTARY INFORMATION: Consistent with the Privacy Act (PA), the CIA has undertaken and...

  12. An Efficient Context-Aware Privacy Preserving Approach for Smartphones

    Directory of Open Access Journals (Sweden)

    Lichen Zhang

    2017-01-01

    Full Text Available With the proliferation of smartphones and the usage of the smartphone apps, privacy preservation has become an important issue. The existing privacy preservation approaches for smartphones usually have less efficiency due to the absent consideration of the active defense policies and temporal correlations between contexts related to users. In this paper, through modeling the temporal correlations among contexts, we formalize the privacy preservation problem to an optimization problem and prove its correctness and the optimality through theoretical analysis. To further speed up the running time, we transform the original optimization problem to an approximate optimal problem, a linear programming problem. By resolving the linear programming problem, an efficient context-aware privacy preserving algorithm (CAPP is designed, which adopts active defense policy and decides how to release the current context of a user to maximize the level of quality of service (QoS of context-aware apps with privacy preservation. The conducted extensive simulations on real dataset demonstrate the improved performance of CAPP over other traditional approaches.

  13. Attribute-Based Signcryption: Signer Privacy, Strong Unforgeability and IND-CCA Security in Adaptive-Predicates Model (Extended Version

    Directory of Open Access Journals (Sweden)

    Tapas Pandit

    2016-08-01

    Full Text Available Attribute-Based Signcryption (ABSC is a natural extension of Attribute-Based Encryption (ABE and Attribute-Based Signature (ABS, where one can have the message confidentiality and authenticity together. Since the signer privacy is captured in security of ABS, it is quite natural to expect that the signer privacy will also be preserved in ABSC. In this paper, first we propose an ABSC scheme which is weak existential unforgeable and IND-CCA secure in adaptive-predicates models and, achieves signer privacy. Then, by applying strongly unforgeable one-time signature (OTS, the above scheme is lifted to an ABSC scheme to attain strong existential unforgeability in adaptive-predicates model. Both the ABSC schemes are constructed on common setup, i.e the public parameters and key are same for both the encryption and signature modules. Our first construction is in the flavor of CtE&S paradigm, except one extra component that will be computed using both signature components and ciphertext components. The second proposed construction follows a new paradigm (extension of CtE&S , we call it “Commit then Encrypt and Sign then Sign” (CtE&S . The last signature is generated using a strong OTS scheme. Since, the non-repudiation is achieved by CtE&S paradigm, our systems also achieve the same.

  14. Protecting patron privacy

    CERN Document Server

    Beckstrom, Matthew

    2015-01-01

    In a world where almost anyone with computer savvy can hack, track, and record the online activities of others, your library can serve as a protected haven for your visitors who rely on the Internet to conduct research-if you take the necessary steps to safeguard their privacy. This book shows you how to protect patrons' privacy while using the technology that your library provides, including public computers, Internet access, wireless networks, and other devices. Logically organized into two major sections, the first part of the book discusses why the privacy of your users is of paramount

  15. Bridging the transatlantic divide in privacy

    Directory of Open Access Journals (Sweden)

    Paula Kift

    2013-08-01

    Full Text Available In the context of the US National Security Agency surveillance scandal, the transatlantic privacy divide has come back to the fore. In the United States, the right to privacy is primarily understood as a right to physical privacy, thus the protection from unwarranted government searches and seizures. In Germany on the other hand, it is also understood as a right to spiritual privacy, thus the right of citizens to develop into autonomous moral agents. The following article will discuss the different constitutional assumptions that underlie American and German attitudes towards privacy, namely privacy as an aspect of liberty or as an aspect of dignity. As data flows defy jurisdictional boundaries, however, policymakers across the Atlantic are faced with a conundrum: how can German and American privacy cultures be reconciled?

  16. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    Science.gov (United States)

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  17. Privacy preserving interactive record linkage (PPIRL).

    Science.gov (United States)

    Kum, Hye-Chung; Krishnamurthy, Ashok; Machanavajjhala, Ashwin; Reiter, Michael K; Ahalt, Stanley

    2014-01-01

    Record linkage to integrate uncoordinated databases is critical in biomedical research using Big Data. Balancing privacy protection against the need for high quality record linkage requires a human-machine hybrid system to safely manage uncertainty in the ever changing streams of chaotic Big Data. In the computer science literature, private record linkage is the most published area. It investigates how to apply a known linkage function safely when linking two tables. However, in practice, the linkage function is rarely known. Thus, there are many data linkage centers whose main role is to be the trusted third party to determine the linkage function manually and link data for research via a master population list for a designated region. Recently, a more flexible computerized third-party linkage platform, Secure Decoupled Linkage (SDLink), has been proposed based on: (1) decoupling data via encryption, (2) obfuscation via chaffing (adding fake data) and universe manipulation; and (3) minimum information disclosure via recoding. We synthesize this literature to formalize a new framework for privacy preserving interactive record linkage (PPIRL) with tractable privacy and utility properties and then analyze the literature using this framework. Human-based third-party linkage centers for privacy preserving record linkage are the accepted norm internationally. We find that a computer-based third-party platform that can precisely control the information disclosed at the micro level and allow frequent human interaction during the linkage process, is an effective human-machine hybrid system that significantly improves on the linkage center model both in terms of privacy and utility.

  18. Toward Privacy-Preserving Personalized Recommendation Services

    Directory of Open Access Journals (Sweden)

    Cong Wang

    2018-02-01

    Full Text Available Recommendation systems are crucially important for the delivery of personalized services to users. With personalized recommendation services, users can enjoy a variety of targeted recommendations such as movies, books, ads, restaurants, and more. In addition, personalized recommendation services have become extremely effective revenue drivers for online business. Despite the great benefits, deploying personalized recommendation services typically requires the collection of users’ personal data for processing and analytics, which undesirably makes users susceptible to serious privacy violation issues. Therefore, it is of paramount importance to develop practical privacy-preserving techniques to maintain the intelligence of personalized recommendation services while respecting user privacy. In this paper, we provide a comprehensive survey of the literature related to personalized recommendation services with privacy protection. We present the general architecture of personalized recommendation systems, the privacy issues therein, and existing works that focus on privacy-preserving personalized recommendation services. We classify the existing works according to their underlying techniques for personalized recommendation and privacy protection, and thoroughly discuss and compare their merits and demerits, especially in terms of privacy and recommendation accuracy. We also identity some future research directions. Keywords: Privacy protection, Personalized recommendation services, Targeted delivery, Collaborative filtering, Machine learning

  19. Towards Privacy Managment of Information Systems

    OpenAIRE

    Drageide, Vidar

    2009-01-01

    This masters thesis provides insight into the concept of privacy. It argues why privacy is important, and why developers and system owners should keep privacy in mind when developing and maintaining systems containing personal information. Following this, a strategy for evaluating the overall level of privacy in a system is defined. The strategy is then applied to parts of the cellphone system in an attempt to evaluate the privacy of traffic and location data in this system.

  20. Adding query privacy to robust DHTs

    DEFF Research Database (Denmark)

    Backes, Michael; Goldberg, Ian; Kate, Aniket

    2012-01-01

    intermediate peers that (help to) route the queries towards their destinations. In this paper, we satisfy this requirement by presenting an approach for providing privacy for the keys in DHT queries. We use the concept of oblivious transfer (OT) in communication over DHTs to preserve query privacy without...... privacy over robust DHTs. Finally, we compare the performance of our privacy-preserving protocols with their more privacy-invasive counterparts. We observe that there is no increase in the message complexity...

  1. Privacy in an Ambient World

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  2. Information Privacy Revealed

    Science.gov (United States)

    Lavagnino, Merri Beth

    2013-01-01

    Why is Information Privacy the focus of the January-February 2013 issue of "EDUCAUSE Review" and "EDUCAUSE Review Online"? Results from the 2012 annual survey of the International Association of Privacy Professionals (IAPP) indicate that "meeting regulatory compliance requirements continues to be the top perceived driver…

  3. Collecting and Analyzing Data from Smart Device Users with Local Differential Privacy

    OpenAIRE

    Nguyên, Thông T.; Xiao, Xiaokui; Yang, Yin; Hui, Siu Cheung; Shin, Hyejin; Shin, Junbum

    2016-01-01

    Organizations with a large user base, such as Samsung and Google, can potentially benefit from collecting and mining users' data. However, doing so raises privacy concerns, and risks accidental privacy breaches with serious consequences. Local differential privacy (LDP) techniques address this problem by only collecting randomized answers from each user, with guarantees of plausible deniability; meanwhile, the aggregator can still build accurate models and predictors by analyzing large amount...

  4. Privacy in social networking sites

    OpenAIRE

    Λεονάρδος, Γεώργιος; Leonardos, Giorgos

    2016-01-01

    The purpose of this study is to explore the aspects of privacy over the use of social networks web sites. More specific, we will show the types of social networks, their privacy mechanisms that are different in each social network site, their privacy options that are offered to users. We will report some serious privacy violations incidents of the most popular social networks sites such as Facebook, Twitter, LinkedIn. Also, we will report some important surveys about social networks and pr...

  5. Location-Related Privacy in Geo-Social Networks

    DEFF Research Database (Denmark)

    Ruiz Vicente, Carmen; Freni, Dario; Bettini, Claudio

    2011-01-01

    -ins." However, this ability to reveal users' locations causes new privacy threats, which in turn call for new privacy-protection methods. The authors study four privacy aspects central to these social networks - location, absence, co-location, and identity privacy - and describe possible means of protecting...... privacy in these circumstances....

  6. Privacy and security perceptions of european citizens: A test of the trade-off model

    NARCIS (Netherlands)

    Friedewald, M.; Lieshout, M. van; Rung, S.; Ooms, M.; Ypma, J.

    2015-01-01

    This paper considers the relationship between privacy and security and, in particular, the traditional “trade-off” paradigm that argues that citizens might be willing to sacrifice some privacy for more security. Academics have long argued against the trade-off paradigm, but these arguments have

  7. PRIVACY AS A CULTURAL PHENOMENON

    Directory of Open Access Journals (Sweden)

    Garfield Benjamin

    2017-07-01

    Full Text Available Privacy remains both contentious and ever more pertinent in contemporary society. Yet it persists as an ill-defined term, not only within specific fields but in its various uses and implications between and across technical, legal and political contexts. This article offers a new critical review of the history of privacy in terms of two dominant strands of thinking: freedom and property. These two conceptions of privacy can be seen as successive historical epochs brought together under digital technologies, yielding increasingly complex socio-technical dilemmas. By simplifying the taxonomy to its socio-cultural function, the article provides a generalisable, interdisciplinary approach to privacy. Drawing on new technologies, historical trends, sociological studies and political philosophy, the article presents a discussion of the value of privacy as a term, before proposing a defense of the term cyber security as a mode of scalable cognitive privacy that integrates the relative needs of individuals, governments and corporations.

  8. Health information: reconciling personal privacy with the public good of human health.

    Science.gov (United States)

    Gostin, L O

    2001-01-01

    The success of the health care system depends on the accuracy, correctness and trustworthiness of the information, and the privacy rights of individuals to control the disclosure of personal information. A national policy on health informational privacy should be guided by ethical principles that respect individual autonomy while recognizing the important collective interests in the use of health information. At present there are no adequate laws or constitutional principles to help guide a rational privacy policy. The laws are scattered and fragmented across the states. Constitutional law is highly general, without important specific safeguards. Finally, a case study is provided showing the important trade-offs that exist between public health and privacy. For a model public health law, see www.critpath.org/msphpa/privacy.

  9. Physical factors that influence patients' privacy perception toward a psychiatric behavioral monitoring system: a qualitative study.

    Science.gov (United States)

    Zakaria, Nasriah; Ramli, Rusyaizila

    2018-01-01

    Psychiatric patients have privacy concerns when it comes to technology intervention in the hospital setting. In this paper, we present scenarios for psychiatric behavioral monitoring systems to be placed in psychiatric wards to understand patients' perception regarding privacy. Psychiatric behavioral monitoring refers to systems that are deemed useful in measuring clinical outcomes, but little research has been done on how these systems will impact patients' privacy. We conducted a case study in one teaching hospital in Malaysia. We investigated the physical factors that influence patients' perceived privacy with respect to a psychiatric monitoring system. The eight physical factors identified from the information system development privacy model, a comprehensive model for designing a privacy-sensitive information system, were adapted in this research. Scenario-based interviews were conducted with 25 patients in a psychiatric ward for 3 months. Psychiatric patients were able to share how physical factors influence their perception of privacy. Results show how patients responded to each of these dimensions in the context of a psychiatric behavioral monitoring system. Some subfactors under physical privacy are modified to reflect the data obtained in the interviews. We were able to capture the different physical factors that influence patient privacy.

  10. The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy.

    Science.gov (United States)

    O'Brolcháin, Fiachra; Jacquemard, Tim; Monaghan, David; O'Connor, Noel; Novitzky, Peter; Gordijn, Bert

    2016-02-01

    The rapid evolution of information, communication and entertainment technologies will transform the lives of citizens and ultimately transform society. This paper focuses on ethical issues associated with the likely convergence of virtual realities (VR) and social networks (SNs), hereafter VRSNs. We examine a scenario in which a significant segment of the world's population has a presence in a VRSN. Given the pace of technological development and the popularity of these new forms of social interaction, this scenario is plausible. However, it brings with it ethical problems. Two central ethical issues are addressed: those of privacy and those of autonomy. VRSNs pose threats to both privacy and autonomy. The threats to privacy can be broadly categorized as threats to informational privacy, threats to physical privacy, and threats to associational privacy. Each of these threats is further subdivided. The threats to autonomy can be broadly categorized as threats to freedom, to knowledge and to authenticity. Again, these three threats are divided into subcategories. Having categorized the main threats posed by VRSNs, a number of recommendations are provided so that policy-makers, developers, and users can make the best possible use of VRSNs.

  11. 77 FR 31371 - Public Workshop: Privacy Compliance Workshop

    Science.gov (United States)

    2012-05-25

    ... presentations, including the privacy compliance fundamentals, privacy and data security, and the privacy... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Public Workshop: Privacy Compliance... Homeland Security Privacy Office will host a public workshop, ``Privacy Compliance Workshop.'' DATES: The...

  12. What was privacy?

    Science.gov (United States)

    McCreary, Lew

    2008-10-01

    Why is that question in the past tense? Because individuals can no longer feel confident that the details of their lives--from identifying numbers to cultural preferences--will be treated with discretion rather than exploited. Even as Facebook users happily share the names of their favorite books, movies, songs, and brands, they often regard marketers' use of that information as an invasion of privacy. In this wide-ranging essay, McCreary, a senior editor at HBR, examines numerous facets of the privacy issue, from Google searches, public shaming on the internet, and cell phone etiquette to passenger screening devices, public surveillance cameras, and corporate chief privacy officers. He notes that IBM has been a leader on privacy; its policy forswearing the use of employees' genetic information in hiring and benefits decisions predated the federal Genetic Information Nondiscrimination Act by three years. Now IBM is involved in an open-source project known as Higgins to provide users with transportable, potentially anonymous online presences. Craigslist, whose CEO calls it "as close to 100% user driven as you can get," has taken an extremely conservative position on privacy--perhaps easier for a company with a declared lack of interest in maximizing revenue. But TJX and other corporate victims of security breaches have discovered that retaining consumers' transaction information can be both costly and risky. Companies that underestimate the importance of privacy to their customers or fail to protect it may eventually face harsh regulation, reputational damage, or both. The best thing they can do, says the author, is negotiate directly with those customers over where to draw the line.

  13. Privacy Expectations in Online Contexts

    Science.gov (United States)

    Pure, Rebekah Abigail

    2013-01-01

    Advances in digital networked communication technology over the last two decades have brought the issue of personal privacy into sharper focus within contemporary public discourse. In this dissertation, I explain the Fourth Amendment and the role that privacy expectations play in the constitutional protection of personal privacy generally, and…

  14. Online Tracking Technologies and Web Privacy:Technologieën voor Online volgen en Web Privacy

    OpenAIRE

    Acar, Mustafa Gunes Can

    2017-01-01

    In my PhD thesis, I would like to study the problem of online privacy with a focus on Web and mobile applications. Key research questions to be addressed by my study are the following: How can we formalize and quantify web tracking? What are the threats presented against privacy by different tracking techniques such as browser fingerprinting and cookie based tracking? What kind of privacy enhancing technologies (PET) can be used to ensure privacy without degrading service quality? The stud...

  15. Preserving Differential Privacy for Similarity Measurement in Smart Environments

    Directory of Open Access Journals (Sweden)

    Kok-Seng Wong

    2014-01-01

    Full Text Available Advances in both sensor technologies and network infrastructures have encouraged the development of smart environments to enhance people’s life and living styles. However, collecting and storing user’s data in the smart environments pose severe privacy concerns because these data may contain sensitive information about the subject. Hence, privacy protection is now an emerging issue that we need to consider especially when data sharing is essential for analysis purpose. In this paper, we consider the case where two agents in the smart environment want to measure the similarity of their collected or stored data. We use similarity coefficient function FSC as the measurement metric for the comparison with differential privacy model. Unlike the existing solutions, our protocol can facilitate more than one request to compute FSC without modifying the protocol. Our solution ensures privacy protection for both the inputs and the computed FSC results.

  16. 39 CFR 262.5 - Systems (Privacy).

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Systems (Privacy). 262.5 Section 262.5 Postal... DEFINITIONS § 262.5 Systems (Privacy). (a) Privacy Act system of records. A Postal Service system containing... individual. (c) Computer matching program. A “matching program,” as defined in the Privacy Act, 5 U.S.C. 552a...

  17. Extending SQL to Support Privacy Policies

    Science.gov (United States)

    Ghazinour, Kambiz; Pun, Sampson; Majedi, Maryam; Chinaci, Amir H.; Barker, Ken

    Increasing concerns over Internet applications that violate user privacy by exploiting (back-end) database vulnerabilities must be addressed to protect both customer privacy and to ensure corporate strategic assets remain trustworthy. This chapter describes an extension onto database catalogues and Structured Query Language (SQL) for supporting privacy in Internet applications, such as in social networks, e-health, e-governmcnt, etc. The idea is to introduce new predicates to SQL commands to capture common privacy requirements, such as purpose, visibility, generalization, and retention for both mandatory and discretionary access control policies. The contribution is that corporations, when creating the underlying databases, will be able to define what their mandatory privacy policies arc with which all application users have to comply. Furthermore, each application user, when providing their own data, will be able to define their own privacy policies with which other users have to comply. The extension is supported with underlying catalogues and algorithms. The experiments demonstrate a very reasonable overhead for the extension. The result is a low-cost mechanism to create new systems that arc privacy aware and also to transform legacy databases to their privacy-preserving equivalents. Although the examples arc from social networks, one can apply the results to data security and user privacy of other enterprises as well.

  18. Is Electronic Privacy Achievable?

    National Research Council Canada - National Science Library

    Irvine, Cynthia E; Levin, Timothy E

    2000-01-01

    ... individuals. The purpose of this panel was to focus on how new technologies are affecting privacy. Technologies that might adversely affect privacy were identified by Rein Turn at previous symposia...

  19. Technical and policy approaches to balancing patient privacy and data sharing in clinical and translational research.

    Science.gov (United States)

    Malin, Bradley; Karp, David; Scheuermann, Richard H

    2010-01-01

    Clinical researchers need to share data to support scientific validation and information reuse and to comply with a host of regulations and directives from funders. Various organizations are constructing informatics resources in the form of centralized databases to ensure reuse of data derived from sponsored research. The widespread use of such open databases is contingent on the protection of patient privacy. We review privacy-related problems associated with data sharing for clinical research from technical and policy perspectives. We investigate existing policies for secondary data sharing and privacy requirements in the context of data derived from research and clinical settings. In particular, we focus on policies specified by the US National Institutes of Health and the Health Insurance Portability and Accountability Act and touch on how these policies are related to current and future use of data stored in public database archives. We address aspects of data privacy and identifiability from a technical, although approachable, perspective and summarize how biomedical databanks can be exploited and seemingly anonymous records can be reidentified using various resources without hacking into secure computer systems. We highlight which clinical and translational data features, specified in emerging research models, are potentially vulnerable or exploitable. In the process, we recount a recent privacy-related concern associated with the publication of aggregate statistics from pooled genome-wide association studies that have had a significant impact on the data sharing policies of National Institutes of Health-sponsored databanks. Based on our analysis and observations we provide a list of recommendations that cover various technical, legal, and policy mechanisms that open clinical databases can adopt to strengthen data privacy protection as they move toward wider deployment and adoption.

  20. Adding Query Privacy to Robust DHTs

    DEFF Research Database (Denmark)

    Backes, Michael; Goldberg, Ian; Kate, Aniket

    2011-01-01

    intermediate peers that (help to) route the queries towards their destinations. In this paper, we satisfy this requirement by presenting an approach for providing privacy for the keys in DHT queries. We use the concept of oblivious transfer (OT) in communication over DHTs to preserve query privacy without...... of obtaining query privacy over robust DHTs. Finally, we compare the performance of our privacy-preserving protocols with their more privacy-invasive counterparts. We observe that there is no increase in the message complexity and only a small overhead in the computational complexity....

  1. One Size Doesn’t Fit All: Measuring Individual Privacy in Aggregate Genomic Data

    Science.gov (United States)

    Simmons, Sean; Berger, Bonnie

    2017-01-01

    Even in the aggregate, genomic data can reveal sensitive information about individuals. We present a new model-based measure, PrivMAF, that provides provable privacy guarantees for aggregate data (namely minor allele frequencies) obtained from genomic studies. Unlike many previous measures that have been designed to measure the total privacy lost by all participants in a study, PrivMAF gives an individual privacy measure for each participant in the study, not just an average measure. These individual measures can then be combined to measure the worst case privacy loss in the study. Our measure also allows us to quantify the privacy gains achieved by perturbing the data, either by adding noise or binning. Our findings demonstrate that both perturbation approaches offer significant privacy gains. Moreover, we see that these privacy gains can be achieved while minimizing perturbation (and thus maximizing the utility) relative to stricter notions of privacy, such as differential privacy. We test PrivMAF using genotype data from the Wellcome Trust Case Control Consortium, providing a more nuanced understanding of the privacy risks involved in an actual genome-wide association studies. Interestingly, our analysis demonstrates that the privacy implications of releasing MAFs from a study can differ greatly from individual to individual. An implementation of our method is available at http://privmaf.csail.mit.edu. PMID:29202050

  2. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  3. (a,k)-Anonymous Scheme for Privacy-Preserving Data Collection in IoT-based Healthcare Services Systems.

    Science.gov (United States)

    Li, Hongtao; Guo, Feng; Zhang, Wenyin; Wang, Jie; Xing, Jinsheng

    2018-02-14

    The widely use of IoT technologies in healthcare services has pushed forward medical intelligence level of services. However, it also brings potential privacy threat to the data collection. In healthcare services system, health and medical data that contains privacy information are often transmitted among networks, and such privacy information should be protected. Therefore, there is a need for privacy-preserving data collection (PPDC) scheme to protect clients (patients) data. We adopt (a,k)-anonymity model as privacy pretection scheme for data collection, and propose a novel anonymity-based PPDC method for healthcare services in this paper. The threat model is analyzed in the client-server-to-user (CS2U) model. On client-side, we utilize (a,k)-anonymity notion to generate anonymous tuples which can resist possible attack, and adopt a bottom-up clustering method to create clusters that satisfy a base privacy level of (a 1 ,k 1 )-anonymity. On server-side, we reduce the communication cost through generalization technology, and compress (a 1 ,k 1 )-anonymous data through an UPGMA-based cluster combination method to make the data meet the deeper level of privacy (a 2 ,k 2 )-anonymity (a 1  ≥ a 2 , k 2  ≥ k 1 ). Theoretical analysis and experimental results prove that our scheme is effective in privacy-preserving and data quality.

  4. Regulating Online Data Privacy

    OpenAIRE

    Paul Reid

    2004-01-01

    With existing data protection laws proving inadequate in the fight to protect online data privacy and with the offline law of privacy in a state of change and uncertainty, the search for an alternative solution to the important problem of online data privacy should commence. With the inherent problem of jurisdiction that the Internet presents, such a solution is best coming from a multi-national body with the power to approximate laws in as many jurisdictions as possible, with a recognised au...

  5. Customer privacy on UK healthcare websites.

    Science.gov (United States)

    Mundy, Darren P

    2006-09-01

    Privacy has been and continues to be one of the key challenges of an age devoted to the accumulation, processing, and mining of electronic information. In particular, privacy of healthcare-related information is seen as a key issue as health organizations move towards the electronic provision of services. The aim of the research detailed in this paper has been to analyse privacy policies on popular UK healthcare-related websites to determine the extent to which consumer privacy is protected. The author has combined approaches (such as approaches focused on usability, policy content, and policy quality) used in studies by other researchers on e-commerce and US healthcare websites to provide a comprehensive analysis of UK healthcare privacy policies. The author identifies a wide range of issues related to the protection of consumer privacy through his research analysis using quantitative results. The main outcomes from the author's research are that only 61% of healthcare-related websites in their sample group posted privacy policies. In addition, most of the posted privacy policies had poor readability standards and included a variety of privacy vulnerability statements. Overall, the author's findings represent significant current issues in relation to healthcare information protection on the Internet. The hope is that raising awareness of these results will drive forward changes in the industry, similar to those experienced with information quality.

  6. Balancing Health Information Exchange and Privacy Governance from a Patient-Centred Connected Health and Telehealth Perspective.

    Science.gov (United States)

    Kuziemsky, Craig E; Gogia, Shashi B; Househ, Mowafa; Petersen, Carolyn; Basu, Arindam

    2018-04-22

     Connected healthcare is an essential part of patient-centred care delivery. Technology such as telehealth is a critical part of connected healthcare. However, exchanging health information brings the risk of privacy issues. To better manage privacy risks we first need to understand the different patterns of patient-centred care in order to tailor solutions to address privacy risks.  Drawing upon published literature, we develop a business model to enable patient-centred care via telehealth. The model identifies three patient-centred connected health patterns. We then use the patterns to analyse potential privacy risks and possible solutions from different types of telehealth delivery.  Connected healthcare raises the risk of unwarranted access to health data and related invasion of privacy. However, the risk and extent of privacy issues differ according to the pattern of patient-centred care delivery and the type of particular challenge as they enable the highest degree of connectivity and thus the greatest potential for privacy breaches.  Privacy issues are a major concern in telehealth systems and patients, providers, and administrators need to be aware of these privacy issues and have guidance on how to manage them. This paper integrates patient-centred connected health care, telehealth, and privacy risks to provide an understanding of how risks vary across different patterns of patient-centred connected health and different types of telehealth delivery. Georg Thieme Verlag KG Stuttgart.

  7. Efficiency and Privacy Enhancement for a Track and Trace System of RFID-Based Supply Chains

    Directory of Open Access Journals (Sweden)

    Xunjun Chen

    2015-06-01

    Full Text Available One of the major applications of Radio Frequency Identification (RFID technology is in supply chain management as it promises to provide real-time visibility based on the function of track and trace. However, such an RFID-based track and trace system raises new security and privacy challenges due to the restricted resource of tags. In this paper, we refine three privacy related models (i.e., the privacy, path unlinkability, and tag unlinkability of RFID-based track and trace systems, and clarify the relations among these privacy models. Specifically, we have proven that privacy is equivalent to path unlinkability and tag unlinkability implies privacy. Our results simplify the privacy concept and protocol design for RFID-based track and trace systems. Furthermore, we propose an efficient track and trace scheme, Tracker+, which allows for authentic and private identification of RFID-tagged objects in supply chains. In the Tracker+, no computational ability is required for tags, but only a few bytes of storage (such as EPC Class 1 Gen 2 tags are needed to store the tag state. Indeed, Tracker+ reduces the memory requirements for each tag by one group element compared to the Tracker presented in other literature. Moreover, Tracker+ provides privacy against supply chain inside attacks.

  8. Advertising and Invasion of Privacy.

    Science.gov (United States)

    Rohrer, Daniel Morgan

    The right of privacy as it relates to advertising and the use of a person's name or likeness is discussed in this paper. After an introduction that traces some of the history of invasion of privacy in court decisions, the paper examines cases involving issues such as public figures and newsworthy items, right of privacy waived, right of privacy…

  9. Physical factors that influence patients’ privacy perception toward a psychiatric behavioral monitoring system: a qualitative study

    Science.gov (United States)

    Zakaria, Nasriah; Ramli, Rusyaizila

    2018-01-01

    Background Psychiatric patients have privacy concerns when it comes to technology intervention in the hospital setting. In this paper, we present scenarios for psychiatric behavioral monitoring systems to be placed in psychiatric wards to understand patients’ perception regarding privacy. Psychiatric behavioral monitoring refers to systems that are deemed useful in measuring clinical outcomes, but little research has been done on how these systems will impact patients’ privacy. Methods We conducted a case study in one teaching hospital in Malaysia. We investigated the physical factors that influence patients’ perceived privacy with respect to a psychiatric monitoring system. The eight physical factors identified from the information system development privacy model, a comprehensive model for designing a privacy-sensitive information system, were adapted in this research. Scenario-based interviews were conducted with 25 patients in a psychiatric ward for 3 months. Results Psychiatric patients were able to share how physical factors influence their perception of privacy. Results show how patients responded to each of these dimensions in the context of a psychiatric behavioral monitoring system. Conclusion Some subfactors under physical privacy are modified to reflect the data obtained in the interviews. We were able to capture the different physical factors that influence patient privacy. PMID:29343963

  10. SafePub: A Truthful Data Anonymization Algorithm With Strong Privacy Guarantees

    Directory of Open Access Journals (Sweden)

    Bild Raffael

    2018-01-01

    Full Text Available Methods for privacy-preserving data publishing and analysis trade off privacy risks for individuals against the quality of output data. In this article, we present a data publishing algorithm that satisfies the differential privacy model. The transformations performed are truthful, which means that the algorithm does not perturb input data or generate synthetic output data. Instead, records are randomly drawn from the input dataset and the uniqueness of their features is reduced. This also offers an intuitive notion of privacy protection. Moreover, the approach is generic, as it can be parameterized with different objective functions to optimize its output towards different applications. We show this by integrating six well-known data quality models. We present an extensive analytical and experimental evaluation and a comparison with prior work. The results show that our algorithm is the first practical implementation of the described approach and that it can be used with reasonable privacy parameters resulting in high degrees of protection. Moreover, when parameterizing the generic method with an objective function quantifying the suitability of data for building statistical classifiers, we measured prediction accuracies that compare very well with results obtained using state-of-the-art differentially private classification algorithms.

  11. Risk-Based Models for Managing Data Privacy in Healthcare

    Science.gov (United States)

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  12. Redefining genomic privacy: trust and empowerment.

    Directory of Open Access Journals (Sweden)

    Yaniv Erlich

    2014-11-01

    Full Text Available Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  13. Redefining genomic privacy: trust and empowerment.

    Science.gov (United States)

    Erlich, Yaniv; Williams, James B; Glazer, David; Yocum, Kenneth; Farahany, Nita; Olson, Maynard; Narayanan, Arvind; Stein, Lincoln D; Witkowski, Jan A; Kain, Robert C

    2014-11-01

    Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  14. Information privacy fundamentals for librarians and information professionals

    CERN Document Server

    Givens, Cherie L

    2014-01-01

    This book introduces library and information professionals to information privacy, provides an overview of information privacy in the library and information science context, U.S. privacy laws by sector, information privacy policy, and key considerations when planning and creating a privacy program.

  15. SIED, a Data Privacy Engineering Framework

    OpenAIRE

    Mivule, Kato

    2013-01-01

    While a number of data privacy techniques have been proposed in the recent years, a few frameworks have been suggested for the implementation of the data privacy process. Most of the proposed approaches are tailored towards implementing a specific data privacy algorithm but not the overall data privacy engineering and design process. Therefore, as a contribution, this study proposes SIED (Specification, Implementation, Evaluation, and Dissemination), a conceptual framework that takes a holist...

  16. User Privacy in RFID Networks

    Science.gov (United States)

    Singelée, Dave; Seys, Stefaan

    Wireless RFID networks are getting deployed at a rapid pace and have already entered the public space on a massive scale: public transport cards, the biometric passport, office ID tokens, customer loyalty cards, etc. Although RFID technology offers interesting services to customers and retailers, it could also endanger the privacy of the end-users. The lack of protection mechanisms being deployed could potentially result in a privacy leakage of personal data. Furthermore, there is the emerging threat of location privacy. In this paper, we will show some practical attack scenarios and illustrates some of them with cases that have received press coverage. We will present the main challenges of enhancing privacy in RFID networks and evaluate some solutions proposed in literature. The main advantages and shortcomings will be briefly discussed. Finally, we will give an overview of some academic and industrial research initiatives on RFID privacy.

  17. Comparison of two speech privacy measurements, articulation index (AI) and speech privacy noise isolation class (NIC'), in open workplaces

    Science.gov (United States)

    Yoon, Heakyung C.; Loftness, Vivian

    2002-05-01

    Lack of speech privacy has been reported to be the main dissatisfaction among occupants in open workplaces, according to workplace surveys. Two speech privacy measurements, Articulation Index (AI), standardized by the American National Standards Institute in 1969, and Speech Privacy Noise Isolation Class (NIC', Noise Isolation Class Prime), adapted from Noise Isolation Class (NIC) by U. S. General Services Administration (GSA) in 1979, have been claimed as objective tools to measure speech privacy in open offices. To evaluate which of them, normal privacy for AI or satisfied privacy for NIC', is a better tool in terms of speech privacy in a dynamic open office environment, measurements were taken in the field. AIs and NIC's in the different partition heights and workplace configurations have been measured following ASTM E1130 (Standard Test Method for Objective Measurement of Speech Privacy in Open Offices Using Articulation Index) and GSA test PBS-C.1 (Method for the Direct Measurement of Speech-Privacy Potential (SPP) Based on Subjective Judgments) and PBS-C.2 (Public Building Service Standard Method of Test Method for the Sufficient Verification of Speech-Privacy Potential (SPP) Based on Objective Measurements Including Methods for the Rating of Functional Interzone Attenuation and NC-Background), respectively.

  18. An Alternative View of Privacy on Facebook

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2011-02-01

    Full Text Available The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy tends to ignore the political economy of privacy in capitalism that can mask socio-economic inequality and protect capital and the rich from public accountability. Facebook is in this paper analyzed with the help of an approach, in which privacy for dominant groups, in regard to the ability of keeping wealth and power secret from the public, is seen as problematic, whereas privacy at the bottom of the power pyramid for consumers and normal citizens is seen as a protection from dominant interests. Facebook’s privacy concept is based on an understanding that stresses self-regulation and on an individualistic understanding of privacy. The theoretical analysis of the political economy of privacy on Facebook in this paper is based on the political theories of Karl Marx, Hannah Arendt and Jürgen Habermas. Based on the political economist Dallas Smythe’s concept of audience commodification, the process of prosumer commodification on Facebook is analyzed. The political economy of privacy on Facebook is analyzed with the help of a theory of drives that is grounded in Herbert Marcuse’s interpretation of Sigmund Freud, which allows to analyze Facebook based on the concept of play labor (= the convergence of play and labor.

  19. An Alternative View of Privacy on Facebook

    OpenAIRE

    Christian Fuchs

    2011-01-01

    The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy ...

  20. PriBots: Conversational Privacy with Chatbots

    OpenAIRE

    Harkous, Hamza; Fawaz, Kassem; Shin, Kang G.; Aberer, Karl

    2016-01-01

    Traditional mechanisms for delivering notice and enabling choice have so far failed to protect users’ privacy. Users are continuously frustrated by complex privacy policies, unreachable privacy settings, and a multitude of emerging standards. The miniaturization trend of smart devices and the emergence of the Internet of Things (IoTs) will exacerbate this problem further. In this paper, we propose Conversational Privacy Bots (PriBots) as a new way of delivering notice and choice through a two...

  1. 29 CFR 1904.29 - Forms.

    Science.gov (United States)

    2010-07-01

    ... OSHA 300 Log. Instead, enter “privacy case” in the space normally used for the employee's name. This...) Basic requirement. You must use OSHA 300, 300-A, and 301 forms, or equivalent forms, for recordable injuries and illnesses. The OSHA 300 form is called the Log of Work-Related Injuries and Illnesses, the 300...

  2. Who commits virtual identity suicide? Differences in privacy concerns, Internet addiction, and personality between Facebook users and quitters.

    Science.gov (United States)

    Stieger, Stefan; Burger, Christoph; Bohn, Manuel; Voracek, Martin

    2013-09-01

    Social networking sites such as Facebook attract millions of users by offering highly interactive social communications. Recently, a counter movement of users has formed, deciding to leave social networks by quitting their accounts (i.e., virtual identity suicide). To investigate whether Facebook quitters (n=310) differ from Facebook users (n=321), we examined privacy concerns, Internet addiction scores, and personality. We found Facebook quitters to be significantly more cautious about their privacy, having higher Internet addiction scores, and being more conscientious than Facebook users. The main self-stated reason for committing virtual identity suicide was privacy concerns (48 percent). Although the adequacy of privacy in online communication has been questioned, privacy is still an important issue in online social communications.

  3. Privacy Protection: Mandating New Arrangements to Implement and Assess Federal Privacy Policy and Practice

    National Research Council Canada - National Science Library

    Relyea, Harold C

    2004-01-01

    When Congress enacted the Privacy Act of 1974, it established a temporary national study commission to conduct a comprehensive assessment of privacy policy and practice in both the public and private...

  4. 24 CFR 3280.107 - Interior privacy.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Interior privacy. 3280.107 Section 3280.107 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... privacy. Bathroom and toilet compartment doors shall be equipped with a privacy lock. ...

  5. Online privacy: overview and preliminary research

    Directory of Open Access Journals (Sweden)

    Renata Mekovec

    2010-12-01

    Full Text Available Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4 Over the last decade using the Internet for online shopping, information browsing and searching as well as for online communication has become part of everyday life. Although the Internet technology has a lot of benefits for users, one of the most important disadvantages is related to the increasing capacity for users’ online activity surveillance. However, the users are increasingly becoming aware of online surveillance methods, which results in their increased concern for privacy protection. Numerous factors influence the way in which individuals perceive the level of privacy protection when they are online. This article provides a review of factors that influence the privacy perception of Internet users. Previous online privacy research related to e-business was predominantly focused on the dimension of information privacy and concerned with the way users’ personal information is collected, saved and used by an online company. This article’s main aim is to provide an overview of numerous Internet users’ privacy perception elements across various privacy dimensions as well as their potential categorization. In addition, considering that e-banking and online shopping are one of the most widely used e-services, an examination of online privacy perception of e-banking/online shopping users was performed. 

  6. 45 CFR 503.1 - Definitions-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Definitions-Privacy Act. 503.1 Section 503.1... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.1 Definitions—Privacy Act. For the purpose of this part: Agency...

  7. 'Privacy lost - and found?' : the information value chain as a model to meet citizens' concerns

    NARCIS (Netherlands)

    van de Pas, John; van Bussel, Geert-Jan

    2015-01-01

    In this paper we explore the extent to which privacy enhancing technologies (PETs) could be effective in providing privacy to citizens. Rapid development of ubiquitous computing and ‘the internet of things’ are leading to Big Data and the application of Predictive Analytics, effectively merging the

  8. Virtue, Privacy and Self-Determination

    DEFF Research Database (Denmark)

    Stamatellos, Giannis

    2011-01-01

    The ethical problem of privacy lies at the core of computer ethics and cyber ethics discussions. The extensive use of personal data in digital networks poses a serious threat to the user’s right of privacy not only at the level of a user’s data integrity and security but also at the level of a user......’s identity and freedom. In normative ethical theory the need for an informational self-deterministic approach of privacy is stressed with greater emphasis on the control over personal data. However, scant attention has been paid on a virtue ethics approach of information privacy. Plotinus’ discussion of self......-determination is related to ethical virtue, human freedom and intellectual autonomy. The Plotinian virtue ethics approach of self-determination is not primarily related to the sphere of moral action, but to the quality of the self prior to moral practice. In this paper, it is argued that the problem of information privacy...

  9. Enforcement of Privacy Policies over Multiple Online Social Networks for Collaborative Activities

    Science.gov (United States)

    Wu, Zhengping; Wang, Lifeng

    Our goal is to tend to develop an enforcement architecture of privacy policies over multiple online social networks. It is used to solve the problem of privacy protection when several social networks build permanent or temporary collaboration. Theoretically, this idea is practical, especially due to more and more social network tend to support open source framework “OpenSocial”. But as we known different social network websites may have the same privacy policy settings based on different enforcement mechanisms, this would cause problems. In this case, we have to manually write code for both sides to make the privacy policy settings enforceable. We can imagine that, this is a huge workload based on the huge number of current social networks. So we focus on proposing a middleware which is used to automatically generate privacy protection component for permanent integration or temporary interaction of social networks. This middleware provide functions, such as collecting of privacy policy of each participant in the new collaboration, generating a standard policy model for each participant and mapping all those standard policy to different enforcement mechanisms of those participants.

  10. Older and Wiser? Facebook Use, Privacy Concern, and Privacy Protection in the Life Stages of Emerging, Young, and Middle Adulthood

    Directory of Open Access Journals (Sweden)

    Evert Van den Broeck

    2015-11-01

    Full Text Available A large part of research conducted on privacy concern and protection on social networking sites (SNSs concentrates on children and adolescents. Individuals in these developmental stages are often described as vulnerable Internet users. But how vulnerable are adults in terms of online informational privacy? This study applied a privacy boundary management approach and investigated Facebook use, privacy concern, and the application of privacy settings on Facebook by linking the results to Erikson’s three stages of adulthood: emerging, young, and middle adulthood. An online survey was distributed among 18- to 65-year-old Dutch-speaking adults ( N  = 508, 51.8% females. Analyses revealed clear differences between the three adult age groups in terms of privacy concern, Facebook use, and privacy protection. Results indicated that respondents in young adulthood and middle adulthood were more vulnerable in terms of privacy protection than emerging adults. Clear discrepancies were found between privacy concern and protection for these age groups. More particularly, the middle adulthood group was more concerned about their privacy in comparison to the emerging adulthood and young adulthood group. Yet, they reported to use privacy settings less frequently than the younger age groups. Emerging adults were found to be pragmatic and privacy conscious SNS users. Young adults occupied the intermediate position, suggesting a developmental shift. The impact of generational differences is discussed, as well as implications for education and governmental action.

  11. A Formal Study of the Privacy Concerns in Biometric-Based Remote Authentication Schemes

    NARCIS (Netherlands)

    Tang, Qiang; Bringer, Julien; Chabanne, Hervé; Pointcheval, David; Chen, L.; Mu, Y.; Susilo, W.

    With their increasing popularity in cryptosystems, biometrics have attracted more and more attention from the information security community. However, how to handle the relevant privacy concerns remains to be troublesome. In this paper, we propose a novel security model to formalize the privacy

  12. 49 CFR 10.13 - Privacy Officer.

    Science.gov (United States)

    2010-10-01

    ... INDIVIDUALS General § 10.13 Privacy Officer. (a) To assist with implementation, evaluation, and administration issues, the Chief Information Officer appoints a principal coordinating official with the title Privacy... 49 Transportation 1 2010-10-01 2010-10-01 false Privacy Officer. 10.13 Section 10.13...

  13. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  14. 77 FR 33761 - Privacy Act of 1974; Notification to Update an Existing Privacy Act System of Records, “Grievance...

    Science.gov (United States)

    2012-06-07

    ... of a data breach. (See also on HUD's privacy Web site, Appendix I for other ways that the Privacy Act... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5613-N-04] Privacy Act of 1974; Notification to Update an Existing Privacy Act System of Records, ``Grievance Records'' AGENCY: Office of the...

  15. Online Privacy as a Corporate Social Responsibility

    DEFF Research Database (Denmark)

    Pollach, Irene

    2011-01-01

    Information technology and the Internet have added a new stakeholder concern to the corporate social responsibility agenda: online privacy. While theory suggests that online privacy is a corporate social responsibility, only very few studies in the business ethics literature have connected...... of the companies have comprehensive privacy programs, although more than half of them voice moral or relational motives for addressing online privacy. The privacy measures they have taken are primarily compliance measures, while measures that stimulate a stakeholder dialogue are rare. Overall, a wide variety...

  16. Privacy-related context information for ubiquitous health.

    Science.gov (United States)

    Seppälä, Antto; Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-03-11

    Ubiquitous health has been defined as a dynamic network of interconnected systems. A system is composed of one or more information systems, their stakeholders, and the environment. These systems offer health services to individuals and thus implement ubiquitous computing. Privacy is the key challenge for ubiquitous health because of autonomous processing, rich contextual metadata, lack of predefined trust among participants, and the business objectives. Additionally, regulations and policies of stakeholders may be unknown to the individual. Context-sensitive privacy policies are needed to regulate information processing. Our goal was to analyze privacy-related context information and to define the corresponding components and their properties that support privacy management in ubiquitous health. These properties should describe the privacy issues of information processing. With components and their properties, individuals can define context-aware privacy policies and set their privacy preferences that can change in different information-processing situations. Scenarios and user stories are used to analyze typical activities in ubiquitous health to identify main actors, goals, tasks, and stakeholders. Context arises from an activity and, therefore, we can determine different situations, services, and systems to identify properties for privacy-related context information in information-processing situations. Privacy-related context information components are situation, environment, individual, information technology system, service, and stakeholder. Combining our analyses and previously identified characteristics of ubiquitous health, more detailed properties for the components are defined. Properties define explicitly what context information for different components is needed to create context-aware privacy policies that can control, limit, and constrain information processing. With properties, we can define, for example, how data can be processed or how components

  17. Enhancing Privacy for Digital Rights Management

    NARCIS (Netherlands)

    Petkovic, M.; Conrado, C.; Schrijen, G.J.; Jonker, Willem

    2007-01-01

    This chapter addresses privacy issues in DRM systems. These systems provide a means of protecting digital content, but may violate the privacy of users in that the content they purchase and their actions in the system can be linked to specific users. The chapter proposes a privacy-preserving DRM

  18. Location privacy protection in mobile networks

    CERN Document Server

    Liu, Xinxin

    2013-01-01

    This SpringerBrief analyzes the potential privacy threats in wireless and mobile network environments, and reviews some existing works. It proposes multiple privacy preserving techniques against several types of privacy threats that are targeting users in a mobile network environment. Depending on the network architecture, different approaches can be adopted. The first proposed approach considers a three-party system architecture where there is a trusted central authority that can be used to protect users? privacy. The second approach considers a totally distributed environment where users per

  19. Privacy enhanced recommender system

    NARCIS (Netherlands)

    Erkin, Zekeriya; Erkin, Zekeriya; Beye, Michael; Veugen, Thijs; Lagendijk, Reginald L.

    2010-01-01

    Recommender systems are widely used in online applications since they enable personalized service to the users. The underlying collaborative filtering techniques work on user’s data which are mostly privacy sensitive and can be misused by the service provider. To protect the privacy of the users, we

  20. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  1. Physical factors that influence patients’ privacy perception toward a psychiatric behavioral monitoring system: a qualitative study

    Directory of Open Access Journals (Sweden)

    Zakaria N

    2017-12-01

    Full Text Available Nasriah Zakaria,1,2 Rusyaizila Ramli3 1Research Chair of Health Informatics and Promotion, 2Medical Informatics and E-learning Unit, Medical Education Department, College of Medicine, King Saud University, Riyadh, Kingdom of Saudi Arabia; 3Advanced Military Maintenance Repair and Overhaul Center (AMMROC, Abu Dhabi, UAE Background: Psychiatric patients have privacy concerns when it comes to technology intervention in the hospital setting. In this paper, we present scenarios for psychiatric behavioral monitoring systems to be placed in psychiatric wards to understand patients’ perception regarding privacy. Psychiatric behavioral monitoring refers to systems that are deemed useful in measuring clinical outcomes, but little research has been done on how these systems will impact patients’ privacy. Methods: We conducted a case study in one teaching hospital in Malaysia. We investigated the physical factors that influence patients’ perceived privacy with respect to a psychiatric monitoring system. The eight physical factors identified from the information system development privacy model, a comprehensive model for designing a privacy-sensitive information system, were adapted in this research. Scenario-based interviews were conducted with 25 patients in a psychiatric ward for 3 months. Results: Psychiatric patients were able to share how physical factors influence their perception of privacy. Results show how patients responded to each of these dimensions in the context of a psychiatric behavioral monitoring system. Conclusion: Some subfactors under physical privacy are modified to reflect the data obtained in the interviews. We were able to capture the different physical factors that influence patient privacy. Keywords: information system development (ISD, physical factor, privacy, psychiatric monitoring system

  2. Control of interior surface materials for speech privacy in high-speed train cabins.

    Science.gov (United States)

    Jang, H S; Lim, H; Jeon, J Y

    2017-05-01

    The effect of interior materials with various absorption coefficients on speech privacy was investigated in a 1:10 scale model of one high-speed train cabin geometry. The speech transmission index (STI) and privacy distance (r P ) were measured in the train cabin to quantify speech privacy. Measurement cases were selected for the ceiling, sidewall, and front and back walls and were classified as high-, medium- and low-absorption coefficient cases. Interior materials with high absorption coefficients yielded a low r P , and the ceiling had the largest impact on both the STI and r P among the interior elements. Combinations of the three cases were measured, and the maximum reduction in r P by the absorptive surfaces was 2.4 m, which exceeds the space between two rows of chairs in the high-speed train. Additionally, the contribution of the interior elements to speech privacy was analyzed using recorded impulse responses and a multiple regression model for r P using the equivalent absorption area. The analysis confirmed that the ceiling was the most important interior element for improving speech privacy. These results can be used to find the relative decrease in r P in the acoustic design of interior materials to improve speech privacy in train cabins. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Examining Self-Disclosure on Social Networking Sites: A Flow Theory and Privacy Perspective

    Directory of Open Access Journals (Sweden)

    George Oppong Appiagyei Ampong

    2018-06-01

    Full Text Available Social media and other web 2.0 tools have provided users with the platform to interact with and also disclose personal information to not only their friends and acquaintances but also relative strangers with unprecedented ease. This has enhanced the ability of people to share more about themselves, their families, and their friends through a variety of media including text, photo, and video, thus developing and sustaining social and business relationships. The purpose of the paper is to identify the factors that predict self-disclosure on social networking sites from the perspective of privacy and flow. Data was collected from 452 students in three leading universities in Ghana and analyzed with Partial Least Square-Structural Equation Modeling. Results from the study revealed that privacy risk was the most significant predictor. We also found privacy awareness, privacy concerns, and privacy invasion experience to be significant predictors of self-disclosure. Interaction and perceived control were found to have significant effect on self-disclosure. In all, the model accounted for 54.6 percent of the variance in self-disclosure. The implications and limitations of the current study are discussed, and directions for future research proposed.

  4. Examining Self-Disclosure on Social Networking Sites: A Flow Theory and Privacy Perspective.

    Science.gov (United States)

    Ampong, George Oppong Appiagyei; Mensah, Aseda; Adu, Adolph Sedem Yaw; Addae, John Agyekum; Omoregie, Osaretin Kayode; Ofori, Kwame Simpe

    2018-06-06

    Social media and other web 2.0 tools have provided users with the platform to interact with and also disclose personal information to not only their friends and acquaintances but also relative strangers with unprecedented ease. This has enhanced the ability of people to share more about themselves, their families, and their friends through a variety of media including text, photo, and video, thus developing and sustaining social and business relationships. The purpose of the paper is to identify the factors that predict self-disclosure on social networking sites from the perspective of privacy and flow. Data was collected from 452 students in three leading universities in Ghana and analyzed with Partial Least Square-Structural Equation Modeling. Results from the study revealed that privacy risk was the most significant predictor. We also found privacy awareness, privacy concerns, and privacy invasion experience to be significant predictors of self-disclosure. Interaction and perceived control were found to have significant effect on self-disclosure. In all, the model accounted for 54.6 percent of the variance in self-disclosure. The implications and limitations of the current study are discussed, and directions for future research proposed.

  5. Privacy Protection Research of Mobile RFID

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Radio Frequency Identification is one of the most controversial technologies at present.It is very difficult to detect who reads a tag incorporated into products owned by a person,a significant concern to privacy threats in RFID system arises from this reason.User privacy problem is prior considersion for mobile RFID service,because most mobile RFID service based on end-user service.Propose a solution for user privacy protection,which is a modification of EPC Class 1 Generation 2 protocol,and introduce a privacy protection scenario for mobile RFID service using this method.

  6. Analysis of Privacy on Social Networks

    OpenAIRE

    Tomandl, Luboš

    2015-01-01

    This thesis deals with a question of privacy in a context of social networks. The main substance of these services is the users' option to share an information about their lives. This alone can be a problem for privacy. In the first part of this thesis concentrates on the meaning of privacy as well as its value for both individuals and the society. In the next part the privacy threats on social networks, namely Facebook, are discussed. These threats are disclosed on four levels according to f...

  7. Privacy by design in personal health monitoring.

    Science.gov (United States)

    Nordgren, Anders

    2015-06-01

    The concept of privacy by design is becoming increasingly popular among regulators of information and communications technologies. This paper aims at analysing and discussing the ethical implications of this concept for personal health monitoring. I assume a privacy theory of restricted access and limited control. On the basis of this theory, I suggest a version of the concept of privacy by design that constitutes a middle road between what I call broad privacy by design and narrow privacy by design. The key feature of this approach is that it attempts to balance automated privacy protection and autonomously chosen privacy protection in a way that is context-sensitive. In personal health monitoring, this approach implies that in some contexts like medication assistance and monitoring of specific health parameters one single automatic option is legitimate, while in some other contexts, for example monitoring in which relatives are receivers of health-relevant information rather than health care professionals, a multi-choice approach stressing autonomy is warranted.

  8. Pre-Capture Privacy for Small Vision Sensors.

    Science.gov (United States)

    Pittaluga, Francesco; Koppal, Sanjeev Jagannatha

    2017-11-01

    The next wave of micro and nano devices will create a world with trillions of small networked cameras. This will lead to increased concerns about privacy and security. Most privacy preserving algorithms for computer vision are applied after image/video data has been captured. We propose to use privacy preserving optics that filter or block sensitive information directly from the incident light-field before sensor measurements are made, adding a new layer of privacy. In addition to balancing the privacy and utility of the captured data, we address trade-offs unique to miniature vision sensors, such as achieving high-quality field-of-view and resolution within the constraints of mass and volume. Our privacy preserving optics enable applications such as depth sensing, full-body motion tracking, people counting, blob detection and privacy preserving face recognition. While we demonstrate applications on macro-scale devices (smartphones, webcams, etc.) our theory has impact for smaller devices.

  9. Through Patients' Eyes: Regulation, Technology, Privacy, and the Future.

    Science.gov (United States)

    Petersen, Carolyn

    2018-04-22

    Privacy is commonly regarded as a regulatory requirement achieved via technical and organizational management practices. Those working in the field of informatics often play a role in privacy preservation as a result of their expertise in information technology, workflow analysis, implementation science, or related skills. Viewing privacy from the perspective of patients whose protected health information is at risk broadens the considerations to include the perceived duality of privacy; the existence of privacy within a context unique to each patient; the competing needs inherent within privacy management; the need for particular consideration when data are shared; and the need for patients to control health information in a global setting. With precision medicine, artificial intelligence, and other treatment innovations on the horizon, health care professionals need to think more broadly about how to preserve privacy in a health care environment driven by data sharing. Patient-reported privacy preferences, privacy portability, and greater transparency around privacy-preserving functionalities are potential strategies for ensuring that privacy regulations are met and privacy is preserved. Georg Thieme Verlag KG Stuttgart.

  10. The privacy implications of Bluetooth

    OpenAIRE

    Kostakos, Vassilis

    2008-01-01

    A substantial amount of research, as well as media hype, has surrounded RFID technology and its privacy implications. Currently, researchers and the media focus on the privacy threats posed by RFID, while consumer groups choose to boycott products bearing RFID tags. At the same, however, a very similar technology has quietly become part of our everyday lives: Bluetooth. In this paper we highlight the fact that Bluetooth is a widespread technology that has real privacy implications. Furthermor...

  11. Privacy-preserving smart meter control strategy including energy storage losses

    OpenAIRE

    Avula, Chinni Venkata Ramana R.; Oechtering, Tobias J.; Månsson, Daniel

    2018-01-01

    Privacy-preserving smart meter control strategies proposed in the literature so far make some ideal assumptions such as instantaneous control without delay, lossless energy storage systems etc. In this paper, we present a one-step-ahead predictive control strategy using Bayesian risk to measure and control privacy leakage with an energy storage system. The controller estimates energy state using a three-circuit energy storage model to account for steady-state energy losses. With numerical exp...

  12. Privacy Protection in Participatory Sensing Applications Requiring Fine-Grained Locations

    DEFF Research Database (Denmark)

    Dong, Kai; Gu, Tao; Tao, Xianping

    2010-01-01

    The emerging participatory sensing applications have brought a privacy risk where users expose their location information. Most of the existing solutions preserve location privacy by generalizing a precise user location to a coarse-grained location, and hence they cannot be applied in those appli...... provider is an trustworthy entity, making our solution more feasible to practical applications. We present and analyze our security model, and evaluate the performance and scalability of our system....

  13. 31 CFR 0.216 - Privacy Act.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Privacy Act. 0.216 Section 0.216... RULES OF CONDUCT Rules of Conduct § 0.216 Privacy Act. Employees involved in the design, development, operation, or maintenance of any system of records or in maintaining records subject to the Privacy Act of...

  14. Privacy-Related Context Information for Ubiquitous Health

    Science.gov (United States)

    Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-01-01

    Background Ubiquitous health has been defined as a dynamic network of interconnected systems. A system is composed of one or more information systems, their stakeholders, and the environment. These systems offer health services to individuals and thus implement ubiquitous computing. Privacy is the key challenge for ubiquitous health because of autonomous processing, rich contextual metadata, lack of predefined trust among participants, and the business objectives. Additionally, regulations and policies of stakeholders may be unknown to the individual. Context-sensitive privacy policies are needed to regulate information processing. Objective Our goal was to analyze privacy-related context information and to define the corresponding components and their properties that support privacy management in ubiquitous health. These properties should describe the privacy issues of information processing. With components and their properties, individuals can define context-aware privacy policies and set their privacy preferences that can change in different information-processing situations. Methods Scenarios and user stories are used to analyze typical activities in ubiquitous health to identify main actors, goals, tasks, and stakeholders. Context arises from an activity and, therefore, we can determine different situations, services, and systems to identify properties for privacy-related context information in information-processing situations. Results Privacy-related context information components are situation, environment, individual, information technology system, service, and stakeholder. Combining our analyses and previously identified characteristics of ubiquitous health, more detailed properties for the components are defined. Properties define explicitly what context information for different components is needed to create context-aware privacy policies that can control, limit, and constrain information processing. With properties, we can define, for example, how

  15. 75 FR 28051 - Public Workshop: Pieces of Privacy

    Science.gov (United States)

    2010-05-19

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Public Workshop: Pieces of Privacy AGENCY: Privacy Office, DHS. ACTION: Notice announcing public workshop. SUMMARY: The Department of Homeland Security Privacy Office will host a public workshop, ``Pieces of Privacy.'' DATES: The workshop will be...

  16. Scalable privacy-preserving data sharing methodology for genome-wide association studies: an application to iDASH healthcare privacy protection challenge.

    Science.gov (United States)

    Yu, Fei; Ji, Zhanglong

    2014-01-01

    In response to the growing interest in genome-wide association study (GWAS) data privacy, the Integrating Data for Analysis, Anonymization and SHaring (iDASH) center organized the iDASH Healthcare Privacy Protection Challenge, with the aim of investigating the effectiveness of applying privacy-preserving methodologies to human genetic data. This paper is based on a submission to the iDASH Healthcare Privacy Protection Challenge. We apply privacy-preserving methods that are adapted from Uhler et al. 2013 and Yu et al. 2014 to the challenge's data and analyze the data utility after the data are perturbed by the privacy-preserving methods. Major contributions of this paper include new interpretation of the χ2 statistic in a GWAS setting and new results about the Hamming distance score, a key component for one of the privacy-preserving methods.

  17. On the comprehensibility and perceived privacy protection of indirect questioning techniques.

    Science.gov (United States)

    Hoffmann, Adrian; Waubert de Puiseau, Berenike; Schmidt, Alexander F; Musch, Jochen

    2017-08-01

    On surveys that assess sensitive personal attributes, indirect questioning aims at increasing respondents' willingness to answer truthfully by protecting confidentiality. However, the assumption that subjects understand questioning procedures fully and trust them to protect their privacy is rarely tested. In a scenario-based design, we compared four indirect questioning procedures in terms of their comprehensibility and perceived privacy protection. All indirect questioning techniques were found to be less comprehensible by respondents than a conventional direct question used for comparison. Less-educated respondents experienced more difficulties when confronted with any indirect questioning technique. Regardless of education, the crosswise model was found to be the most comprehensible among the four indirect methods. Indirect questioning in general was perceived to increase privacy protection in comparison to a direct question. Unexpectedly, comprehension and perceived privacy protection did not correlate. We recommend assessing these factors separately in future evaluations of indirect questioning.

  18. Privacy Issues: Journalists Should Balance Need for Privacy with Need to Cover News.

    Science.gov (United States)

    Plopper, Bruce

    1998-01-01

    Notes that journalists have to balance their desire to print the news with personal rights to privacy. Argues that a working knowledge of ethics and law helps journalism students resolve such issues. Discusses ethical issues; legal aspects of privacy; and "training" administrators. Offers a list of questions to ask, six notable court…

  19. The Regulatory Framework for Privacy and Security

    Science.gov (United States)

    Hiller, Janine S.

    The internet enables the easy collection of massive amounts of personally identifiable information. Unregulated data collection causes distrust and conflicts with widely accepted principles of privacy. The regulatory framework in the United States for ensuring privacy and security in the online environment consists of federal, state, and self-regulatory elements. New laws have been passed to address technological and internet practices that conflict with privacy protecting policies. The United States and the European Union approaches to privacy differ significantly, and the global internet environment will likely cause regulators to face the challenge of balancing privacy interests with data collection for many years to come.

  20. A New Look at The Right to Privacy: Case Snowden and legal postmodernity

    Directory of Open Access Journals (Sweden)

    José Isaac Pilati

    2014-12-01

    Full Text Available http://dx.doi.org/10.5007/2177-7055.2014v35n69p281 Edward Snowden was responsible for the disclosure of the data collection program developed by the National Security Agency. This sparked a strong debate on new forms of violation of the right to privacy, which demonstrates the need to adapt the law to the reality resulting from technological innovations. In this new technological context, this article is based on the Snowden case to discuss the political and legal issues of privacy. The doctrinal approach to the topic is updated and proposes a theoretical approach to privacy as collective good in the Legal Theory of Postmodernism, a new paradigm.

  1. Privacy-Preserving Trajectory Collection

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Xuegang, Huang; Pedersen, Torben Bach

    2008-01-01

    In order to provide context--aware Location--Based Services, real location data of mobile users must be collected and analyzed by spatio--temporal data mining methods. However, the data mining methods need precise location data, while the mobile users want to protect their location privacy....... To remedy this situation, this paper first formally defines novel location privacy requirements. Then, it briefly presents a system for privacy--preserving trajectory collection that meets these requirements. The system is composed of an untrusted server and clients communicating in a P2P network. Location...... data is anonymized in the system using data cloaking and data swapping techniques. Finally, the paper empirically demonstrates that the proposed system is effective and feasible....

  2. Data Security and Privacy in Apps for Dementia: An Analysis of Existing Privacy Policies.

    Science.gov (United States)

    Rosenfeld, Lisa; Torous, John; Vahia, Ipsit V

    2017-08-01

    Despite tremendous growth in the number of health applications (apps), little is known about how well these apps protect their users' health-related data. This gap in knowledge is of particular concern for apps targeting people with dementia, whose cognitive impairment puts them at increased risk of privacy breaches. In this article, we determine how many dementia apps have privacy policies and how well they protect user data. Our analysis included all iPhone apps that matched the search terms "medical + dementia" or "health & fitness + dementia" and collected user-generated content. We evaluated all available privacy policies for these apps based on criteria that systematically measure how individual user data is handled. Seventy-two apps met the above search teams and collected user data. Of these, only 33 (46%) had an available privacy policy. Nineteen of the 33 with policies (58%) were specific to the app in question, and 25 (76%) specified how individual-user as opposed to aggregate data would be handled. Among these, there was a preponderance of missing information, the majority acknowledged collecting individual data for internal purposes, and most admitted to instances in which they would share user data with outside parties. At present, the majority of health apps focused on dementia lack a privacy policy, and those that do exist lack clarity. Bolstering safeguards and improving communication about privacy protections will help facilitate consumer trust in apps, thereby enabling more widespread and meaningful use by people with dementia and those involved in their care. Copyright © 2017. Published by Elsevier Inc.

  3. Enhancing Privacy in Wearable IoT through a Provenance Architecture

    Directory of Open Access Journals (Sweden)

    Richard K. Lomotey

    2018-04-01

    Full Text Available The Internet of Things (IoT is inspired by network interconnectedness of humans, objects, and cloud services to facilitate new use cases and new business models across multiple enterprise domains including healthcare. This creates the need for continuous data streaming in IoT architectures which are mainly designed following the broadcast model. The model facilitates IoT devices to sense and deliver information to other nodes (e.g., cloud, physical objects, etc. that are interested in the information. However, this is a recipe for privacy breaches since sensitive data, such as personal vitals from wearables, can be delivered to undesired sniffing nodes. In order to protect users’ privacy and manufacturers’ IP, as well as detecting and blocking malicious activity, this research paper proposes privacy-oriented IoT architecture following the provenance technique. This ensures that the IoT data will only be delivered to the nodes that subscribe to receive the information. Using the provenance technique to ensure high transparency, the work is able to provide trace routes for digital audit trail. Several empirical evaluations are conducted in a real-world wearable IoT ecosystem to prove the superiority of the proposed work.

  4. Privacy concerns, dead or misunderstood? : The perceptions of privacy amongst the young and old

    NARCIS (Netherlands)

    Steijn, Wouter; Vedder, Anton

    2015-01-01

    The concept of ‘privacy’ has become an important topic for academics and policy-makers. Ubiquitous computing and internet access raise new questions in relation to privacy in the virtual world, including individuals’ appreciation of privacy and how this can be safeguarded. This article contributes

  5. Privacy in the Genomic Era

    Science.gov (United States)

    NAVEED, MUHAMMAD; AYDAY, ERMAN; CLAYTON, ELLEN W.; FELLAY, JACQUES; GUNTER, CARL A.; HUBAUX, JEAN-PIERRE; MALIN, BRADLEY A.; WANG, XIAOFENG

    2015-01-01

    Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highly-detailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward. PMID:26640318

  6. Privacy in the Genomic Era.

    Science.gov (United States)

    Naveed, Muhammad; Ayday, Erman; Clayton, Ellen W; Fellay, Jacques; Gunter, Carl A; Hubaux, Jean-Pierre; Malin, Bradley A; Wang, Xiaofeng

    2015-09-01

    Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highly-detailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward.

  7. Social Transparency through Recommendation Engines and its Challenges: Looking Beyond Privacy

    Directory of Open Access Journals (Sweden)

    Remus TITIRIGA

    2011-01-01

    Full Text Available Our knowledge society is quickly becoming a 'transparent' one. This transparency is acquired, among other means, by 'personalization' or 'profiling': ICT tools gathering contextualized information about individuals in men–computers interactions. The paper begins with an overview of these ICT tools (behavioral targeting, recommendation engines, 'personalization' through social networking. Based on these developments the analysis focus a case study of developments in social network (Facebook and the trade-offs between 'personalization' and privacy constrains. A deeper analysis will reveal unexpected challenges and the need to overcome the privacy paradigm. Finally a draft of possible normative solutions will be depicted, grounded in new forms of individual rights.

  8. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data

    Directory of Open Access Journals (Sweden)

    Qingqing Xie

    2016-11-01

    Full Text Available With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP can provide location-based service (LBS for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman algorithm and ciphertext policy attribute-based encryption (CP-ABE scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA and efficient enough for practical applications in terms of user side computation overhead.

  9. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data.

    Science.gov (United States)

    Xie, Qingqing; Wang, Liangmin

    2016-11-25

    With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP) can provide location-based service (LBS) for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman) algorithm and ciphertext policy attribute-based encryption (CP-ABE) scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA) and efficient enough for practical applications in terms of user side computation overhead.

  10. Privacy-preserving Kruskal-Wallis test.

    Science.gov (United States)

    Guo, Suxin; Zhong, Sheng; Zhang, Aidong

    2013-10-01

    Statistical tests are powerful tools for data analysis. Kruskal-Wallis test is a non-parametric statistical test that evaluates whether two or more samples are drawn from the same distribution. It is commonly used in various areas. But sometimes, the use of the method is impeded by privacy issues raised in fields such as biomedical research and clinical data analysis because of the confidential information contained in the data. In this work, we give a privacy-preserving solution for the Kruskal-Wallis test which enables two or more parties to coordinately perform the test on the union of their data without compromising their data privacy. To the best of our knowledge, this is the first work that solves the privacy issues in the use of the Kruskal-Wallis test on distributed data. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Negotiating privacy in surveillant welfare relations

    DEFF Research Database (Denmark)

    Andersen, Lars Bo; Lauritsen, Peter; Bøge, Ask Risom

    . However, while privacy is central to debates of surveillance, it has proven less productive as an analytical resource for studying surveillance in practice. Consequently, this paper reviews different conceptualisations of privacy in relation to welfare and surveillance and argues for strengthening...... the analytical capacity of the concept by rendering it a situated and relational concept. The argument is developed through a research and design project called Teledialogue meant to improve the relation between case managers and children placed at institutions or in foster families. Privacy in Teledialogue...... notion of privacy are discussed in relation to both research- and public debates on surveillance in a welfare setting....

  12. Insights to develop privacy policy for organization in Indonesia

    Science.gov (United States)

    Rosmaini, E.; Kusumasari, T. F.; Lubis, M.; Lubis, A. R.

    2018-03-01

    Nowadays, the increased utilization of shared application in the network needs not only dictate to have enhanced security but also emphasize the need to balance its privacy protection and ease of use. Meanwhile, its accessibility and availability as the demand from organization service put privacy obligations become more complex process to be handled and controlled. Nonetheless, the underlying principles for privacy policy exist in Indonesian current laws, even though they spread across various article regulations. Religions, constitutions, statutes, regulations, custom and culture requirements still become the reference model to control the activity process for data collection and information sharing accordingly. Moreover, as the customer and organization often misinterpret their responsibilities and rights in the business function, process and level, the essential thing to be considered for professionals on how to articulate clearly the rules that manage their information gathering and distribution in a manner that translates into information system specification and requirements for developers and managers. This study focus on providing suggestion and recommendation to develop privacy policy based on descriptive analysis of 791 respondents on personal data protection in accordance with political and economic factor in Indonesia.

  13. Privacy-preserving efficient searchable encryption

    OpenAIRE

    Ferreira, Bernardo Luís da Silva

    2016-01-01

    Data storage and computation outsourcing to third-party managed data centers, in environments such as Cloud Computing, is increasingly being adopted by individuals, organizations, and governments. However, as cloud-based outsourcing models expand to society-critical data and services, the lack of effective and independent control over security and privacy conditions in such settings presents significant challenges. An interesting solution to these issues is to perform comput...

  14. Trust and Privacy in Healthcare

    Science.gov (United States)

    Singleton, Peter; Kalra, Dipak

    This paper considers issues of trust and privacy in healthcare around increased data-sharing through Electronic Health Records (EHRs). It uses a model structured around different aspects of trust in the healthcare organisation’s reasons for greater data-sharing and their ability to execute EHR projects, particularly any associated confidentiality controls. It reflects the individual’s personal circumstances and attitude to use of health records.

  15. 32 CFR 311.7 - OSD/JS Privacy Office Processes.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 2 2010-07-01 2010-07-01 false OSD/JS Privacy Office Processes. 311.7 Section...) PRIVACY PROGRAM OFFICE OF THE SECRETARY OF DEFENSE AND JOINT STAFF PRIVACY PROGRAM § 311.7 OSD/JS Privacy Office Processes. The OSD/JS Privacy Office shall: (a) Exercise oversight and administrative control of...

  16. 32 CFR 701.101 - Privacy program terms and definitions.

    Science.gov (United States)

    2010-07-01

    ... from a project on privacy issues, identifying and resolving the privacy risks, and approval by a... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy program terms and definitions. 701.101... DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.101 Privacy program terms and...

  17. An informational theory of privacy

    NARCIS (Netherlands)

    Schottmuller, C.; Jann, Ole

    2016-01-01

    We develop a theory that explains how and when privacy can increase welfare. Without privacy, some individuals misrepresent their preferences, because they will otherwise be statistically discriminated against. This "chilling effect" hurts them individually, and impairs information aggregation. The

  18. 45 CFR 503.2 - General policies-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false General policies-Privacy Act. 503.2 Section 503.2... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.2 General policies—Privacy Act. The Commission will protect the...

  19. Preserving differential privacy under finite-precision semantics.

    Directory of Open Access Journals (Sweden)

    Ivan Gazeau

    2013-06-01

    Full Text Available The approximation introduced by finite-precision representation of continuous data can induce arbitrarily large information leaks even when the computation using exact semantics is secure. Such leakage can thus undermine design efforts aimed at protecting sensitive information. We focus here on differential privacy, an approach to privacy that emerged from the area of statistical databases and is now widely applied also in other domains. In this approach, privacy is protected by the addition of noise to a true (private value. To date, this approach to privacy has been proved correct only in the ideal case in which computations are made using an idealized, infinite-precision semantics. In this paper, we analyze the situation at the implementation level, where the semantics is necessarily finite-precision, i.e. the representation of real numbers and the operations on them, are rounded according to some level of precision. We show that in general there are violations of the differential privacy property, and we study the conditions under which we can still guarantee a limited (but, arguably, totally acceptable variant of the property, under only a minor degradation of the privacy level. Finally, we illustrate our results on two cases of noise-generating distributions: the standard Laplacian mechanism commonly used in differential privacy, and a bivariate version of the Laplacian recently introduced in the setting of privacy-aware geolocation.

  20. Privacy-preserving distributed clustering

    DEFF Research Database (Denmark)

    Erkin, Zekeriya; Veugen, Thijs; Toft, Tomas

    2013-01-01

    with any other entity, including the service provider. Such privacy concerns lead to trust issues between entities, which clearly damages the functioning of the service and even blocks cooperation between entities with similar data sets. To enable joint efforts with private data, we propose a protocol......, or in some cases, information from different databases is pooled to enrich the data so that the merged database can improve the clustering effort. However, in either case, the content of the database may be privacy sensitive and/or commercially valuable such that the owners may not want to share their data...... provider with computations. Experimental results clearly indicate that the work we present is an efficient way of deploying a privacy-preserving clustering algorithm in a distributed manner....

  1. A Privacy-Preserving Distributed Optimal Scheduling for Interconnected Microgrids

    Directory of Open Access Journals (Sweden)

    Nian Liu

    2016-12-01

    Full Text Available With the development of microgrids (MGs, interconnected operation of multiple MGs is becoming a promising strategy for the smart grid. In this paper, a privacy-preserving distributed optimal scheduling method is proposed for the interconnected microgrids (IMG with a battery energy storage system (BESS and renewable energy resources (RESs. The optimal scheduling problem is modeled to minimize the coalitional operation cost of the IMG, including the fuel cost of conventional distributed generators and the life loss cost of BESSs. By using the framework of the alternating direction method of multipliers (ADMM, a distributed optimal scheduling model and an iteration solution algorithm for the IMG is introduced; only the expected exchanging power (EEP of each MG is required during the iterations. Furthermore, a privacy-preserving strategy for the sharing of the EEP among MGs is designed to work with the mechanism of the distributed algorithm. According to the security analysis, the EEP can be delivered in a cooperative and privacy-preserving way. A case study and numerical results are given in terms of the convergence of the algorithm, the comparison of the costs and the implementation efficiency.

  2. Security and Privacy in Video Surveillance: Requirements and Challenges

    DEFF Research Database (Denmark)

    Mahmood Rajpoot, Qasim; Jensen, Christian D.

    2014-01-01

    observed by the system. Several techniques to protect the privacy of individuals have therefore been proposed, but very little research work has focused on the specific security requirements of video surveillance data (in transit or in storage) and on authorizing access to this data. In this paper, we...... present a general model of video surveillance systems that will help identify the major security and privacy requirements for a video surveillance system and we use this model to identify practical challenges in ensuring the security of video surveillance data in all stages (in transit and at rest). Our...... study shows a gap between the identified security requirements and the proposed security solutions where future research efforts may focus in this domain....

  3. A framework for privacy and security analysis of probe-based traffic information systems

    KAUST Repository

    Canepa, Edward S.; Claudel, Christian G.

    2013-01-01

    Most large scale traffic information systems rely on fixed sensors (e.g. loop detectors, cameras) and user generated data, this latter in the form of GPS traces sent by smartphones or GPS devices onboard vehicles. While this type of data is relatively inexpensive to gather, it can pose multiple security and privacy risks, even if the location tracks are anonymous. In particular, creating bogus location tracks and sending them to the system is relatively easy. This bogus data could perturb traffic flow estimates, and disrupt the transportation system whenever these estimates are used for actuation. In this article, we propose a new framework for solving a variety of privacy and cybersecurity problems arising in transportation systems. The state of traffic is modeled by the Lighthill-Whitham-Richards traffic flow model, which is a first order scalar conservation law with concave flux function. Given a set of traffic flow data, we show that the constraints resulting from this partial differential equation are mixed integer linear inequalities for some decision variable. The resulting framework is very flexible, and can in particular be used to detect spoofing attacks in real time, or carry out attacks on location tracks. Numerical implementations are performed on experimental data from the Mobile Century experiment to validate this framework. © 2013 ACM.

  4. 77 FR 46643 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2012-08-06

    ... providing notice to and obtaining consent from parents. Conversely, online services whose business models..., challenging others to gameplay, swapping digital collectibles, participating in monitored `chat' with... Digital Democracy (``CDD''), Consumers Union (``CU''), and the Electronic Privacy Information Center...

  5. Guaranteeing Privacy-Observing Data Exchange

    DEFF Research Database (Denmark)

    Probst, Christian W.

    2016-01-01

    Privacy is a major concern in large of parts of the world when exchanging information. Ideally, we would like to be able to have fine-grained control about how information that we deem sensitive can be propagated and used. While privacy policy languages exist, it is not possible to control whether...... the entity that receives data is living up to its own policy specification. In this work we present our initial work on an approach that empowers data owners to specify their privacy preferences, and data consumers to specify their data needs. Using a static analysis of the two specifications, our approach...... then finds a communication scheme that complies with these preferences and needs. While applicable to online transactions, the same techniques can be used in development of IT systems dealing with sensitive data. To the best of our knowledge, no existing privacy policy languages supports negotiation...

  6. Privacy and policy for genetic research.

    Science.gov (United States)

    DeCew, Judith Wagner

    2004-01-01

    I begin with a discussion of the value of privacy and what we lose without it. I then turn to the difficulties of preserving privacy for genetic information and other medical records in the face of advanced information technology. I suggest three alternative public policy approaches to the problem of protecting individual privacy and also preserving databases for genetic research: (1) governmental guidelines and centralized databases, (2) corporate self-regulation, and (3) my hybrid approach. None of these are unproblematic; I discuss strengths and drawbacks of each, emphasizing the importance of protecting the privacy of sensitive medical and genetic information as well as letting information technology flourish to aid patient care, public health and scientific research.

  7. Achieving Network Level Privacy in Wireless Sensor Networks†

    Science.gov (United States)

    Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2010-01-01

    Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power), sensor networks (e.g., mobility and topology) and QoS issues (e.g., packet reach-ability and timeliness). In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks. PMID:22294881

  8. Achieving Network Level Privacy in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sungyoung Lee

    2010-02-01

    Full Text Available Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power, sensor networks (e.g., mobility and topology and QoS issues (e.g., packet reach-ability and timeliness. In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks.

  9. Digital privacy in the marketplace perspectives on the information exchange

    CERN Document Server

    Milne, George

    2015-01-01

    Digital Privacy in the Marketplace focuses on the data ex-changes between marketers and consumers, with special ttention to the privacy challenges that are brought about by new information technologies. The purpose of this book is to provide a background source to help the reader think more deeply about the impact of privacy issues on both consumers and marketers. It covers topics such as: why privacy is needed, the technological, historical and academic theories of privacy, how market exchange af-fects privacy, what are the privacy harms and protections available, and what is the likely future of privacy.

  10. 48 CFR 352.224-70 - Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Privacy Act. 352.224-70... SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 352.224-70 Privacy Act. As prescribed in 324.103(b)(2), the Contracting Officer shall insert the following clause: Privacy Act (January...

  11. Access to Information and Privacy | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    As a Crown corporation, IDRC is subject to Canada's laws on access to information and privacy protection. The following resources will help you learn more about IDRC and the access to information and privacy acts, including instructions for submitting an access to information or privacy act (ATIP) request. IDRC and ATIP ...

  12. Privacy enhancing techniques - the key to secure communication and management of clinical and genomic data.

    Science.gov (United States)

    De Moor, G J E; Claerhout, B; De Meyer, F

    2003-01-01

    To introduce some of the privacy protection problems related to genomics based medicine and to highlight the relevance of Trusted Third Parties (TTPs) and of Privacy Enhancing Techniques (PETs) in the restricted context of clinical research and statistics. Practical approaches based on two different pseudonymisation models, both for batch and interactive data collection and exchange, are described and analysed. The growing need of managing both clinical and genetic data raises important legal and ethical challenges. Protecting human rights in the realm of privacy, while optimising research potential and other statistical activities is a challenge that can easily be overcome with the assistance of a trust service provider offering advanced privacy enabling/enhancing solutions. As such, the use of pseudonymisation and other innovative Privacy Enhancing Techniques can unlock valuable data sources.

  13. Thumbs up for privacy?: Differences in online self-disclosure behavior across national cultures.

    Science.gov (United States)

    Reed, Philip J; Spiro, Emma S; Butts, Carter T

    2016-09-01

    This study investigates relationships between national-level culture and online self-disclosure behavior. We operationalize culture through the GLOBE dimensions, a set of nine variables measuring cultural practices and another nine measuring values. Our observations of self-disclosure come from the privacy settings of approximately 200,000 randomly sampled Facebook users who designated a geographical network in 2009. We model privacy awareness as a function of one or more GLOBE variables with demographic covariates, evaluating the relative influence of each factor. In the top-performing models, we find that the majority of the cultural dimensions are significantly related to privacy awareness behavior. We also find that the hypothesized directions of several of these relationships, based largely on cultural attitudes towards threat mitigation, are confirmed. Copyright © 2016. Published by Elsevier Inc.

  14. PRIVACY PROTECTION PROBLEMS IN SOCIAL NETWORKS

    OpenAIRE

    OKUR, M. Cudi

    2011-01-01

    Protecting privacy has become a major concern for most social network users because of increased difficulties of controlling the online data. This article presents an assessment of the common privacy related risks of social networking sites. Open and hidden privacy risks of active and passive online profiles are examined and increasing share of social networking in these phenomena is discussed. Inadequacy of available legal and institutional protection is demonstrated and the effectiveness of...

  15. Facebook: Personality and privacy on profiles

    OpenAIRE

    Casado Riera, Carla; Oberst, Ursula; Carbonell, Xavier

    2015-01-01

    The aim of this study was to study the possible relationship between the privacy settings in Facebook profiles and two personality dimensions, extraversion and neuroticism, in relation to gender. The Privacy on Facebook Questionnaire and the Eysenck Personality Inventory was applied to a sample of 92 womenand 70 men, all users of Facebook. No significant relationship was found between extraversion or neuroticism and the privacy settings of Facebook profiles, but the results showed significant...

  16. Locking it down: The privacy and security of mobile medication apps.

    Science.gov (United States)

    Grindrod, Kelly; Boersema, Jonathan; Waked, Khrystine; Smith, Vivian; Yang, Jilan; Gebotys, Catherine

    2017-01-01

    To explore the privacy and security of free medication applications (apps) available to Canadian consumers. The authors searched the Canadian iTunes store for iOS apps and the Canadian Google Play store for Android apps related to medication use and management. Using an Apple iPad Air 2 and a Google Nexus 7 tablet, 2 reviewers generated a list of apps that met the following inclusion criteria: free, available in English, intended for consumer use and related to medication management. Using a standard data collection form, 2 reviewers independently coded each app for the presence/absence of passwords, the storage of personal health information, a privacy statement, encryption, remote wipe and third-party sharing. A Cohen's Kappa statistic was used to measure interrater agreement. Of the 184 apps evaluated, 70.1% had no password protection or sign-in system. Personal information, including name, date of birth and gender, was requested by 41.8% (77/184) of apps. Contact information, such as address, phone number and email, was requested by 25% (46/184) of apps. Finally, personal health information, other than medication name, was requested by 89.1% (164/184) of apps. Only 34.2% (63/184) of apps had a privacy policy in place. Most free medication apps offer very limited authentication and privacy protocols. As a result, the onus currently falls on patients to input information in these apps selectively and to be aware of the potential privacy issues. Until more secure systems are built, health care practitioners cannot fully support patients wanting to use such apps.

  17. New threats to health data privacy.

    Science.gov (United States)

    Li, Fengjun; Zou, Xukai; Liu, Peng; Chen, Jake Y

    2011-11-24

    Along with the rapid digitalization of health data (e.g. Electronic Health Records), there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. We study how patient privacy could be compromised with the help of today's information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  18. New threats to health data privacy

    Directory of Open Access Journals (Sweden)

    Li Fengjun

    2011-11-01

    Full Text Available Abstract Background Along with the rapid digitalization of health data (e.g. Electronic Health Records, there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. Results We study how patient privacy could be compromised with the help of today’s information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. Conclusion We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  19. Public Auditing with Privacy Protection in a Multi-User Model of Cloud-Assisted Body Sensor Networks

    Science.gov (United States)

    Li, Song; Cui, Jie; Zhong, Hong; Liu, Lu

    2017-01-01

    Wireless Body Sensor Networks (WBSNs) are gaining importance in the era of the Internet of Things (IoT). The modern medical system is a particular area where the WBSN techniques are being increasingly adopted for various fundamental operations. Despite such increasing deployments of WBSNs, issues such as the infancy in the size, capabilities and limited data processing capacities of the sensor devices restrain their adoption in resource-demanding applications. Though providing computing and storage supplements from cloud servers can potentially enrich the capabilities of the WBSNs devices, data security is one of the prevailing issues that affects the reliability of cloud-assisted services. Sensitive applications such as modern medical systems demand assurance of the privacy of the users’ medical records stored in distant cloud servers. Since it is economically impossible to set up private cloud servers for every client, auditing data security managed in the remote servers has necessarily become an integral requirement of WBSNs’ applications relying on public cloud servers. To this end, this paper proposes a novel certificateless public auditing scheme with integrated privacy protection. The multi-user model in our scheme supports groups of users to store and share data, thus exhibiting the potential for WBSNs’ deployments within community environments. Furthermore, our scheme enriches user experiences by offering public verifiability, forward security mechanisms and revocation of illegal group members. Experimental evaluations demonstrate the security effectiveness of our proposed scheme under the Random Oracle Model (ROM) by outperforming existing cloud-assisted WBSN models. PMID:28475110

  20. Public Auditing with Privacy Protection in a Multi-User Model of Cloud-Assisted Body Sensor Networks.

    Science.gov (United States)

    Li, Song; Cui, Jie; Zhong, Hong; Liu, Lu

    2017-05-05

    Wireless Body Sensor Networks (WBSNs) are gaining importance in the era of the Internet of Things (IoT). The modern medical system is a particular area where the WBSN techniques are being increasingly adopted for various fundamental operations. Despite such increasing deployments of WBSNs, issues such as the infancy in the size, capabilities and limited data processing capacities of the sensor devices restrain their adoption in resource-demanding applications. Though providing computing and storage supplements from cloud servers can potentially enrich the capabilities of the WBSNs devices, data security is one of the prevailing issues that affects the reliability of cloud-assisted services. Sensitive applications such as modern medical systems demand assurance of the privacy of the users' medical records stored in distant cloud servers. Since it is economically impossible to set up private cloud servers for every client, auditing data security managed in the remote servers has necessarily become an integral requirement of WBSNs' applications relying on public cloud servers. To this end, this paper proposes a novel certificateless public auditing scheme with integrated privacy protection. The multi-user model in our scheme supports groups of users to store and share data, thus exhibiting the potential for WBSNs' deployments within community environments. Furthermore, our scheme enriches user experiences by offering public verifiability, forward security mechanisms and revocation of illegal group members. Experimental evaluations demonstrate the security effectiveness of our proposed scheme under the Random Oracle Model (ROM) by outperforming existing cloud-assisted WBSN models.

  1. PrivateRide: A Privacy-Enhanced Ride-Hailing Service

    Directory of Open Access Journals (Sweden)

    Pham Anh

    2017-04-01

    Full Text Available In the past few years, we have witnessed a rise in the popularity of ride-hailing services (RHSs, an online marketplace that enables accredited drivers to use their own cars to drive ride-hailing users. Unlike other transportation services, RHSs raise significant privacy concerns, as providers are able to track the precise mobility patterns of millions of riders worldwide. We present the first survey and analysis of the privacy threats in RHSs. Our analysis exposes high-risk privacy threats that do not occur in conventional taxi services. Therefore, we propose PrivateRide, a privacy-enhancing and practical solution that offers anonymity and location privacy for riders, and protects drivers’ information from harvesting attacks. PrivateRide lowers the high-risk privacy threats in RHSs to a level that is at least as low as that of many taxi services. Using real data-sets from Uber and taxi rides, we show that PrivateRide significantly enhances riders’ privacy, while preserving tangible accuracy in ride matching and fare calculation, with only negligible effects on convenience. Moreover, by using our Android implementation for experimental evaluations, we show that PrivateRide’s overhead during ride setup is negligible. In short, we enable privacy-conscious riders to achieve levels of privacy that are not possible in current RHSs and even in some conventional taxi services, thereby offering a potential business differentiator.

  2. Privacy Practices of Health Social Networking Sites: Implications for Privacy and Data Security in Online Cancer Communities.

    Science.gov (United States)

    Charbonneau, Deborah H

    2016-08-01

    While online communities for social support continue to grow, little is known about the state of privacy practices of health social networking sites. This article reports on a structured content analysis of privacy policies and disclosure practices for 25 online ovarian cancer communities. All of the health social networking sites in the study sample provided privacy statements to users, yet privacy practices varied considerably across the sites. The majority of sites informed users that personal information was collected about participants and shared with third parties (96%, n = 24). Furthermore, more than half of the sites (56%, n = 14) stated that cookies technology was used to track user behaviors. Despite these disclosures, only 36% (n = 9) offered opt-out choices for sharing data with third parties. In addition, very few of the sites (28%, n = 7) allowed individuals to delete their personal information. Discussions about specific security measures used to protect personal information were largely missing. Implications for privacy, confidentiality, consumer choice, and data safety in online environments are discussed. Overall, nurses and other health professionals can utilize these findings to encourage individuals seeking online support and participating in social networking sites to build awareness of privacy risks to better protect their personal health information in the digital age.

  3. Governance Through Privacy, Fairness, and Respect for Individuals.

    Science.gov (United States)

    Baker, Dixie B; Kaye, Jane; Terry, Sharon F

    2016-01-01

    Individuals have a moral claim to be involved in the governance of their personal data. Individuals' rights include privacy, autonomy, and the ability to choose for themselves how they want to manage risk, consistent with their own personal values and life situations. The Fair Information Practices principles (FIPPs) offer a framework for governance. Privacy-enhancing technology that complies with applicable law and FIPPs offers a dynamic governance tool for enabling the fair and open use of individual's personal data. Any governance model must protect against the risks posed by data misuse. Individual perceptions of risks are a subjective function involving individuals' values toward self, family, and society, their perceptions of trust, and their cognitive decision-making skills. Individual privacy protections and individuals' right to choose are codified in the HIPAA Privacy Rule, which attempts to strike a balance between the dual goals of information flow and privacy protection. The choices most commonly given individuals regarding the use of their health information are binary ("yes" or "no") and immutable. Recent federal recommendations and law recognize the need for granular, dynamic choices. Individuals expect that they will govern the use of their own health and genomic data. Failure to build and maintain individuals' trust increases the likelihood that they will refuse to grant permission to access or use their data. The "no surprises principle" asserts that an individual's personal information should never be collected, used, transmitted, or disclosed in a way that would surprise the individual were she to learn about it. The FIPPs provide a powerful framework for enabling data sharing and use, while maintaining trust. We introduce the eight FIPPs adopted by the Department of Health and Human Services, and provide examples of their interpretation and implementation. Privacy risk and health risk can be reduced by giving consumers control, autonomy, and

  4. Anonymising the Sparse Dataset: A New Privacy Preservation Approach while Predicting Diseases

    Directory of Open Access Journals (Sweden)

    V. Shyamala Susan

    2016-09-01

    Full Text Available Data mining techniques analyze the medical dataset with the intention of enhancing patient’s health and privacy. Most of the existing techniques are properly suited for low dimensional medical dataset. The proposed methodology designs a model for the representation of sparse high dimensional medical dataset with the attitude of protecting the patient’s privacy from an adversary and additionally to predict the disease’s threat degree. In a sparse data set many non-zero values are randomly spread in the entire data space. Hence, the challenge is to cluster the correlated patient’s record to predict the risk degree of the disease earlier than they occur in patients and to keep privacy. The first phase converts the sparse dataset right into a band matrix through the Genetic algorithm along with Cuckoo Search (GCS.This groups the correlated patient’s record together and arranges them close to the diagonal. The next segment dissociates the patient’s disease, which is a sensitive value (SA with the parameters that determine the disease normally Quasi Identifier (QI.Finally, density based clustering technique is used over the underlying data to  create anonymized groups to maintain privacy and to predict the risk level of disease. Empirical assessments on actual health care data corresponding to V.A.Medical Centre heart disease dataset reveal the efficiency of this model pertaining to information loss, utility and privacy.

  5. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua; Prasanna, Viktor K.

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Grid Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.

  6. Sexiled: Privacy Acquisition Strategies of College Roommates

    Science.gov (United States)

    Erlandson, Karen

    2014-01-01

    This study sought to understand how roommates make privacy bids in college residence halls. The results indicate that privacy for sexual activity is a problem for students living in college residence halls, as almost all participants (82%) reported having dealt with this issue. Two sets of responses were collected and analyzed: privacy acquisition…

  7. Privacy and CHI : methodologies for studying privacy issues

    NARCIS (Netherlands)

    Patil, S.; Romero, N.A.; Karat, J.

    2006-01-01

    This workshop aims to reflect on methodologies to empirically study privacy issues related to advanced technology. The goal is to address methodological concerns by drawing upon both theoretical perspectives as well as practical experiences.

  8. Patient Privacy in the Era of Big Data

    Directory of Open Access Journals (Sweden)

    Mehmet Kayaalp

    2018-02-01

    Full Text Available Protecting patient privacy requires various technical tools. It involves regulations for sharing, de-identifying, securely storing, transmitting and handling protected health information (PHI. It involves privacy laws and legal agreements. It requires establishing rules for monitoring privacy leaks, determining actions when they occur, and handling de-identified clinical narrative reports. Deidentification is one such indispensable instrument in this set of privacy tools

  9. Biobanking and Privacy in India.

    Science.gov (United States)

    Chaturvedi, Sachin; Srinivas, Krishna Ravi; Muthuswamy, Vasantha

    2016-03-01

    Biobank-based research is not specifically addressed in Indian statutory law and therefore Indian Council for Medical Research guidelines are the primary regulators of biobank research in India. The guidelines allow for broad consent and for any level of identification of specimens. Although privacy is a fundamental right under the Indian Constitution, courts have limited this right when it conflicts with other rights or with the public interest. Furthermore, there is no established privacy test or actionable privacy right in the common law of India. In order to facilitate biobank-based research, both of these lacunae should be addressed by statutory law specifically addressing biobanking and more directly addressing the accompanying privacy concerns. A biobank-specific law should be written with international guidelines in mind, but harmonization with other laws should not be attempted until after India has created a law addressing biobank research within the unique legal and cultural environment of India. © 2016 American Society of Law, Medicine & Ethics.

  10. Privacy amplification for quantum key distribution

    International Nuclear Information System (INIS)

    Watanabe, Yodai

    2007-01-01

    This paper examines classical privacy amplification using a universal family of hash functions. In quantum key distribution, the adversary's measurement can wait until the choice of hash functions is announced, and so the adversary's information may depend on the choice. Therefore the existing result on classical privacy amplification, which assumes the independence of the choice from the other random variables, is not applicable to this case. This paper provides a security proof of privacy amplification which is valid even when the adversary's information may depend on the choice of hash functions. The compression rate of the proposed privacy amplification can be taken to be the same as that of the existing one with an exponentially small loss in secrecy of a final key. (fast track communication)

  11. Do Online Privacy Concerns Predict Selfie Behavior among Adolescents, Young Adults and Adults?

    Science.gov (United States)

    Dhir, Amandeep; Torsheim, Torbjørn; Pallesen, Ståle; Andreassen, Cecilie S.

    2017-01-01

    Selfies, or self-portraits, are often taken and shared on social media for online self-presentation reasons, which are considered essential for the psychosocial development and well-being of people in today’s culture. Despite the growing popularity and widespread sharing of selfies in the online space, little is known about how privacy concerns moderate selfie behavior. In addition to this, it is also not known whether privacy concerns across age and gender groups influence selfie behavior. To address this timely issue, a survey assessing common selfie behaviors, that is, frequency of taking (individual and group selfies), editing (cropping and filtering), and posting selfies online, and social media privacy concerns (over personal data being accessed and misused by third parties) was conducted. The web-survey was administered to 3,763 Norwegian social media users, ranging from 13 to 50 years, with a preponderance of women (n = 2,509, 66.7%). The present study investigated the impact of privacy concerns on selfie behaviors across gender and age groups (adolescent, young adult, and adult) by use of the structural equation modeling approach. The results suggest that young adults have greater privacy concerns compared to adolescents and adults. Females have greater privacy concerns than males. Greater privacy concerns among female social media users were linked to lower engagement in selfie behavior, but privacy concerns did not influence selfie behavior in the case of male adolescents and young adults. Overall, privacy concerns were more consistently and inversely related to selfie behavior (taking and posting) among females than males. The study results have theoretical as well as practical implications for both researchers and policy makers. PMID:28588530

  12. Do Online Privacy Concerns Predict Selfie Behavior among Adolescents, Young Adults and Adults?

    Science.gov (United States)

    Dhir, Amandeep; Torsheim, Torbjørn; Pallesen, Ståle; Andreassen, Cecilie S

    2017-01-01

    Selfies, or self-portraits, are often taken and shared on social media for online self-presentation reasons, which are considered essential for the psychosocial development and well-being of people in today's culture. Despite the growing popularity and widespread sharing of selfies in the online space, little is known about how privacy concerns moderate selfie behavior. In addition to this, it is also not known whether privacy concerns across age and gender groups influence selfie behavior. To address this timely issue, a survey assessing common selfie behaviors, that is, frequency of taking (individual and group selfies), editing (cropping and filtering), and posting selfies online, and social media privacy concerns (over personal data being accessed and misused by third parties) was conducted. The web-survey was administered to 3,763 Norwegian social media users, ranging from 13 to 50 years, with a preponderance of women ( n = 2,509, 66.7%). The present study investigated the impact of privacy concerns on selfie behaviors across gender and age groups (adolescent, young adult, and adult) by use of the structural equation modeling approach. The results suggest that young adults have greater privacy concerns compared to adolescents and adults. Females have greater privacy concerns than males. Greater privacy concerns among female social media users were linked to lower engagement in selfie behavior, but privacy concerns did not influence selfie behavior in the case of male adolescents and young adults. Overall, privacy concerns were more consistently and inversely related to selfie behavior (taking and posting) among females than males. The study results have theoretical as well as practical implications for both researchers and policy makers.

  13. Do Online Privacy Concerns Predict Selfie Behavior among Adolescents, Young Adults and Adults?

    Directory of Open Access Journals (Sweden)

    Amandeep Dhir

    2017-05-01

    Full Text Available Selfies, or self-portraits, are often taken and shared on social media for online self-presentation reasons, which are considered essential for the psychosocial development and well-being of people in today’s culture. Despite the growing popularity and widespread sharing of selfies in the online space, little is known about how privacy concerns moderate selfie behavior. In addition to this, it is also not known whether privacy concerns across age and gender groups influence selfie behavior. To address this timely issue, a survey assessing common selfie behaviors, that is, frequency of taking (individual and group selfies, editing (cropping and filtering, and posting selfies online, and social media privacy concerns (over personal data being accessed and misused by third parties was conducted. The web-survey was administered to 3,763 Norwegian social media users, ranging from 13 to 50 years, with a preponderance of women (n = 2,509, 66.7%. The present study investigated the impact of privacy concerns on selfie behaviors across gender and age groups (adolescent, young adult, and adult by use of the structural equation modeling approach. The results suggest that young adults have greater privacy concerns compared to adolescents and adults. Females have greater privacy concerns than males. Greater privacy concerns among female social media users were linked to lower engagement in selfie behavior, but privacy concerns did not influence selfie behavior in the case of male adolescents and young adults. Overall, privacy concerns were more consistently and inversely related to selfie behavior (taking and posting among females than males. The study results have theoretical as well as practical implications for both researchers and policy makers.

  14. Do privacy and security regulations need a status update? Perspectives from an intergenerational survey

    Science.gov (United States)

    Pereira, Stacey; Robinson, Jill Oliver; Gutierrez, Amanda M.; Majumder, Mary A.; McGuire, Amy L.; Rothstein, Mark A.

    2017-01-01

    Background The importance of health privacy protections in the era of the “Facebook Generation” has been called into question. The ease with which younger people share personal information about themselves has led to the assumption that they are less concerned than older generations about the privacy of their information, including health information. We explored whether survey respondents’ views toward health privacy suggest that efforts to strengthen privacy protections as health information is moved online are unnecessary. Methods Using Amazon’s Mechanical Turk (MTurk), which is well-known for recruitment for survey research, we distributed a 45-item survey to individuals in the U.S. to assess their perspectives toward privacy and security of online and health information, social media behaviors, use of health and fitness devices, and demographic information. Results 1310 participants (mean age: 36 years, 50% female, 78% non-Hispanic white, 54% college graduates or higher) were categorized by generations: Millennials, Generation X, and Baby Boomers. In multivariate regression models, we found that generational cohort was an independent predictor of level of concern about privacy and security of both online and health information. Younger generations were significantly less likely to be concerned than older generations (all P privacy or security of online or health information (all P > 0.05). Limitations This study is limited by the non-representativeness of our sample. Conclusions Though Millennials reported lower levels of concern about privacy and security, this was not related to internet or social media behaviors, and majorities within all generations reported concern about both the privacy and security of their health information. Thus, there is no intergenerational imperative to relax privacy and security standards, and it would be advisable to take privacy and security of health information more seriously. PMID:28926626

  15. Do privacy and security regulations need a status update? Perspectives from an intergenerational survey.

    Science.gov (United States)

    Pereira, Stacey; Robinson, Jill Oliver; Peoples, Hayley A; Gutierrez, Amanda M; Majumder, Mary A; McGuire, Amy L; Rothstein, Mark A

    2017-01-01

    The importance of health privacy protections in the era of the "Facebook Generation" has been called into question. The ease with which younger people share personal information about themselves has led to the assumption that they are less concerned than older generations about the privacy of their information, including health information. We explored whether survey respondents' views toward health privacy suggest that efforts to strengthen privacy protections as health information is moved online are unnecessary. Using Amazon's Mechanical Turk (MTurk), which is well-known for recruitment for survey research, we distributed a 45-item survey to individuals in the U.S. to assess their perspectives toward privacy and security of online and health information, social media behaviors, use of health and fitness devices, and demographic information. 1310 participants (mean age: 36 years, 50% female, 78% non-Hispanic white, 54% college graduates or higher) were categorized by generations: Millennials, Generation X, and Baby Boomers. In multivariate regression models, we found that generational cohort was an independent predictor of level of concern about privacy and security of both online and health information. Younger generations were significantly less likely to be concerned than older generations (all P privacy or security of online or health information (all P > 0.05). This study is limited by the non-representativeness of our sample. Though Millennials reported lower levels of concern about privacy and security, this was not related to internet or social media behaviors, and majorities within all generations reported concern about both the privacy and security of their health information. Thus, there is no intergenerational imperative to relax privacy and security standards, and it would be advisable to take privacy and security of health information more seriously.

  16. 10 CFR 1304.103 - Privacy Act inquiries.

    Science.gov (United States)

    2010-01-01

    ... writing may be sent to: Privacy Act Officer, U.S. Nuclear Waste Technical Review Board, 2300 Clarendon... NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.103 Privacy Act inquiries. (a) Requests... contains a record pertaining to him or her may file a request in person or in writing, via the internet, or...

  17. Privacy Law and Print Photojournalism.

    Science.gov (United States)

    Dykhouse, Caroline Dow

    Reviews of publications about privacy law, of recent court actions, and of interviews with newspaper photographers and attorneys indicate that torts of privacy often conflict with the freedoms to publish and to gather news. Although some guidelines have already been established (about running distorted pictures, "stealing" pictures, taking…

  18. The Role of Glass in Interior Architecture: Aesthetics, Community, and Privacy

    Science.gov (United States)

    Ziff, Matthew

    2004-01-01

    Advances in glass technologies are being applied in contemporary interior architecture. Glass forms and surfaces are appearing in settings and applications that offer vivid aesthetic experiences for users, but create ambiguous messages concerning community and privacy. Where a modernist application of glass may have been directed toward creating a…

  19. Smart Grid Privacy through Distributed Trust

    Science.gov (United States)

    Lipton, Benjamin

    Though the smart electrical grid promises many advantages in efficiency and reliability, the risks to consumer privacy have impeded its deployment. Researchers have proposed protecting privacy by aggregating user data before it reaches the utility, using techniques of homomorphic encryption to prevent exposure of unaggregated values. However, such schemes generally require users to trust in the correct operation of a single aggregation server. We propose two alternative systems based on secret sharing techniques that distribute this trust among multiple service providers, protecting user privacy against a misbehaving server. We also provide an extensive evaluation of the systems considered, comparing their robustness to privacy compromise, error handling, computational performance, and data transmission costs. We conclude that while all the systems should be computationally feasible on smart meters, the two methods based on secret sharing require much less computation while also providing better protection against corrupted aggregators. Building systems using these techniques could help defend the privacy of electricity customers, as well as customers of other utilities as they move to a more data-driven architecture.

  20. Do privacy and security regulations need a status update? Perspectives from an intergenerational survey.

    Directory of Open Access Journals (Sweden)

    Stacey Pereira

    Full Text Available The importance of health privacy protections in the era of the "Facebook Generation" has been called into question. The ease with which younger people share personal information about themselves has led to the assumption that they are less concerned than older generations about the privacy of their information, including health information. We explored whether survey respondents' views toward health privacy suggest that efforts to strengthen privacy protections as health information is moved online are unnecessary.Using Amazon's Mechanical Turk (MTurk, which is well-known for recruitment for survey research, we distributed a 45-item survey to individuals in the U.S. to assess their perspectives toward privacy and security of online and health information, social media behaviors, use of health and fitness devices, and demographic information.1310 participants (mean age: 36 years, 50% female, 78% non-Hispanic white, 54% college graduates or higher were categorized by generations: Millennials, Generation X, and Baby Boomers. In multivariate regression models, we found that generational cohort was an independent predictor of level of concern about privacy and security of both online and health information. Younger generations were significantly less likely to be concerned than older generations (all P 0.05.This study is limited by the non-representativeness of our sample.Though Millennials reported lower levels of concern about privacy and security, this was not related to internet or social media behaviors, and majorities within all generations reported concern about both the privacy and security of their health information. Thus, there is no intergenerational imperative to relax privacy and security standards, and it would be advisable to take privacy and security of health information more seriously.

  1. Privacy Act

    Science.gov (United States)

    Learn about the Privacy Act of 1974, the Electronic Government Act of 2002, the Federal Information Security Management Act, and other information about the Environmental Protection Agency maintains its records.

  2. High utility-itemset mining and privacy-preserving utility mining

    Directory of Open Access Journals (Sweden)

    Jerry Chun-Wei Lin

    2016-03-01

    Full Text Available In recent decades, high-utility itemset mining (HUIM has emerging a critical research topic since the quantity and profit factors are both concerned to mine the high-utility itemsets (HUIs. Generally, data mining is commonly used to discover interesting and useful knowledge from massive data. It may, however, lead to privacy threats if private or secure information (e.g., HUIs are published in the public place or misused. In this paper, we focus on the issues of HUIM and privacy-preserving utility mining (PPUM, and present two evolutionary algorithms to respectively mine HUIs and hide the sensitive high-utility itemsets in PPUM. Extensive experiments showed that the two proposed models for the applications of HUIM and PPUM can not only generate the high quality profitable itemsets according to the user-specified minimum utility threshold, but also enable the capability of privacy preserving for private or secure information (e.g., HUIs in real-word applications.

  3. Privacy-Preserving Location-Based Services

    Science.gov (United States)

    Chow, Chi Yin

    2010-01-01

    Location-based services (LBS for short) providers require users' current locations to answer their location-based queries, e.g., range and nearest-neighbor queries. Revealing personal location information to potentially untrusted service providers could create privacy risks for users. To this end, our objective is to design a privacy-preserving…

  4. Story Lab: Student Data Privacy

    Science.gov (United States)

    Herold, Benjamin

    2015-01-01

    Student data privacy is an increasingly high-profile--and controversial--issue that touches schools and families across the country. There are stories to tell in virtually every community. About three dozen states have passed legislation addressing student data privacy in the past two years, and eight different proposals were floating around…

  5. Vehicular ad hoc network security and privacy

    CERN Document Server

    Lin, X

    2015-01-01

    Unlike any other book in this area, this book provides innovative solutions to security issues, making this book a must read for anyone working with or studying security measures. Vehicular Ad Hoc Network Security and Privacy mainly focuses on security and privacy issues related to vehicular communication systems. It begins with a comprehensive introduction to vehicular ad hoc network and its unique security threats and privacy concerns and then illustrates how to address those challenges in highly dynamic and large size wireless network environments from multiple perspectives. This book is richly illustrated with detailed designs and results for approaching security and privacy threats.

  6. Ensuring privacy in the study of pathogen genetics.

    Science.gov (United States)

    Mehta, Sanjay R; Vinterbo, Staal A; Little, Susan J

    2014-08-01

    Rapid growth in the genetic sequencing of pathogens in recent years has led to the creation of large sequence databases. This aggregated sequence data can be very useful for tracking and predicting epidemics of infectious diseases. However, the balance between the potential public health benefit and the risk to personal privacy for individuals whose genetic data (personal or pathogen) are included in such work has been difficult to delineate, because neither the true benefit nor the actual risk to participants has been adequately defined. Existing approaches to minimise the risk of privacy loss to participants are based on de-identification of data by removal of a predefined set of identifiers. These approaches neither guarantee privacy nor protect the usefulness of the data. We propose a new approach to privacy protection that will quantify the risk to participants, while still maximising the usefulness of the data to researchers. This emerging standard in privacy protection and disclosure control, which is known as differential privacy, uses a process-driven rather than data-centred approach to protecting privacy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Personalized privacy-preserving frequent itemset mining using randomized response.

    Science.gov (United States)

    Sun, Chongjing; Fu, Yan; Zhou, Junlin; Gao, Hui

    2014-01-01

    Frequent itemset mining is the important first step of association rule mining, which discovers interesting patterns from the massive data. There are increasing concerns about the privacy problem in the frequent itemset mining. Some works have been proposed to handle this kind of problem. In this paper, we introduce a personalized privacy problem, in which different attributes may need different privacy levels protection. To solve this problem, we give a personalized privacy-preserving method by using the randomized response technique. By providing different privacy levels for different attributes, this method can get a higher accuracy on frequent itemset mining than the traditional method providing the same privacy level. Finally, our experimental results show that our method can have better results on the frequent itemset mining while preserving personalized privacy.

  8. Privacy as virtue: searching for a new privacy paradigm in the age of Big Data

    NARCIS (Netherlands)

    van der Sloot, B.; Beyvers, E.; Helm, P.; Hennig, M.; Keckeis, C.; Kreknin, I.; Püschel, F.

    2017-01-01

    Originally, privacy was conceived primarily as a duty of the state not to abuse its powers It could not, for example, enter a private house without legitimate reason or reasonable suspicion that the owner of the house had engaged in, for example, criminal conduct Gradually, however, privacy has been

  9. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data †

    Science.gov (United States)

    Xie, Qingqing; Wang, Liangmin

    2016-01-01

    With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP) can provide location-based service (LBS) for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman) algorithm and ciphertext policy attribute-based encryption (CP-ABE) scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA) and efficient enough for practical applications in terms of user side computation overhead. PMID:27897984

  10. Security measures required for HIPAA privacy.

    Science.gov (United States)

    Amatayakul, M

    2000-01-01

    HIPAA security requirements include administrative, physical, and technical services and mechanisms to safeguard confidentiality, availability, and integrity of health information. Security measures, however, must be implemented in the context of an organization's privacy policies. Because HIPAA's proposed privacy rules are flexible and scalable to account for the nature of each organization's business, size, and resources, each organization will be determining its own privacy policies within the context of the HIPAA requirements and its security capabilities. Security measures cannot be implemented in a vacuum.

  11. Protecting privacy in a clinical data warehouse.

    Science.gov (United States)

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  12. Kids Sell: Celebrity Kids’ Right to Privacy

    Directory of Open Access Journals (Sweden)

    Seong Choul Hong

    2016-04-01

    Full Text Available The lives of celebrities are often spotlighted in the media because of their newsworthiness; however, many celebrities argue that their right to privacy is often infringed upon. Concerns about celebrity privacy are not limited to the celebrities themselves and often expand to their children. As a result of their popularity, public interest has pushed paparazzi and journalists to pursue trivial and private details about the lives of both celebrities and their children. This paper investigates conflicting areas where the right to privacy and the right to know collide when dealing with the children of celebrities. In general, the courts have been unsympathetic to celebrity privacy claims, noting their newsworthiness and self-promoted characteristic. Unless the press violates news-gathering ethics or torts, the courts will often rule in favor of the media. However, the story becomes quite different when related to an infringement on the privacy of celebrities’ children. This paper argues that all children have a right to protect their privacy regardless of their parents’ social status. Children of celebrities should not be exempt to principles of privacy just because their parents are a celebrity. Furthermore, they should not be exposed by the media without the voluntary consent of their legal patrons. That is, the right of the media to publish and the newsworthiness of children of celebrities must be restrictedly acknowledged.

  13. An overview of human genetic privacy.

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2017-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. © 2016 New York Academy of Sciences.

  14. An overview of human genetic privacy

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2016-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. PMID:27626905

  15. Privacy-preserving digital rights management

    NARCIS (Netherlands)

    Conrado, C.; Petkovic, M.; Jonker, W.; Jonker, W.; Petkovic, M.

    2004-01-01

    DRM systems provide a means for protecting digital content, but at the same time they violate the privacy of users in a number of ways. This paper addresses privacy issues in DRM systems. The main challenge is how to allow a user to interact with the system in an anonymous/pseudonymous way, while

  16. Privacy-preserving Identity Management

    OpenAIRE

    Milutinovic, Milica

    2015-01-01

    With the technological advances and the evolution of online services, user privacy is becoming a crucial issue in the modern day society. Privacy in the general sense refers to individuals’ ability to protect information about themselves and selectively present it to other entities. This concept is nowadays strongly affected by everyday practices that assume personal data disclosure, such as online shopping and participation in loyalty schemes. This makes it difficult for an individual to con...

  17. The Privacy Problem: Although School Librarians Seldom Discuss It, Students' Privacy Rights Are under Attack

    Science.gov (United States)

    Adams, Helen R.

    2011-01-01

    Every day in school libraries nationwide, students' privacy rights are under attack, but many principals, teachers, parents, and community members do not know much about these rights. Even though school librarians are among the strongest proponents of privacy, the subject is rarely discussed, probably because state and federal laws can be…

  18. A framework to preserve the privacy of electronic health data streams.

    Science.gov (United States)

    Kim, Soohyung; Sung, Min Kyoung; Chung, Yon Dohn

    2014-08-01

    The anonymization of health data streams is important to protect these data against potential privacy breaches. A large number of research studies aiming at offering privacy in the context of data streams has been recently conducted. However, the techniques that have been proposed in these studies generate a significant delay during the anonymization process, since they concentrate on applying existing privacy models (e.g., k-anonymity and l-diversity) to batches of data extracted from data streams in a period of time. In this paper, we present delay-free anonymization, a framework for preserving the privacy of electronic health data streams. Unlike existing works, our method does not generate an accumulation delay, since input streams are anonymized immediately with counterfeit values. We further devise late validation for increasing the data utility of the anonymization results and managing the counterfeit values. Through experiments, we show the efficiency and effectiveness of the proposed method for the real-time release of data streams. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Towards context adaptive privacy decisions in ubiquitous computing

    NARCIS (Netherlands)

    Schaub, Florian; Könings, Bastian; Weber, M.; Kargl, Frank

    2012-01-01

    In ubiquitous systems control of privacy settings will be increasingly difficult due to the pervasive nature of sensing and communication capabilities. We identify challenges for privacy decisions in ubiquitous systems and propose a system for in situ privacy decision support. When context changes

  20. Privacy Challenges of Genomic Big Data.

    Science.gov (United States)

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  1. Privacy in Online Social Networking Sites

    OpenAIRE

    M.Ida Evones

    2015-01-01

    There are more than 192 act ive social networking websites. Bringing every kind of social group together in one place and letting them interact is really a big thing indeed .Huge amount of information process in the sites each day, end up making it vulnerable to attack. There is no systematic framework taking into account the importance of privacy. Increased privacy settings don’t always guarantee privacy when there is a loop hole in the applications. Lack of user education results is over sh...

  2. A Secure and Privacy-Preserving Targeted Ad-System

    Science.gov (United States)

    Androulaki, Elli; Bellovin, Steven M.

    Thanks to its low product-promotion cost and its efficiency, targeted online advertising has become very popular. Unfortunately, being profile-based, online advertising methods violate consumers' privacy, which has engendered resistance to the ads. However, protecting privacy through anonymity seems to encourage click-fraud. In this paper, we define consumer's privacy and present a privacy-preserving, targeted ad system (PPOAd) which is resistant towards click fraud. Our scheme is structured to provide financial incentives to all entities involved.

  3. Security and privacy preserving approaches in the eHealth clouds with disaster recovery plan.

    Science.gov (United States)

    Sahi, Aqeel; Lai, David; Li, Yan

    2016-11-01

    Cloud computing was introduced as an alternative storage and computing model in the health sector as well as other sectors to handle large amounts of data. Many healthcare companies have moved their electronic data to the cloud in order to reduce in-house storage, IT development and maintenance costs. However, storing the healthcare records in a third-party server may cause serious storage, security and privacy issues. Therefore, many approaches have been proposed to preserve security as well as privacy in cloud computing projects. Cryptographic-based approaches were presented as one of the best ways to ensure the security and privacy of healthcare data in the cloud. Nevertheless, the cryptographic-based approaches which are used to transfer health records safely remain vulnerable regarding security, privacy, or the lack of any disaster recovery strategy. In this paper, we review the related work on security and privacy preserving as well as disaster recovery in the eHealth cloud domain. Then we propose two approaches, the Security-Preserving approach and the Privacy-Preserving approach, and a disaster recovery plan. The Security-Preserving approach is a robust means of ensuring the security and integrity of Electronic Health Records, and the Privacy-Preserving approach is an efficient authentication approach which protects the privacy of Personal Health Records. Finally, we discuss how the integrated approaches and the disaster recovery plan can ensure the reliability and security of cloud projects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Computer-Aided Identification and Validation of Privacy Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2016-05-01

    Full Text Available Privacy is a software quality that is closely related to security. The main difference is that security properties aim at the protection of assets that are crucial for the considered system, and privacy aims at the protection of personal data that are processed by the system. The identification of privacy protection needs in complex systems is a hard and error prone task. Stakeholders whose personal data are processed might be overlooked, or the sensitivity and the need of protection of the personal data might be underestimated. The later personal data and the needs to protect them are identified during the development process, the more expensive it is to fix these issues, because the needed changes of the system-to-be often affect many functionalities. In this paper, we present a systematic method to identify the privacy needs of a software system based on a set of functional requirements by extending the problem-based privacy analysis (ProPAn method. Our method is tool-supported and automated where possible to reduce the effort that has to be spent for the privacy analysis, which is especially important when considering complex systems. The contribution of this paper is a semi-automatic method to identify the relevant privacy requirements for a software-to-be based on its functional requirements. The considered privacy requirements address all dimensions of privacy that are relevant for software development. As our method is solely based on the functional requirements of the system to be, we enable users of our method to identify the privacy protection needs that have to be addressed by the software-to-be at an early stage of the development. As initial evaluation of our method, we show its applicability on a small electronic health system scenario.

  5. Secure open cloud in data transmission using reference pattern and identity with enhanced remote privacy checking

    Science.gov (United States)

    Vijay Singh, Ran; Agilandeeswari, L.

    2017-11-01

    To handle the large amount of client’s data in open cloud lots of security issues need to be address. Client’s privacy should not be known to other group members without data owner’s valid permission. Sometime clients are fended to have accessing with open cloud servers due to some restrictions. To overcome the security issues and these restrictions related to storing, data sharing in an inter domain network and privacy checking, we propose a model in this paper which is based on an identity based cryptography in data transmission and intermediate entity which have client’s reference with identity that will take control handling of data transmission in an open cloud environment and an extended remote privacy checking technique which will work at admin side. On behalf of data owner’s authority this proposed model will give best options to have secure cryptography in data transmission and remote privacy checking either as private or public or instructed. The hardness of Computational Diffie-Hellman assumption algorithm for key exchange makes this proposed model more secure than existing models which are being used for public cloud environment.

  6. 43 CFR 2.47 - Records subject to Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Records subject to Privacy Act. 2.47 Section 2.47 Public Lands: Interior Office of the Secretary of the Interior RECORDS AND TESTIMONY; FREEDOM OF INFORMATION ACT Privacy Act § 2.47 Records subject to Privacy Act. The Privacy Act applies to all...

  7. Regulation of Unmanned Aerial Systems and Related Privacy Issues in Lithuania

    Directory of Open Access Journals (Sweden)

    Pūraitė Aurelija

    2017-12-01

    Full Text Available In the past few years the use of unmanned aerial vehicles in Lithuania has significantly increased. However, enjoying the advantages of this technology, which improves society’s socio-economical safety (public safety in a broad sense, raises some privacy concerns. This article analyses European Union and national legal regulations regarding the use of unmanned aerial vehicles as well as legal tools for defence of the right to privacy or prevention from its breaches in the Republic of Lithuania. Unmanned aerial vehicles have become popular only recently; thus, legislation regarding their use has not yet become a common topic among lawyers. Furthermore, case law of the Republic of Lithuania is silent about it. Thus, the authors model a situation of breach of privacy using an unmanned aerial vehicle and analyse possible defence mechanisms.

  8. Between Tradition and Modernity: Determining Spatial Systems of Privacy in the Domestic Architecture of Contemporary Iraq

    Directory of Open Access Journals (Sweden)

    Ali Al-Thahab

    2014-11-01

    Full Text Available The notion of privacy represents a central criterion for both indoor and outdoor social spaces in most traditional Arab settlements. This paper investigates privacy and everyday life as determinants of the physical properties of the built and urban fabric and will study their impact on traditional settlements and architecture of the home in the contemporary Iraqi city. It illustrates the relationship between socio-cultural aspects of public/private realms using the notion of the social sphere as an investigative tool of the concept of social space in Iraqi houses and local communities (Mahalla. This paper reports that in spite of the impact of other factors in articulating built forms, privacy embodies the primary role under the effects of Islamic rules, principles and culture. The crucial problem is the underestimation of traditional inherited values through opening social spaces to the outside that giving unlimited accesses to the indoor social environment creating many problems with regard to privacy and communal social integration.

  9. Genetic privacy and non-discrimination.

    Science.gov (United States)

    Romeo Casabona, Carlos María

    2011-01-01

    The UN Inter-Agency Committee on Bioethics met for its tenth meeting at the UNESCO headquarters in Paris on 4-5th March 2011. Member organisations such as the WHO and UNESCO were in attendance alongside associate members such as the Council for Europe, the European Commission, the Organisation for Economic Co-operation and Development and the World Trade Organisation. Discussion centred on the theme "genetic privacy and nondiscrimination". The United Nations Economic and Social Council (ECOSOC) had previously considered, from a legal and ethical perspective, the implications of increasingly sophisticated technologies for genetic privacy and non-discrimination in fields such as medicine, employment and insurance. Thus, the ECOSOC requested that UNESCO report on relevant developments in the field of genetic privacy and non-discrimination. In parallel with a consultation process with member states, UNESCO launched a consultation with the UN Interagency Committee on Bioethics. This article analyses the report presented by the author concerning the analysis of the current contentions in the field and illustrates attempts at responding on a normative level to a perceived threat to genetic privacy and non-discrimination.

  10. 77 FR 32111 - Privacy Act System of Records

    Science.gov (United States)

    2012-05-31

    ... contacted in order to obtain that office's advice regarding obligations under the Privacy Act; 8. Breach... FEDERAL COMMUNICATIONS COMMISSION Privacy Act System of Records AGENCY: Federal Communications Commission. ACTION: Notice; one new Privacy Act system of records. SUMMARY: Pursuant to subsection (e)(4) of...

  11. CARAVAN: Providing Location Privacy for VANET

    National Research Council Canada - National Science Library

    Sampigethaya, Krishna; Huang, Leping; Li, Mingyan; Poovendran, Radha; Matsuura, Kanta; Sezaki, Kaoru

    2005-01-01

    .... This type of tracking leads to threats on the location privacy of the vehicle's user. In this paper, we study the problem of providing location privacy in VANET by allowing vehicles to prevent tracking of their broadcast communications...

  12. Privacy and confidentiality: perspectives of mental health consumers and carers in pharmacy settings.

    Science.gov (United States)

    Hattingh, Hendrika Laetitia; Knox, Kathy; Fejzic, Jasmina; McConnell, Denise; Fowler, Jane L; Mey, Amary; Kelly, Fiona; Wheeler, Amanda J

    2015-02-01

    The study aims to explore within the community pharmacy practice context the views of mental health stakeholders on: (1) current and past experiences of privacy, confidentiality and support; and (2) expectations and needs in relation to privacy and confidentiality. In-depth interviews and focus groups were conducted in three states in Australia, namely Queensland, the northern region of New South Wales and Western Australia, between December 2011 and March 2012. There were 98 participants consisting of consumers and carers (n = 74), health professionals (n = 13) and representatives from consumer organisations (n = 11). Participants highlighted a need for improved staff awareness. Consumers indicated a desire to receive information in a way that respects their privacy and confidentiality, in an appropriate space. Areas identified that require improved protection of privacy and confidentiality during pharmacy interactions were the number of staff having access to sensitive information, workflow models causing information exposure and pharmacies' layout not facilitating private discussions. Challenges experienced by carers created feelings of isolation which could impact on care. This study explored mental health stakeholders' experiences and expectations regarding privacy and confidentiality in the Australian community pharmacy context. A need for better pharmacy staff training about the importance of privacy and confidentiality and strategies to enhance compliance with national pharmacy practice requirements was identified. Findings provided insight into privacy and confidentiality needs and will assist in the development of pharmacy staff training material to better support consumers with sensitive conditions. © 2014 Royal Pharmaceutical Society.

  13. The War Against Terror and Transatlantic Information Sharing: Spillovers of Privacy or Spillovers of Security?

    Directory of Open Access Journals (Sweden)

    Maria Tzanou

    2015-02-01

    Full Text Available The EU-US Passenger Name Record (PNR agreement has been among the most controversial instruments in the fight against terrorism that the EU negotiated with the US after the 9/11 terrorist attacks. The agreement has been heavily criticised for its implications regarding fundamental rights, in particular the rights to privacy and data protection. Nevertheless, the EU has put forward plans to develop its own PNR programme. The present article aims to examine the new dynamics concerning privacy that arise from the transatlantic fight against terrorism. It argues that, while attempts for the development of a transatlantic privacy protection framework have been made, ‘spillovers’ of security, taking the form of internalisation of external counter-terrorism measures, are prevalent in the era of the war against terror.

  14. Privacy and Open Government

    Directory of Open Access Journals (Sweden)

    Teresa Scassa

    2014-06-01

    Full Text Available The public-oriented goals of the open government movement promise increased transparency and accountability of governments, enhanced citizen engagement and participation, improved service delivery, economic development and the stimulation of innovation. In part, these goals are to be achieved by making more and more government information public in reusable formats and under open licences. This paper identifies three broad privacy challenges raised by open government. The first is how to balance privacy with transparency and accountability in the context of “public” personal information. The second challenge flows from the disruption of traditional approaches to privacy based on a collapse of the distinctions between public and private sector actors. The third challenge is that of the potential for open government data—even if anonymized—to contribute to the big data environment in which citizens and their activities are increasingly monitored and profiled.

  15. 5G Visions of User Privacy

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Khajuria, Samant; Skouby, Knud Erik

    2015-01-01

    Currently, the discussions are going on the elements and definition of 5G networks. One of the elements in this discussion is how to provide for user controlled privacy for securing users' digital interaction. The purpose of this paper is to present elements of user controlled privacy needed...... for the future 5G networks. The paper concludes that an ecosystem consisting of Trusted Third Party between the end user and the service providers as a distributed system could be integrated to secure the perspective of user controlled privacy for future systems...

  16. Personal Information Request Form

    International Development Research Centre (IDRC) Digital Library (Canada)

    PC Forms Inc. 834-4048

    To apply for information under the Privacy Act, complete this form or a written request mentioning the Act. Describe the information being sought and provide any relevant details necessary to help the. International Development Research Centre. (IDRC) find it. If you require assistance, refer to. Info Source (Sources of ...

  17. 32 CFR 701.118 - Privacy, IT, and PIAs.

    Science.gov (United States)

    2010-07-01

    ...) Development. Privacy must be considered when requirements are being analyzed and decisions are being made...-347) directs agencies to conduct reviews of how privacy issues are considered when purchasing or... a PIA to effectively address privacy factors. Guidance is provided at http://www.doncio.navy.mil. (f...

  18. 36 CFR 902.56 - Protection of personal privacy.

    Science.gov (United States)

    2010-07-01

    ... privacy. 902.56 Section 902.56 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT... Protection of personal privacy. (a) Any of the following personnel, medical, or similar records is within the... invasion of his personal privacy: (1) Personnel and background records personal to any officer or employee...

  19. 76 FR 51869 - Privacy Act Implementation

    Science.gov (United States)

    2011-08-19

    ... permanent residence. Maintain includes collect, use, disseminate, or control. Privacy Act means the Privacy... announces the creation, deletion, or amendment of one or more system of records. System of records notices... reference and university libraries or electronically at the [[Page 51873

  20. Measuring privacy compliance using fitness metrics

    NARCIS (Netherlands)

    Banescu, S.; Petkovic, M.; Zannone, N.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Nowadays, repurposing of personal data is a major privacy issue. Detection of data repurposing requires posteriori mechanisms able to determine how data have been processed. However, current a posteriori solutions for privacy compliance are often manual, leading infringements to remain undetected.

  1. Defining Privacy Is Supposed to Be Easy

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander; Gross, Thomas; Viganò, Luca

    2013-01-01

    Formally specifying privacy goals is not trivial. The most widely used approach in formal methods is based on the static equivalence of frames in the applied pi-calculus, basically asking whether or not the intruder is able to distinguish two given worlds. A subtle question is how we can be sure...... that we have specified all pairs of worlds to properly reflect our intuitive privacy goal. To address this problem, we introduce in this paper a novel and declarative way to specify privacy goals, called α-β privacy, and relate it to static equivalence. This new approach is based on specifying two...... formulae α and β in first-order logic with Herbrand universes, where α reflects the intentionally released information and β includes the actual cryptographic (“technical”) messages the intruder can see. Then α-β privacy means that the intruder cannot derive any “non-technical” statement from β that he...

  2. 77 FR 61275 - Privacy Act of 1974: Implementation

    Science.gov (United States)

    2012-10-09

    ... (FBI) Privacy Act system of records titled FBI Data Warehouse System, JUSTICE/FBI- 022. This system is...)(G), (H), and (I), (5), and (8); (f); and (g) of the Privacy Act: (1) FBI Data Warehouse System... security; disclose information that would constitute an unwarranted invasion of another's personal privacy...

  3. 22 CFR 212.22 - Protection of personal privacy.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Protection of personal privacy. 212.22 Section... Information for Public Inspection and Copying § 212.22 Protection of personal privacy. To the extent required to prevent a clearly unwarranted invasion of personal privacy, USAID may delete identifying details...

  4. Privacy Preservation in Distributed Subgradient Optimization Algorithms

    OpenAIRE

    Lou, Youcheng; Yu, Lean; Wang, Shouyang

    2015-01-01

    Privacy preservation is becoming an increasingly important issue in data mining and machine learning. In this paper, we consider the privacy preserving features of distributed subgradient optimization algorithms. We first show that a well-known distributed subgradient synchronous optimization algorithm, in which all agents make their optimization updates simultaneously at all times, is not privacy preserving in the sense that the malicious agent can learn other agents' subgradients asymptotic...

  5. Certificate Transparency with Privacy

    Directory of Open Access Journals (Sweden)

    Eskandarian Saba

    2017-10-01

    Full Text Available Certificate transparency (CT is an elegant mechanism designed to detect when a certificate authority (CA has issued a certificate incorrectly. Many CAs now support CT and it is being actively deployed in browsers. However, a number of privacy-related challenges remain. In this paper we propose practical solutions to two issues. First, we develop a mechanism that enables web browsers to audit a CT log without violating user privacy. Second, we extend CT to support non-public subdomains.

  6. 37 CFR 251.23 - FOIA and Privacy Act.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false FOIA and Privacy Act. 251.23 Section 251.23 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT... Access to and Inspection of Records § 251.23 FOIA and Privacy Act. Freedom of Information Act and Privacy...

  7. 32 CFR 806b.4 - Privacy Act complaints.

    Science.gov (United States)

    2010-07-01

    ... be identified, the local Privacy Act officer will assume these duties. Issues that cannot be resolved... 32 National Defense 6 2010-07-01 2010-07-01 false Privacy Act complaints. 806b.4 Section 806b.4 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION PRIVACY ACT...

  8. Star-forming Filament Models

    International Nuclear Information System (INIS)

    Myers, Philip C.

    2017-01-01

    New models of star-forming filamentary clouds are presented in order to quantify their properties and to predict their evolution. These 2D axisymmetric models describe filaments that have no core, one low-mass core, and one cluster-forming core. They are based on Plummer-like cylinders and spheroids that are bounded by a constant-density surface of finite extent. In contrast to 1D Plummer-like models, they have specific values of length and mass, they approximate observed column density maps, and their distributions of column density ( N -pdfs) are pole-free. Each model can estimate the star-forming potential of a core-filament system by identifying the zone of gas dense enough to form low-mass stars and by counting the number of enclosed thermal Jeans masses. This analysis suggests that the Musca central filament may be near the start of its star-forming life, with enough dense gas to make its first ∼3 protostars, while the Coronet filament is near the midpoint of its star formation, with enough dense gas to add ∼8 protostars to its ∼20 known stars. In contrast, L43 appears to be near the end of its star-forming life, since it lacks enough dense gas to add any new protostars to the two young stellar objectsalready known.

  9. Star-forming Filament Models

    Energy Technology Data Exchange (ETDEWEB)

    Myers, Philip C., E-mail: pmyers@cfa.harvard.edu [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States)

    2017-03-20

    New models of star-forming filamentary clouds are presented in order to quantify their properties and to predict their evolution. These 2D axisymmetric models describe filaments that have no core, one low-mass core, and one cluster-forming core. They are based on Plummer-like cylinders and spheroids that are bounded by a constant-density surface of finite extent. In contrast to 1D Plummer-like models, they have specific values of length and mass, they approximate observed column density maps, and their distributions of column density ( N -pdfs) are pole-free. Each model can estimate the star-forming potential of a core-filament system by identifying the zone of gas dense enough to form low-mass stars and by counting the number of enclosed thermal Jeans masses. This analysis suggests that the Musca central filament may be near the start of its star-forming life, with enough dense gas to make its first ∼3 protostars, while the Coronet filament is near the midpoint of its star formation, with enough dense gas to add ∼8 protostars to its ∼20 known stars. In contrast, L43 appears to be near the end of its star-forming life, since it lacks enough dense gas to add any new protostars to the two young stellar objectsalready known.

  10. Investigating privacy attitudes and behavior in relation to personalization

    NARCIS (Netherlands)

    Garde - Perik, van de E.M.; Markopoulos, P.; Ruyter, de B.E.R.; Eggen, J.H.; IJsselsteijn, W.A.

    2008-01-01

    This article presents an experimental study of privacy-related attitudes and behaviors regarding a music recommender service based on two types of user modeling: personality traits and musical preferences. Contrary to prior expectations and attitudes reported by participants, personality traits are

  11. Privacy in the Sharing Economy

    DEFF Research Database (Denmark)

    Ranzini, Giulia; Etter, Michael; Lutz, Christoph

    ’s digital services through providing recommendations to Europe’s institutions. The initial stage of this research project involves a set of three literature reviews of the state of research on three core topics in relation to the sharing economy: participation (1), privacy (2), and power (3). This piece...... is a literature review on the topic of privacy. It addresses key privacy challenges for different stakeholders in the sharing economy. Throughout, we use the term "consumers" to refer to users on the receiving end (e.g., Airbnb guests, Uber passengers), "providers" to refer to users on the providing end (e.......g., Airbnb hosts, Uber drivers) and "platforms" to refer to the mediating sites, apps and infrastructures matching consumers and providers (e.g., Airbnb, Uber)....

  12. AnonySense: Opportunistic and Privacy-Preserving Context Collection

    DEFF Research Database (Denmark)

    Triandopoulos, Nikolaos; Kapadia, Apu; Cornelius, Cory

    2008-01-01

    on tessellation and clustering to protect users' privacy against the system while reporting context, and k-anonymous report aggregation to improve the users' privacy against applications receiving the context. We outline the architecture and security properties of AnonySense, and focus on evaluating our....... We propose AnonySense, a general-purpose architecture for leveraging users' mobile devices for measuring context, while maintaining the privacy of the users.AnonySense features multiple layers of privacy protection-a framework for nodes to receive tasks anonymously, a novel blurring mechanism based...

  13. Privacy Data Decomposition and Discretization Method for SaaS Services

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2017-01-01

    Full Text Available In cloud computing, user functional requirements are satisfied through service composition. However, due to the process of interaction and sharing among SaaS services, user privacy data tends to be illegally disclosed to the service participants. In this paper, we propose a privacy data decomposition and discretization method for SaaS services. First, according to logic between the data, we classify the privacy data into discrete privacy data and continuous privacy data. Next, in order to protect the user privacy information, continuous data chains are decomposed into discrete data chain, and discrete data chains are prevented from being synthesized into continuous data chains. Finally, we propose a protection framework for privacy data and demonstrate its correctness and feasibility with experiments.

  14. Privacy and human behavior in the age of information.

    Science.gov (United States)

    Acquisti, Alessandro; Brandimarte, Laura; Loewenstein, George

    2015-01-30

    This Review summarizes and draws connections between diverse streams of empirical research on privacy behavior. We use three themes to connect insights from social and behavioral sciences: people's uncertainty about the consequences of privacy-related behaviors and their own preferences over those consequences; the context-dependence of people's concern, or lack thereof, about privacy; and the degree to which privacy concerns are malleable—manipulable by commercial and governmental interests. Organizing our discussion by these themes, we offer observations concerning the role of public policy in the protection of privacy in the information age. Copyright © 2015, American Association for the Advancement of Science.

  15. Privacy preserving surveillance and the tracking-paradox

    OpenAIRE

    Greiner, S.; Birnstill, Pascal; Krempel, Erik; Beckert, B.; Beyerer, Jürgen

    2013-01-01

    Increasing capabilities of intelligent video surveillance systems impose new threats to privacy while, at the same time, offering opportunities for reducing the privacy invasiveness of surveillance measures as well as their selectivity. We show that aggregating more data about observed people does not necessarily lead to less privacy, but can increase the selectivity of surveillance measures. In case of video surveillance in a company environment, if we enable the system to authenticate emplo...

  16. 76 FR 30952 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-05-27

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... the Department. These assessments were approved and published on the Privacy Office's web site between..., 2011 and March 31, 2011, the Chief Privacy Officer of the DHS approved and published sixteen Privacy...

  17. 76 FR 58814 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-09-22

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... DHS. These assessments were approved and published on the Privacy Office's Web site between June 1... 31, 2011, the Chief Privacy Officer of the DHS approved and published twenty-six Privacy Impact...

  18. 76 FR 78934 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-12-20

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on.... These assessments were approved and published on the Privacy Office's web site between September 1, 2011... November 30, 2011, the Chief Privacy Officer of the DHS approved and published seven Privacy Impact...

  19. 76 FR 37823 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2011-06-28

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... Department. These assessments were approved and published on the Privacy Office's Web site between March 31... 31, 2011, the Chief Privacy Officer of the DHS approved and published ten Privacy Impact Assessments...

  20. A Cross-Cultural Perspective on the Privacy Calculus

    Directory of Open Access Journals (Sweden)

    Sabine Trepte

    2017-01-01

    Full Text Available The “privacy calculus” approach to studying online privacy implies that willingness to engage in disclosures on social network sites (SNSs depends on evaluation of the resulting risks and benefits. In this article, we propose that cultural factors influence the perception of privacy risks and social gratifications. Based on survey data collected from participants from five countries (Germany [n = 740], the Netherlands [n = 89], the United Kingdom [n = 67], the United States [n = 489], and China [n = 165], we successfully replicated the privacy calculus. Furthermore, we found that culture plays an important role: As expected, people from cultures ranking high in individualism found it less important to generate social gratifications on SNSs as compared to people from collectivist-oriented countries. However, the latter placed greater emphasis on privacy risks—presumably to safeguard the collective. Furthermore, we identified uncertainty avoidance to be a cultural dimension crucially influencing the perception of SNS risks and benefits. As expected, people from cultures ranking high in uncertainty avoidance found privacy risks to be more important when making privacy-related disclosure decisions. At the same time, these participants ascribed lower importance to social gratifications—possibly because social encounters are perceived to be less controllable in the social media environment.

  1. 32 CFR 701.109 - Privacy Act (PA) appeals.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy Act (PA) appeals. 701.109 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.109 Privacy Act (PA) appeals. (a... commence when the appeal reaches the office of the review authority having jurisdiction over the record...

  2. Privacy Information Security Classification for Internet of Things Based on Internet Data

    OpenAIRE

    Lu, Xiaofeng; Qu, Zhaowei; Li, Qi; Hui, Pan

    2015-01-01

    A lot of privacy protection technologies have been proposed, but most of them are independent and aim at protecting some specific privacy. There is hardly enough deep study into the attributes of privacy. To minimize the damage and influence of the privacy disclosure, the important and sensitive privacy should be a priori preserved if all privacy pieces cannot be preserved. This paper focuses on studying the attributes of the privacy and proposes privacy information security classification (P...

  3. Differential privacy in intelligent transportation systems

    NARCIS (Netherlands)

    Kargl, Frank; Friedman, Arik; Boreli, Roksana

    2013-01-01

    In this paper, we investigate how the concept of differential privacy can be applied to Intelligent Transportation Systems (ITS), focusing on protection of Floating Car Data (FCD) stored and processed in central Traffic Data Centers (TDC). We illustrate an integration of differential privacy with

  4. The privacy paradox : Investigating discrepancies between expressed privacy concerns and actual online behavior - A systematic literature review

    NARCIS (Netherlands)

    Barth, Susanne; de Jong, Menno D.T.

    2017-01-01

    Also known as the privacy paradox, recent research on online behavior has revealed discrepancies between user attitude and their actual behavior. More specifically: While users claim to be very concerned about their privacy, they nevertheless undertake very little to protect their personal data.

  5. Privacy Breach Analysis in Social Networks

    Science.gov (United States)

    Nagle, Frank

    This chapter addresses various aspects of analyzing privacy breaches in social networks. We first review literature that defines three types of privacy breaches in social networks: interactive, active, and passive. We then survey the various network anonymization schemes that have been constructed to address these privacy breaches. After exploring these breaches and anonymization schemes, we evaluate a measure for determining the level of anonymity inherent in a network graph based on its topological structure. Finally, we close by emphasizing the difficulty of anonymizing social network data while maintaining usability for research purposes and offering areas for future work.

  6. Documenting death: public access to government death records and attendant privacy concerns.

    Science.gov (United States)

    Boles, Jeffrey R

    2012-01-01

    This Article examines the contentious relationship between public rights to access government-held death records and privacy rights concerning the deceased, whose personal information is contained in those same records. This right of access dispute implicates core democratic principles and public policy interests. Open access to death records, such as death certificates and autopsy reports, serves the public interest by shedding light on government agency performance, uncovering potential government wrongdoing, providing data on public health trends, and aiding those investigating family history, for instance. Families of the deceased have challenged the release of these records on privacy grounds, as the records may contain sensitive and embarrassing information about the deceased. Legislatures and the courts addressing this dispute have collectively struggled to reconcile the competing open access and privacy principles. The Article demonstrates how a substantial portion of the resulting law in this area is haphazardly formed, significantly overbroad, and loaded with unintended consequences. The Article offers legal reforms to bring consistency and coherence to this currently disordered area of jurisprudence.

  7. Evaluating Common Privacy Vulnerabilities in Internet Service Providers

    Science.gov (United States)

    Kotzanikolaou, Panayiotis; Maniatis, Sotirios; Nikolouzou, Eugenia; Stathopoulos, Vassilios

    Privacy in electronic communications receives increased attention in both research and industry forums, stemming from both the users' needs and from legal and regulatory requirements in national or international context. Privacy in internet-based communications heavily relies on the level of security of the Internet Service Providers (ISPs), as well as on the security awareness of the end users. This paper discusses the role of the ISP in the privacy of the communications. Based on real security audits performed in national-wide ISPs, we illustrate privacy-specific threats and vulnerabilities that many providers fail to address when implementing their security policies. We subsequently provide and discuss specific security measures that the ISPs can implement, in order to fine-tune their security policies in the context of privacy protection.

  8. 76 FR 63896 - Federal Acquisition Regulation; Privacy Training, 2010-013

    Science.gov (United States)

    2011-10-14

    ... should a breach occur; and (7) Any agency-specific privacy training requirements. (d) The contractor is... Acquisition Regulation; Privacy Training, 2010-013 AGENCY: Department of Defense (DoD), General Services... contractors to complete training that addresses the protection of privacy, in accordance with the Privacy Act...

  9. Protecting location privacy for outsourced spatial data in cloud storage.

    Science.gov (United States)

    Tian, Feng; Gui, Xiaolin; An, Jian; Yang, Pan; Zhao, Jianqiang; Zhang, Xuejun

    2014-01-01

    As cloud computing services and location-aware devices are fully developed, a large amount of spatial data needs to be outsourced to the cloud storage provider, so the research on privacy protection for outsourced spatial data gets increasing attention from academia and industry. As a kind of spatial transformation method, Hilbert curve is widely used to protect the location privacy for spatial data. But sufficient security analysis for standard Hilbert curve (SHC) is seldom proceeded. In this paper, we propose an index modification method for SHC (SHC(∗)) and a density-based space filling curve (DSC) to improve the security of SHC; they can partially violate the distance-preserving property of SHC, so as to achieve better security. We formally define the indistinguishability and attack model for measuring the privacy disclosure risk of spatial transformation methods. The evaluation results indicate that SHC(∗) and DSC are more secure than SHC, and DSC achieves the best index generation performance.

  10. Collaborative eHealth Meets Security: Privacy-Enhancing Patient Profile Management.

    Science.gov (United States)

    Sanchez-Guerrero, Rosa; Mendoza, Florina Almenarez; Diaz-Sanchez, Daniel; Cabarcos, Patricia Arias; Lopez, Andres Marin

    2017-11-01

    Collaborative healthcare environments offer potential benefits, including enhancing the healthcare quality delivered to patients and reducing costs. As a direct consequence, sharing of electronic health records (EHRs) among healthcare providers has experienced a noteworthy growth in the last years, since it enables physicians to remotely monitor patients' health and enables individuals to manage their own health data more easily. However, these scenarios face significant challenges regarding security and privacy of the extremely sensitive information contained in EHRs. Thus, a flexible, efficient, and standards-based solution is indispensable to guarantee selective identity information disclosure and preserve patient's privacy. We propose a privacy-aware profile management approach that empowers the patient role, enabling him to bring together various healthcare providers as well as user-generated claims into an unique credential. User profiles are represented through an adaptive Merkle Tree, for which we formalize the underlying mathematical model. Furthermore, performance of the proposed solution is empirically validated through simulation experiments.

  11. Syllabus for Privacy and Information Technology, Fall 2017, UCLA Information Studies

    OpenAIRE

    Borgman, Christine L.

    2017-01-01

    Privacy is a broad topic that covers many disciplines, stakeholders, and concerns. This course addresses the intersection of privacy and information technology, surveying a wide array of topics of concern for research and practice in the information fields. Among the topics covered are the history and changing contexts of privacy; privacy risks and harms; law, policies, and practices; privacy in searching for information, in reading, and in libraries; surveillance, networks, and privacy by de...

  12. Interpretation and Analysis of Privacy Policies of Websites in India

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2016-01-01

    the conditions specified in the policy document. So, ideally the privacy policies should be readable and provide sufficient information to empower users to make knowledgeable decisions. Thus, we have examined more than 50 privacy policies and discussed the content analysis in this paper. We discovered...... on information collection methods, purpose, sharing entities names and data transit. In this study, the 11 % privacy policies are compliance with privacy standards which denotes other privacy policies are less committed to support transparency, choice, and accountability in the process of information collection...... that the policies are not only unstructured but also described in complicated language. Our analysis shows that the user data security measures are nonspecific and unsatisfactory in 57% privacy policies. In spite of huge amount of information collection, the privacy policies does not have clear description...

  13. Hacking Facebook Privacy and Security

    Science.gov (United States)

    2012-08-28

    REPORT Hacking Facebook Privacy and Security 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: When people talk about hacking and social networks , they’re...12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Facebook , Privacy, Security, Social Network Dr. Jeff Duffany (Advisor), Omar Galban...transmit personal information that many people that they dare not do it personally. FACEBOOK PLATFORM Facebook is a popular social networking

  14. 77 FR 46100 - Published Privacy Impact Assessments on the Web

    Science.gov (United States)

    2012-08-02

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Published Privacy Impact Assessments on... published on the Privacy Office's Web site between March 1, 2012 and May 31, 2012. DATES: The PIAs will be... approved and published fifteen Privacy Impact Assessments (PIAs) on the DHS Privacy Office Web site, www...

  15. Designing Privacy-aware Internet of Things Applications

    OpenAIRE

    Perera, Charith; Barhamgi, Mahmoud; Bandara, Arosha K.; Ajmal, Muhammad; Price, Blaine; Nuseibeh, Bashar

    2017-01-01

    Internet of Things (IoT) applications typically collect and analyse personal data that can be used to derive sensitive information about individuals. However, thus far, privacy concerns have not been explicitly considered in software engineering processes when designing IoT applications. In this paper, we explore how a Privacy-by-Design (PbD) framework, formulated as a set of guidelines, can help software engineers to design privacy-aware IoT applications. We studied the utility of our propos...

  16. Toward sensitive document release with privacy guarantees

    OpenAIRE

    David Sánchez; Montserrat Batet

    2017-01-01

    Toward sensitive document release with privacy guarantees DOI: 10.1016/j.engappai.2016.12.013 URL: http://www.sciencedirect.com/science/article/pii/S0952197616302408 Filiació URV: SI Inclòs a la memòria: SI Privacy has become a serious concern for modern Information Societies. The sensitive nature of much of the data that are daily exchanged or released to untrusted parties requires that responsible organizations undertake appropriate privacy protection measures. Nowadays, much...

  17. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Directory of Open Access Journals (Sweden)

    Cynthia Cheung

    Full Text Available Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters.This study sought to understand privacy attitudes of early adopters of emerging health technologies.Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis.Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science.Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  18. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Science.gov (United States)

    Cheung, Cynthia; Bietz, Matthew J; Patrick, Kevin; Bloss, Cinnamon S

    2016-01-01

    Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters. This study sought to understand privacy attitudes of early adopters of emerging health technologies. Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis. Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science. Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  19. Biomedical databases: protecting privacy and promoting research.

    Science.gov (United States)

    Wylie, Jean E; Mineau, Geraldine P

    2003-03-01

    When combined with medical information, large electronic databases of information that identify individuals provide superlative resources for genetic, epidemiology and other biomedical research. Such research resources increasingly need to balance the protection of privacy and confidentiality with the promotion of research. Models that do not allow the use of such individual-identifying information constrain research; models that involve commercial interests raise concerns about what type of access is acceptable. Researchers, individuals representing the public interest and those developing regulatory guidelines must be involved in an ongoing dialogue to identify practical models.

  20. The study on privacy preserving data mining for information security

    Science.gov (United States)

    Li, Xiaohui

    2012-04-01

    Privacy preserving data mining have a rapid development in a short year. But it still faces many challenges in the future. Firstly, the level of privacy has different definitions in different filed. Therefore, the measure of privacy preserving data mining technology protecting private information is not the same. So, it's an urgent issue to present a unified privacy definition and measure. Secondly, the most of research in privacy preserving data mining is presently confined to the theory study.

  1. 32 CFR 701.119 - Privacy and the web.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy and the web. 701.119 Section 701.119... THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.119 Privacy and the web. DON activities shall consult SECNAVINST 5720.47B for guidance on what may be posted on a Navy Web site. ...

  2. Visual privacy by context: proposal and evaluation of a level-based visualisation scheme.

    Science.gov (United States)

    Padilla-López, José Ramón; Chaaraoui, Alexandros Andre; Gu, Feng; Flórez-Revuelta, Francisco

    2015-06-04

    Privacy in image and video data has become an important subject since cameras are being installed in an increasing number of public and private spaces. Specifically, in assisted living, intelligent monitoring based on computer vision can allow one to provide risk detection and support services that increase people's autonomy at home. In the present work, a level-based visualisation scheme is proposed to provide visual privacy when human intervention is necessary, such as at telerehabilitation and safety assessment applications. Visualisation levels are dynamically selected based on the previously modelled context. In this way, different levels of protection can be provided, maintaining the necessary intelligibility required for the applications. Furthermore, a case study of a living room, where a top-view camera is installed, is presented. Finally, the performed survey-based evaluation indicates the degree of protection provided by the different visualisation models, as well as the personal privacy preferences and valuations of the users.

  3. Unveiling consumer's privacy paradox behaviour in an economic exchange.

    Science.gov (United States)

    Motiwalla, Luvai F; Li, Xiao-Bai

    2016-01-01

    Privacy paradox is of great interest to IS researchers and firms gathering personal information. It has been studied from social, behavioural, and economic perspectives independently. However, prior research has not examined the degrees of influence these perspectives contribute to the privacy paradox problem. We combine both economic and behavioural perspectives in our study of the privacy paradox with a price valuation of personal information through an economic experiment combined with a behavioural study on privacy paradox. Our goal is to reveal more insights on the privacy paradox through economic valuation on personal information. Results indicate that general privacy concerns or individual disclosure concerns do not have a significant influence on the price valuation of personal information. Instead, prior disclosure behaviour in specific scenario, like with healthcare providers or social networks, is a better indicator of consumer price valuations.

  4. Privacy and confidentiality in pragmatic clinical trials.

    Science.gov (United States)

    McGraw, Deven; Greene, Sarah M; Miner, Caroline S; Staman, Karen L; Welch, Mary Jane; Rubel, Alan

    2015-10-01

    With pragmatic clinical trials, an opportunity exists to answer important questions about the relative risks, burdens, and benefits of therapeutic interventions. However, concerns about protecting the privacy of this information are significant and must be balanced with the imperative to learn from the data gathered in routine clinical practice. Traditional privacy protections for research uses of identifiable information rely disproportionately on informed consent or authorizations, based on a presumption that this is necessary to fulfill ethical principles of respect for persons. But frequently, the ideal of informed consent is not realized in its implementation. Moreover, the principle of respect for persons—which encompasses their interests in health information privacy—can be honored through other mechanisms. Data anonymization also plays a role in protecting privacy but is not suitable for all research, particularly pragmatic clinical trials. In this article, we explore both the ethical foundation and regulatory framework intended to protect privacy in pragmatic clinical trials. We then review examples of novel approaches to respecting persons in research that may have the added benefit of honoring patient privacy considerations. © The Author(s) 2015.

  5. HIPPA privacy regulations: practical information for physicians.

    Science.gov (United States)

    McMahon, E B; Lee-Huber, T

    2001-07-01

    After much debate and controversy, the Bush administration announced on April 12, 2001, that it would implement the Health Insurance Portability and Accountability Act (HIPAA) privacy regulations issued by the Clinton administration in December of 2000. The privacy regulations became effective on April 14, 2001. Although the regulations are considered final, the Secretary of the Department of Health and Human Services has the power to modify the regulations at any time during the first year of implementation. These regulations affect how a patient's health information is used and disclosed, as well as how patients are informed of their privacy rights. As "covered entities," physicians have until April 14, 2003, to comply fully with the HIPAA privacy regulations, which are more than 1,500 pages in length. This article presents a basic overview of the new and complex regulations and highlights practical information about physicians' compliance with the regulations. However, this summary of the HIPAA privacy regulations should not be construed as legal advice or an opinion on specific situations. Please consult an attorney concerning your compliance with HIPAA and the regulations promulgated thereunder.

  6. Fourteen Reasons Privacy Matters: A Multidisciplinary Review of Scholarly Literature

    Science.gov (United States)

    Magi, Trina J.

    2011-01-01

    Librarians have long recognized the importance of privacy to intellectual freedom. As digital technology and its applications advance, however, efforts to protect privacy may become increasingly difficult. With some users behaving in ways that suggest they do not care about privacy and with powerful voices claiming that privacy is dead, librarians…

  7. Understanding Engagement with the Privacy Domain Through Design Research.

    OpenAIRE

    Vasalou, A.; Oostveen, A.; Bowers, Christopher; Beale, R.

    2015-01-01

    This paper reports findings from participatory design research aimed at uncovering how technological interventions can engage users in the domain of privacy. Our work was undertaken in the context of a new design concept “Privacy Trends” whose aspiration is to foster technology users’ digital literacy regarding ongoing privacy risks and elucidate how such risks fit within existing social, organizational and political systems, leading to a longer term privacy concern. Our study reveals two cha...

  8. Privacy Preserving Mapping Schemes Supporting Comparison

    NARCIS (Netherlands)

    Tang, Qiang

    2010-01-01

    To cater to the privacy requirements in cloud computing, we introduce a new primitive, namely Privacy Preserving Mapping (PPM) schemes supporting comparison. An PPM scheme enables a user to map data items into images in such a way that, with a set of images, any entity can determine the <, =, >

  9. Privacy een grondrecht, maar ook handelswaar

    NARCIS (Netherlands)

    Olsthoorn, P.

    2015-01-01

    Snoeihard uit journalist Brenno de Winter zijn commentaar op sprekers over privacy tijdens het NLIGF congres 2015. Hij zet Bart Schermer, adviseur van bedrijven en organisaties in de hoek. Die heeft net betoogd dat privacy geen juk (‘korvee’) mag vormen maar inzet moet zijn van innovatie door

  10. A Privacy Preservation Model for Health-Related Social Networking Sites

    OpenAIRE

    Li, Jingquan

    2015-01-01

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many use...

  11. Space in Space: Designing for Privacy in the Workplace

    Science.gov (United States)

    Akin, Jonie

    2015-01-01

    Privacy is cultural, socially embedded in the spatial, temporal, and material aspects of the lived experience. Definitions of privacy are as varied among scholars as they are among those who fight for their personal rights in the home and the workplace. Privacy in the workplace has become a topic of interest in recent years, as evident in discussions on Big Data as well as the shrinking office spaces in which people work. An article in The New York Times published in February of this year noted that "many companies are looking to cut costs, and one way to do that is by trimming personal space". Increasingly, organizations ranging from tech start-ups to large corporations are downsizing square footage and opting for open-office floorplans hoping to trim the budget and spark creative, productive communication among their employees. The question of how much is too much to trim when it comes to privacy, is one that is being actively addressed by the National Aeronautics and Space Administration (NASA) as they explore habitat designs for future space missions. NASA recognizes privacy as a design-related stressor impacting human health and performance. Given the challenges of sustaining life in an isolated, confined, and extreme environment such as Mars, NASA deems it necessary to determine the acceptable minimal amount for habitable volume for activities requiring at least some level of privacy in order to support optimal crew performance. Ethnographic research was conducted in 2013 to explore perceptions of privacy and privacy needs among astronauts living and working in space as part of a long-distance, long-duration mission. The allocation of space, or habitable volume, becomes an increasingly complex issue in outer space due to the costs associated with maintaining an artificial, confined environment bounded by limitations of mass while located in an extreme environment. Privacy in space, or space in space, provides a unique case study of the complex notions of

  12. The Influence of Security Statement, Technical Protection, and Privacy on Satisfaction and Loyalty; A Structural Equation Modeling

    Science.gov (United States)

    Peikari, Hamid Reza

    Customer satisfaction and loyalty have been cited as the e-commerce critical success factors and various studies have been conducted to find the antecedent determinants of these concepts in the online transactions. One of the variables suggested by some studies is perceived security. However, these studies have referred to security from a broad general perspective and no attempts have been made to study the specific security related variables. This paper intends to study the influence on security statement and technical protection on satisfaction, loyalty and privacy. The data was collected from 337 respondents and after the reliability and validity tests, path analysis was applied to examine the hypotheses. The results suggest that loyalty is influenced by satisfaction and security statement and no empirical support was found for the influence on technical protection and privacy on loyalty. Moreover, it was found that security statement and technical protection have a positive significant influence on satisfaction while no significant effect was found for privacy. Furthermore, the analysis indicated that security statement have a positive significant influence on technical protection while technical protection was found to have a significant negative impact on perceived privacy.

  13. SOCIO-SPATIAL INTEGRATION OF LANDSCAPE BACK LANE OF HOUSING AT BANDAR BARU NILAI: PRIVACY AND COMMUNITY

    Directory of Open Access Journals (Sweden)

    SITI F. M. LIAS

    2016-05-01

    Full Text Available Urban informal spaces in the form of back lane tend to promote socio-spatial integration between neighbourhood communities. The Back Lane Planning Design Guidelines issued in 2014 by Town and Country Planning Department of Malaysia identified back lane as such a place to encourage communal lifestyle whereas an area of owns residents privacy. In reality, back lane portrays as the wasted unfavourable paths thus several social concerns of safety, security, health issues as well as invading privacy and sense of deficiency community bonding issues arise. This study quantitatively analyses dwellers perception focusing to level of visual privacy and level of spiritual neighbourhood interaction towards effectiveness of newly landscape back lane (LBL in contemporary urban dwellings. Comparison of socio-spatial integration between two types of back lane design in grid-linear housing scheme ; the pleasing greenery landscape back lane (LBL and the plain empty bare paved back lane (PBL in residential area of Kota Seriemas, Nilai, Negeri Sembilan . Structured questionnaire distributed to 115 respondents to assess on privacy and comfort level, neighbourhood activities and communal lifestyle, back -lane usage, resident’s perception and expectation.The study proved the landscape design back lane (LBL is ensuring own right privacy lacking in promoting community interaction among the residents due to contemporary urban lifestyles.

  14. Overview of Privacy in Social Networking Sites (SNS)

    Science.gov (United States)

    Powale, Pallavi I.; Bhutkar, Ganesh D.

    2013-07-01

    Social Networking Sites (SNS) have become an integral part of communication and life style of people in today's world. Because of the wide range of services offered by SNSs mostly for free of cost, these sites are attracting the attention of all possible Internet users. Most importantly, users from all age groups have become members of SNSs. Since many of the users are not aware of the data thefts associated with information sharing, they freely share their personal information with SNSs. Therefore, SNSs may be used for investigating users' character and social habits by familiar or even unknown persons and agencies. Such commercial and social scenario, has led to number of privacy and security threats. Though, all major issues in SNSs need to be addressed, by SNS providers, privacy of SNS users is the most crucial. And therefore, in this paper, we have focused our discussion on "privacy in SNSs". We have discussed different ways of Personally Identifiable Information (PII) leakages from SNSs, information revelation to third-party domains without user consent and privacy related threats associated with such information sharing. We expect that this comprehensive overview on privacy in SNSs will definitely help in raising user awareness about sharing data and managing their privacy with SNSs. It will also help SNS providers to rethink about their privacy policies.

  15. Electronic Mail, Privacy, and the Electronic Communications Privacy Act of 1986: Technology in Search of Law.

    Science.gov (United States)

    Samoriski, Jan H.; And Others

    1996-01-01

    Attempts to clarify the status of e-mail privacy under the Electronic Communications Privacy Act of 1986 (ECPA). Examines current law and the paucity of definitive case law. A review of cases and literature suggests there is a gap in the existing ECPA that allows for potentially abusive electronic monitoring and interception of e-mail,…

  16. Millennials sex differences on Snapchat perceived privacy

    Directory of Open Access Journals (Sweden)

    Antonietta Rauzzino

    2017-07-01

    Full Text Available Snapchat offers a distinctive feature from other social networks in that its users control the visibility of the contents they share with others by defining how long these contents may be available. Snapchat is changing the way men and women perceive online information privacy and content management. This paper aims to illustrate the relevance of social representation theory to evaluate perceived privacy in Snapchat users, with a sample of 268 young adults residing in Bogotá. A survey method was employed for data collection purposes. The results reveal that Snapchat users are concerned about their networks’ privacy, with no significant sex differences, although men's perception of Snapchat privacy is safer than that of women. Finally, a discussion is presented as to the limitations and implications of these results for further studies.

  17. Preserving location and absence privacy in geo-social networks

    DEFF Research Database (Denmark)

    Freni, Dario; Vicente, Carmen Ruiz; Mascetti, Sergio

    2010-01-01

    accessible to multiple users. This renders it difficult for GeoSN users to control which information about them is available and to whom it is available. This paper addresses two privacy threats that occur in GeoSNs: location privacy and absence privacy. The former concerns the availability of information...... about the presence of users in specific locations at given times, while the latter concerns the availability of information about the absence of an individual from specific locations during given periods of time. The challenge addressed is that of supporting privacy while still enabling useful services....... The resulting geo-aware social networks (GeoSNs) pose privacy threats beyond those found in location-based services. Content published in a GeoSN is often associated with references to multiple users, without the publisher being aware of the privacy preferences of those users. Moreover, this content is often...

  18. Privacy Training Program

    Science.gov (United States)

    Recognizing that training and awareness are critical to protecting agency Personally Identifiable Information (PII), the EPA is developing online training for privacy contacts in its programs and regions.

  19. Redefining Genomic Privacy: Trust and Empowerment

    OpenAIRE

    Erlich, Yaniv; Williams, James B.; Glazer, David; Yocum, Kenneth; Farahany, Nita; Olson, Maynard; Narayanan, Arvind; Stein, Lincoln D.; Witkowski, Jan A.; Kain, Robert C.

    2014-01-01

    Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we int...

  20. 49 CFR 801.56 - Unwarranted invasion of personal privacy.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Unwarranted invasion of personal privacy. 801.56... Unwarranted invasion of personal privacy. Pursuant to 5 U.S.C. 552(b)(6), any personal, medical, or similar... a clearly unwarranted invasion of the person's personal privacy. ...

  1. Privacy na Babel : de vermeende ongrijpbaarheid van het privacybegrip

    NARCIS (Netherlands)

    Vedder, A.H.

    1998-01-01

    De veel voorkomende en onlangs weer door Serge Gutwirth naar voren gebrachte opvatting dat privacy principieel ondefinieerbaar is, is onjuist. Voor de verdediging van privacy als waarde moet men aannemen dat privacy weliswaar een vaag complex begrip is, dat voor een deel contextueel bepaald wordt,

  2. 20 CFR 401.30 - Privacy Act and other responsibilities.

    Science.gov (United States)

    2010-04-01

    ... information privacy issues, including those relating to the collection, use, sharing, and disclosure of... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Privacy Act and other responsibilities. 401.30 Section 401.30 Employees' Benefits SOCIAL SECURITY ADMINISTRATION PRIVACY AND DISCLOSURE OF...

  3. Towards quantum-based privacy and voting

    International Nuclear Information System (INIS)

    Hillery, Mark; Ziman, Mario; Buzek, Vladimir; Bielikova, Martina

    2006-01-01

    The privacy of communicating participants is often of paramount importance, but in some situations it is an essential condition. A typical example is a fair (secret) voting. We analyze in detail communication privacy based on quantum resources, and we propose new quantum protocols. Possible generalizations that would lead to voting schemes are discussed

  4. The Privacy Attitude Questionnaire (PAQ): Initial Development and Validation

    OpenAIRE

    Chignell, Mark H.; Quan-Haase, Anabel; Gwizdka, Jacek

    2003-01-01

    Privacy has been identified as a key issue in a variety of domains, including electronic commerce and public policy. While there are many discussions of privacy issues from a legal and policy perspective, there is little information on the structure of privacy as a psychometric construct. Our goal is to develop a method for measuring attitudes towards privacy that can guide the design and personalization of services. This paper reports on the development of an initial version of the PAQ. Four...

  5. Privacy concerns in smart cities

    OpenAIRE

    van Zoonen, Liesbet

    2016-01-01

    textabstractIn this paper a framework is constructed to hypothesize if and how smart city technologies and urban big data produce privacy concerns among the people in these cities (as inhabitants, workers, visitors, and otherwise). The framework is built on the basis of two recurring dimensions in research about people's concerns about privacy: one dimensions represents that people perceive particular data as more personal and sensitive than others, the other dimension represents that people'...

  6. Privacy-Preserving Patient Similarity Learning in a Federated Environment: Development and Analysis.

    Science.gov (United States)

    Lee, Junghye; Sun, Jimeng; Wang, Fei; Wang, Shuang; Jun, Chi-Hyuck; Jiang, Xiaoqian

    2018-04-13

    There is an urgent need for the development of global analytic frameworks that can perform analyses in a privacy-preserving federated environment across multiple institutions without privacy leakage. A few studies on the topic of federated medical analysis have been conducted recently with the focus on several algorithms. However, none of them have solved similar patient matching, which is useful for applications such as cohort construction for cross-institution observational studies, disease surveillance, and clinical trials recruitment. The aim of this study was to present a privacy-preserving platform in a federated setting for patient similarity learning across institutions. Without sharing patient-level information, our model can find similar patients from one hospital to another. We proposed a federated patient hashing framework and developed a novel algorithm to learn context-specific hash codes to represent patients across institutions. The similarities between patients can be efficiently computed using the resulting hash codes of corresponding patients. To avoid security attack from reverse engineering on the model, we applied homomorphic encryption to patient similarity search in a federated setting. We used sequential medical events extracted from the Multiparameter Intelligent Monitoring in Intensive Care-III database to evaluate the proposed algorithm in predicting the incidence of five diseases independently. Our algorithm achieved averaged area under the curves of 0.9154 and 0.8012 with balanced and imbalanced data, respectively, in κ-nearest neighbor with κ=3. We also confirmed privacy preservation in similarity search by using homomorphic encryption. The proposed algorithm can help search similar patients across institutions effectively to support federated data analysis in a privacy-preserving manner. ©Junghye Lee, Jimeng Sun, Fei Wang, Shuang Wang, Chi-Hyuck Jun, Xiaoqian Jiang. Originally published in JMIR Medical Informatics (http

  7. Users or Students? Privacy in University MOOCS.

    Science.gov (United States)

    Jones, Meg Leta; Regner, Lucas

    2016-10-01

    Two terms, student privacy and Massive Open Online Courses, have received a significant amount of attention recently. Both represent interesting sites of change in entrenched structures, one educational and one legal. MOOCs represent something college courses have never been able to provide: universal access. Universities not wanting to miss the MOOC wave have started to build MOOC courses and integrate them into the university system in various ways. However, the design and scale of university MOOCs create tension for privacy laws intended to regulate information practices exercised by educational institutions. Are MOOCs part of the educational institutions these laws and policies aim to regulate? Are MOOC users students whose data are protected by aforementioned laws and policies? Many university researchers and faculty members are asked to participate as designers and instructors in MOOCs but may not know how to approach the issues proposed. While recent scholarship has addressed the disruptive nature of MOOCs, student privacy generally, and data privacy in the K-12 system, we provide an in-depth description and analysis of the MOOC phenomenon and the privacy laws and policies that guide and regulate educational institutions today. We offer privacy case studies of three major MOOC providers active in the market today to reveal inconsistencies among MOOC platform and the level and type of legal uncertainty surrounding them. Finally, we provide a list of organizational questions to pose internally to navigate the uncertainty presented to university MOOC teams.

  8. Unpicking the privacy paradox: can structuration theory help to explain location-based privacy decisions?

    OpenAIRE

    Zafeiropoulou, Aristea M.; Millard, David E.; Webber, Craig; O'Hara, Kieron

    2013-01-01

    Social Media and Web 2.0 tools have dramatically increased the amount of previously private data that users share on the Web; now with the advent of GPS-enabled smartphones users are also actively sharing their location data through a variety of applications and services. Existing research has explored people’s privacy attitudes, and shown that the way people trade their personal data for services of value can be inconsistent with their stated privacy preferences (a phenomenon known as the pr...

  9. Just in Time Research: Privacy Practices

    Science.gov (United States)

    Grama, Joanna Lyn

    2014-01-01

    The January 2014 edition of the ECAR Update subscriber newsletter included an informal poll on information privacy practices. The poll was intended to collect a quick snapshot of the higher education community's thoughts on this important topic during Data Privacy Month. Results of the poll will be used to inform EDUCAUSE research, programs,…

  10. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  11. Reliable Collaborative Filtering on Spatio-Temporal Privacy Data

    Directory of Open Access Journals (Sweden)

    Zhen Liu

    2017-01-01

    Full Text Available Lots of multilayer information, such as the spatio-temporal privacy check-in data, is accumulated in the location-based social network (LBSN. When using the collaborative filtering algorithm for LBSN location recommendation, one of the core issues is how to improve recommendation performance by combining the traditional algorithm with the multilayer information. The existing approaches of collaborative filtering use only the sparse user-item rating matrix. It entails high computational complexity and inaccurate results. A novel collaborative filtering-based location recommendation algorithm called LGP-CF, which takes spatio-temporal privacy information into account, is proposed in this paper. By mining the users check-in behavior pattern, the dataset is segmented semantically to reduce the data size that needs to be computed. Then the clustering algorithm is used to obtain and narrow the set of similar users. User-location bipartite graph is modeled using the filtered similar user set. Then LGP-CF can quickly locate the location and trajectory of users through message propagation and aggregation over the graph. Through calculating users similarity by spatio-temporal privacy data on the graph, we can finally calculate the rating of recommendable locations. Experiments results on the physical clusters indicate that compared with the existing algorithms, the proposed LGP-CF algorithm can make recommendations more accurately.

  12. Privacy-Preserving Biometric Authentication: Challenges and Directions

    Directory of Open Access Journals (Sweden)

    Elena Pagnin

    2017-01-01

    Full Text Available An emerging direction for authenticating people is the adoption of biometric authentication systems. Biometric credentials are becoming increasingly popular as a means of authenticating people due to the wide range of advantages that they provide with respect to classical authentication methods (e.g., password-based authentication. The most characteristic feature of this authentication method is the naturally strong bond between a user and her biometric credentials. This very same advantageous property, however, raises serious security and privacy concerns in case the biometric trait gets compromised. In this article, we present the most challenging issues that need to be taken into consideration when designing secure and privacy-preserving biometric authentication protocols. More precisely, we describe the main threats against privacy-preserving biometric authentication systems and give directions on possible countermeasures in order to design secure and privacy-preserving biometric authentication protocols.

  13. Secure and Privacy-Preserving Body Sensor Data Collection and Query Scheme

    Directory of Open Access Journals (Sweden)

    Hui Zhu

    2016-02-01

    Full Text Available With the development of body sensor networks and the pervasiveness of smart phones, different types of personal data can be collected in real time by body sensors, and the potential value of massive personal data has attracted considerable interest recently. However, the privacy issues of sensitive personal data are still challenging today. Aiming at these challenges, in this paper, we focus on the threats from telemetry interface and present a secure and privacy-preserving body sensor data collection and query scheme, named SPCQ, for outsourced computing. In the proposed SPCQ scheme, users’ personal information is collected by body sensors in different types and converted into multi-dimension data, and each dimension is converted into the form of a number and uploaded to the cloud server, which provides a secure, efficient and accurate data query service, while the privacy of sensitive personal information and users’ query data is guaranteed. Specifically, based on an improved homomorphic encryption technology over composite order group, we propose a special weighted Euclidean distance contrast algorithm (WEDC for multi-dimension vectors over encrypted data. With the SPCQ scheme, the confidentiality of sensitive personal data, the privacy of data users’ queries and accurate query service can be achieved in the cloud server. Detailed analysis shows that SPCQ can resist various security threats from telemetry interface. In addition, we also implement SPCQ on an embedded device, smart phone and laptop with a real medical database, and extensive simulation results demonstrate that our proposed SPCQ scheme is highly efficient in terms of computation and communication costs.

  14. Secure and Privacy-Preserving Body Sensor Data Collection and Query Scheme.

    Science.gov (United States)

    Zhu, Hui; Gao, Lijuan; Li, Hui

    2016-02-01

    With the development of body sensor networks and the pervasiveness of smart phones, different types of personal data can be collected in real time by body sensors, and the potential value of massive personal data has attracted considerable interest recently. However, the privacy issues of sensitive personal data are still challenging today. Aiming at these challenges, in this paper, we focus on the threats from telemetry interface and present a secure and privacy-preserving body sensor data collection and query scheme, named SPCQ, for outsourced computing. In the proposed SPCQ scheme, users' personal information is collected by body sensors in different types and converted into multi-dimension data, and each dimension is converted into the form of a number and uploaded to the cloud server, which provides a secure, efficient and accurate data query service, while the privacy of sensitive personal information and users' query data is guaranteed. Specifically, based on an improved homomorphic encryption technology over composite order group, we propose a special weighted Euclidean distance contrast algorithm (WEDC) for multi-dimension vectors over encrypted data. With the SPCQ scheme, the confidentiality of sensitive personal data, the privacy of data users' queries and accurate query service can be achieved in the cloud server. Detailed analysis shows that SPCQ can resist various security threats from telemetry interface. In addition, we also implement SPCQ on an embedded device, smart phone and laptop with a real medical database, and extensive simulation results demonstrate that our proposed SPCQ scheme is highly efficient in terms of computation and communication costs.

  15. Routes for breaching and protecting genetic privacy.

    Science.gov (United States)

    Erlich, Yaniv; Narayanan, Arvind

    2014-06-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these data sets is vital for progress in biomedical research. However, a growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, indicate the underlying assumptions, and assess their technological complexity and maturation. We then review potential mitigation methods for privacy-preserving dissemination of sensitive data and highlight different cases that are relevant to genetic applications.

  16. Blood rights: the body and information privacy.

    Science.gov (United States)

    Alston, Bruce

    2005-05-01

    Genetic and other medical technology makes blood, human tissue and other bodily samples an immediate and accessible source of comprehensive personal and health information about individuals. Yet, unlike medical records, bodily samples are not subject to effective privacy protection or other regulation to ensure that individuals have rights to control the collection, use and transfer of such samples. This article examines the existing coverage of privacy legislation, arguments in favour of baseline protection for bodily samples as sources of information and possible approaches to new regulation protecting individual privacy rights in bodily samples.

  17. Analysis of Privacy-Enhancing Identity Management Systems

    DEFF Research Database (Denmark)

    Adjei, Joseph K.; Olesen, Henning

    Privacy has become a major issue for policy makers. This has been impelled by the rapid development of technologies that facilitate collection, distribution, storage, and manipulation of personal information. Business organizations are finding new ways of leveraging the value derived from consumer...... is an attempt to understand the relationship between individuals’ intentions to disclose personal information, their actual personal information disclosure behaviours, and how these can be leveraged to develop privacy-enhancing identity management systems (IDMS) that users can trust. Legal, regulatory...... and technological aspects of privacy and technology adoption are also discussed....

  18. Big Data and Consumer Participation in Privacy Contracts: Deciding who Decides on Privacy

    Directory of Open Access Journals (Sweden)

    Michiel Rhoen

    2015-02-01

    Full Text Available Big data puts data protection to the test. Consumers granting permission to process their personal data are increasingly opening up their personal lives, thanks to the “datafication” of everyday life, indefinite data retention and the increasing sophistication of algorithms for analysis.The privacy implications of big data call for serious consideration of consumers’ opportunities to participate in decision-making processes about their contracts. If these opportunities are insufficient, the resulting rules may represent special interests rather than consumers’ needs. This may undermine the legitimacy of big data applications.This article argues that providing sufficient consumer participation in privacy matters requires choosing the best available decision making mechanism. Is a consumer to negotiate his own privacy terms in the market, will lawmakers step in on his behalf, or is he to seek protection through courts? Furthermore is this a matter of national law or European law? These choices will affect the opportunities for achieving different policy goals associated with the possible benefits of the “big data revolution”.

  19. 32 CFR 806b.30 - Evaluating information systems for Privacy Act compliance.

    Science.gov (United States)

    2010-07-01

    ... privacy issues are unchanged. (d) The depth and content of the Privacy Impact Assessment should be... 32 National Defense 6 2010-07-01 2010-07-01 false Evaluating information systems for Privacy Act... FORCE ADMINISTRATION PRIVACY ACT PROGRAM Privacy Impact Assessments § 806b.30 Evaluating information...

  20. A Framework for Privacy-preserving Classification of Next-generation PHR data.

    Science.gov (United States)

    Koufi, Vassiliki; Malamateniou, Flora; Prentza, Andriana; Vassilacopoulos, George

    2014-01-01

    Personal Health Records (PHRs), integrated with data from various sources, such as social care data, Electronic Health Record data and genetic information, are envisaged as having a pivotal role in transforming healthcare. These data, lumped under the term 'big data', are usually complex, noisy, heterogeneous, longitudinal and voluminous thus prohibiting their meaningful use by clinicians. Deriving value from these data requires the utilization of innovative data analysis techniques, which, however, may be hindered due to potential security and privacy breaches that may arise from improper release of personal health information. This paper presents a HIPAA-compliant machine learning framework that enables privacy-preserving classification of next-generation PHR data. The predictive models acquired can act as supporting tools to clinical practice by enabling more effective prevention, diagnosis and treatment of new incidents. The proposed framework has a huge potential for complementing medical staff expertise as it outperforms the manual inspection of PHR data while protecting patient privacy.

  1. Enforcement of Security and Privacy in a Service-Oriented Smart Grid

    DEFF Research Database (Denmark)

    Mikkelsen, Søren Aagaard

    inhabitants. With the vision, it is therefore necessity to enforce privacy and security of the data in all phases of its life cycle. The life cycle starts from acquiring the data to it is stored. Therefore, this dissertation follows a system-level and application-level approach to manage data with respect...... to privacy and security. This includes first a design of a service-oriented architecture that allows for the deployment of home-oriented and grid-oriented IASs on a Home Energy Management System (HEMS) and in the cloud, respectively. Privacy and security of electricity data are addressed by letting...... the residential consumer control data dissemination in a two-stage process: first from the HEMS to the cloud and from the cloud to the IASs. Then the dissertation focuses on the critical phases in securing the residential home as well as securing the cloud. It presents a system-level threat model of the HEMS...

  2. PRIVACY CONCERNS IN FACEBOOK SITE

    OpenAIRE

    Vandana Singh

    2014-01-01

    Today social networking sites play an important role and inexpensive way to maintain existing relationships and present oneself. However, the increasing use of online sites give rise to privacy concerns and risks. All Internet sites are also under attack from phishers, fraudsters, and spammers. They aim to steal user information and expose users to unwanted spam. They have so many resources at their disposal.This paper studies the awareness of college students regarding the privacy in Faceboo...

  3. Privacy Protection Method for Multiple Sensitive Attributes Based on Strong Rule

    Directory of Open Access Journals (Sweden)

    Tong Yi

    2015-01-01

    Full Text Available At present, most studies on data publishing only considered single sensitive attribute, and the works on multiple sensitive attributes are still few. And almost all the existing studies on multiple sensitive attributes had not taken the inherent relationship between sensitive attributes into account, so that adversary can use the background knowledge about this relationship to attack the privacy of users. This paper presents an attack model with the association rules between the sensitive attributes and, accordingly, presents a data publication for multiple sensitive attributes. Through proof and analysis, the new model can prevent adversary from using the background knowledge about association rules to attack privacy, and it is able to get high-quality released information. At last, this paper verifies the above conclusion with experiments.

  4. PRIVACY IN CLOUD COMPUTING: A SURVEY

    OpenAIRE

    Arockiam L; Parthasarathy G; Monikandan S

    2012-01-01

    Various cloud computing models are used to increase the profit of an organization. Cloud provides a convenient environment and more advantages to business organizations to run their business. But, it has some issues related to the privacy of data. User’s data are stored and maintained out of user’s premises. The failure of data protection causes many issues like data theft which affects the individual organization. The cloud users may be satisfied, if their data are protected p...

  5. Assessing privacy risks in population health publications using a checklist-based approach.

    Science.gov (United States)

    O'Keefe, Christine M; Ickowicz, Adrien; Churches, Tim; Westcott, Mark; O'Sullivan, Maree; Khan, Atikur

    2017-11-10

    Recent growth in the number of population health researchers accessing detailed datasets, either on their own computers or through virtual data centers, has the potential to increase privacy risks. In response, a checklist for identifying and reducing privacy risks in population health analysis outputs has been proposed for use by researchers themselves. In this study we explore the usability and reliability of such an approach by investigating whether different users identify the same privacy risks on applying the checklist to a sample of publications. The checklist was applied to a sample of 100 academic population health publications distributed among 5 readers. Cohen's κ was used to measure interrater agreement. Of the 566 instances of statistical output types found in the 100 publications, the most frequently occurring were counts, summary statistics, plots, and model outputs. Application of the checklist identified 128 outputs (22.6%) with potential privacy concerns. Most of these were associated with the reporting of small counts. Among these identified outputs, the readers found no substantial actual privacy concerns when context was taken into account. Interrater agreement for identifying potential privacy concerns was generally good. This study has demonstrated that a checklist can be a reliable tool to assist researchers with anonymizing analysis outputs in population health research. This further suggests that such an approach may have the potential to be developed into a broadly applicable standard providing consistent confidentiality protection across multiple analyses of the same data. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  6. Privacy Concerns: The Effects of the Latest FERPA Changes

    Science.gov (United States)

    Cossler, Christine

    2010-01-01

    Privacy, something once taken for granted, has again become top-of-mind for public school districts thanks to technology's increasing reach, as well as new changes to privacy laws governing student information. Recently, educators have had to face important changes to the Family Educational Rights and Privacy Act (FERPA), originally signed into…

  7. A compressive sensing based secure watermark detection and privacy preserving storage framework.

    Science.gov (United States)

    Qia Wang; Wenjun Zeng; Jun Tian

    2014-03-01

    Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.

  8. 32 CFR 505.3 - Privacy Act systems of records.

    Science.gov (United States)

    2010-07-01

    ... anticipated threats or hazards to the security or integrity of data, which could result in substantial harm... 32 National Defense 3 2010-07-01 2010-07-01 true Privacy Act systems of records. 505.3 Section 505... AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.3 Privacy Act systems of records. (a) Systems of...

  9. Exploring the Perceived Measures of Privacy: RFID in Public Applications

    Directory of Open Access Journals (Sweden)

    Mohammad Alamgir Hossain

    2014-06-01

    Full Text Available The purpose of this study is to explore the measures that may protect privacy of the users - in the context of RFID use in public applications. More specifically, this study investigates what the users perceive to have securing their privacy, particularly for the RFID applications in public uses. Qualitative research approach has been utilised for this study. The author conducted two focus-group discussion sessions and eight in-depth interviews in two countries: one from Australasia region (Australia and the other from Asia (Bangladesh, assuming that the status, and the perceptions and tolerance of the citizens on privacy issues are different in the stated regions. The explored factors have been analysed from privacy perspectives. The findings show that, in developed and developing countries, the basic perceptions of the users on privacy protection are complimentary; however, privacy is a more serious concern in Australia than in Bangladesh. Data analysis proposed some attributes that may improve users’ privacy perceptions when RFID is used in public applications. This study is the single initiative that focuses on privacy of RFID users from national-use context. As practical implication, the proposed attributes can be exercised by the deploying agencies that implement RFID technology for citizens’ use.

  10. Exercising privacy rights in medical science.

    Science.gov (United States)

    Hillmer, Michael; Redelmeier, Donald A

    2007-12-04

    Privacy laws are intended to preserve human well-being and improve medical outcomes. We used the Sportstats website, a repository of competitive athletic data, to test how easily these laws can be circumvented. We designed a haphazard, unrepresentative case-series analysis and applied unscientific methods based on an Internet connection and idle time. We found it both feasible and titillating to breach anonymity, stockpile personal information and generate misquotations. We extended our methods to snoop on celebrities, link to outside databases and uncover refusal to participate. Throughout our study, we evaded capture and public humiliation despite violating these 6 privacy fundamentals. We suggest that the legitimate principle of safeguarding personal privacy is undermined by the natural human tendency toward showing off.

  11. Problematic use of social network sites: the interactive relationship between gratifications sought and privacy concerns.

    Science.gov (United States)

    Chen, Hsuan-Ting; Kim, Yonghwan

    2013-11-01

    Problematic Internet use has long been a matter of concern; however, few studies extend this line of research from general Internet use to the use of social network sites (SNSs), or explicate the problematic use of SNSs by understanding what factors may enhance or reduce users' compulsive behaviors and excessive form of use on SNSs. Building on literature that found a positive relationship between gratifications sought from the Internet and problematic Internet use, this study first explores the types of gratifications sought from SNSs and examines their relationship with problematic SNS use. It found that three types of gratifications-diversion, self-presentation, and relationship building-were positively related to problematic SNS use. In addition, with a growing body of research on SNS privacy, a moderating role of privacy concerns on SNSs has been proposed to understand how it can influence the relationship between gratifications sought from SNSs and problematic SNS use. The findings suggest that different subdimensions of privacy concerns interact with gratifications sought in different manners. In other words, privacy concerns, including unauthorized secondary use and improper access, play a more influential role in constraining the positive relationship between gratifications sought and problematic SNS use when individuals seek to build relationships on SNSs. However, if individuals seek to have diversion on SNSs, their privacy concerns will be overridden by their gratifications sought, which in turn leads to problematic SNS use. Implications of these findings for future research are discussed.

  12. Uniting Legislation with RFID Privacy-Enhancing Technologies

    NARCIS (Netherlands)

    Rieback, M.R.; Crispo, B.; Tanenbaum, A.S.

    2005-01-01

    RFID is a popular identification and automation technology with serious security and privacy threats. Legislation expounds upon the actual security and privacy needs of people in RFID-enabled environments, while technology helps to ensure legal compliance. This paper examines the main aims of RFID

  13. Using genetic information while protecting the privacy of the soul.

    Science.gov (United States)

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  14. Unveiling consumer’s privacy paradox behaviour in an economic exchange

    Science.gov (United States)

    Li, Xiao-Bai

    2015-01-01

    Privacy paradox is of great interest to IS researchers and firms gathering personal information. It has been studied from social, behavioural, and economic perspectives independently. However, prior research has not examined the degrees of influence these perspectives contribute to the privacy paradox problem. We combine both economic and behavioural perspectives in our study of the privacy paradox with a price valuation of personal information through an economic experiment combined with a behavioural study on privacy paradox. Our goal is to reveal more insights on the privacy paradox through economic valuation on personal information. Results indicate that general privacy concerns or individual disclosure concerns do not have a significant influence on the price valuation of personal information. Instead, prior disclosure behaviour in specific scenario, like with healthcare providers or social networks, is a better indicator of consumer price valuations. PMID:27708687

  15. Enhancing Privacy Education with a Technical Emphasis in IT Curriculum

    Directory of Open Access Journals (Sweden)

    Svetlana Peltsverger

    2015-12-01

    Full Text Available The paper describes the development of four learning modules that focus on technical details of how a person’s privacy might be compromised in real-world scenarios. The paper shows how students benefited from the addition of hands-on learning experiences of privacy and data protection to the existing information technology courses. These learning modules raised students’ awareness of potential breaches of privacy as a user as well as a developer. The demonstration of a privacy breach in action helped students to design, configure, and implement technical solutions to prevent privacy violations. The assessment results demonstrate the strength of the technical approach.

  16. Privacy After Snowden: Theoretical Developments and Public Opinion Perceptions of Privacy in Slovenia (Zasebnost po Snowdnu: novejša pojmovanja zasebnosti in odnos javnosti do le-te v Sloveniji

    Directory of Open Access Journals (Sweden)

    Aleš Završnik

    2014-10-01

    Full Text Available The article analyses recent theorizing of privacy arising from new technologies that allow constant and ubiquitous monitoring of our communication and movement. The theoretical part analyses Helen Nissenbaum’s theory of contextual integrity of privacy and pluralistic understanding of privacy by Daniel Solove. The empirical part presents the results of an online survey on the Slovenian public perceptions of privacy that includes questions on types and frequency of victimizations relating to the right to privacy; self-reported privacy violations; concern for the protection of one’s own privacy; perception of primary privacy offenders; the value of privacy; attitude towards data retention in public telecommunication networks; and acquaintance with the Information Commissioner of RS. Despite growing distrust of large internet corporations and – after Edward Snowden’s revelations – Intelligence agencies, the findings indicate a low degree of awareness and care for the protection of personal data.

  17. An Examination of Individual’s Perceived Security and Privacy of the Internet in Malaysia and the Influence of This on Their Intention to Use E-Commerce: Using An Extension of the Technology Acceptance Model

    OpenAIRE

    Muniruddeen Lallmahamood

    2007-01-01

    This study explores the impact of perceived security and privacy on the intention to use Internet banking. An extended version of the technology acceptance model (TAM) is used to examine the above perception. A survey was distributed, the 187 responses mainly from the urban cities in Malaysia, hav e generally agreed that security and privacy are still the main concerns while using Internet banking. The research model explains over half of the variance of the intenti...

  18. Security and privacy in biometrics

    CERN Document Server

    Campisi, Patrizio

    2013-01-01

    This important text/reference presents the latest secure and privacy-compliant techniques in automatic human recognition. Featuring viewpoints from an international selection of experts in the field, the comprehensive coverage spans both theory and practical implementations, taking into consideration all ethical and legal issues. Topics and features: presents a unique focus on novel approaches and new architectures for unimodal and multimodal template protection; examines signal processing techniques in the encrypted domain, security and privacy leakage assessment, and aspects of standardizati

  19. Privacy for location-based services

    CERN Document Server

    Ghinita, Gabriel

    2013-01-01

    Sharing of location data enables numerous exciting applications, such as location-based queries, location-based social recommendations, monitoring of traffic and air pollution levels, etc. Disclosing exact user locations raises serious privacy concerns, as locations may give away sensitive information about individuals' health status, alternative lifestyles, political and religious affiliations, etc. Preserving location privacy is an essential requirement towards the successful deployment of location-based applications. These lecture notes provide an overview of the state-of-the-art in locatio

  20. Privacy Management Contracts And Economics, Using Service Level Agreements (Sla)

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractRecognizing the importance of privacy management as a business process and a business support process, this paper proposes the use of service level agreements around privacy features, including qualitative and quantitative ones. It also casts privacy management into a business

  1. Because we care: Privacy Dashboard on Firefox OS

    OpenAIRE

    Piekarska, Marta; Zhou, Yun; Strohmeier, Dominik; Raake, Alexander

    2015-01-01

    In this paper we present the Privacy Dashboard -- a tool designed to inform and empower the people using mobile devices, by introducing features such as Remote Privacy Protection, Backup, Adjustable Location Accuracy, Permission Control and Secondary-User Mode. We have implemented our solution on FirefoxOS and conducted user studies to verify the usefulness and usability of our tool. The paper starts with a discussion of different aspects of mobile privacy, how users perceive it and how much ...

  2. Security, privacy, and confidentiality issues on the Internet

    OpenAIRE

    Kelly, Grant; McKenzie, Bruce

    2002-01-01

    We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standa...

  3. Autonomous Vehicles for Smart and Sustainable Cities: An In-Depth Exploration of Privacy and Cybersecurity Implications

    Directory of Open Access Journals (Sweden)

    Hazel Si Min Lim

    2018-04-01

    Full Text Available Amidst rapid urban development, sustainable transportation solutions are required to meet the increasing demands for mobility whilst mitigating the potentially negative social, economic, and environmental impacts. This study analyses autonomous vehicles (AVs as a potential transportation solution for smart and sustainable development. We identified privacy and cybersecurity risks of AVs as crucial to the development of smart and sustainable cities and examined the steps taken by governments around the world to address these risks. We highlight the literature that supports why AVs are essential for smart and sustainable development. We then identify the aspects of privacy and cybersecurity in AVs that are important for smart and sustainable development. Lastly, we review the efforts taken by federal governments in the US, the UK, China, Australia, Japan, Singapore, South Korea, Germany, France, and the EU, and by US state governments to address AV-related privacy and cybersecurity risks in-depth. Overall, the actions taken by governments to address privacy risks are mainly in the form of regulations or voluntary guidelines. To address cybersecurity risks, governments have mostly resorted to regulations that are not specific to AVs and are conducting research and fostering research collaborations with the private sector.

  4. Contemporary Privacy Theory Contributions to Learning Analytics

    Science.gov (United States)

    Heath, Jennifer

    2014-01-01

    With the continued adoption of learning analytics in higher education institutions, vast volumes of data are generated and "big data" related issues, including privacy, emerge. Privacy is an ill-defined concept and subject to various interpretations and perspectives, including those of philosophers, lawyers, and information systems…

  5. Not All Adware Is Badware: Towards Privacy-Aware Advertising

    Science.gov (United States)

    Haddadi, Hamed; Guha, Saikat; Francis, Paul

    Online advertising is a major economic force in the Internet today. A basic goal of any advertising system is to accurately target the ad to the recipient audience. While Internet technology brings the promise of extremely well-targeted ad placement, there have always been serious privacy concerns surrounding personalization. Today there is a constant battle between privacy advocates and advertisers, where advertisers try to push new personalization technologies, and privacy advocates try to stop them. As long as privacy advocates, however, are unable to propose an alternative personalization system that is private, this is a battle they are destined to lose. This paper presents the framework for such an alternative system, the Private Verifiable Advertising (Privad). We describe the privacy issues associated with today’s advertising systems, describe Privad, and discuss its pros and cons and the challenges that remain.

  6. "Everybody Knows Everybody Else's Business"-Privacy in Rural Communities.

    Science.gov (United States)

    Leung, Janni; Smith, Annetta; Atherton, Iain; McLaughlin, Deirdre

    2016-12-01

    Patients have a right to privacy in a health care setting. This involves conversational discretion, security of medical records and physical privacy of remaining unnoticed or unidentified when using health care services other than by those who need to know or whom the patient wishes to know. However, the privacy of cancer patients who live in rural areas is more difficult to protect due to the characteristics of rural communities. The purpose of this article is to reflect on concerns relating to the lack of privacy experienced by cancer patients and health care professionals in the rural health care setting. In addition, this article suggests future research directions to provide much needed evidence for educating health care providers and guiding health care policies that can lead to better protection of privacy among cancer patients living in rural communities.

  7. Privacy and equality in diagnostic genetic testing.

    Science.gov (United States)

    Nyrhinen, Tarja; Hietala, Marja; Puukka, Pauli; Leino-Kilpi, Helena

    2007-05-01

    This study aimed to determine the extent to which the principles of privacy and equality were observed during diagnostic genetic testing according to views held by patients or child patients' parents (n = 106) and by staff (n = 162) from three Finnish university hospitals. The data were collected through a structured questionnaire and analysed using the SAS 8.1 statistical software. In general, the two principles were observed relatively satisfactorily in clinical practice. According to patients/parents, equality in the post-analytic phase and, according to staff, privacy in the pre-analytic phase, involved the greatest ethical problems. The two groups differed in their views concerning pre-analytic privacy. Although there were no major problems regarding the two principles, the differences between the testing phases require further clarification. To enhance privacy protection and equality, professionals need to be given more genetics/ethics training, and patients individual counselling by genetics units staff, giving more consideration to patients' world-view, the purpose of the test and the test result.

  8. Report from Dagstuhl Seminar 12331 Mobility Data Mining and Privacy

    OpenAIRE

    Clifton, Christopher W.; Kuijpers, Bart; Morik, Katharina; Saygin, Yucel

    2012-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 12331 “Mobility Data Mining and Privacy”. Mobility data mining aims to extract knowledge from movement behaviour of people, but this data also poses novel privacy risks. This seminar gathered a multidisciplinary team for a conversation on how to balance the value in mining mobility data with privacy issues. The seminar focused on four key issues: Privacy in vehicular data, in cellular data, context- dependent privacy, and ...

  9. Privacy & Social Media in the Context of the Arab Gulf

    OpenAIRE

    Abokhodair, Norah; Vieweg, Sarah

    2016-01-01

    Theories of privacy and how it relates to the use of Information Communication Technology (ICT) have been a topic of research for decades. However, little attention has been paid to the perception of privacy from the perspective of technology users in the Middle East. In this paper, we delve into interpretations of privacy from the approach of Arab Gulf citizens. We consider how privacy is practiced and understood in technology-mediated environments among this population, paying particular at...

  10. Acoustic assessment of speech privacy curtains in two nursing units

    Science.gov (United States)

    Pope, Diana S.; Miller-Klein, Erik T.

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s’ standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered. PMID:26780959

  11. Acoustic assessment of speech privacy curtains in two nursing units.

    Science.gov (United States)

    Pope, Diana S; Miller-Klein, Erik T

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s' standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  12. Acoustic assessment of speech privacy curtains in two nursing units

    Directory of Open Access Journals (Sweden)

    Diana S Pope

    2016-01-01

    Full Text Available Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient′s bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s′ standard hospital construction and the other was newly refurbished (2013 with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  13. Privacy as human flourishing: could a shift towards virtue ethics strengthen privacy protection in the age of Big Data?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Privacy is commonly seen as an instrumental value in relation to negative freedom, human dignity and personal autonomy. Article 8 ECHR, protecting the right to privacy, was originally coined as a doctrine protecting the negative freedom of citizens in vertical relations, that is between citizen and

  14. Privacy and Psychosomatic Stress: An Empirical Analysis.

    Science.gov (United States)

    Webb, Stephen D.

    1978-01-01

    Examines the supposition that insufficient privacy is stressful to the individual. Data were obtained from urban centers in New Zealand. Findings support the hypothesis that a percieved lack of privacy is associated with psychosomatic stress. The relationship is specified by measures of stress and sex of respondents. (Author)

  15. 46 CFR 14.105 - Disclosure and privacy.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Disclosure and privacy. 14.105 Section 14.105 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN SHIPMENT AND DISCHARGE OF MERCHANT MARINERS General § 14.105 Disclosure and privacy. The Coast Guard makes information...

  16. Location Privacy with Randomness Consistency

    Directory of Open Access Journals (Sweden)

    Wu Hao

    2016-10-01

    Full Text Available Location-Based Social Network (LBSN applications that support geo-location-based posting and queries to provide location-relevant information to mobile users are increasingly popular, but pose a location-privacy risk to posts. We investigated existing LBSNs and location privacy mechanisms, and found a powerful potential attack that can accurately locate users with relatively few queries, even when location data is well secured and location noise is applied. Our technique defeats previously proposed solutions including fake-location detection and query rate limits.

  17. Anonymous communication networks protecting privacy on the web

    CERN Document Server

    Peng, Kun

    2014-01-01

    In today's interactive network environment, where various types of organizations are eager to monitor and track Internet use, anonymity is one of the most powerful resources available to counterbalance the threat of unknown spectators and to ensure Internet privacy.Addressing the demand for authoritative information on anonymous Internet usage, Anonymous Communication Networks: Protecting Privacy on the Web examines anonymous communication networks as a solution to Internet privacy concerns. It explains how anonymous communication networks make it possible for participants to communicate with

  18. Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach

    Directory of Open Access Journals (Sweden)

    Tuukka Lehtiniemi

    2017-07-01

    Full Text Available In privacy self-management, people are expected to perform cost–benefit analysis on the use of their personal data, and only consent when their subjective benefits outweigh the costs. However, the ubiquitous collection of personal data and Big Data analytics present increasing challenges to successful privacy management. A number of services and research initiatives have proposed similar solutions to provide people with more control over their data by consolidating consent decisions under a single interface. We have named this the ‘ consent intermediary ’ approach. In this paper, we first identify the eight obstacles to privacy self-management which make cost–benefit analysis conceptually and practically challenging. We then analyse to which extent consent intermediaries can help overcome the obstacles. We argue that simply bringing consent decisions under one interface offers limited help, but that the potential of this approach lies in leveraging the intermediary position to provide aides for privacy management. We find that with suitable tools, some of the more practical obstacles indeed can become solvable, while others remain fundamentally insuperable within the individuated privacy self-management model. Attention should also be paid to how the consent intermediaries may take advantage of the power vested in the intermediary positions between users and other services.

  19. 77 FR 74851 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2012-12-18

    ... FEDERAL DEPOSIT INSURANCE CORPORATION Privacy Act of 1974; System of Records AGENCY: Federal Deposit Insurance Corporation. ACTION: Notice to Delete a System of Records. SUMMARY: In accordance with the requirements of the Privacy Act of 1974, as amended (Privacy Act), the Federal Deposit Insurance...

  20. How socially aware are social media privacy controls?

    OpenAIRE

    Misra, Gaurav; Such Aparicio, Jose Miguel

    2016-01-01

    Social media sites are key mediators of online communication. Yet the privacy controls for these sites are not fully socially aware, even when privacy management is known to be fundamental to successful social relationships.