WorldWideScience

Sample records for providing location privacy

  1. CARAVAN: Providing Location Privacy for VANET

    National Research Council Canada - National Science Library

    Sampigethaya, Krishna; Huang, Leping; Li, Mingyan; Poovendran, Radha; Matsuura, Kanta; Sezaki, Kaoru

    2005-01-01

    .... This type of tracking leads to threats on the location privacy of the vehicle's user. In this paper, we study the problem of providing location privacy in VANET by allowing vehicles to prevent tracking of their broadcast communications...

  2. From Data Privacy to Location Privacy

    Science.gov (United States)

    Wang, Ting; Liu, Ling

    Over the past decade, the research on data privacy has achieved considerable advancement in the following two aspects: First, a variety of privacy threat models and privacy principles have been proposed, aiming at providing sufficient protection against different types of inference attacks; Second, a plethora of algorithms and methods have been developed to implement the proposed privacy principles, while attempting to optimize the utility of the resulting data. The first part of the chapter presents an overview of data privacy research by taking a close examination at the achievements from the above two aspects, with the objective of pinpointing individual research efforts on the grand map of data privacy protection. As a special form of data privacy, location privacy possesses its unique characteristics. In the second part of the chapter, we examine the research challenges and opportunities of location privacy protection, in a perspective analogous to data privacy. Our discussion attempts to answer the following three questions: (1) Is it sufficient to apply the data privacy models and algorithms developed to date for protecting location privacy? (2) What is the current state of the research on location privacy? (3) What are the open issues and technical challenges that demand further investigation? Through answering these questions, we intend to provide a comprehensive review of the state of the art in location privacy research.

  3. Privacy-Preserving Location-Based Services

    Science.gov (United States)

    Chow, Chi Yin

    2010-01-01

    Location-based services (LBS for short) providers require users' current locations to answer their location-based queries, e.g., range and nearest-neighbor queries. Revealing personal location information to potentially untrusted service providers could create privacy risks for users. To this end, our objective is to design a privacy-preserving…

  4. Location Privacy with Randomness Consistency

    Directory of Open Access Journals (Sweden)

    Wu Hao

    2016-10-01

    Full Text Available Location-Based Social Network (LBSN applications that support geo-location-based posting and queries to provide location-relevant information to mobile users are increasingly popular, but pose a location-privacy risk to posts. We investigated existing LBSNs and location privacy mechanisms, and found a powerful potential attack that can accurately locate users with relatively few queries, even when location data is well secured and location noise is applied. Our technique defeats previously proposed solutions including fake-location detection and query rate limits.

  5. Privacy for location-based services

    CERN Document Server

    Ghinita, Gabriel

    2013-01-01

    Sharing of location data enables numerous exciting applications, such as location-based queries, location-based social recommendations, monitoring of traffic and air pollution levels, etc. Disclosing exact user locations raises serious privacy concerns, as locations may give away sensitive information about individuals' health status, alternative lifestyles, political and religious affiliations, etc. Preserving location privacy is an essential requirement towards the successful deployment of location-based applications. These lecture notes provide an overview of the state-of-the-art in locatio

  6. Privacy vs. Reward in Indoor Location-Based Services

    Directory of Open Access Journals (Sweden)

    Fawaz Kassem

    2016-10-01

    Full Text Available With the advance of indoor localization technology, indoor location-based services (ILBS are gaining popularity. They, however, accompany privacy concerns. ILBS providers track the users’ mobility to learn more about their behavior, and then provide them with improved and personalized services. Our survey of 200 individuals highlighted their concerns about this tracking for potential leakage of their personal/private traits, but also showed their willingness to accept reduced tracking for improved service. In this paper, we propose PR-LBS (Privacy vs. Reward for Location-Based Service, a system that addresses these seemingly conflicting requirements by balancing the users’ privacy concerns and the benefits of sharing location information in indoor location tracking environments. PR-LBS relies on a novel location-privacy criterion to quantify the privacy risks pertaining to sharing indoor location information. It also employs a repeated play model to ensure that the received service is proportionate to the privacy risk. We implement and evaluate PR-LBS extensively with various real-world user mobility traces. Results show that PR-LBS has low overhead, protects the users’ privacy, and makes a good tradeoff between the quality of service for the users and the utility of shared location data for service providers.

  7. Location-Related Privacy in Geo-Social Networks

    DEFF Research Database (Denmark)

    Ruiz Vicente, Carmen; Freni, Dario; Bettini, Claudio

    2011-01-01

    -ins." However, this ability to reveal users' locations causes new privacy threats, which in turn call for new privacy-protection methods. The authors study four privacy aspects central to these social networks - location, absence, co-location, and identity privacy - and describe possible means of protecting...... privacy in these circumstances....

  8. A Hybrid Location Privacy Solution for Mobile LBS

    Directory of Open Access Journals (Sweden)

    Ruchika Gupta

    2017-01-01

    Full Text Available The prevalent usage of location based services, where getting any service is solely based on the user’s current location, has raised an extreme concern over location privacy of the user. Generalized approaches dealing with location privacy, referred to as cloaking and obfuscation, are mainly based on a trusted third party, in which all the data remain available at a central server and thus complete knowledge of the query exists at the central node. This is the major limitation of such approaches; on the other hand, in trusted third-party-free framework clients collaborate with each other and freely communicate with the service provider without any third-party involvement. Measuring and evaluating trust among peers is a crucial aspect in trusted third-party-free framework. This paper exploits the merits and mitigating the shortcomings of both of these approaches. We propose a hybrid solution, HYB, to achieve location privacy for the mobile users who use location services frequently. The proposed HYB scheme is based on the collaborative preprocessing of location data and utilizes the benefits of homomorphic encryption technique. Location privacy is achieved at two levels, namely, at the proximity level and at distant level. The proposed HYB solution preserves the user’s location privacy effectively under specific, pull-based, sporadic query scenario.

  9. Location Privacy in RFID Applications

    Science.gov (United States)

    Sadeghi, Ahmad-Reza; Visconti, Ivan; Wachsmann, Christian

    RFID-enabled systems allow fully automatic wireless identification of objects and are rapidly becoming a pervasive technology with various applications. However, despite their benefits, RFID-based systems also pose challenging risks, in particular concerning user privacy. Indeed, improvident use of RFID can disclose sensitive information about users and their locations allowing detailed user profiles. Hence, it is crucial to identify and to enforce appropriate security and privacy requirements of RFID applications (that are also compliant to legislation). This chapter first discusses security and privacy requirements for RFID-enabled systems, focusing in particular on location privacy issues. Then it explores the advances in RFID applications, stressing the security and privacy shortcomings of existing proposals. Finally, it presents new promising directions for privacy-preserving RFID systems, where as a case study we focus electronic tickets (e-tickets) for public transportation.

  10. Location Privacy Techniques in Client-Server Architectures

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yiu, Man Lung

    2009-01-01

    A typical location-based service returns nearby points of interest in response to a user location. As such services are becoming increasingly available and popular, location privacy emerges as an important issue. In a system that does not offer location privacy, users must disclose their exact...... locations in order to receive the desired services. We view location privacy as an enabling technology that may lead to increased use of location-based services. In this chapter, we consider location privacy techniques that work in traditional client-server architectures without any trusted components other....... Third, their effectiveness is independent of the distribution of other users, unlike the k-anonymity approach. The chapter characterizes the privacy models assumed by existing techniques and categorizes these according to their approach. The techniques are then covered in turn according...

  11. Privacy Protection in Participatory Sensing Applications Requiring Fine-Grained Locations

    DEFF Research Database (Denmark)

    Dong, Kai; Gu, Tao; Tao, Xianping

    2010-01-01

    The emerging participatory sensing applications have brought a privacy risk where users expose their location information. Most of the existing solutions preserve location privacy by generalizing a precise user location to a coarse-grained location, and hence they cannot be applied in those appli...... provider is an trustworthy entity, making our solution more feasible to practical applications. We present and analyze our security model, and evaluate the performance and scalability of our system....

  12. A Location Privacy Aware Friend Locator

    DEFF Research Database (Denmark)

    Siksnys, Laurynas; Thomsen, Jeppe Rishede; Saltenis, Simonas

    2009-01-01

    to trade their location privacy for quality of service, limiting the attractiveness of the services. The challenge is to develop a communication-efficient solution such that (i) it detects proximity between a user and the user’s friends, (ii) any other party is not allowed to infer the location of the user...

  13. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data

    Directory of Open Access Journals (Sweden)

    Qingqing Xie

    2016-11-01

    Full Text Available With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP can provide location-based service (LBS for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman algorithm and ciphertext policy attribute-based encryption (CP-ABE scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA and efficient enough for practical applications in terms of user side computation overhead.

  14. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data.

    Science.gov (United States)

    Xie, Qingqing; Wang, Liangmin

    2016-11-25

    With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP) can provide location-based service (LBS) for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman) algorithm and ciphertext policy attribute-based encryption (CP-ABE) scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA) and efficient enough for practical applications in terms of user side computation overhead.

  15. The privacy concerns in location based services: protection approaches and remaining challenges

    OpenAIRE

    Basiri, Anahid; Moore, Terry; Hill, Chris

    2016-01-01

    Despite the growth in the developments of the Location Based Services (LBS) applications, there are still several challenges remaining. One of the most important concerns about LBS, shared by many users and service providers is the privacy. Privacy has been considered as a big threat to the adoption of LBS among many users and consequently to the growth of LBS markets. This paper discusses the privacy concerns associated with location data, and the current privacy protection approaches. It re...

  16. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data †

    Science.gov (United States)

    Xie, Qingqing; Wang, Liangmin

    2016-01-01

    With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP) can provide location-based service (LBS) for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman) algorithm and ciphertext policy attribute-based encryption (CP-ABE) scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA) and efficient enough for practical applications in terms of user side computation overhead. PMID:27897984

  17. On Limitations of Existing Methods for Location Privacy

    DEFF Research Database (Denmark)

    Andersen, Mads Schaarup

    This paper argues that there are some limitations when applying location privacy methods developed for point-ofinterest services to newer classes of location based services. We support the argument by categorizing methods for location privacy and identifying the issues. It is hypothesized...

  18. Location privacy and national security : Contradiction in terminus?

    NARCIS (Netherlands)

    Van Loenen, B.

    2010-01-01

    Location based services (LBS) potentially put the privacy of individuals at risk. The increased possibility to know people’s whereabouts is posing the question of possibility versus desirability with regard to location privacy. The central question that this article aims to answer is how location

  19. A Framework For Enhancing Privacy In Location Based Services Using K-Anonymity Model

    Directory of Open Access Journals (Sweden)

    Jane Mugi

    2015-08-01

    Full Text Available Abstract This paper presents a framework for enhancing privacy in Location Based Services using K-anonymity model. Users of location based services have to reveal their location information in order to use these services however this has threatened the user privacy. K-anonymity approach has been studied extensively in various forms. However it is only effective when the user location is fixed. When a user moves and continuously sends their location information the location service provider can approximate user trajectory which poses a threat to the trajectory privacy of the user. This framework will ensure that user privacy is enhanced for both snapshot and continuous queries. The efficiency and effectiveness of the proposed framework was evaluated the results indicate that the proposed framework has high success rate and good run time performance.

  20. Preserving location and absence privacy in geo-social networks

    DEFF Research Database (Denmark)

    Freni, Dario; Vicente, Carmen Ruiz; Mascetti, Sergio

    2010-01-01

    accessible to multiple users. This renders it difficult for GeoSN users to control which information about them is available and to whom it is available. This paper addresses two privacy threats that occur in GeoSNs: location privacy and absence privacy. The former concerns the availability of information...... about the presence of users in specific locations at given times, while the latter concerns the availability of information about the absence of an individual from specific locations during given periods of time. The challenge addressed is that of supporting privacy while still enabling useful services....... The resulting geo-aware social networks (GeoSNs) pose privacy threats beyond those found in location-based services. Content published in a GeoSN is often associated with references to multiple users, without the publisher being aware of the privacy preferences of those users. Moreover, this content is often...

  1. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-01-01

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users’ privacy. PMID:29751670

  2. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection.

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-05-11

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users' privacy.

  3. Location Privacy Protection Based on Improved K-Value Method in Augmented Reality on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Chunyong Yin

    2017-01-01

    Full Text Available With the development of Augmented Reality technology, the application of location based service (LBS is more and more popular, which provides enormous convenience to people’s life. User location information could be obtained at anytime and anywhere. So user location privacy security suffers huge threats. Therefore, it is crucial to pay attention to location privacy protection in LBS. Based on the architecture of the trusted third party (TTP, we analyzed the advantages and shortages of existing location privacy protection methods in LBS on mobile terminal. Then we proposed the improved K-value location privacy protection method according to privacy level, which combines k-anonymity method with pseudonym method. Through the simulation experiment, the results show that this improved method can anonymize all service requests effectively. In addition to the experiment of execution time, it demonstrated that our proposed method can realize the location privacy protection more efficiently.

  4. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2018-05-01

    Full Text Available A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users’ privacy.

  5. A rural/urban comparison of privacy and confidentiality concerns associated with providing sensitive location information in epidemiologic research involving persons who use drugs.

    Science.gov (United States)

    Rudolph, Abby E; Young, April M; Havens, Jennifer R

    2017-11-01

    Analyses that link contextual factors with individual-level data can improve our understanding of the "risk environment"; however, the accuracy of information provided by participants about locations where illegal/stigmatized behaviors occur may be influenced by privacy/confidentiality concerns that may vary by setting and/or data collection approach. We recruited thirty-five persons who use drugs from a rural Appalachian town and a Mid-Atlantic city to participate in in-depth interviews. Through thematic analyses, we identified and compared privacy/confidentiality concerns associated with two survey methods that (1) collect self-reported addresses/cross-streets and (2) use an interactive web-based map to find/confirm locations in rural and urban settings. Concerns differed more by setting than between methods. For example, (1) rural participants valued interviewer rapport and protections provided by the Certificate of Confidentiality more; (2) locations considered to be sensitive differed in rural (i.e., others' homes) and urban (i.e., where drugs were used) settings; and (3) urban participants were more likely to view providing cross-streets as an acceptable alternative to providing exact addresses for sensitive locations and to prefer the web-based map approach. Rural-urban differences in privacy/confidentiality concerns reflect contextual differences (i.e., where drugs are used/purchased, population density, and prior drug-related arrests). Strategies to alleviate concerns include: (1) obtain a Certificate of Confidentiality, (2) collect geographic data at the scale necessary for proposed analyses, and (3) permit participants to provide intersections/landmarks in close proximity to actual locations rather than exact addresses or to skip questions where providing an intersection/landmark would not obfuscate the actual address. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Location privacy online : China, the Netherlands and South Korea

    NARCIS (Netherlands)

    Broeder, Peter; Lee, Yujin

    2016-01-01

    The aim of the study is to explore cross-cultural differences in users’ location privacy behaviour on LBSNs (location-based social networks) in China, the Netherlands and Korea. The study suggests evidence that Chinese, Dutch and Korean users exhibit different location privacy concerns, attitudes to

  7. Towards a New Classification of Location Privacy Methods in Pervasive Computing

    DEFF Research Database (Denmark)

    Andersen, Mads Schaarup; Kjærgaard, Mikkel Baun

    2011-01-01

    and Collaborative Sensing, and that insufficient work has been done in Route Tracing. It is concluded that none of the existing methods cover all applications of Route Tracing. It is, therefore, suggested that a new overall method should be proposed to solve the problem of location privacy in Route Tracing......-of-Interest, Social Networking, Collaborative Sensing, and Route Tracing, and the high level location privacy method categories are Anonymization, Classical Security, Spatial Obfuscation, Temporal Obfuscation, and Protocol. It is found that little work exists on location privacy in the areas of Social Networking......Over the last decade many methods for location privacy have been proposed, but the mapping between classes of location based services and location privacy methods is not obvious. This entails confusion for developers, lack of usage of privacy methods, and an unclear road-map ahead for research...

  8. Protection of Location Privacy Based on Distributed Collaborative Recommendations.

    Science.gov (United States)

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users' location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users' location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users' location information profiles and used generalization and encryption to ensure the safety of the user's location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user's location privacy.

  9. Extended Privacy in Crowdsourced Location-Based Services Using Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jacques Bou Abdo

    2016-01-01

    Full Text Available Crowdsourcing mobile applications are of increasing importance due to their suitability in providing personalized and better matching replies. The competitive edge of crowdsourcing is twofold; the requestors can achieve better and/or cheaper responses while the crowd contributors can achieve extra money by utilizing their free time or resources. Crowdsourcing location-based services inherit the querying mechanism from their legacy predecessors and this is where the threat lies. In this paper, we are going to show that none of the advanced privacy notions found in the literature except for K-anonymity is suitable for crowdsourced location-based services. In addition, we are going to prove mathematically, using an attack we developed, that K-anonymity does not satisfy the privacy level needed by such services. To respond to this emerging threat, we will propose a new concept, totally different from existing resource consuming privacy notions, to handle user privacy using Mobile Cloud Computing.

  10. Protecting location privacy for outsourced spatial data in cloud storage.

    Science.gov (United States)

    Tian, Feng; Gui, Xiaolin; An, Jian; Yang, Pan; Zhao, Jianqiang; Zhang, Xuejun

    2014-01-01

    As cloud computing services and location-aware devices are fully developed, a large amount of spatial data needs to be outsourced to the cloud storage provider, so the research on privacy protection for outsourced spatial data gets increasing attention from academia and industry. As a kind of spatial transformation method, Hilbert curve is widely used to protect the location privacy for spatial data. But sufficient security analysis for standard Hilbert curve (SHC) is seldom proceeded. In this paper, we propose an index modification method for SHC (SHC(∗)) and a density-based space filling curve (DSC) to improve the security of SHC; they can partially violate the distance-preserving property of SHC, so as to achieve better security. We formally define the indistinguishability and attack model for measuring the privacy disclosure risk of spatial transformation methods. The evaluation results indicate that SHC(∗) and DSC are more secure than SHC, and DSC achieves the best index generation performance.

  11. Achieve Location Privacy-Preserving Range Query in Vehicular Sensing.

    Science.gov (United States)

    Kong, Qinglei; Lu, Rongxing; Ma, Maode; Bao, Haiyong

    2017-08-08

    Modern vehicles are equipped with a plethora of on-board sensors and large on-board storage, which enables them to gather and store various local-relevant data. However, the wide application of vehicular sensing has its own challenges, among which location-privacy preservation and data query accuracy are two critical problems. In this paper, we propose a novel range query scheme, which helps the data requester to accurately retrieve the sensed data from the distributive on-board storage in vehicular ad hoc networks (VANETs) with location privacy preservation. The proposed scheme exploits structured scalars to denote the locations of data requesters and vehicles, and achieves the privacy-preserving location matching with the homomorphic Paillier cryptosystem technique. Detailed security analysis shows that the proposed range query scheme can successfully preserve the location privacy of the involved data requesters and vehicles, and protect the confidentiality of the sensed data. In addition, performance evaluations are conducted to show the efficiency of the proposed scheme, in terms of computation delay and communication overhead. Specifically, the computation delay and communication overhead are not dependent on the length of the scalar, and they are only proportional to the number of vehicles.

  12. Privacy implications of location and contextual data on the social web

    OpenAIRE

    Zafeiropoulou, Aristea-Maria; Millard, David; Webber, Craig; O'Hara, Kieron

    2011-01-01

    Location-based applications have recently begun to emerge on the Social Web. After their appearance numerous concerns with regards to location privacy have been provoked. However, these privacy concerns seem to have effects beyond location, as other contextual information can be inferred through location information. This research addresses these implications, which keep on growing on the Social Web.

  13. Game-Theoretic Model of Incentivizing Privacy-Aware Users to Consent to Location Tracking

    OpenAIRE

    Panaousis, Emmanouil; Laszka, Aron; Pohl, Johannes; Noack, Andreas; Alpcan, Tansu

    2016-01-01

    Nowadays, mobile users have a vast number of applications and services at their disposal. Each of these might impose some privacy threats on users' "Personally Identifiable Information" (PII). Location privacy is a crucial part of PII, and as such, privacy-aware users wish to maximize it. This privacy can be, for instance, threatened by a company, which collects users' traces and shares them with third parties. To maximize their location privacy, users can decide to get offline so that the co...

  14. GLPP: A Game-Based Location Privacy-Preserving Framework in Account Linked Mixed Location-Based Services

    Directory of Open Access Journals (Sweden)

    Zhuo Ma

    2018-01-01

    Full Text Available In Location-Based Services (LBSs platforms, such as Foursquare and Swarm, the submitted position for a share or search leads to the exposure of users’ activities. Additionally, the cross-platform account linkage could aggravate this exposure, as the fusion of users’ information can enhance inference attacks on users’ next submitted location. Hence, in this paper, we propose GLPP, a personalized and continuous location privacy-preserving framework in account linked platforms with different LBSs (i.e., search-based LBSs and share-based LBSs. The key point of GLPP is to obfuscate every location submitted in search-based LBSs so as to defend dynamic inference attacks. Specifically, first, possible inference attacks are listed through user behavioral analysis. Second, for each specific attack, an obfuscation model is proposed to minimize location privacy leakage under a given location distortion, which ensures submitted locations’ utility for search-based LBSs. Third, for dynamic attacks, a framework based on zero-sum game is adopted to joint specific obfuscation above and minimize the location privacy leakage to a balanced point. Experiments on real dataset prove the effectiveness of our proposed attacks in Accuracy, Certainty, and Correctness and, meanwhile, also show the performance of our preserving solution in defense of attacks and guarantee of location utility.

  15. A Fine-Grained and Privacy-Preserving Query Scheme for Fog Computing-Enhanced Location-Based Service.

    Science.gov (United States)

    Yang, Xue; Yin, Fan; Tang, Xiaohu

    2017-07-11

    Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching.

  16. Location Privacy for Mobile Crowd Sensing through Population Mapping.

    Science.gov (United States)

    Shin, Minho; Cornelius, Cory; Kapadia, Apu; Triandopoulos, Nikos; Kotz, David

    2015-06-29

    Opportunistic sensing allows applications to "task" mobile devices to measure context in a target region. For example, one could leverage sensor-equipped vehicles to measure traffic or pollution levels on a particular street or users' mobile phones to locate (Bluetooth-enabled) objects in their vicinity. In most proposed applications, context reports include the time and location of the event, putting the privacy of users at increased risk: even if identifying information has been removed from a report, the accompanying time and location can reveal sufficient information to de-anonymize the user whose device sent the report. We propose and evaluate a novel spatiotemporal blurring mechanism based on tessellation and clustering to protect users' privacy against the system while reporting context. Our technique employs a notion of probabilistic k-anonymity; it allows users to perform local blurring of reports efficiently without an online anonymization server before the data are sent to the system. The proposed scheme can control the degree of certainty in location privacy and the quality of reports through a system parameter. We outline the architecture and security properties of our approach and evaluate our tessellation and clustering algorithm against real mobility traces.

  17. Location Privacy for Mobile Crowd Sensing through Population Mapping

    Directory of Open Access Journals (Sweden)

    Minho Shin

    2015-06-01

    Full Text Available Opportunistic sensing allows applications to “task” mobile devices to measure context in a target region. For example, one could leverage sensor-equipped vehicles to measure traffic or pollution levels on a particular street or users’ mobile phones to locate (Bluetooth-enabled objects in their vicinity. In most proposed applications, context reports include the time and location of the event, putting the privacy of users at increased risk: even if identifying information has been removed from a report, the accompanying time and location can reveal sufficient information to de-anonymize the user whose device sent the report. We propose and evaluate a novel spatiotemporal blurring mechanism based on tessellation and clustering to protect users’ privacy against the system while reporting context. Our technique employs a notion of probabilistic k-anonymity; it allows users to perform local blurring of reports efficiently without an online anonymization server before the data are sent to the system. The proposed scheme can control the degree of certainty in location privacy and the quality of reports through a system parameter. We outline the architecture and security properties of our approach and evaluate our tessellation and clustering algorithm against real mobility traces.

  18. Evaluating Common Privacy Vulnerabilities in Internet Service Providers

    Science.gov (United States)

    Kotzanikolaou, Panayiotis; Maniatis, Sotirios; Nikolouzou, Eugenia; Stathopoulos, Vassilios

    Privacy in electronic communications receives increased attention in both research and industry forums, stemming from both the users' needs and from legal and regulatory requirements in national or international context. Privacy in internet-based communications heavily relies on the level of security of the Internet Service Providers (ISPs), as well as on the security awareness of the end users. This paper discusses the role of the ISP in the privacy of the communications. Based on real security audits performed in national-wide ISPs, we illustrate privacy-specific threats and vulnerabilities that many providers fail to address when implementing their security policies. We subsequently provide and discuss specific security measures that the ISPs can implement, in order to fine-tune their security policies in the context of privacy protection.

  19. Unpicking the privacy paradox: can structuration theory help to explain location-based privacy decisions?

    OpenAIRE

    Zafeiropoulou, Aristea M.; Millard, David E.; Webber, Craig; O'Hara, Kieron

    2013-01-01

    Social Media and Web 2.0 tools have dramatically increased the amount of previously private data that users share on the Web; now with the advent of GPS-enabled smartphones users are also actively sharing their location data through a variety of applications and services. Existing research has explored people’s privacy attitudes, and shown that the way people trade their personal data for services of value can be inconsistent with their stated privacy preferences (a phenomenon known as the pr...

  20. A Moving-Object Index for Efficient Query Processing with PeerWise Location Privacy

    DEFF Research Database (Denmark)

    Lin, Dan; Jensen, Christian S.; Zhang, Rui

    2011-01-01

    attention has been paid to enabling so-called peer-wise privacy—the protection of a user’s location from unauthorized peer users. This paper identifies an important efficiency problem in existing peer-privacy approaches that simply apply a filtering step to identify users that are located in a query range......, but that do not want to disclose their location to the querying peer. To solve this problem, we propose a novel, privacy-policy enabled index called the PEB-tree that seamlessly integrates location proximity and policy compatibility. We propose efficient algorithms that use the PEB-tree for processing privacy......-aware range and kNN queries. Extensive experiments suggest that the PEB-tree enables efficient query processing....

  1. Location-Based Services and Privacy in Airports

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Alapetite, Alexandre; Andersen, Henning Boje

    2009-01-01

    This paper reports on a study of privacy concerns related to location-based services in an airport, where users who volunteer for the service will be tracked for a limited period and within a limited area. Reactions elicited from travellers at a field trial showed 60% feeling to some or to a larg...

  2. LPPS: A Distributed Cache Pushing Based K-Anonymity Location Privacy Preserving Scheme

    Directory of Open Access Journals (Sweden)

    Ming Chen

    2016-01-01

    Full Text Available Recent years have witnessed the rapid growth of location-based services (LBSs for mobile social network applications. To enable location-based services, mobile users are required to report their location information to the LBS servers and receive answers of location-based queries. Location privacy leak happens when such servers are compromised, which has been a primary concern for information security. To address this issue, we propose the Location Privacy Preservation Scheme (LPPS based on distributed cache pushing. Unlike existing solutions, LPPS deploys distributed cache proxies to cover users mostly visited locations and proactively push cache content to mobile users, which can reduce the risk of leaking users’ location information. The proposed LPPS includes three major process. First, we propose an algorithm to find the optimal deployment of proxies to cover popular locations. Second, we present cache strategies for location-based queries based on the Markov chain model and propose update and replacement strategies for cache content maintenance. Third, we introduce a privacy protection scheme which is proved to achieve k-anonymity guarantee for location-based services. Extensive experiments illustrate that the proposed LPPS achieves decent service coverage ratio and cache hit ratio with lower communication overhead compared to existing solutions.

  3. Expectation-Maximization Tensor Factorization for Practical Location Privacy Attacks

    Directory of Open Access Journals (Sweden)

    Murakami Takao

    2017-10-01

    Full Text Available Location privacy attacks based on a Markov chain model have been widely studied to de-anonymize or de-obfuscate mobility traces. An adversary can perform various kinds of location privacy attacks using a personalized transition matrix, which is trained for each target user. However, the amount of training data available to the adversary can be very small, since many users do not disclose much location information in their daily lives. In addition, many locations can be missing from the training traces, since many users do not disclose their locations continuously but rather sporadically. In this paper, we show that the Markov chain model can be a threat even in this realistic situation. Specifically, we focus on a training phase (i.e. mobility profile building phase and propose Expectation-Maximization Tensor Factorization (EMTF, which alternates between computing a distribution of missing locations (E-step and computing personalized transition matrices via tensor factorization (M-step. Since the time complexity of EMTF is exponential in the number of missing locations, we propose two approximate learning methods, one of which uses the Viterbi algorithm while the other uses the Forward Filtering Backward Sampling (FFBS algorithm. We apply our learning methods to a de-anonymization attack and a localization attack, and evaluate them using three real datasets. The results show that our learning methods significantly outperform a random guess, even when there is only one training trace composed of 10 locations per user, and each location is missing with probability 80% (i.e. even when users hardly disclose two temporally-continuous locations.

  4. Location privacy protection in mobile networks

    CERN Document Server

    Liu, Xinxin

    2013-01-01

    This SpringerBrief analyzes the potential privacy threats in wireless and mobile network environments, and reviews some existing works. It proposes multiple privacy preserving techniques against several types of privacy threats that are targeting users in a mobile network environment. Depending on the network architecture, different approaches can be adopted. The first proposed approach considers a three-party system architecture where there is a trusted central authority that can be used to protect users? privacy. The second approach considers a totally distributed environment where users per

  5. Anonymous authentication and location privacy preserving schemes for LTE-A networks

    Directory of Open Access Journals (Sweden)

    Zaher Jabr Haddad

    2017-11-01

    Full Text Available Long Term Evaluation Advanced (LTE-A is the third generation partnership project for cellular network that allows subscribers to roam into networks (i.e., the Internet and wireless connections using spacial purpose base-stations, such as wireless access points and home node B. In such LTE-A based networks, neither base-stations, nor the Internet and wireless connections are trusted because base-stations are operated by un-trusted subscribers. Attackers may exploit these vulnerabilities to violate the privacy of the LTE-A subscribers. On the other hand, the tradeoff between privacy and authentication is another challenge in such networks. Therefore, in this paper, we propose two anonymous authentication schemes based on one-time pseudonymes and Schnorr Zero Knowledge Protocols. Instead of the international mobile subscriber identity, these schemes enable the user equipment, base-stations and mobility management entity to mutually authenticate each others and update the location of the user equipment without evolving the home subscriber server. The security analysis demonstrate that the proposed schemes thwart security and privacy attacks, such as malicious, international mobile subscriber identity catching, and tracking attacks. Additionally, our proposed schemes preserve the location privacy of user equipment since no entity except the mobility management entity and Gate-Way Mobile Location Center can link between the pseudonymes and the international mobile subscriber identity. Also attackers have no knowledge about international mobile subscriber identity. Hence, the proposed schemes achieve backward/forward secrecy. Furthermore, the performance evaluation shows that the proposed handover schemes impose a small overhead on the mobile nodes and it has smaller computation and communication overheads than those in other schemes.

  6. Lattice Based Mix Network for Location Privacy in Mobile System

    Directory of Open Access Journals (Sweden)

    Kunwar Singh

    2015-01-01

    Full Text Available In 1981, David Chaum proposed a cryptographic primitive for privacy called mix network (Mixnet. A mixnet is cryptographic construction that establishes anonymous communication channel through a set of servers. In 2004, Golle et al. proposed a new cryptographic primitive called universal reencryption which takes the input as encrypted messages under the public key of the recipients not the public key of the universal mixnet. In Eurocrypt 2010, Gentry, Halevi, and Vaikunthanathan presented a cryptosystem which is an additive homomorphic and a multiplicative homomorphic for only one multiplication. In MIST 2013, Singh et al. presented a lattice based universal reencryption scheme under learning with error (LWE assumption. In this paper, we have improved Singh et al.’s scheme using Fairbrother’s idea. LWE is a lattice hard problem for which till now there is no polynomial time quantum algorithm. Wiangsripanawan et al. proposed a protocol for location privacy in mobile system using universal reencryption whose security is reducible to Decision Diffie-Hellman assumption. Once quantum computer becomes a reality, universal reencryption can be broken in polynomial time by Shor’s algorithm. In postquantum cryptography, our scheme can replace universal reencryption scheme used in Wiangsripanawan et al. scheme for location privacy in mobile system.

  7. Privacy-Preserving Trajectory Collection

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Xuegang, Huang; Pedersen, Torben Bach

    2008-01-01

    In order to provide context--aware Location--Based Services, real location data of mobile users must be collected and analyzed by spatio--temporal data mining methods. However, the data mining methods need precise location data, while the mobile users want to protect their location privacy....... To remedy this situation, this paper first formally defines novel location privacy requirements. Then, it briefly presents a system for privacy--preserving trajectory collection that meets these requirements. The system is composed of an untrusted server and clients communicating in a P2P network. Location...... data is anonymized in the system using data cloaking and data swapping techniques. Finally, the paper empirically demonstrates that the proposed system is effective and feasible....

  8. A privacy-preserving framework for outsourcing location-based services to the cloud

    OpenAIRE

    Zhu, Xiaojie; Ayday, Erman; Vitenberg, Roman

    2018-01-01

    Thanks to the popularity of mobile devices a large number of location-based services (LBS) have emerged. While a large number of privacy-preserving solutions for LBS have been proposed, most of these solutions do not consider the fact that LBS are typically cloud-based nowadays. Outsourcing data and computation to the cloud raises a number of significant challenges related to data confidentiality, user identity and query privacy, fine-grain access control, and query expressiveness. In this wo...

  9. Achieving Network Level Privacy in Wireless Sensor Networks†

    Science.gov (United States)

    Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2010-01-01

    Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power), sensor networks (e.g., mobility and topology) and QoS issues (e.g., packet reach-ability and timeliness). In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks. PMID:22294881

  10. Achieving Network Level Privacy in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sungyoung Lee

    2010-02-01

    Full Text Available Full network level privacy has often been categorized into four sub-categories: Identity, Route, Location and Data privacy. Achieving full network level privacy is a critical and challenging problem due to the constraints imposed by the sensor nodes (e.g., energy, memory and computation power, sensor networks (e.g., mobility and topology and QoS issues (e.g., packet reach-ability and timeliness. In this paper, we proposed two new identity, route and location privacy algorithms and data privacy mechanism that addresses this problem. The proposed solutions provide additional trustworthiness and reliability at modest cost of memory and energy. Also, we proved that our proposed solutions provide protection against various privacy disclosure attacks, such as eavesdropping and hop-by-hop trace back attacks.

  11. Privacy Management on Facebook: Do Device Type and Location of Posting Matter?

    Directory of Open Access Journals (Sweden)

    Jennifer Jiyoung Suh

    2015-10-01

    Full Text Available People’s information sharing on Facebook often happens through mobile devices allowing for posting from different locations. Despite the potential contextual differences in content sharing, the literature on online privacy management rarely takes into consideration the type of device and the type of location from which people post content. Do these aspects of Facebook use affect how people share information online? Analyzing Facebook posts young adults shared from different devices and different locations, this article examines the effectiveness of users’ privacy management. By comparing the intended audience with the actual audience of each post, we find considerable mismatch between the two despite most participants expressing confidence in their ability to manage their information on the site. Posts that are accidentally shared with “public”—potentially anyone on the web—are more likely to be shared from non-mobile devices. Interview data reveal that this happens despite the fact that most participants consider non-mobile devices more reliable and convenient to use than mobile devices.

  12. Anonymity Preserving Routing In Location Privacy Schemes In Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    R Regin

    2015-12-01

    Full Text Available Location privacy measures need to be developed to prevent the opponent from determining the physical locations of source sensors and sinks. An opponent can easily intercept network traffic due to the use of a broadcast medium for routing packets and get detailed information such as packet transmission time and frequency to perform traffic analysis and infer the locations of monitored objects and data sinks. On the other hand, sensors usually have limited processing speed and energy supplies. It is very expensive to apply traditional anonymous communication techniques for hiding the communication between sensor nodes and sinks. The existing source-location privacy protects the location of monitored objects to increase the number of messages sent by the source before the object is located by the attacker. The flooding technique has the source node send each packet through numerous paths to a sink making it difficult for an opponent to trace the source. The locations of sinks can be protected from a local eavesdropper by hashing the ID field in the packet header. But opponent can track sinks by carrying out time correlation and rate monitoring attacks. Besides protection some source nodes are transferring relatively large amounts of data in existing system. As a result, these nodes run out of battery faster due to improper position of nodes and sinks. Thus in the proposed system the sinks should be located as optimally as possible to reduce traffic flow and energy consumption for sensor nodes. Hence Sink placement problem is resolved for minimizing the delay as well as maximizing the lifetime of a WSN. Thus proposed system is efficient in terms of overhead and functionality when compared to existing system.

  13. Towards Privacy Managment of Information Systems

    OpenAIRE

    Drageide, Vidar

    2009-01-01

    This masters thesis provides insight into the concept of privacy. It argues why privacy is important, and why developers and system owners should keep privacy in mind when developing and maintaining systems containing personal information. Following this, a strategy for evaluating the overall level of privacy in a system is defined. The strategy is then applied to parts of the cellphone system in an attempt to evaluate the privacy of traffic and location data in this system.

  14. Privacy Implications of Surveillance Systems

    DEFF Research Database (Denmark)

    Thommesen, Jacob; Andersen, Henning Boje

    2009-01-01

    This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed for vari......This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed...... for various purposes, even as a service for those being observed, but in any case they will to some degree invade their privacy. The model provided here can indicate how invasive any particular system may be – and be used to compare the invasiveness of different systems. Applying a functional approach......, the model is established by first considering the social function of privacy in everyday life, which in turn lets us determine which different domains will be considered as private, and finally identify the different types of privacy invasion. This underlying model (function – domain – invasion) then serves...

  15. What Does The Crowd Say About You? Evaluating Aggregation-based Location Privacy

    Directory of Open Access Journals (Sweden)

    Pyrgelis Apostolos

    2017-10-01

    Full Text Available Information about people’s movements and the locations they visit enables an increasing number of mobility analytics applications, e.g., in the context of urban and transportation planning, In this setting, rather than collecting or sharing raw data, entities often use aggregation as a privacy protection mechanism, aiming to hide individual users’ location traces. Furthermore, to bound information leakage from the aggregates, they can perturb the input of the aggregation or its output to ensure that these are differentially private.

  16. Secure Mix-Zones for Privacy Protection of Road Network Location Based Services Users

    Directory of Open Access Journals (Sweden)

    Rubina S. Zuberi

    2016-01-01

    Full Text Available Privacy has been found to be the major impediment and hence the area to be worked out for the provision of Location Based Services in the wide sense. With the emergence of smart, easily portable, communicating devices, information acquisition is achieving new domains. The work presented here is an extension of the ongoing work towards achieving privacy for the present day emerging communication techniques. This work emphasizes one of the most effective real-time privacy enhancement techniques called Mix-Zones. In this paper, we have presented a model of a secure road network with Mix-Zones getting activated on the basis of spatial as well as temporal factors. The temporal factors are ascertained by the amount of traffic and its flow. The paper also discusses the importance of the number of Mix-Zones a user traverses and their mixing effectiveness. We have also shown here using our simulations which are required for the real-time treatment of the problem that the proposed transient Mix-Zones are part of a viable and robust solution towards the road network privacy protection of the communicating moving objects of the present scenario.

  17. Protection of the Locational Privacy Using Mosaic Theory of Data (Varstvo lokacijske zasebnosti s pomočjo mozaične teorije podatkov

    Directory of Open Access Journals (Sweden)

    Primož Križnar

    2016-12-01

    Full Text Available The individual’s right to privacy is one of the fundamental human rights. Part of this »embedded« right presents a person’s capability to move from a variety of different points and locations with reasonable expectation that performed paths, stops and current locations are not systematically recorded and stored for future use. Notwithstanding this, individuals often seem to be ignorant of the modern technology capabilities, which is aggressively interfering with wide spectrum of their privacy, part of which is also locational privacy. However, the following as one of the existential component of privacy must also be given all the necessary legal protection, which, at least for the time being, is reflected in the implementation of the mosaic theory in the European legal traditions with the help of established legal standards of the European Court of Human Rights regarding privacy.

  18. An Efficient and Privacy-Preserving Multiuser Cloud-Based LBS Query Scheme

    Directory of Open Access Journals (Sweden)

    Lu Ou

    2018-01-01

    Full Text Available Location-based services (LBSs are increasingly popular in today’s society. People reveal their location information to LBS providers to obtain personalized services such as map directions, restaurant recommendations, and taxi reservations. Usually, LBS providers offer user privacy protection statement to assure users that their private location information would not be given away. However, many LBSs run on third-party cloud infrastructures. It is challenging to guarantee user location privacy against curious cloud operators while still permitting users to query their own location information data. In this paper, we propose an efficient privacy-preserving cloud-based LBS query scheme for the multiuser setting. We encrypt LBS data and LBS queries with a hybrid encryption mechanism, which can efficiently implement privacy-preserving search over encrypted LBS data and is very suitable for the multiuser setting with secure and effective user enrollment and user revocation. This paper contains security analysis and performance experiments to demonstrate the privacy-preserving properties and efficiency of our proposed scheme.

  19. Mobile location-based advertising: how information privacy concerns influence consumers' attitude and acceptance

    NARCIS (Netherlands)

    Limpf, N.; Voorveld, H.A.M.

    2015-01-01

    This study investigates the effect of information privacy concerns on consumers' attitude toward and acceptance of mobile location-based advertising (LBA), and the moderating role of the type of mobile LBA, namely push versus pull. Using an online experiment (N = 224), it was found that consumers'

  20. Effective Privacy-Preserving Online Route Planning

    DEFF Research Database (Denmark)

    Vicente, Carmen Ruiz; Assent, Ira; Jensen, Christian S.

    2011-01-01

    An online Route Planning Service (RPS) computes a route from one location to another. Current RPSs such as Google Maps require the use of precise locations. However, some users may not want to disclose their source and destination locations due to privacy concerns. An approach that supplies fake...... privacy. The solution re-uses a standard online RPS rather than replicate this functionality, and it needs no trusted third party. The solution is able to compute the exact results without leaking of the exact locations to the RPS or un-trusted parties. In addition, we provide heuristics that reduce...... the number of times that the RPS needs to be queried, and we also describe how the accuracy and privacy requirements can be relaxed to achieve better performance. An empirical study offers insight into key properties of the approach....

  1. Entropy-Based Privacy against Profiling of User Mobility

    Directory of Open Access Journals (Sweden)

    Alicia Rodriguez-Carrion

    2015-06-01

    Full Text Available Location-based services (LBSs flood mobile phones nowadays, but their use poses an evident privacy risk. The locations accompanying the LBS queries can be exploited by the LBS provider to build the user profile of visited locations, which might disclose sensitive data, such as work or home locations. The classic concept of entropy is widely used to evaluate privacy in these scenarios, where the information is represented as a sequence of independent samples of categorized data. However, since the LBS queries might be sent very frequently, location profiles can be improved by adding temporal dependencies, thus becoming mobility profiles, where location samples are not independent anymore and might disclose the user’s mobility patterns. Since the time dimension is factored in, the classic entropy concept falls short of evaluating the real privacy level, which depends also on the time component. Therefore, we propose to extend the entropy-based privacy metric to the use of the entropy rate to evaluate mobility profiles. Then, two perturbative mechanisms are considered to preserve locations and mobility profiles under gradual utility constraints. We further use the proposed privacy metric and compare it to classic ones to evaluate both synthetic and real mobility profiles when the perturbative methods proposed are applied. The results prove the usefulness of the proposed metric for mobility profiles and the need for tailoring the perturbative methods to the features of mobility profiles in order to improve privacy without completely loosing utility.

  2. Location Privacy on DVB-RCS using a “Spatial-Timing” Approach

    Directory of Open Access Journals (Sweden)

    A. Aggelis

    2011-09-01

    Full Text Available DVB-RCS synchronization scheme on the Return Channel requires the RCSTs to be programmed with their location coordinates with an accuracy of no more than a few kilometers. RCSTs use this location information in their ranging calculation to the servicing satellite. For certain users this location information disclosure to the network operator can be seen as a serious security event. Recent work of the authors overcame this requirement by cloaking the location of an RCST in such a way (based on "spatial/geometric" symmetries of the network that the respective ranging calculations are not affected. In this work we argue that timing tolerances in the Return Channel synchronization scheme, accepted by the DVB-RCS standard, can be used in combination to the "spatial" method, further enhancing the location privacy of an RCST. Theoretical findings of the proposed "spatial-timing" approach were used to develop a practical method that can be used by workers in the field. Finally this practical method was successfully tested on a real DVB-RCS system.

  3. Privacy Issues of the W3C Geolocation API

    OpenAIRE

    Doty, Nick; Mulligan, Deirdre K.; Wilde, Erik

    2010-01-01

    The W3C's Geolocation API may rapidly standardize the transmission of location information on the Web, but, in dealing with such sensitive information, it also raises serious privacy concerns. We analyze the manner and extent to which the current W3C Geolocation API provides mechanisms to support privacy. We propose a privacy framework for the consideration of location information and use it to evaluate the W3C Geolocation API, both the specification and its use in the wild, and recommend s...

  4. An Improved Privacy-Preserving Framework for Location-Based Services Based on Double Cloaking Regions with Supplementary Information Constraints

    Directory of Open Access Journals (Sweden)

    Li Kuang

    2017-01-01

    Full Text Available With the rapid development of location-based services in the field of mobile network applications, users enjoy the convenience of location-based services on one side, while being exposed to the risk of disclosure of privacy on the other side. Attacker will make a fierce attack based on the probability of inquiry, map data, point of interest (POI, and other supplementary information. The existing location privacy protection techniques seldom consider the supplementary information held by attackers and usually only generate single cloaking region according to the protected location point, and the query efficiency is relatively low. In this paper, we improve the existing LBSs system framework, in which we generate double cloaking regions by constraining the supplementary information, and then k-anonymous task is achieved by the cooperation of the double cloaking regions; specifically speaking, k dummy points of fixed dummy positions in the double cloaking regions are generated and the LBSs query is then performed. Finally, the effectiveness of the proposed method is verified by the experiments on real datasets.

  5. Efficient spatial privacy preserving scheme for sensor network

    Science.gov (United States)

    Debnath, Ashmita; Singaravelu, Pradheepkumar; Verma, Shekhar

    2013-03-01

    The privacy of sensitive events observed by a wireless sensor networks (WSN) needs to be protected. Adversaries with the knowledge of sensor deployment and network protocols can infer the location of a sensed event by monitoring the communication from the sensors even when the messages are encrypted. Encryption provides confidentiality; however, the context of the event can used to breach the privacy of sensed objects. An adversary can track the trajectory of a moving object or determine the location of the occurrence of a critical event to breach its privacy. In this paper, we propose ring signature to obfuscate the spatial information. Firstly, the extended region of location of an event of interest as estimated from a sensor communication is presented. Then, the increase in this region of spatial uncertainty due to the effect of ring signature is determined. We observe that ring signature can effectively enhance the region of location uncertainty of a sensed event. As the event of interest can be situated anywhere in the enhanced region of uncertainty, its privacy against local or global adversary is ensured. Both analytical and simulation results show that induced delay and throughput are insignificant with negligible impact on the performance of a WSN.

  6. Big Brother’s Little Helpers: The Right to Privacy and the Responsibility of Internet Service Providers

    Directory of Open Access Journals (Sweden)

    Yael Ronen

    2015-02-01

    Full Text Available Following the 2013 revelations on the extent of intelligence gathering through internet service providers, this article concerns the responsibility of internet service providers (ISPs involved in disclosure of personal data to government authorities under the right to privacy, by reference to the developing, non-binding standards applied to businesses under the Protect, Respect and Remedy Framework. The article examines the manner in which the Framework applies to ISPs and looks at measures that ISPs can take to fulfil their responsibility to respect the right to privacy. It utilizes the challenges to the right to privacy to discuss some aspects of the extension of human rights responsibilities to corporations. These include the respective roles of government and non-state actors, the extent to which corporations may be required to act proactively in order to protect the privacy of clients, and the relevance of transnational activity.

  7. Reward-based spatial crowdsourcing with differential privacy preservation

    Science.gov (United States)

    Xiong, Ping; Zhang, Lefeng; Zhu, Tianqing

    2017-11-01

    In recent years, the popularity of mobile devices has transformed spatial crowdsourcing (SC) into a novel mode for performing complicated projects. Workers can perform tasks at specified locations in return for rewards offered by employers. Existing methods ensure the efficiency of their systems by submitting the workers' exact locations to a centralised server for task assignment, which can lead to privacy violations. Thus, implementing crowsourcing applications while preserving the privacy of workers' location is a key issue that needs to be tackled. We propose a reward-based SC method that achieves acceptable utility as measured by task assignment success rates, while efficiently preserving privacy. A differential privacy model ensures rigorous privacy guarantee, and Laplace noise is introduced to protect workers' exact locations. We then present a reward allocation mechanism that adjusts each piece of the reward for a task using the distribution of the workers' locations. Through experimental results, we demonstrate that this optimised-reward method is efficient for SC applications.

  8. Exploiting Proximity-Based Mobile Apps for Large-Scale Location Privacy Probing

    Directory of Open Access Journals (Sweden)

    Shuang Zhao

    2018-01-01

    Full Text Available Proximity-based apps have been changing the way people interact with each other in the physical world. To help people extend their social networks, proximity-based nearby-stranger (NS apps that encourage people to make friends with nearby strangers have gained popularity recently. As another typical type of proximity-based apps, some ridesharing (RS apps allowing drivers to search nearby passengers and get their ridesharing requests also become popular due to their contribution to economy and emission reduction. In this paper, we concentrate on the location privacy of proximity-based mobile apps. By analyzing the communication mechanism, we find that many apps of this type are vulnerable to large-scale location spoofing attack (LLSA. We accordingly propose three approaches to performing LLSA. To evaluate the threat of LLSA posed to proximity-based mobile apps, we perform real-world case studies against an NS app named Weibo and an RS app called Didi. The results show that our approaches can effectively and automatically collect a huge volume of users’ locations or travel records, thereby demonstrating the severity of LLSA. We apply the LLSA approaches against nine popular proximity-based apps with millions of installations to evaluate the defense strength. We finally suggest possible countermeasures for the proposed attacks.

  9. A Systematic Review of Research Studies Examining Telehealth Privacy and Security Practices Used By Healthcare Providers

    Directory of Open Access Journals (Sweden)

    Valerie J.M. Watzlaf

    2017-11-01

    Full Text Available The objective of this systematic review was to systematically review papers in the United States that examine current practices in privacy and security when telehealth technologies are used by healthcare providers. A literature search was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocols (PRISMA-P. PubMed, CINAHL and INSPEC from 2003 – 2016 were searched and returned 25,404 papers (after duplications were removed. Inclusion and exclusion criteria were strictly followed to examine title, abstract, and full text for 21 published papers which reported on privacy and security practices used by healthcare providers using telehealth.  Data on confidentiality, integrity, privacy, informed consent, access control, availability, retention, encryption, and authentication were all searched and retrieved from the papers examined. Papers were selected by two independent reviewers, first per inclusion/exclusion criteria and, where there was disagreement, a third reviewer was consulted. The percentage of agreement and Cohen’s kappa was 99.04% and 0.7331 respectively. The papers reviewed ranged from 2004 to 2016 and included several types of telehealth specialties. Sixty-seven percent were policy type studies, and 14 percent were survey/interview studies. There were no randomized controlled trials. Based upon the results, we conclude that it is necessary to have more studies with specific information about the use of privacy and security practices when using telehealth technologies as well as studies that examine patient and provider preferences on how data is kept private and secure during and after telehealth sessions. Keywords: Computer security, Health personnel, Privacy, Systematic review, Telehealth

  10. Parents' and providers' attitudes toward school-located provision and school-entry requirements for HPV vaccines.

    Science.gov (United States)

    Vercruysse, Jessica; Chigurupati, Nagasudha L; Fung, Leslie; Apte, Gauri; Pierre-Joseph, Natalie; Perkins, Rebecca B

    2016-06-02

    To determine parents' and providers' attitudes toward school-located provision and school-entry requirements for HPV vaccination. Parents/guardians of 11-17 y old girls and pediatric healthcare providers at one inner-city public clinic and three private practices completed semi-structured interviews in 2012-2013. Participants were asked open-ended questions regarding their attitudes toward school-located provision and school-entry requirements for HPV vaccination. Parents' answers were analyzed with relationship to whether their daughters had not initiated, initiated but not completed, or completed the HPV vaccine series. Qualitative analysis was used to identify themes related to shared views. 129 parents/guardians and 34 providers participated. 61% of parents supported providing HPV vaccinations in schools, citing reasons of convenience, improved access, and positive peer pressure. Those who opposed school-located provision raised concerns related to privacy and the capacity of school nurses to manage vaccine-related reactions. Parents whose daughters had not completed the series were more likely to intend to vaccinate their daughters in schools (70%) and support requirements (64%) than parents who had not initiated vaccination (42% would vaccinate at school, 46% support requirements) or completed the series (42% would vaccinate at school, 32% support requirements; p parents whose children have not completed the series, indicating that this venue might be a valuable addition to improve completion rates. Support for school-entry requirements was limited among both parents and healthcare providers.

  11. Enhanced Internet Mobility and Privacy Using Public Cloud

    Directory of Open Access Journals (Sweden)

    Ping Zhang

    2017-01-01

    Full Text Available Internet mobile users are concerned more and more about their privacy nowadays as both researches and real world incidents show that leaking of communication and location privacy can lead to serious consequence, and many research works have been done to anonymize individual user from aggregated location data. However, just the communication itself between the mobile users and their peers or website could collect considerable privacy of the mobile users, such as location history, to other parties. In this paper, we investigated the potential privacy risk of mobile Internet users and proposed a scalable system built on top of public cloud services that can hide mobile user’s network location and traffic from communication peers. This system creates a dynamic distributed proxy network for each mobile user to minimize performance overhead and operation cost.

  12. Privacy and Ethics in Undergraduate GIS Curricula

    Science.gov (United States)

    Scull, Peter; Burnett, Adam; Dolfi, Emmalee; Goldfarb, Ali; Baum, Peter

    2016-01-01

    The development of location-aware technologies, such as smartphones, raises serious questions regarding locational privacy and the ethical use of geographic data. The degree to which these concepts are taught in undergraduate geographic information science (GISci) courses is unknown. A survey of GISci educators shows that issues of privacy and…

  13. Privacy vs security

    CERN Document Server

    Stalla-Bourdillon, Sophie; Ryan, Mark D

    2014-01-01

    Securing privacy in the current environment is one of the great challenges of today's democracies. Privacy vs. Security explores the issues of privacy and security and their complicated interplay, from a legal and a technical point of view. Sophie Stalla-Bourdillon provides a thorough account of the legal underpinnings of the European approach to privacy and examines their implementation through privacy, data protection and data retention laws. Joshua Philips and Mark D. Ryan focus on the technological aspects of privacy, in particular, on today's attacks on privacy by the simple use of today'

  14. PrivateRide: A Privacy-Enhanced Ride-Hailing Service

    Directory of Open Access Journals (Sweden)

    Pham Anh

    2017-04-01

    Full Text Available In the past few years, we have witnessed a rise in the popularity of ride-hailing services (RHSs, an online marketplace that enables accredited drivers to use their own cars to drive ride-hailing users. Unlike other transportation services, RHSs raise significant privacy concerns, as providers are able to track the precise mobility patterns of millions of riders worldwide. We present the first survey and analysis of the privacy threats in RHSs. Our analysis exposes high-risk privacy threats that do not occur in conventional taxi services. Therefore, we propose PrivateRide, a privacy-enhancing and practical solution that offers anonymity and location privacy for riders, and protects drivers’ information from harvesting attacks. PrivateRide lowers the high-risk privacy threats in RHSs to a level that is at least as low as that of many taxi services. Using real data-sets from Uber and taxi rides, we show that PrivateRide significantly enhances riders’ privacy, while preserving tangible accuracy in ride matching and fare calculation, with only negligible effects on convenience. Moreover, by using our Android implementation for experimental evaluations, we show that PrivateRide’s overhead during ride setup is negligible. In short, we enable privacy-conscious riders to achieve levels of privacy that are not possible in current RHSs and even in some conventional taxi services, thereby offering a potential business differentiator.

  15. A hybrid technique for private location-based queries with database protection

    KAUST Repository

    Ghinita, Gabriel

    2009-01-01

    Mobile devices with global positioning capabilities allow users to retrieve points of interest (POI) in their proximity. To protect user privacy, it is important not to disclose exact user coordinates to un-trusted entities that provide location-based services. Currently, there are two main approaches to protect the location privacy of users: (i) hiding locations inside cloaking regions (CRs) and (ii) encrypting location data using private information retrieval (PIR) protocols. Previous work focused on finding good trade-offs between privacy and performance of user protection techniques, but disregarded the important issue of protecting the POI dataset D. For instance, location cloaking requires large-sized CRs, leading to excessive disclosure of POIs (O(|D|) in the worst case). PIR, on the other hand, reduces this bound to , but at the expense of high processing and communication overhead. We propose a hybrid, two-step approach to private location-based queries, which provides protection for both the users and the database. In the first step, user locations are generalized to coarse-grained CRs which provide strong privacy. Next, a PIR protocol is applied with respect to the obtained query CR. To protect excessive disclosure of POI locations, we devise a cryptographic protocol that privately evaluates whether a point is enclosed inside a rectangular region. We also introduce an algorithm to efficiently support PIR on dynamic POI sub-sets. Our method discloses O(1) POI, orders of magnitude fewer than CR- or PIR-based techniques. Experimental results show that the hybrid approach is scalable in practice, and clearly outperforms the pure-PIR approach in terms of computational and communication overhead. © 2009 Springer Berlin Heidelberg.

  16. Conundrums with penumbras: the right to privacy encompasses non-gamete providers who create preembryos with the intent to become parents.

    Science.gov (United States)

    Dillon, Lainie M C

    2003-05-01

    To date, five state high courts have resolved disputes over frozen preembryos. These disputes arose during divorce proceedings between couples who had previously used assisted reproduction and cryopreserved excess preembryos. In each case, one spouse wished to have the preembryos destroyed, while the other wanted to be able to use or donate them in the future. The parties in these cases invoked the constitutional right to privacy to argue for dispositional control over the preembryos; two of the five cases were resolved by relying on this right. The constitutional right to privacy protects intimate decisions involving procreation, marriage, and family life. However, when couples use donated sperm or ova to create preembryos, a unique circumstance arises: one spouse--the gamete provider--is genetically related to the preembryos and the other is not. If courts resolve frozen preembryo disputes that involve non-gamete providers based on the constitutional right to privacy, they should find that the constitutional right to privacy encompasses the interests of both gamete and non-gamete providers. Individuals who create preembryos with the intent to become a parent have made an intimate decision involving procreation, marriage, and family life that falls squarely within the the right to privacy. In such cases, the couple together made the decision to create a family through the use of assisted reproduction, and the preembryos would not exist but for that joint decision. Therefore, gamete and non-gamete providers should be afforded equal constitutional protection in disputes over frozen preembryos.

  17. Do Privacy Concerns Matter for Millennials?

    DEFF Research Database (Denmark)

    Fodor, Mark; Brem, Alexander

    2015-01-01

    data have raised the question, if location data are considered as sensitive data by users. Thus, we use two privacy concern models, namely Concern for Information Privacy (CFIP) and Internet Users’ Information Privacy Concerns (IUIPC) to find out. Our sample comprises of 235 individuals between 18...... and 34 years (Generation C) from Germany. The results of this study indicate that the second-order factor IUIPC showed better fit for the underlying data than CFIP did. Overall privacy concerns have been found to have an impact on behavioral intentions of users for LBS adoption. Furthermore, other risk...

  18. Crowdsourcing for Context: Regarding Privacy in Beacon Encounters via Contextual Integrity

    Directory of Open Access Journals (Sweden)

    Bello-Ogunu Emmanuel

    2016-07-01

    Full Text Available Research shows that context is important to the privacy perceptions associated with technology. With Bluetooth Low Energy beacons, one of the latest technologies for providing proximity and indoor tracking, the current identifiers that characterize a beacon are not sufficient for ordinary users to make informed privacy decisions about the location information that could be shared. One solution would be to have standardized category and privacy labels, produced by beacon providers or an independent third-party. An alternative solution is to find an approach driven by users, for users. In this paper, we propose a novel crowdsourcing based approach to introduce elements of context in beacon encounters.We demonstrate the effectiveness of this approach through a user study, where participants use a crowd-based mobile app designed to collect beacon category and privacy information as a scavenger hunt game. Results show that our approach was effective in helping users label beacons according to the specific context of a given beacon encounter, as well as the privacy perceptions associated with it. This labeling was done with an accuracy of 92%, and with an acceptance rate of 82% of all recommended crowd labels. Lastly, we conclusively show how crowdsourcing for context can be used towards a user-centric framework for privacy management during beacon encounters.

  19. Health Records and the Cloud Computing Paradigm from a Privacy Perspective

    Directory of Open Access Journals (Sweden)

    Christian Stingl

    2011-01-01

    Full Text Available With the advent of cloud computing, the realization of highly available electronic health records providing location-independent access seems to be very promising. However, cloud computing raises major security issues that need to be addressed particularly within the health care domain. The protection of the privacy of individuals often seems to be left on the sidelines. For instance, common protection against malicious insiders, i.e., non-disclosure agreements, is purely organizational. Clearly, such measures cannot prevent misuses but can at least discourage it. In this paper, we present an approach to storing highly sensitive health data in the cloud whereas the protection of patient's privacy is exclusively based on technical measures, so that users and providers of health records do not need to trust the cloud provider with privacy related issues. Our technical measures comprise anonymous communication and authentication, anonymous yet authorized transactions and pseudonymization of databases.

  20. User Privacy in RFID Networks

    Science.gov (United States)

    Singelée, Dave; Seys, Stefaan

    Wireless RFID networks are getting deployed at a rapid pace and have already entered the public space on a massive scale: public transport cards, the biometric passport, office ID tokens, customer loyalty cards, etc. Although RFID technology offers interesting services to customers and retailers, it could also endanger the privacy of the end-users. The lack of protection mechanisms being deployed could potentially result in a privacy leakage of personal data. Furthermore, there is the emerging threat of location privacy. In this paper, we will show some practical attack scenarios and illustrates some of them with cases that have received press coverage. We will present the main challenges of enhancing privacy in RFID networks and evaluate some solutions proposed in literature. The main advantages and shortcomings will be briefly discussed. Finally, we will give an overview of some academic and industrial research initiatives on RFID privacy.

  1. Using Extracted Behavioral Features to Improve Privacy for Shared Route Tracks

    DEFF Research Database (Denmark)

    Andersen, Mads Schaarup; Kjærgaard, Mikkel Baun; Grønbæk, Kaj

    2012-01-01

    . In this paper, we present the concept of privacy by substitution that addresses the problem without degrading service quality by substituting location tracks with less privacy invasive behavioral data extracted from raw tracks of location data or other sensing data. We explore this concept by designing...... and implementing TracM, a track-based community service for runners to share and compare their running performance. We show how such a service can be implemented by substituting location tracks with less privacy invasive behavioral data. Furthermore, we discuss the lessons learned from building TracM and discuss...

  2. A Distance Bounding Protocol for Location-Cloaked Applications.

    Science.gov (United States)

    Molina-Martínez, Cristián; Galdames, Patricio; Duran-Faundez, Cristian

    2018-04-26

    Location-based services (LBSs) assume that users are willing to release trustworthy and useful details about their whereabouts. However, many location privacy concerns have arisen. For location privacy protection, several algorithms build a cloaking region to hide a user’s location. However, many applications may not operate adequately on cloaked locations. For example, a traditional distance bounding protocol (DBP)—which is run by two nodes called the prover and the verifier—may conclude an untight and useless distance between these two entities. An LBS (verifier) may use this distance as a metric of usefulness and trustworthiness of the location claimed by the user (prover). However, we show that if a tight distance is desired, traditional DBP can refine a user’s cloaked location and compromise its location privacy. To find a proper balance, we propose a location-privacy-aware DBP protocol. Our solution consists of adding some small delays before submitting any user’s response. We show that several issues arise when a certain delay is chosen, and we propose some solutions. The effectiveness of our techniques in balancing location refinement and utility is demonstrated through simulation.

  3. A Distance Bounding Protocol for Location-Cloaked Applications

    Directory of Open Access Journals (Sweden)

    Cristián Molina-Martínez

    2018-04-01

    Full Text Available Location-based services (LBSs assume that users are willing to release trustworthy and useful details about their whereabouts. However, many location privacy concerns have arisen. For location privacy protection, several algorithms build a cloaking region to hide a user’s location. However, many applications may not operate adequately on cloaked locations. For example, a traditional distance bounding protocol (DBP—which is run by two nodes called the prover and the verifier—may conclude an untight and useless distance between these two entities. An LBS (verifier may use this distance as a metric of usefulness and trustworthiness of the location claimed by the user (prover. However, we show that if a tight distance is desired, traditional DBP can refine a user’s cloaked location and compromise its location privacy. To find a proper balance, we propose a location-privacy-aware DBP protocol. Our solution consists of adding some small delays before submitting any user’s response. We show that several issues arise when a certain delay is chosen, and we propose some solutions. The effectiveness of our techniques in balancing location refinement and utility is demonstrated through simulation.

  4. Location-Based Services and Privacy Protection Under Mobile Cloud Computing

    OpenAIRE

    Yan, Yan; Xiaohong, Hao; Wanjun, Wang

    2015-01-01

    Location-based services can provide personalized services based on location information of moving objects and have already been widely used in public safety services, transportation, entertainment and many other areas. With the rapid development of mobile communication technology and popularization of intelligent terminals, there will be great commercial prospects to provide location-based services under mobile cloud computing environment. However, the high adhesion degree of mobile terminals...

  5. Musings on privacy issues in health research involving disaggregate geographic data about individuals.

    Science.gov (United States)

    Boulos, Maged N Kamel; Curtis, Andrew J; Abdelmalik, Philip

    2009-07-20

    This paper offers a state-of-the-art overview of the intertwined privacy, confidentiality, and security issues that are commonly encountered in health research involving disaggregate geographic data about individuals. Key definitions are provided, along with some examples of actual and potential security and confidentiality breaches and related incidents that captured mainstream media and public interest in recent months and years. The paper then goes on to present a brief survey of the research literature on location privacy/confidentiality concerns and on privacy-preserving solutions in conventional health research and beyond, touching on the emerging privacy issues associated with online consumer geoinformatics and location-based services. The 'missing ring' (in many treatments of the topic) of data security is also discussed. Personal information and privacy legislations in two countries, Canada and the UK, are covered, as well as some examples of recent research projects and events about the subject. Select highlights from a June 2009 URISA (Urban and Regional Information Systems Association) workshop entitled 'Protecting Privacy and Confidentiality of Geographic Data in Health Research' are then presented. The paper concludes by briefly charting the complexity of the domain and the many challenges associated with it, and proposing a novel, 'one stop shop' case-based reasoning framework to streamline the provision of clear and individualised guidance for the design and approval of new research projects (involving geographical identifiers about individuals), including crisp recommendations on which specific privacy-preserving solutions and approaches would be suitable in each case.

  6. Privacy in Social Networks

    CERN Document Server

    Zheleva, Elena

    2012-01-01

    This synthesis lecture provides a survey of work on privacy in online social networks (OSNs). This work encompasses concerns of users as well as service providers and third parties. Our goal is to approach such concerns from a computer-science perspective, and building upon existing work on privacy, security, statistical modeling and databases to provide an overview of the technical and algorithmic issues related to privacy in OSNs. We start our survey by introducing a simple OSN data model and describe common statistical-inference techniques that can be used to infer potentially sensitive inf

  7. Gender and online privacy among teens: risk perception, privacy concerns, and protection behaviors.

    Science.gov (United States)

    Youn, Seounmi; Hall, Kimberly

    2008-12-01

    Survey data from 395 high school students revealed that girls perceive more privacy risks and have a higher level of privacy concerns than boys. Regarding privacy protection behaviors, boys tended to read unsolicited e-mail and register for Web sites while directly sending complaints in response to unsolicited e-mail. This study found girls to provide inaccurate information as their privacy concerns increased. Boys, however, refrained from registering to Web sites as their concerns increased.

  8. Context-Aware Generative Adversarial Privacy

    Directory of Open Access Journals (Sweden)

    Chong Huang

    2017-12-01

    Full Text Available Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP. GAP leverages recent advancements in generative adversarial networks (GANs to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals’ private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP’s performance, we investigate two simple (yet canonical statistical dataset models: (a the binary data model; and (b the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.

  9. Context-Aware Generative Adversarial Privacy

    Science.gov (United States)

    Huang, Chong; Kairouz, Peter; Chen, Xiao; Sankar, Lalitha; Rajagopal, Ram

    2017-12-01

    Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP). GAP leverages recent advancements in generative adversarial networks (GANs) to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals' private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP's performance, we investigate two simple (yet canonical) statistical dataset models: (a) the binary data model, and (b) the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion) match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.

  10. Privacy and internet services

    OpenAIRE

    Samec, Marek

    2010-01-01

    This thesis is focused on internet services user privacy. Goal of this thesis is to determine level of user awareness of how is their privacy approached while using internet services. Then suggest procedure to improve this awareness, or that will lead to better control of individual privacy. In theoretical part I analyze general and legislative approach to privacy, followed by analysis of behaviour of internet service users and providers. Part of this analysis deals with usage of web cookies ...

  11. Users Behavior in Location-Aware Services: Digital Natives versus Digital Immigrants

    Directory of Open Access Journals (Sweden)

    Marco Furini

    2014-01-01

    Full Text Available Location-aware services may expose users to privacy risks as they usually attach user’s location to the generated contents. Different studies have focused on privacy in location-aware services, but the results are often conflicting. Our hypothesis is that users are not fully aware of the features of the location-aware scenario and this lack of knowledge affects the results. Hence, in this paper we present a different approach: the analysis is conducted on two different groups of users (digital natives and digital immigrants and is divided into two steps: (i understanding users’ knowledge of a location-aware scenario and (ii investigating users’ opinion toward location-aware services after showing them an example of an effective location-aware service able to extract personal and sensitive information from contents publicly available in social media platforms. The analysis reveals that there is relation between users’ knowledge and users’ concerns toward privacy in location-aware services and also reveals that digital natives are more interested in the location-aware scenario than digital immigrants. The analysis also discloses that users’ concerns toward these services may be ameliorated if these services ask for users’ authorization and provide benefits to users. Other interesting findings allow us to draw guidelines that might be helpful in developing effective location-aware services.

  12. Reliable Collaborative Filtering on Spatio-Temporal Privacy Data

    Directory of Open Access Journals (Sweden)

    Zhen Liu

    2017-01-01

    Full Text Available Lots of multilayer information, such as the spatio-temporal privacy check-in data, is accumulated in the location-based social network (LBSN. When using the collaborative filtering algorithm for LBSN location recommendation, one of the core issues is how to improve recommendation performance by combining the traditional algorithm with the multilayer information. The existing approaches of collaborative filtering use only the sparse user-item rating matrix. It entails high computational complexity and inaccurate results. A novel collaborative filtering-based location recommendation algorithm called LGP-CF, which takes spatio-temporal privacy information into account, is proposed in this paper. By mining the users check-in behavior pattern, the dataset is segmented semantically to reduce the data size that needs to be computed. Then the clustering algorithm is used to obtain and narrow the set of similar users. User-location bipartite graph is modeled using the filtered similar user set. Then LGP-CF can quickly locate the location and trajectory of users through message propagation and aggregation over the graph. Through calculating users similarity by spatio-temporal privacy data on the graph, we can finally calculate the rating of recommendable locations. Experiments results on the physical clusters indicate that compared with the existing algorithms, the proposed LGP-CF algorithm can make recommendations more accurately.

  13. Musings on privacy issues in health research involving disaggregate geographic data about individuals

    Directory of Open Access Journals (Sweden)

    AbdelMalik Philip

    2009-07-01

    Full Text Available Abstract This paper offers a state-of-the-art overview of the intertwined privacy, confidentiality, and security issues that are commonly encountered in health research involving disaggregate geographic data about individuals. Key definitions are provided, along with some examples of actual and potential security and confidentiality breaches and related incidents that captured mainstream media and public interest in recent months and years. The paper then goes on to present a brief survey of the research literature on location privacy/confidentiality concerns and on privacy-preserving solutions in conventional health research and beyond, touching on the emerging privacy issues associated with online consumer geoinformatics and location-based services. The 'missing ring' (in many treatments of the topic of data security is also discussed. Personal information and privacy legislations in two countries, Canada and the UK, are covered, as well as some examples of recent research projects and events about the subject. Select highlights from a June 2009 URISA (Urban and Regional Information Systems Association workshop entitled 'Protecting Privacy and Confidentiality of Geographic Data in Health Research' are then presented. The paper concludes by briefly charting the complexity of the domain and the many challenges associated with it, and proposing a novel, 'one stop shop' case-based reasoning framework to streamline the provision of clear and individualised guidance for the design and approval of new research projects (involving geographical identifiers about individuals, including crisp recommendations on which specific privacy-preserving solutions and approaches would be suitable in each case.

  14. Damage Detection/Locating System Providing Thermal Protection

    Science.gov (United States)

    Woodard, Stanley E. (Inventor); Jones, Thomas W. (Inventor); Taylor, Bryant D. (Inventor); Qamar, A. Shams (Inventor)

    2010-01-01

    A damage locating system also provides thermal protection. An array of sensors substantially tiles an area of interest. Each sensor is a reflective-surface conductor having operatively coupled inductance and capacitance. A magnetic field response recorder is provided to interrogate each sensor before and after a damage condition. Changes in response are indicative of damage and a corresponding location thereof.

  15. Because we care: Privacy Dashboard on Firefox OS

    OpenAIRE

    Piekarska, Marta; Zhou, Yun; Strohmeier, Dominik; Raake, Alexander

    2015-01-01

    In this paper we present the Privacy Dashboard -- a tool designed to inform and empower the people using mobile devices, by introducing features such as Remote Privacy Protection, Backup, Adjustable Location Accuracy, Permission Control and Secondary-User Mode. We have implemented our solution on FirefoxOS and conducted user studies to verify the usefulness and usability of our tool. The paper starts with a discussion of different aspects of mobile privacy, how users perceive it and how much ...

  16. Where Dating Meets Data: Investigating Social and Institutional Privacy Concerns on Tinder

    Directory of Open Access Journals (Sweden)

    Christoph Lutz

    2017-03-01

    Full Text Available The widespread diffusion of location-based real-time dating or mobile dating apps, such as Tinder and Grindr, is changing dating practices. The affordances of these dating apps differ from those of “old school” dating sites, for example, by privileging picture-based selection, minimizing room for textual self-description, and drawing upon existing Facebook profile data. They might also affect users’ privacy perceptions as these services are location based and often include personal conversations and data. Based on a survey collected via Mechanical Turk, we assess how Tinder users perceive privacy concerns. We find that the users are more concerned about institutional privacy than social privacy. Moreover, different motivations for using Tinder—hooking up, relationship, friendship, travel, self-validation, and entertainment—affect social privacy concerns more strongly than institutional concerns. Finally, loneliness significantly increases users’ social and institutional privacy concerns, while narcissism decreases them.

  17. Privacy and Data-Based Research

    OpenAIRE

    Ori Heffetz; Katrina Ligett

    2013-01-01

    What can we, as users of microdata, formally guarantee to the individuals (or firms) in our dataset, regarding their privacy? We retell a few stories, well-known in data-privacy circles, of failed anonymization attempts in publicly released datasets. We then provide a mostly informal introduction to several ideas from the literature on differential privacy, an active literature in computer science that studies formal approaches to preserving the privacy of individuals in statistical databases...

  18. Private Sharing of User Location over Online Social Networks

    OpenAIRE

    Freudiger, Julien; Neu, Raoul; Hubaux, Jean-Pierre

    2010-01-01

    Online social networks increasingly allow mobile users to share their location with their friends. Much to the detriment of users’ privacy, this also means that social network operators collect users’ lo- cation. Similarly, third parties can learn users’ location from localization and location visualization services. Ideally, third-parties should not be given complete access to users’ location. To protect location privacy, we design and implement a platform-independent solution for users to s...

  19. Adding query privacy to robust DHTs

    DEFF Research Database (Denmark)

    Backes, Michael; Goldberg, Ian; Kate, Aniket

    2012-01-01

    intermediate peers that (help to) route the queries towards their destinations. In this paper, we satisfy this requirement by presenting an approach for providing privacy for the keys in DHT queries. We use the concept of oblivious transfer (OT) in communication over DHTs to preserve query privacy without...... privacy over robust DHTs. Finally, we compare the performance of our privacy-preserving protocols with their more privacy-invasive counterparts. We observe that there is no increase in the message complexity...

  20. Information privacy fundamentals for librarians and information professionals

    CERN Document Server

    Givens, Cherie L

    2014-01-01

    This book introduces library and information professionals to information privacy, provides an overview of information privacy in the library and information science context, U.S. privacy laws by sector, information privacy policy, and key considerations when planning and creating a privacy program.

  1. Big Brother’s Little Helpers: The Right to Privacy and the Responsibility of Internet Service Providers

    OpenAIRE

    Ronen, Yael

    2015-01-01

    Following the 2013 revelations on the extent of intelligence gathering through internet service providers, this article concerns the responsibility of internet service providers (ISPs) involved in disclosure of personal data to government authorities under the right to privacy, by reference to the developing, non-binding standards applied to businesses under the Protect, Respect and Remedy Framework. The article examines the manner in which the Framework applies to ISPs and looks at measures ...

  2. Privacy versus autonomy: a tradeoff model for smart home monitoring technologies.

    Science.gov (United States)

    Townsend, Daphne; Knoefel, Frank; Goubran, Rafik

    2011-01-01

    Smart homes are proposed as a new location for the delivery of healthcare services. They provide healthcare monitoring and communication services, by using integrated sensor network technologies. We validate a hypothesis regarding older adults' adoption of home monitoring technologies by conducting a literature review of articles studying older adults' attitudes and perceptions of sensor technologies. Using current literature to support the hypothesis, this paper applies the tradeoff model to decisions about sensor acceptance. Older adults are willing to trade privacy (by accepting a monitoring technology), for autonomy. As the information captured by the sensor becomes more intrusive and the infringement on privacy increases, sensors are accepted if the loss in privacy is traded for autonomy. Even video cameras, the most intrusive sensor type were accepted in exchange for the height of autonomy which is to remain in the home.

  3. 76 FR 52320 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2011-08-22

    ... & Privacy, and DoD Information Assurance Regulations. Auditing: Audit trail records from all available.../JS Privacy Office, Freedom of Information Directorate, Washington Headquarters Services, 1155 Defense... Defense. DHA 23 System name: Pharmacy Data Transaction Service (PDTS). System location: Primary: Emdeon...

  4. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  5. Protecting patron privacy

    CERN Document Server

    Beckstrom, Matthew

    2015-01-01

    In a world where almost anyone with computer savvy can hack, track, and record the online activities of others, your library can serve as a protected haven for your visitors who rely on the Internet to conduct research-if you take the necessary steps to safeguard their privacy. This book shows you how to protect patrons' privacy while using the technology that your library provides, including public computers, Internet access, wireless networks, and other devices. Logically organized into two major sections, the first part of the book discusses why the privacy of your users is of paramount

  6. Privacy driven internet ecosystem

    OpenAIRE

    Trinh, Tuan Anh; Gyarmati, Laszlo

    2012-01-01

    The dominant business model of today's Internet is built upon advertisements; users can access Internet services while the providers show ads to them. Although significant efforts have been made to model and analyze the economic aspects of this ecosystem, the heart of the current status quo, namely privacy, has not received the attention of the research community yet. Accordingly, we propose an economic model of the privacy driven Internet ecosystem where privacy is handled as an asset that c...

  7. Trust information-based privacy architecture for ubiquitous health.

    Science.gov (United States)

    Ruotsalainen, Pekka Sakari; Blobel, Bernd; Seppälä, Antto; Nykänen, Pirkko

    2013-10-08

    Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system

  8. Adding Query Privacy to Robust DHTs

    DEFF Research Database (Denmark)

    Backes, Michael; Goldberg, Ian; Kate, Aniket

    2011-01-01

    intermediate peers that (help to) route the queries towards their destinations. In this paper, we satisfy this requirement by presenting an approach for providing privacy for the keys in DHT queries. We use the concept of oblivious transfer (OT) in communication over DHTs to preserve query privacy without...... of obtaining query privacy over robust DHTs. Finally, we compare the performance of our privacy-preserving protocols with their more privacy-invasive counterparts. We observe that there is no increase in the message complexity and only a small overhead in the computational complexity....

  9. Privacy enhanced recommender system

    NARCIS (Netherlands)

    Erkin, Zekeriya; Erkin, Zekeriya; Beye, Michael; Veugen, Thijs; Lagendijk, Reginald L.

    2010-01-01

    Recommender systems are widely used in online applications since they enable personalized service to the users. The underlying collaborative filtering techniques work on user’s data which are mostly privacy sensitive and can be misused by the service provider. To protect the privacy of the users, we

  10. Efficient task assignment in spatial crowdsourcing with worker and task privacy protection

    KAUST Repository

    Liu, An

    2017-08-01

    Spatial crowdsourcing (SC) outsources tasks to a set of workers who are required to physically move to specified locations and accomplish tasks. Recently, it is emerging as a promising tool for emergency management, as it enables efficient and cost-effective collection of critical information in emergency such as earthquakes, when search and rescue survivors in potential ares are required. However in current SC systems, task locations and worker locations are all exposed in public without any privacy protection. SC systems if attacked thus have penitential risk of privacy leakage. In this paper, we propose a protocol for protecting the privacy for both workers and task requesters while maintaining the functionality of SC systems. The proposed protocol is built on partially homomorphic encryption schemes, and can efficiently realize complex operations required during task assignment over encrypted data through a well-designed computation strategy. We prove that the proposed protocol is privacy-preserving against semi-honest adversaries. Simulation on two real-world datasets shows that the proposed protocol is more effective than existing solutions and can achieve mutual privacy-preserving with acceptable computation and communication cost.

  11. Enhancing Privacy for Digital Rights Management

    NARCIS (Netherlands)

    Petkovic, M.; Conrado, C.; Schrijen, G.J.; Jonker, Willem

    2007-01-01

    This chapter addresses privacy issues in DRM systems. These systems provide a means of protecting digital content, but may violate the privacy of users in that the content they purchase and their actions in the system can be linked to specific users. The chapter proposes a privacy-preserving DRM

  12. Advanced research in data privacy

    CERN Document Server

    Torra, Vicenç

    2015-01-01

    This book provides an overview of the research work on data privacy and privacy enhancing technologies carried by the participants of the ARES project. ARES (Advanced Research in Privacy an Security, CSD2007-00004) has been one of the most important research projects funded by the Spanish Government in the fields of computer security and privacy. It is part of the now extinct CONSOLIDER INGENIO 2010 program, a highly competitive program which aimed to advance knowledge and open new research lines among top Spanish research groups. The project started in 2007 and will finish this 2014. Composed by 6 research groups from 6 different institutions, it has gathered an important number of researchers during its lifetime. Among the work produced by the ARES project, one specific work package has been related to privacy. This books gathers works produced by members of the project related to data privacy and privacy enhancing technologies. The presented works not only summarize important research carried in the proje...

  13. Privacy in the Sharing Economy

    DEFF Research Database (Denmark)

    Ranzini, Giulia; Etter, Michael; Lutz, Christoph

    ’s digital services through providing recommendations to Europe’s institutions. The initial stage of this research project involves a set of three literature reviews of the state of research on three core topics in relation to the sharing economy: participation (1), privacy (2), and power (3). This piece...... is a literature review on the topic of privacy. It addresses key privacy challenges for different stakeholders in the sharing economy. Throughout, we use the term "consumers" to refer to users on the receiving end (e.g., Airbnb guests, Uber passengers), "providers" to refer to users on the providing end (e.......g., Airbnb hosts, Uber drivers) and "platforms" to refer to the mediating sites, apps and infrastructures matching consumers and providers (e.g., Airbnb, Uber)....

  14. Providing strong Security and high privacy in low-cost RFID networks

    DEFF Research Database (Denmark)

    David, Mathieu; Prasad, Neeli R.

    2009-01-01

    Since the dissemination of Radio Frequency IDentification (RFID) tags is getting larger and larger, the requirement for strong security and privacy is also increasing. Low-cost and ultra-low-cost tags are being implemented on everyday products, and their limited resources constraints the security...

  15. Privacy-preserving distributed clustering

    DEFF Research Database (Denmark)

    Erkin, Zekeriya; Veugen, Thijs; Toft, Tomas

    2013-01-01

    with any other entity, including the service provider. Such privacy concerns lead to trust issues between entities, which clearly damages the functioning of the service and even blocks cooperation between entities with similar data sets. To enable joint efforts with private data, we propose a protocol......, or in some cases, information from different databases is pooled to enrich the data so that the merged database can improve the clustering effort. However, in either case, the content of the database may be privacy sensitive and/or commercially valuable such that the owners may not want to share their data...... provider with computations. Experimental results clearly indicate that the work we present is an efficient way of deploying a privacy-preserving clustering algorithm in a distributed manner....

  16. Smart Grid Privacy through Distributed Trust

    Science.gov (United States)

    Lipton, Benjamin

    Though the smart electrical grid promises many advantages in efficiency and reliability, the risks to consumer privacy have impeded its deployment. Researchers have proposed protecting privacy by aggregating user data before it reaches the utility, using techniques of homomorphic encryption to prevent exposure of unaggregated values. However, such schemes generally require users to trust in the correct operation of a single aggregation server. We propose two alternative systems based on secret sharing techniques that distribute this trust among multiple service providers, protecting user privacy against a misbehaving server. We also provide an extensive evaluation of the systems considered, comparing their robustness to privacy compromise, error handling, computational performance, and data transmission costs. We conclude that while all the systems should be computationally feasible on smart meters, the two methods based on secret sharing require much less computation while also providing better protection against corrupted aggregators. Building systems using these techniques could help defend the privacy of electricity customers, as well as customers of other utilities as they move to a more data-driven architecture.

  17. Online privacy: overview and preliminary research

    Directory of Open Access Journals (Sweden)

    Renata Mekovec

    2010-12-01

    Full Text Available Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4 Over the last decade using the Internet for online shopping, information browsing and searching as well as for online communication has become part of everyday life. Although the Internet technology has a lot of benefits for users, one of the most important disadvantages is related to the increasing capacity for users’ online activity surveillance. However, the users are increasingly becoming aware of online surveillance methods, which results in their increased concern for privacy protection. Numerous factors influence the way in which individuals perceive the level of privacy protection when they are online. This article provides a review of factors that influence the privacy perception of Internet users. Previous online privacy research related to e-business was predominantly focused on the dimension of information privacy and concerned with the way users’ personal information is collected, saved and used by an online company. This article’s main aim is to provide an overview of numerous Internet users’ privacy perception elements across various privacy dimensions as well as their potential categorization. In addition, considering that e-banking and online shopping are one of the most widely used e-services, an examination of online privacy perception of e-banking/online shopping users was performed. 

  18. Data Security and Privacy in Cloud Computing

    OpenAIRE

    Yunchuan Sun; Junsheng Zhang; Yongping Xiong; Guangyu Zhu

    2014-01-01

    Data security has consistently been a major issue in information technology. In the cloud computing environment, it becomes particularly serious because the data is located in different places even in all the globe. Data security and privacy protection are the two main factors of user’s concerns about the cloud technology. Though many techniques on the topics in cloud computing have been investigated in both academics and industries, data security and privacy protection are becoming more impo...

  19. User Privacy and Empowerment: Trends, Challenges, and Opportunities

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2018-01-01

    to the service providers. Considering business models that are slanted towards service provid-ers, privacy has become a crucial issue in today’s fast growing digital world. Hence, this paper elaborates personal information flow between users, service providers, and data brokers. We also discussed the significant...... privacy issues like present business models, user awareness about privacy and user control over per-sonal data. To address such issues, this paper also identified challenges that com-prise unavailability of effective privacy awareness or protection tools and the ef-fortless way to study and see the flow...... of personal information and its manage-ment. Thus, empowering users and enhancing awareness are essential to compre-hending the value of secrecy. This paper also introduced latest advances in the domain of privacy issues like User Managed Access (UMA) can state suitable requirements for user empowerment...

  20. Privacy-Preserving Data Publishing An Overview

    CERN Document Server

    Wong, Raymond Chi-Wing

    2010-01-01

    Privacy preservation has become a major issue in many data analysis applications. When a data set is released to other parties for data analysis, privacy-preserving techniques are often required to reduce the possibility of identifying sensitive information about individuals. For example, in medical data, sensitive information can be the fact that a particular patient suffers from HIV. In spatial data, sensitive information can be a specific location of an individual. In web surfing data, the information that a user browses certain websites may be considered sensitive. Consider a dataset conta

  1. For telehealth to succeed, privacy and security risks must be identified and addressed.

    Science.gov (United States)

    Hall, Joseph L; McGraw, Deven

    2014-02-01

    The success of telehealth could be undermined if serious privacy and security risks are not addressed. For example, sensors that are located in a patient's home or that interface with the patient's body to detect safety issues or medical emergencies may inadvertently transmit sensitive information about household activities. Similarly, routine data transmissions from an app or medical device, such as an insulin pump, may be shared with third-party advertisers. Without adequate security and privacy protections for underlying telehealth data and systems, providers and patients will lack trust in the use of telehealth solutions. Although some federal and state guidelines for telehealth security and privacy have been established, many gaps remain. No federal agency currently has authority to enact privacy and security requirements to cover the telehealth ecosystem. This article examines privacy risks and security threats to telehealth applications and summarizes the extent to which technical controls and federal law adequately address these risks. We argue for a comprehensive federal regulatory framework for telehealth, developed and enforced by a single federal entity, the Federal Trade Commission, to bolster trust and fully realize the benefits of telehealth.

  2. Privacy-preserving clinical decision support system using Gaussian kernel-based classification.

    Science.gov (United States)

    Rahulamathavan, Yogachandran; Veluru, Suresh; Phan, Raphael C-W; Chambers, Jonathon A; Rajarajan, Muttukrishnan

    2014-01-01

    A clinical decision support system forms a critical capability to link health observations with health knowledge to influence choices by clinicians for improved healthcare. Recent trends toward remote outsourcing can be exploited to provide efficient and accurate clinical decision support in healthcare. In this scenario, clinicians can use the health knowledge located in remote servers via the Internet to diagnose their patients. However, the fact that these servers are third party and therefore potentially not fully trusted raises possible privacy concerns. In this paper, we propose a novel privacy-preserving protocol for a clinical decision support system where the patients' data always remain in an encrypted form during the diagnosis process. Hence, the server involved in the diagnosis process is not able to learn any extra knowledge about the patient's data and results. Our experimental results on popular medical datasets from UCI-database demonstrate that the accuracy of the proposed protocol is up to 97.21% and the privacy of patient data is not compromised.

  3. An Effective Privacy Architecture to Preserve User Trajectories in Reward-Based LBS Applications

    Directory of Open Access Journals (Sweden)

    A S M Touhidul Hasan

    2018-02-01

    Full Text Available How can training performance data (e.g., running or walking routes be collected, measured, and published in a mobile program while preserving user privacy? This question is becoming important in the context of the growing use of reward-based location-based service (LBS applications, which aim to promote employee training activities and to share such data with insurance companies in order to reduce the healthcare insurance costs of an organization. One of the main concerns of such applications is the privacy of user trajectories, because the applications normally collect user locations over time with identities. The leak of the identified trajectories often results in personal privacy breaches. For instance, a trajectory would expose user interest in places and behaviors in time by inference and linking attacks. This information can be used for spam advertisements or individual-based assaults. To the best of our knowledge, no existing studies can be directly applied to solve the problem while keeping data utility. In this paper, we identify the personal privacy problem in a reward-based LBS application and propose privacy architecture with a bounded perturbation technique to protect user’s trajectory from the privacy breaches. Bounded perturbation uses global location set (GLS to anonymize the trajectory data. In addition, the bounded perturbation will not generate any visiting points that are not possible to visit in real time. The experimental results on real-world datasets demonstrate that the proposed bounded perturbation can effectively anonymize location information while preserving data utility compared to the existing methods.

  4. PRIVACY AS A CULTURAL PHENOMENON

    Directory of Open Access Journals (Sweden)

    Garfield Benjamin

    2017-07-01

    Full Text Available Privacy remains both contentious and ever more pertinent in contemporary society. Yet it persists as an ill-defined term, not only within specific fields but in its various uses and implications between and across technical, legal and political contexts. This article offers a new critical review of the history of privacy in terms of two dominant strands of thinking: freedom and property. These two conceptions of privacy can be seen as successive historical epochs brought together under digital technologies, yielding increasingly complex socio-technical dilemmas. By simplifying the taxonomy to its socio-cultural function, the article provides a generalisable, interdisciplinary approach to privacy. Drawing on new technologies, historical trends, sociological studies and political philosophy, the article presents a discussion of the value of privacy as a term, before proposing a defense of the term cyber security as a mode of scalable cognitive privacy that integrates the relative needs of individuals, governments and corporations.

  5. 5G Visions of User Privacy

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Khajuria, Samant; Skouby, Knud Erik

    2015-01-01

    Currently, the discussions are going on the elements and definition of 5G networks. One of the elements in this discussion is how to provide for user controlled privacy for securing users' digital interaction. The purpose of this paper is to present elements of user controlled privacy needed...... for the future 5G networks. The paper concludes that an ecosystem consisting of Trusted Third Party between the end user and the service providers as a distributed system could be integrated to secure the perspective of user controlled privacy for future systems...

  6. Privacy in wireless sensor networks using ring signature

    Directory of Open Access Journals (Sweden)

    Ashmita Debnath

    2014-07-01

    Full Text Available The veracity of a message from a sensor node must be verified in order to avoid a false reaction by the sink. This verification requires the authentication of the source node. The authentication process must also preserve the privacy such that the node and the sensed object are not endangered. In this work, a ring signature was proposed to authenticate the source node while preserving its spatial privacy. However, other nodes as signers and their numbers must be chosen to preclude the possibility of a traffic analysis attack by an adversary. The spatial uncertainty increases with the number of signers but requires larger memory size and communication overhead. This requirement can breach the privacy of the sensed object. To determine the effectiveness of the proposed scheme, the location estimate of a sensor node by an adversary and enhancement in the location uncertainty with a ring signature was evaluated. Using simulation studies, the ring signature was estimated to require approximately four members from the same neighbor region of the source node to sustain the privacy of the node. Furthermore, the ring signature was also determined to have a small overhead and not to adversely affect the performance of the sensor network.

  7. Toward Privacy-Preserving Personalized Recommendation Services

    Directory of Open Access Journals (Sweden)

    Cong Wang

    2018-02-01

    Full Text Available Recommendation systems are crucially important for the delivery of personalized services to users. With personalized recommendation services, users can enjoy a variety of targeted recommendations such as movies, books, ads, restaurants, and more. In addition, personalized recommendation services have become extremely effective revenue drivers for online business. Despite the great benefits, deploying personalized recommendation services typically requires the collection of users’ personal data for processing and analytics, which undesirably makes users susceptible to serious privacy violation issues. Therefore, it is of paramount importance to develop practical privacy-preserving techniques to maintain the intelligence of personalized recommendation services while respecting user privacy. In this paper, we provide a comprehensive survey of the literature related to personalized recommendation services with privacy protection. We present the general architecture of personalized recommendation systems, the privacy issues therein, and existing works that focus on privacy-preserving personalized recommendation services. We classify the existing works according to their underlying techniques for personalized recommendation and privacy protection, and thoroughly discuss and compare their merits and demerits, especially in terms of privacy and recommendation accuracy. We also identity some future research directions. Keywords: Privacy protection, Personalized recommendation services, Targeted delivery, Collaborative filtering, Machine learning

  8. Personalized privacy-preserving frequent itemset mining using randomized response.

    Science.gov (United States)

    Sun, Chongjing; Fu, Yan; Zhou, Junlin; Gao, Hui

    2014-01-01

    Frequent itemset mining is the important first step of association rule mining, which discovers interesting patterns from the massive data. There are increasing concerns about the privacy problem in the frequent itemset mining. Some works have been proposed to handle this kind of problem. In this paper, we introduce a personalized privacy problem, in which different attributes may need different privacy levels protection. To solve this problem, we give a personalized privacy-preserving method by using the randomized response technique. By providing different privacy levels for different attributes, this method can get a higher accuracy on frequent itemset mining than the traditional method providing the same privacy level. Finally, our experimental results show that our method can have better results on the frequent itemset mining while preserving personalized privacy.

  9. A Taxonomy of Privacy Constructs for Privacy-Sensitive Robotics

    OpenAIRE

    Rueben, Matthew; Grimm, Cindy M.; Bernieri, Frank J.; Smart, William D.

    2017-01-01

    The introduction of robots into our society will also introduce new concerns about personal privacy. In order to study these concerns, we must do human-subject experiments that involve measuring privacy-relevant constructs. This paper presents a taxonomy of privacy constructs based on a review of the privacy literature. Future work in operationalizing privacy constructs for HRI studies is also discussed.

  10. Efficient Dynamic Searchable Encryption with Forward Privacy

    Directory of Open Access Journals (Sweden)

    Etemad Mohammad

    2018-01-01

    Full Text Available Searchable symmetric encryption (SSE enables a client to perform searches over its outsourced encrypted files while preserving privacy of the files and queries. Dynamic schemes, where files can be added or removed, leak more information than static schemes. For dynamic schemes, forward privacy requires that a newly added file cannot be linked to previous searches. We present a new dynamic SSE scheme that achieves forward privacy by replacing the keys revealed to the server on each search. Our scheme is efficient and parallelizable and outperforms the best previous schemes providing forward privacy, and achieves competitive performance with dynamic schemes without forward privacy. We provide a full security proof in the random oracle model. In our experiments on the Wikipedia archive of about four million pages, the server takes one second to perform a search with 100,000 results.

  11. Disclosure of computerized health care information: provider privacy rights under supply side competition.

    Science.gov (United States)

    Watson, B L

    1981-01-01

    This Article explores the constitutional, statutory and common law privacy rights of physicians given the inescapable role of delivery data under supply side competition. The Article begins with a general review of the federal constitutional right of privacy. It then discusses the statutory protection given to physician-specific data under current federal law, and considers the insights gained from the controversy over physician data and the federal Freedom of Information Act. The remainder of the Article analyzes the usefulness of several common law causes of action to remedy the misuse of physician data, and concludes with recommendations which may obviate the need for litigation to protect against misuse of physician-specific data.

  12. Protecting multi-party privacy in location-aware social point-of-interest recommendation

    KAUST Repository

    Wang, Weiqi; Liu, An; Li, Zhixu; Zhang, Xiangliang; Li, Qing; Zhou, Xiaofang

    2018-01-01

    Point-of-interest (POI) recommendation has attracted much interest recently because of its significant business potential. Data used in POI recommendation (e.g., user-location check-in matrix) are much more sparse than that used in traditional item (e.g., book and movie) recommendation, which leads to more serious cold start problem. Social POI recommendation has proved to be an effective solution, but most existing works assume that recommenders have access to all required data. This is very rare in practice because these data are generally owned by different entities who are not willing to share their data with others due to privacy and legal concerns. In this paper, we first propose PLAS, a protocol which enables effective POI recommendation without disclosing the sensitive data of every party getting involved in the recommendation. We formally show PLAS is secure in the semi-honest adversary model. To improve its performance. We then adopt the technique of cloaking area by which expensive distance computation over encrypted data is replaced by cheap operation over plaintext. In addition, we utilize the sparsity of check-ins to selectively publish data, thus reducing encryption cost and avoiding unnecessary computation over ciphertext. Experiments on two real datasets show that our protocol is feasible and can scale to large POI recommendation problems in practice.

  13. Protecting multi-party privacy in location-aware social point-of-interest recommendation

    KAUST Repository

    Wang, Weiqi

    2018-04-04

    Point-of-interest (POI) recommendation has attracted much interest recently because of its significant business potential. Data used in POI recommendation (e.g., user-location check-in matrix) are much more sparse than that used in traditional item (e.g., book and movie) recommendation, which leads to more serious cold start problem. Social POI recommendation has proved to be an effective solution, but most existing works assume that recommenders have access to all required data. This is very rare in practice because these data are generally owned by different entities who are not willing to share their data with others due to privacy and legal concerns. In this paper, we first propose PLAS, a protocol which enables effective POI recommendation without disclosing the sensitive data of every party getting involved in the recommendation. We formally show PLAS is secure in the semi-honest adversary model. To improve its performance. We then adopt the technique of cloaking area by which expensive distance computation over encrypted data is replaced by cheap operation over plaintext. In addition, we utilize the sparsity of check-ins to selectively publish data, thus reducing encryption cost and avoiding unnecessary computation over ciphertext. Experiments on two real datasets show that our protocol is feasible and can scale to large POI recommendation problems in practice.

  14. Consumer Attitudes and Perceptions on mHealth Privacy and Security: Findings From a Mixed-Methods Study.

    Science.gov (United States)

    Atienza, Audie A; Zarcadoolas, Christina; Vaughon, Wendy; Hughes, Penelope; Patel, Vaishali; Chou, Wen-Ying Sylvia; Pritts, Joy

    2015-01-01

    This study examined consumers' attitudes and perceptions regarding mobile health (mHealth) technology use in health care. Twenty-four focus groups with 256 participants were conducted in 5 geographically diverse locations. Participants were also diverse in age, education, race/ethnicity, gender, and rural versus urban settings. Several key themes emerged from the focus groups. Findings suggest that consumer attitudes regarding mHealth privacy/security are highly contextualized, with concerns depending on the type of information being communicated, where and when the information is being accessed, who is accessing or seeing the information, and for what reasons. Consumers frequently considered the tradeoffs between the privacy/security of using mHealth technologies and the potential benefits. Having control over mHealth privacy/security features and trust in providers were important issues for consumers. Overall, this study found significant diversity in attitudes regarding mHealth privacy/security both within and between traditional demographic groups. Thus, to address consumers' concerns regarding mHealth privacy and security, a one-size-fits-all approach may not be adequate. Health care providers and technology developers should consider tailoring mHealth technology according to how various types of information are communicated in the health care setting, as well as according to the comfort, skills, and concerns individuals may have with mHealth technology.

  15. Privacy under construction : A developmental perspective on privacy perception

    NARCIS (Netherlands)

    Steijn, W.M.P.; Vedder, A.H.

    2015-01-01

    We present a developmental perspective regarding the difference in perceptions toward privacy between young and old. Here, we introduce the notion of privacy conceptions, that is, the specific ideas that individuals have regarding what privacy actually is. The differences in privacy concerns often

  16. What's that, you say? Employee expectations of privacy when using employer-provided technology--and how employers can defeat them.

    Science.gov (United States)

    Herrin, Barry S

    2012-01-01

    Two 2010 court cases that determined the effectiveness of policies governing employees' use of employer-provided communication devices can be used to guide employers when constructing their own technology policies. In light of a policy that stated that "users should have no expectation of privacy or confidentiality," one case established that the employer was in the right. However, a separate case favored the employee due, in part, to an "unclear and ambiguous" policy. Ultimately, employers can restrict the use of employer-furnished technology by employees by: 1) clearly outlining that employees do not have a reasonable expectation of privacy in their use of company devices; 2) stating that any use of personal e-mail accounts using employer-provided technology will be subject to the policy; 3) detailing all technology used to monitor employees; 4) identifying company devices covered; 5) not exposing the content of employee communications; and 6) having employees sign and acknowledge the policy.

  17. Privacy Bridges: EU and US Privacy Experts In Search of Transatlantic Privacy Solutions

    NARCIS (Netherlands)

    Abramatic, J.-F.; Bellamy, B.; Callahan, M.E.; Cate, F.; van Eecke, P.; van Eijk, N.; Guild, E.; de Hert, P.; Hustinx, P.; Kuner, C.; Mulligan, D.; O'Connor, N.; Reidenberg, J.; Rubinstein, I.; Schaar, P.; Shadbolt, N.; Spiekermann, S.; Vladeck, D.; Weitzner, D.J.; Zuiderveen Borgesius, F.; Hagenauw, D.; Hijmans, H.

    2015-01-01

    The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the

  18. A Generic Privacy Quantification Framework for Privacy-Preserving Data Publishing

    Science.gov (United States)

    Zhu, Zutao

    2010-01-01

    In recent years, the concerns about the privacy for the electronic data collected by government agencies, organizations, and industries are increasing. They include individual privacy and knowledge privacy. Privacy-preserving data publishing is a research branch that preserves the privacy while, at the same time, withholding useful information in…

  19. How Well Are We Respecting Patient Privacy in Medical Imaging? Lessons Learnt From a Departmental Audit.

    Science.gov (United States)

    Dilauro, Marc; Thornhill, Rebecca; Fasih, Najla

    2016-11-01

    Preservation of patient privacy and dignity are basic requirements for all patients visiting a hospital. The purpose of this study was to perform an audit of patients' satisfaction with privacy whilst in the Department of Medical Imaging (MI) at the Civic Campus of the Ottawa Hospital. Outpatients who underwent magnetic resonance imaging (MRI), computed tomography (CT), ultrasonography (US), and plain film (XR) examinations were provided with a survey on patient privacy. The survey asked participants to rank (on a 6-point scale ranging from 6 = excellent to 1 = no privacy) whether their privacy was respected in 5 key locations within the Department of MI. The survey was conducted over a consecutive 5-day period. A total of 502 surveys were completed. The survey response rate for each imaging modality was: 55% MRI, 42% CT, 45% US, and 47% XR. For each imaging modality, the total percentage of privacy scores greater than or equal to 5 were: 98% MRI, 96% CT, 94% US, and 92% XR. Privacy ratings for the MRI reception and waiting room areas were significantly higher in comparison to the other imaging modalities (P = .0025 and P = .0227, respectively). Overall, patient privacy was well respected within the Department of MI. Copyright © 2016 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  20. Extending SQL to Support Privacy Policies

    Science.gov (United States)

    Ghazinour, Kambiz; Pun, Sampson; Majedi, Maryam; Chinaci, Amir H.; Barker, Ken

    Increasing concerns over Internet applications that violate user privacy by exploiting (back-end) database vulnerabilities must be addressed to protect both customer privacy and to ensure corporate strategic assets remain trustworthy. This chapter describes an extension onto database catalogues and Structured Query Language (SQL) for supporting privacy in Internet applications, such as in social networks, e-health, e-governmcnt, etc. The idea is to introduce new predicates to SQL commands to capture common privacy requirements, such as purpose, visibility, generalization, and retention for both mandatory and discretionary access control policies. The contribution is that corporations, when creating the underlying databases, will be able to define what their mandatory privacy policies arc with which all application users have to comply. Furthermore, each application user, when providing their own data, will be able to define their own privacy policies with which other users have to comply. The extension is supported with underlying catalogues and algorithms. The experiments demonstrate a very reasonable overhead for the extension. The result is a low-cost mechanism to create new systems that arc privacy aware and also to transform legacy databases to their privacy-preserving equivalents. Although the examples arc from social networks, one can apply the results to data security and user privacy of other enterprises as well.

  1. Choose Privacy Week: Educate Your Students (and Yourself) about Privacy

    Science.gov (United States)

    Adams, Helen R.

    2016-01-01

    The purpose of "Choose Privacy Week" is to encourage a national conversation to raise awareness of the growing threats to personal privacy online and in day-to-day life. The 2016 Choose Privacy Week theme is "respecting individuals' privacy," with an emphasis on minors' privacy. A plethora of issues relating to minors' privacy…

  2. 75 FR 63703 - Privacy Act of 1974; Privacy Act Regulation

    Science.gov (United States)

    2010-10-18

    ... FEDERAL RESERVE SYSTEM 12 CFR Part 261a [Docket No. R-1313] Privacy Act of 1974; Privacy Act... implementing the Privacy Act of 1974 (Privacy Act). The primary changes concern the waiver of copying fees... records under the Privacy Act; the amendment of special procedures for the release of medical records to...

  3. Privacy-preserving digital rights management

    NARCIS (Netherlands)

    Conrado, C.; Petkovic, M.; Jonker, W.; Jonker, W.; Petkovic, M.

    2004-01-01

    DRM systems provide a means for protecting digital content, but at the same time they violate the privacy of users in a number of ways. This paper addresses privacy issues in DRM systems. The main challenge is how to allow a user to interact with the system in an anonymous/pseudonymous way, while

  4. Designing Privacy for You : A User Centric Approach For Privacy

    OpenAIRE

    Senarath, Awanthika; Arachchilage, Nalin A. G.; Slay, Jill

    2017-01-01

    Privacy directly concerns the user as the data owner (data- subject) and hence privacy in systems should be implemented in a manner which concerns the user (user-centered). There are many concepts and guidelines that support development of privacy and embedding privacy into systems. However, none of them approaches privacy in a user- centered manner. Through this research we propose a framework that would enable developers and designers to grasp privacy in a user-centered manner and implement...

  5. The Privacy Coach: Supporting customer privacy in the Internet of Things

    OpenAIRE

    Broenink, Gerben; Hoepman, Jaap-Henk; Hof, Christian van 't; van Kranenburg, Rob; Smits, David; Wisman, Tijmen

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing technologies on the RFID tags themselves. Instead the Privacy Coach functions as a mediator between customer privacy preferences and corporate privacy policies, trying to find a match between the ...

  6. Users or Students? Privacy in University MOOCS.

    Science.gov (United States)

    Jones, Meg Leta; Regner, Lucas

    2016-10-01

    Two terms, student privacy and Massive Open Online Courses, have received a significant amount of attention recently. Both represent interesting sites of change in entrenched structures, one educational and one legal. MOOCs represent something college courses have never been able to provide: universal access. Universities not wanting to miss the MOOC wave have started to build MOOC courses and integrate them into the university system in various ways. However, the design and scale of university MOOCs create tension for privacy laws intended to regulate information practices exercised by educational institutions. Are MOOCs part of the educational institutions these laws and policies aim to regulate? Are MOOC users students whose data are protected by aforementioned laws and policies? Many university researchers and faculty members are asked to participate as designers and instructors in MOOCs but may not know how to approach the issues proposed. While recent scholarship has addressed the disruptive nature of MOOCs, student privacy generally, and data privacy in the K-12 system, we provide an in-depth description and analysis of the MOOC phenomenon and the privacy laws and policies that guide and regulate educational institutions today. We offer privacy case studies of three major MOOC providers active in the market today to reveal inconsistencies among MOOC platform and the level and type of legal uncertainty surrounding them. Finally, we provide a list of organizational questions to pose internally to navigate the uncertainty presented to university MOOC teams.

  7. 32 CFR 701.118 - Privacy, IT, and PIAs.

    Science.gov (United States)

    2010-07-01

    ...) Development. Privacy must be considered when requirements are being analyzed and decisions are being made...-347) directs agencies to conduct reviews of how privacy issues are considered when purchasing or... a PIA to effectively address privacy factors. Guidance is provided at http://www.doncio.navy.mil. (f...

  8. Targeted advertising on the handset: Privacy and security challenges

    OpenAIRE

    Haddadi, Hamed; Hui, Pan; Henderson, Tristan; Brown, Ian

    2011-01-01

    Online advertising is currently a rich source of revenue for many Internet giants. With the ever-increasing number of smart phones, there is a fertile market for personalised and localised advertising. A key benefit of using mobile phones is to take advantage of the significant amount of information on phones — such as locations of interest to the user — in order to provide personalised advertisements. Preservation of user privacy, however, is essential for successful deployment of such a sys...

  9. Customer privacy on UK healthcare websites.

    Science.gov (United States)

    Mundy, Darren P

    2006-09-01

    Privacy has been and continues to be one of the key challenges of an age devoted to the accumulation, processing, and mining of electronic information. In particular, privacy of healthcare-related information is seen as a key issue as health organizations move towards the electronic provision of services. The aim of the research detailed in this paper has been to analyse privacy policies on popular UK healthcare-related websites to determine the extent to which consumer privacy is protected. The author has combined approaches (such as approaches focused on usability, policy content, and policy quality) used in studies by other researchers on e-commerce and US healthcare websites to provide a comprehensive analysis of UK healthcare privacy policies. The author identifies a wide range of issues related to the protection of consumer privacy through his research analysis using quantitative results. The main outcomes from the author's research are that only 61% of healthcare-related websites in their sample group posted privacy policies. In addition, most of the posted privacy policies had poor readability standards and included a variety of privacy vulnerability statements. Overall, the author's findings represent significant current issues in relation to healthcare information protection on the Internet. The hope is that raising awareness of these results will drive forward changes in the industry, similar to those experienced with information quality.

  10. Privacy transparency patterns

    NARCIS (Netherlands)

    Siljee B.I.J.

    2015-01-01

    This paper describes two privacy patterns for creating privacy transparency: the Personal Data Table pattern and the Privacy Policy Icons pattern, as well as a full overview of privacy transparency patterns. It is a first step in creating a full set of privacy design patterns, which will aid

  11. Spatio-Temporal Data Mining for Location-Based Services

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo

    . The objectives of the presented thesis are three-fold. First, to extend popular data mining methods to the spatio-temporal domain. Second, to demonstrate the usefulness of the extended methods and the derived knowledge in promising LBS examples. Finally, to eliminate privacy concerns in connection with spatio......-temporal data mining by devising systems for privacy-preserving location data collection and mining.......Location-Based Services (LBS) are continuously gaining popularity. Innovative LBSes integrate knowledge about the users into the service. Such knowledge can be derived by analyzing the location data of users. Such data contain two unique dimensions, space and time, which need to be analyzed...

  12. Space in Space: Designing for Privacy in the Workplace

    Science.gov (United States)

    Akin, Jonie

    2015-01-01

    Privacy is cultural, socially embedded in the spatial, temporal, and material aspects of the lived experience. Definitions of privacy are as varied among scholars as they are among those who fight for their personal rights in the home and the workplace. Privacy in the workplace has become a topic of interest in recent years, as evident in discussions on Big Data as well as the shrinking office spaces in which people work. An article in The New York Times published in February of this year noted that "many companies are looking to cut costs, and one way to do that is by trimming personal space". Increasingly, organizations ranging from tech start-ups to large corporations are downsizing square footage and opting for open-office floorplans hoping to trim the budget and spark creative, productive communication among their employees. The question of how much is too much to trim when it comes to privacy, is one that is being actively addressed by the National Aeronautics and Space Administration (NASA) as they explore habitat designs for future space missions. NASA recognizes privacy as a design-related stressor impacting human health and performance. Given the challenges of sustaining life in an isolated, confined, and extreme environment such as Mars, NASA deems it necessary to determine the acceptable minimal amount for habitable volume for activities requiring at least some level of privacy in order to support optimal crew performance. Ethnographic research was conducted in 2013 to explore perceptions of privacy and privacy needs among astronauts living and working in space as part of a long-distance, long-duration mission. The allocation of space, or habitable volume, becomes an increasingly complex issue in outer space due to the costs associated with maintaining an artificial, confined environment bounded by limitations of mass while located in an extreme environment. Privacy in space, or space in space, provides a unique case study of the complex notions of

  13. What was privacy?

    Science.gov (United States)

    McCreary, Lew

    2008-10-01

    Why is that question in the past tense? Because individuals can no longer feel confident that the details of their lives--from identifying numbers to cultural preferences--will be treated with discretion rather than exploited. Even as Facebook users happily share the names of their favorite books, movies, songs, and brands, they often regard marketers' use of that information as an invasion of privacy. In this wide-ranging essay, McCreary, a senior editor at HBR, examines numerous facets of the privacy issue, from Google searches, public shaming on the internet, and cell phone etiquette to passenger screening devices, public surveillance cameras, and corporate chief privacy officers. He notes that IBM has been a leader on privacy; its policy forswearing the use of employees' genetic information in hiring and benefits decisions predated the federal Genetic Information Nondiscrimination Act by three years. Now IBM is involved in an open-source project known as Higgins to provide users with transportable, potentially anonymous online presences. Craigslist, whose CEO calls it "as close to 100% user driven as you can get," has taken an extremely conservative position on privacy--perhaps easier for a company with a declared lack of interest in maximizing revenue. But TJX and other corporate victims of security breaches have discovered that retaining consumers' transaction information can be both costly and risky. Companies that underestimate the importance of privacy to their customers or fail to protect it may eventually face harsh regulation, reputational damage, or both. The best thing they can do, says the author, is negotiate directly with those customers over where to draw the line.

  14. Privacy Awareness: A Means to Solve the Privacy Paradox?

    Science.gov (United States)

    Pötzsch, Stefanie

    People are limited in their resources, i.e. they have limited memory capabilities, cannot pay attention to too many things at the same time, and forget much information after a while; computers do not suffer from these limitations. Thus, revealing personal data in electronic communication environments and being completely unaware of the impact of privacy might cause a lot of privacy issues later. Even if people are privacy aware in general, the so-called privacy paradox shows that they do not behave according to their stated attitudes. This paper discusses explanations for the existing dichotomy between the intentions of people towards disclosure of personal data and their behaviour. We present requirements on tools for privacy-awareness support in order to counteract the privacy paradox.

  15. Digital privacy in the marketplace perspectives on the information exchange

    CERN Document Server

    Milne, George

    2015-01-01

    Digital Privacy in the Marketplace focuses on the data ex-changes between marketers and consumers, with special ttention to the privacy challenges that are brought about by new information technologies. The purpose of this book is to provide a background source to help the reader think more deeply about the impact of privacy issues on both consumers and marketers. It covers topics such as: why privacy is needed, the technological, historical and academic theories of privacy, how market exchange af-fects privacy, what are the privacy harms and protections available, and what is the likely future of privacy.

  16. Privacy Policy

    Science.gov (United States)

    ... Home → NLM Privacy Policy URL of this page: https://medlineplus.gov/privacy.html NLM Privacy Policy To ... out of cookies in the most popular browsers, http://www.usa.gov/optout_instructions.shtml. Please note ...

  17. Concentrated Differential Privacy

    OpenAIRE

    Dwork, Cynthia; Rothblum, Guy N.

    2016-01-01

    We introduce Concentrated Differential Privacy, a relaxation of Differential Privacy enjoying better accuracy than both pure differential privacy and its popular "(epsilon,delta)" relaxation without compromising on cumulative privacy loss over multiple computations.

  18. A Secure and Privacy-Preserving Targeted Ad-System

    Science.gov (United States)

    Androulaki, Elli; Bellovin, Steven M.

    Thanks to its low product-promotion cost and its efficiency, targeted online advertising has become very popular. Unfortunately, being profile-based, online advertising methods violate consumers' privacy, which has engendered resistance to the ads. However, protecting privacy through anonymity seems to encourage click-fraud. In this paper, we define consumer's privacy and present a privacy-preserving, targeted ad system (PPOAd) which is resistant towards click fraud. Our scheme is structured to provide financial incentives to all entities involved.

  19. Health Records and the Cloud Computing Paradigm from a Privacy Perspective

    OpenAIRE

    Stingl, Christian; Slamanig, Daniel

    2011-01-01

    With the advent of cloud computing, the realization of highly available electronic health records providing location-independent access seems to be very promising. However, cloud computing raises major security issues that need to be addressed particularly within the health care domain. The protection of the privacy of individuals often seems to be left on the sidelines. For instance, common protection against malicious insiders, i.e., non-disclosure agreements, is purely organizational. Clea...

  20. Privacy Challenges of Genomic Big Data.

    Science.gov (United States)

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  1. Vehicular ad hoc network security and privacy

    CERN Document Server

    Lin, X

    2015-01-01

    Unlike any other book in this area, this book provides innovative solutions to security issues, making this book a must read for anyone working with or studying security measures. Vehicular Ad Hoc Network Security and Privacy mainly focuses on security and privacy issues related to vehicular communication systems. It begins with a comprehensive introduction to vehicular ad hoc network and its unique security threats and privacy concerns and then illustrates how to address those challenges in highly dynamic and large size wireless network environments from multiple perspectives. This book is richly illustrated with detailed designs and results for approaching security and privacy threats.

  2. Privacy Management and Networked PPD Systems - Challenges Solutions.

    Science.gov (United States)

    Ruotsalainen, Pekka; Pharow, Peter; Petersen, Francoise

    2015-01-01

    Modern personal portable health devices (PPDs) become increasingly part of a larger, inhomogeneous information system. Information collected by sensors are stored and processed in global clouds. Services are often free of charge, but at the same time service providers' business model is based on the disclosure of users' intimate health information. Health data processed in PPD networks is not regulated by health care specific legislation. In PPD networks, there is no guarantee that stakeholders share same ethical principles with the user. Often service providers have own security and privacy policies and they rarely offer to the user possibilities to define own, or adapt existing privacy policies. This all raises huge ethical and privacy concerns. In this paper, the authors have analyzed privacy challenges in PPD networks from users' viewpoint using system modeling method and propose the principle "Personal Health Data under Personal Control" must generally be accepted at global level. Among possible implementation of this principle, the authors propose encryption, computer understandable privacy policies, and privacy labels or trust based privacy management methods. The latter can be realized using infrastructural trust calculation and monitoring service. A first step is to require the protection of personal health information and the principle proposed being internationally mandatory. This requires both regulatory and standardization activities, and the availability of open and certified software application which all service providers can implement. One of those applications should be the independent Trust verifier.

  3. Privacy-Preserving Restricted Boltzmann Machine

    Directory of Open Access Journals (Sweden)

    Yu Li

    2014-01-01

    Full Text Available With the arrival of the big data era, it is predicted that distributed data mining will lead to an information technology revolution. To motivate different institutes to collaborate with each other, the crucial issue is to eliminate their concerns regarding data privacy. In this paper, we propose a privacy-preserving method for training a restricted boltzmann machine (RBM. The RBM can be got without revealing their private data to each other when using our privacy-preserving method. We provide a correctness and efficiency analysis of our algorithms. The comparative experiment shows that the accuracy is very close to the original RBM model.

  4. Privacy Policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy; Petkovic, M.; Jonker, W.; Jonker, Willem

    2007-01-01

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  5. A Privacy-Preserving Framework for Trust-Oriented Point-of-Interest Recommendation

    KAUST Repository

    Liu, An; Wang, Weiqi; Li, Zhixu; Liu, Guanfeng; Li, Qing; Zhou, Xiaofang; Zhang, Xiangliang

    2017-01-01

    Point-of-Interest (POI) recommendation has attracted many interests recently because of its significant potential for helping users to explore new places and helping LBS providers to carry out precision marketing. Compared with the user-item rating matrix in conventional recommender systems, the user-location check-in matrix in POI recommendation is usually much more sparse, which makes the notorious cold start problem more prominent in POI recommendation. Trust-oriented recommendation is an effective way to deal with this problem but it requires that the recommender has access to user check-in and trust data. In practice, however, these data are usually owned by different businesses who are not willing to share their data with the recommender mainly due to privacy and legal concerns. In this paper, we propose a privacy-preserving framework to boost data owners willingness to share their data with untrustworthy businesses. More specifically, we utilize partially homomorphic encryption to design two protocols for privacy-preserving trustoriented POI recommendation. By offline encryption and parallel computing, these protocols can efficiently protect the private data of every party involved in the recommendation. We prove that the proposed protocols are secure against semi-honest adversaries. Experiments on both synthetic data and real data show that our protocols can achieve privacy-preserving with acceptable computation and communication cost.

  6. A Privacy-Preserving Framework for Trust-Oriented Point-of-Interest Recommendation

    KAUST Repository

    Liu, An

    2017-10-23

    Point-of-Interest (POI) recommendation has attracted many interests recently because of its significant potential for helping users to explore new places and helping LBS providers to carry out precision marketing. Compared with the user-item rating matrix in conventional recommender systems, the user-location check-in matrix in POI recommendation is usually much more sparse, which makes the notorious cold start problem more prominent in POI recommendation. Trust-oriented recommendation is an effective way to deal with this problem but it requires that the recommender has access to user check-in and trust data. In practice, however, these data are usually owned by different businesses who are not willing to share their data with the recommender mainly due to privacy and legal concerns. In this paper, we propose a privacy-preserving framework to boost data owners willingness to share their data with untrustworthy businesses. More specifically, we utilize partially homomorphic encryption to design two protocols for privacy-preserving trustoriented POI recommendation. By offline encryption and parallel computing, these protocols can efficiently protect the private data of every party involved in the recommendation. We prove that the proposed protocols are secure against semi-honest adversaries. Experiments on both synthetic data and real data show that our protocols can achieve privacy-preserving with acceptable computation and communication cost.

  7. Location, Location, Location: Does Place Provide the Opportunity for Differentiation for Universities?

    Science.gov (United States)

    Winter, Emma; Thompson-Whiteside, Helen

    2017-01-01

    The fiercely competitive HE market has led HEIs to invest significant resources in building a distinct identity. An HEI's location forms an inherent part of its identity and the uniqueness of location offers an opportunity to differentiate. However there has been limited examination of how location is used by HEIs and little consideration of how…

  8. Privacy policies

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, S.; Hartog, den J.I.; Petkovic, M.; Jonker, W.

    2007-01-01

    Privacy is a prime concern in today’s information society. To protect the privacy of individuals, enterprises must follow certain privacy practices while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website, processes

  9. Privacy amplification for quantum key distribution

    International Nuclear Information System (INIS)

    Watanabe, Yodai

    2007-01-01

    This paper examines classical privacy amplification using a universal family of hash functions. In quantum key distribution, the adversary's measurement can wait until the choice of hash functions is announced, and so the adversary's information may depend on the choice. Therefore the existing result on classical privacy amplification, which assumes the independence of the choice from the other random variables, is not applicable to this case. This paper provides a security proof of privacy amplification which is valid even when the adversary's information may depend on the choice of hash functions. The compression rate of the proposed privacy amplification can be taken to be the same as that of the existing one with an exponentially small loss in secrecy of a final key. (fast track communication)

  10. PAVS: A New Privacy-Preserving Data Aggregation Scheme for Vehicle Sensing Systems.

    Science.gov (United States)

    Xu, Chang; Lu, Rongxing; Wang, Huaxiong; Zhu, Liehuang; Huang, Cheng

    2017-03-03

    Air pollution has become one of the most pressing environmental issues in recent years. According to a World Health Organization (WHO) report, air pollution has led to the deaths of millions of people worldwide. Accordingly, expensive and complex air-monitoring instruments have been exploited to measure air pollution. Comparatively, a vehicle sensing system (VSS), as it can be effectively used for many purposes and can bring huge financial benefits in reducing high maintenance and repair costs, has received considerable attention. However, the privacy issues of VSS including vehicles' location privacy have not been well addressed. Therefore, in this paper, we propose a new privacy-preserving data aggregation scheme, called PAVS, for VSS. Specifically, PAVS combines privacy-preserving classification and privacy-preserving statistics on both the mean E(·) and variance Var(·), which makes VSS more promising, as, with minimal privacy leakage, more vehicles are willing to participate in sensing. Detailed analysis shows that the proposed PAVS can achieve the properties of privacy preservation, data accuracy and scalability. In addition, the performance evaluations via extensive simulations also demonstrate its efficiency.

  11. 76 FR 64115 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-092)] Privacy Act of 1974; Privacy Act... retirement of one Privacy Act system of records notice. SUMMARY: In accordance with the Privacy Act of 1974, NASA is giving notice that it proposes to cancel the following Privacy Act system of records notice...

  12. Genetic privacy.

    Science.gov (United States)

    Sankar, Pamela

    2003-01-01

    During the past 10 years, the number of genetic tests performed more than tripled, and public concern about genetic privacy emerged. The majority of states and the U.S. government have passed regulations protecting genetic information. However, research has shown that concerns about genetic privacy are disproportionate to known instances of information misuse. Beliefs in genetic determinacy explain some of the heightened concern about genetic privacy. Discussion of the debate over genetic testing within families illustrates the most recent response to genetic privacy concerns.

  13. Preliminary Analysis of Google+'s Privacy

    OpenAIRE

    Mahmood, Shah; Desmedt, Yvo

    2011-01-01

    In this paper we provide a preliminary analysis of Google+ privacy. We identified that Google+ shares photo metadata with users who can access the photograph and discuss its potential impact on privacy. We also identified that Google+ encourages the provision of other names including maiden name, which may help criminals performing identity theft. We show that Facebook lists are a superset of Google+ circles, both functionally and logically, even though Google+ provides a better user interfac...

  14. Privacy og selvbeskrivelse

    DEFF Research Database (Denmark)

    Rosengaard, Hans Ulrik

    2015-01-01

    En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse......En beskrivelse af feltet for forskning i Privacy med særligt henblik på privacys betydning for muligheden for at styre sin egen selvbeskrivelse...

  15. Service Outsourcing Character Oriented Privacy Conflict Detection Method in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Changbo Ke

    2014-01-01

    Full Text Available Cloud computing has provided services for users as a software paradigm. However, it is difficult to ensure privacy information security because of its opening, virtualization, and service outsourcing features. Therefore how to protect user privacy information has become a research focus. In this paper, firstly, we model service privacy policy and user privacy preference with description logic. Secondly, we use the pellet reasonor to verify the consistency and satisfiability, so as to detect the privacy conflict between services and user. Thirdly, we present the algorithm of detecting privacy conflict in the process of cloud service composition and prove the correctness and feasibility of this method by case study and experiment analysis. Our method can reduce the risk of user sensitive privacy information being illegally used and propagated by outsourcing services. In the meantime, the method avoids the exception in the process of service composition by the privacy conflict, and improves the trust degree of cloud service providers.

  16. Couldn't or wouldn't? The influence of privacy concerns and self-efficacy in privacy management on privacy protection.

    Science.gov (United States)

    Chen, Hsuan-Ting; Chen, Wenghong

    2015-01-01

    Sampling 515 college students, this study investigates how privacy protection, including profile visibility, self-disclosure, and friending, are influenced by privacy concerns and efficacy regarding one's own ability to manage privacy settings, a factor that researchers have yet to give a great deal of attention to in the context of social networking sites (SNSs). The results of this study indicate an inconsistency in adopting strategies to protect privacy, a disconnect from limiting profile visibility and friending to self-disclosure. More specifically, privacy concerns lead SNS users to limit their profile visibility and discourage them from expanding their network. However, they do not constrain self-disclosure. Similarly, while self-efficacy in privacy management encourages SNS users to limit their profile visibility, it facilitates self-disclosure. This suggests that if users are limiting their profile visibility and constraining their friending behaviors, it does not necessarily mean they will reduce self-disclosure on SNSs because these behaviors are predicted by different factors. In addition, the study finds an interaction effect between privacy concerns and self-efficacy in privacy management on friending. It points to the potential problem of increased risk-taking behaviors resulting from high self-efficacy in privacy management and low privacy concerns.

  17. The privacy coach: Supporting customer privacy in the internet of things

    NARCIS (Netherlands)

    Broenink, E.G.; Hoepman, J.H.; Hof, C. van 't; Kranenburg, R. van; Smits, D.; Wisman, T.

    2010-01-01

    The Privacy Coach is an application running on a mobile phone that supports customers in making privacy decisions when confronted with RFID tags. The approach we take to increase customer privacy is a radical departure from the mainstream research efforts that focus on implementing privacy enhancing

  18. Semantic Security: Privacy Definitions Revisited

    OpenAIRE

    Jinfei Liu; Li Xiong; Jun Luo

    2013-01-01

    In this paper we illustrate a privacy framework named Indistinguishabley Privacy. Indistinguishable privacy could be deemed as the formalization of the existing privacy definitions in privacy preserving data publishing as well as secure multi-party computation. We introduce three representative privacy notions in the literature, Bayes-optimal privacy for privacy preserving data publishing, differential privacy for statistical data release, and privacy w.r.t. semi-honest behavior in the secure...

  19. Towards Practical Privacy-Preserving Internet Services

    Science.gov (United States)

    Wang, Shiyuan

    2012-01-01

    Today's Internet offers people a vast selection of data centric services, such as online query services, the cloud, and location-based services, etc. These internet services bring people a lot of convenience, but at the same time raise privacy concerns, e.g., sensitive information revealed by the queries, sensitive data being stored and…

  20. "Everybody Knows Everybody Else's Business"-Privacy in Rural Communities.

    Science.gov (United States)

    Leung, Janni; Smith, Annetta; Atherton, Iain; McLaughlin, Deirdre

    2016-12-01

    Patients have a right to privacy in a health care setting. This involves conversational discretion, security of medical records and physical privacy of remaining unnoticed or unidentified when using health care services other than by those who need to know or whom the patient wishes to know. However, the privacy of cancer patients who live in rural areas is more difficult to protect due to the characteristics of rural communities. The purpose of this article is to reflect on concerns relating to the lack of privacy experienced by cancer patients and health care professionals in the rural health care setting. In addition, this article suggests future research directions to provide much needed evidence for educating health care providers and guiding health care policies that can lead to better protection of privacy among cancer patients living in rural communities.

  1. Comparative Approaches to Biobanks and Privacy.

    Science.gov (United States)

    Rothstein, Mark A; Knoppers, Bartha Maria; Harrell, Heather L

    2016-03-01

    Laws in the 20 jurisdictions studied for this project display many similar approaches to protecting privacy in biobank research. Although few have enacted biobank-specific legislation, many countries address biobanking within other laws. All provide for some oversight mechanisms for biobank research, even though the nature of that oversight varies between jurisdictions. Most have some sort of controlled access system in place for research with biobank specimens. While broad consent models facilitate biobanking, countries without national or federated biobanks have been slow to adopt broad consent. International guidelines have facilitated sharing and generally take a proportional risk approach, but many countries have provisions guiding international sharing and a few even limit international sharing. Although privacy laws may not prohibit international collaborations, the multi-prong approach to privacy unique to each jurisdiction can complicate international sharing. These symposium issues can serve as a resource for explaining the sometimes intricate privacy laws in each studied jurisdiction, outlining the key issues with regards to privacy and biobanking, and serving to describe a framework for the process of harmonization of privacy laws. © 2016 American Society of Law, Medicine & Ethics.

  2. Observing Privacy, Modesty and Hospitality in the Home Domain: Three Case Studies of Muslim Homes in Brisbane, Australia

    Directory of Open Access Journals (Sweden)

    Zulkeplee Othman

    2014-12-01

    Full Text Available A home embodies a sensorial space that is layered with personal memories and traces of history. The success of a home in providing a strong sense of place depends on various factors such as geographical location, climatic conditions, and occupants’ world-views and perceptions. This paper explores Muslims’ perceptions of privacy, modesty and hospitality within their homes through their lived experiences. This case study focuses on three Muslim families living in Australian designed homes within the same suburb of Brisbane, Australia. The study provides prefatory insight into the ways in which these families perform their daily activities and entertain their guests without jeopardizing their privacy needs. The study examines the significance of modesty in the design of Muslim homes as a means by which family members are able to achieve optimum privacy while simultaneously extending hospitality to guests inside and outside their homes. The findings of this study provide opportunities too, for expanding research into culturally adaptable housing systems to help meet the changing needs of Australian multicultural society.

  3. Tales from the dark side: Privacy dark strategies and privacy dark patterns

    DEFF Research Database (Denmark)

    Bösch, Christoph; Erb, Benjamin; Kargl, Frank

    2016-01-01

    Privacy strategies and privacy patterns are fundamental concepts of the privacy-by-design engineering approach. While they support a privacy-aware development process for IT systems, the concepts used by malicious, privacy-threatening parties are generally less understood and known. We argue...... that understanding the “dark side”, namely how personal data is abused, is of equal importance. In this paper, we introduce the concept of privacy dark strategies and privacy dark patterns and present a framework that collects, documents, and analyzes such malicious concepts. In addition, we investigate from...... a psychological perspective why privacy dark strategies are effective. The resulting framework allows for a better understanding of these dark concepts, fosters awareness, and supports the development of countermeasures. We aim to contribute to an easier detection and successive removal of such approaches from...

  4. Overview of Privacy in Social Networking Sites (SNS)

    Science.gov (United States)

    Powale, Pallavi I.; Bhutkar, Ganesh D.

    2013-07-01

    Social Networking Sites (SNS) have become an integral part of communication and life style of people in today's world. Because of the wide range of services offered by SNSs mostly for free of cost, these sites are attracting the attention of all possible Internet users. Most importantly, users from all age groups have become members of SNSs. Since many of the users are not aware of the data thefts associated with information sharing, they freely share their personal information with SNSs. Therefore, SNSs may be used for investigating users' character and social habits by familiar or even unknown persons and agencies. Such commercial and social scenario, has led to number of privacy and security threats. Though, all major issues in SNSs need to be addressed, by SNS providers, privacy of SNS users is the most crucial. And therefore, in this paper, we have focused our discussion on "privacy in SNSs". We have discussed different ways of Personally Identifiable Information (PII) leakages from SNSs, information revelation to third-party domains without user consent and privacy related threats associated with such information sharing. We expect that this comprehensive overview on privacy in SNSs will definitely help in raising user awareness about sharing data and managing their privacy with SNSs. It will also help SNS providers to rethink about their privacy policies.

  5. A Model for Calculated Privacy and Trust in pHealth Ecosystems.

    Science.gov (United States)

    Ruotsalainen, Pekka; Blobel, Bernd

    2018-01-01

    A pHealth ecosystem is a community of service users and providers. It is also a dynamic socio-technical system. One of its main goals is to help users to maintain their personal health status. Another goal is to give economic benefit to stakeholders which use personal health information existing in the ecosystem. In pHealth ecosystems, a huge amount of health related data is collected and used by service providers such as data extracted from the regulated health record and information related to personal characteristics, genetics, lifestyle and environment. In pHealth ecosystems, there are different kinds of service providers such as regulated health care service providers, unregulated health service providers, ICT service providers, researchers and industrial organizations. This fact together with the multidimensional personal health data used raises serious privacy concerns. Privacy is a necessary enabler for successful pHealth, but it is also an elastic concept without any universally agreed definition. Regardless of what kind of privacy model is used in dynamic socio-technical systems, it is difficult for a service user to know the privacy level of services in real life situations. As privacy and trust are interrelated concepts, the authors have developed a hybrid solution where knowledge got from regulatory privacy requirements and publicly available privacy related documents is used for calculation of service providers' specific initial privacy value. This value is then used as an estimate for the initial trust score. In this solution, total trust score is a combination of recommended trust, proposed trust and initial trust. Initial privacy level is a weighted arithmetic mean of knowledge and user selected weights. The total trust score for any service provider in the ecosystem can be calculated deploying either a beta trust model or the Fuzzy trust calculation method. The prosed solution is easy to use and to understand, and it can be also automated. It is

  6. Privacy-preserving heterogeneous health data sharing.

    Science.gov (United States)

    Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila

    2013-05-01

    Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis.

  7. Display methods of electronic patient record screens: patient privacy concerns.

    Science.gov (United States)

    Niimi, Yukari; Ota, Katsumasa

    2013-01-01

    To provide adequate care, medical professionals have to collect not only medical information but also information that may be related to private aspects of the patient's life. With patients' increasing awareness of information privacy, healthcare providers have to pay attention to the patients' right of privacy. This study aimed to clarify the requirements of the display method of electronic patient record (EPR) screens in consideration of both patients' information privacy concerns and health professionals' information needs. For this purpose, semi-structured group interviews were conducted of 78 medical professionals. They pointed out that partial concealment of information to meet patients' requests for privacy could result in challenges in (1) safety in healthcare, (2) information sharing, (3) collaboration, (4) hospital management, and (5) communication. They believed that EPRs should (1) meet the requirements of the therapeutic process, (2) have restricted access, (3) provide convenient access to necessary information, and (4) facilitate interprofessional collaboration. This study provides direction for the development of display methods that balance the sharing of vital information and protection of patient privacy.

  8. 75 FR 81205 - Privacy Act: Revision of Privacy Act Systems of Records

    Science.gov (United States)

    2010-12-27

    ... DEPARTMENT OF AGRICULTURE Office of the Secretary Privacy Act: Revision of Privacy Act Systems of Records AGENCY: Office of the Secretary, USDA. ACTION: Notice to Revise Privacy Act Systems of Records... two Privacy Act Systems of Records entitled ``Information on Persons Disqualified from the...

  9. Gain-Based Relief for Invasion of Privacy

    Directory of Open Access Journals (Sweden)

    Sirko Harder

    2013-11-01

    Full Text Available In many common law jurisdictions, some or all instances of invasion of privacy constitute a privacy-specific wrong either at common law (including equity or under statute. A remedy invariably available for such a wrong is compensation for loss. However, the plaintiff may instead seek to claim the profit the defendant has made from the invasion. This article examines when a plaintiff is, and should be, entitled to claim that profit, provided that invasion of privacy is actionable as such. After a brief overview of the relevant law in major common law jurisdictions, the article investigates how invasion of privacy fits into a general concept of what is called ‘restitution for wrongs’. It will be argued that the right to privacy is a right against the whole world and as such forms a proper basis of awarding gain-based relief for the unauthorised use of that right.

  10. The perceived impact of location privacy: A web-based survey of public health perspectives and requirements in the UK and Canada

    Directory of Open Access Journals (Sweden)

    Boulos Maged

    2008-05-01

    Full Text Available Abstract Background The "place-consciousness" of public health professionals is on the rise as spatial analyses and Geographic Information Systems (GIS are rapidly becoming key components of their toolbox. However, "place" is most useful at its most precise, granular scale – which increases identification risks, thereby clashing with privacy issues. This paper describes the views and requirements of public health professionals in Canada and the UK on privacy issues and spatial data, as collected through a web-based survey. Methods Perceptions on the impact of privacy were collected through a web-based survey administered between November 2006 and January 2007. The survey targeted government, non-government and academic GIS labs and research groups involved in public health, as well as public health units (Canada, ministries, and observatories (UK. Potential participants were invited to participate through personally addressed, standardised emails. Results Of 112 invitees in Canada and 75 in the UK, 66 and 28 participated in the survey, respectively. The completion proportion for Canada was 91%, and 86% for the UK. No response differences were observed between the two countries. Ninety three percent of participants indicated a requirement for personally identifiable data (PID in their public health activities, including geographic information. Privacy was identified as an obstacle to public health practice by 71% of respondents. The overall self-rated median score for knowledge of privacy legislation and policies was 7 out of 10. Those who rated their knowledge of privacy as high (at the median or above also rated it significantly more severe as an obstacle to research (P Conclusion The clash between PID requirements – including granular geography – and limitations imposed by privacy and its associated bureaucracy require immediate attention and solutions, particularly given the increasing utilisation of GIS in public health. Solutions

  11. 78 FR 40515 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-07-05

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-071] Privacy Act of 1974; Privacy Act System of Records AGENCY: National Aeronautics and Space Administration (NASA). ACTION: Notice of Privacy Act system of records. SUMMARY: Each Federal agency is required by the Privacy Act of 1974 to publish...

  12. 78 FR 77503 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2013-12-23

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 13-149] Privacy Act of 1974; Privacy Act... proposed revisions to existing Privacy Act systems of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  13. Privacy Preserving Association Rule Mining Revisited: Privacy Enhancement and Resources Efficiency

    Science.gov (United States)

    Mohaisen, Abedelaziz; Jho, Nam-Su; Hong, Dowon; Nyang, Daehun

    Privacy preserving association rule mining algorithms have been designed for discovering the relations between variables in data while maintaining the data privacy. In this article we revise one of the recently introduced schemes for association rule mining using fake transactions (FS). In particular, our analysis shows that the FS scheme has exhaustive storage and high computation requirements for guaranteeing a reasonable level of privacy. We introduce a realistic definition of privacy that benefits from the average case privacy and motivates the study of a weakness in the structure of FS by fake transactions filtering. In order to overcome this problem, we improve the FS scheme by presenting a hybrid scheme that considers both privacy and resources as two concurrent guidelines. Analytical and empirical results show the efficiency and applicability of our proposed scheme.

  14. Business Information Exchange System with Security, Privacy, and Anonymity

    Directory of Open Access Journals (Sweden)

    Sead Muftic

    2016-01-01

    Full Text Available Business Information Exchange is an Internet Secure Portal for secure management, distribution, sharing, and use of business e-mails, documents, and messages. It has three applications supporting three major types of information exchange systems: secure e-mail, secure instant messaging, and secure sharing of business documents. In addition to standard security services for e-mail letters, which are also applied to instant messages and documents, the system provides innovative features of privacy and full anonymity of users and their locations, actions, transactions, and exchanged resources. In this paper we describe design, implementation, and use of the system.

  15. 76 FR 67763 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-11-02

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-109)] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  16. 76 FR 64114 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-093)] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  17. 77 FR 69898 - Privacy Act of 1974; Privacy Act System of Records

    Science.gov (United States)

    2012-11-21

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice 12-100] Privacy Act of 1974; Privacy Act... proposed revisions to an existing Privacy Act system of records. SUMMARY: Pursuant to the provisions of the Privacy Act of 1974 (5 U.S.C. 552a), the National Aeronautics and Space Administration is issuing public...

  18. Privacy Practices of Health Social Networking Sites: Implications for Privacy and Data Security in Online Cancer Communities.

    Science.gov (United States)

    Charbonneau, Deborah H

    2016-08-01

    While online communities for social support continue to grow, little is known about the state of privacy practices of health social networking sites. This article reports on a structured content analysis of privacy policies and disclosure practices for 25 online ovarian cancer communities. All of the health social networking sites in the study sample provided privacy statements to users, yet privacy practices varied considerably across the sites. The majority of sites informed users that personal information was collected about participants and shared with third parties (96%, n = 24). Furthermore, more than half of the sites (56%, n = 14) stated that cookies technology was used to track user behaviors. Despite these disclosures, only 36% (n = 9) offered opt-out choices for sharing data with third parties. In addition, very few of the sites (28%, n = 7) allowed individuals to delete their personal information. Discussions about specific security measures used to protect personal information were largely missing. Implications for privacy, confidentiality, consumer choice, and data safety in online environments are discussed. Overall, nurses and other health professionals can utilize these findings to encourage individuals seeking online support and participating in social networking sites to build awareness of privacy risks to better protect their personal health information in the digital age.

  19. Privacy and Innovation

    OpenAIRE

    Avi Goldfarb; Catherine Tucker

    2011-01-01

    Information and communication technology now enables firms to collect detailed and potentially intrusive data about their customers both easily and cheaply. This means that privacy concerns are no longer limited to government surveillance and public figures' private lives. The empirical literature on privacy regulation shows that privacy regulation may affect the extent and direction of data-based innovation. We also show that the impact of privacy regulation can be extremely heterogeneous. T...

  20. Privacy-related context information for ubiquitous health.

    Science.gov (United States)

    Seppälä, Antto; Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-03-11

    are regulated or in what kind of environment data can be processed. This study added to the vision of ubiquitous health by analyzing information processing from the viewpoint of an individual's privacy. We learned that health and wellness-related activities may happen in several environments and situations with multiple stakeholders, services, and systems. We have provided new knowledge regarding privacy-related context information and corresponding components by analyzing typical activities in ubiquitous health. With the identified components and their properties, individuals can define their personal preferences on information processing based on situational information, and privacy services can capture privacy-related context of the information-processing situation.

  1. Interpretation and Analysis of Privacy Policies of Websites in India

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning; Khajuria, Samant

    2016-01-01

    the conditions specified in the policy document. So, ideally the privacy policies should be readable and provide sufficient information to empower users to make knowledgeable decisions. Thus, we have examined more than 50 privacy policies and discussed the content analysis in this paper. We discovered...... on information collection methods, purpose, sharing entities names and data transit. In this study, the 11 % privacy policies are compliance with privacy standards which denotes other privacy policies are less committed to support transparency, choice, and accountability in the process of information collection...... that the policies are not only unstructured but also described in complicated language. Our analysis shows that the user data security measures are nonspecific and unsatisfactory in 57% privacy policies. In spite of huge amount of information collection, the privacy policies does not have clear description...

  2. Privacy in the Genomic Era

    Science.gov (United States)

    NAVEED, MUHAMMAD; AYDAY, ERMAN; CLAYTON, ELLEN W.; FELLAY, JACQUES; GUNTER, CARL A.; HUBAUX, JEAN-PIERRE; MALIN, BRADLEY A.; WANG, XIAOFENG

    2015-01-01

    Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highly-detailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward. PMID:26640318

  3. Privacy in the Genomic Era.

    Science.gov (United States)

    Naveed, Muhammad; Ayday, Erman; Clayton, Ellen W; Fellay, Jacques; Gunter, Carl A; Hubaux, Jean-Pierre; Malin, Bradley A; Wang, Xiaofeng

    2015-09-01

    Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highly-detailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward.

  4. A secure data privacy preservation for on-demand

    Directory of Open Access Journals (Sweden)

    Dhasarathan Chandramohan

    2017-04-01

    Full Text Available This paper spotlights privacy and its obfuscation issues of intellectual, confidential information owned by insurance and finance sectors. Privacy risk in business era if authoritarians misuse secret information. Software interruptions in steeling digital data in the name of third party services. Liability in digital secrecy for the business continuity isolation, mishandling causing privacy breaching the vicinity and its preventive phenomenon is scrupulous in the cloud, where a huge amount of data is stored and maintained enormously. In this developing IT-world toward cloud, users privacy protection is becoming a big question , albeit cloud computing made changes in the computing field by increasing its effectiveness, efficiency and optimization of the service environment etc, cloud users data and their identity, reliability, maintainability and privacy may vary for different CPs (cloud providers. CP ensures that the user’s proprietary information is maintained more secretly with current technologies. More remarkable occurrence is even the cloud provider does not have suggestions regarding the information and the digital data stored and maintained globally anywhere in the cloud. The proposed system is one of the obligatory research issues in cloud computing. We came forward by proposing the Privacy Preserving Model to Prevent Digital Data Loss in the Cloud (PPM–DDLC. This proposal helps the CR (cloud requester/users to trust their proprietary information and data stored in the cloud.

  5. Internet and Privacy

    OpenAIRE

    Al-Fadhli, Meshal Shehab

    2007-01-01

    The concept of privacy is hard to understand and is not easy to define, because this concept is linked with several dimensions. Internet Privacy is associated with the use of the Internet and most likely appointed under communications privacy, involving the user of the Internet’s personal information and activities, and the disclosure of them online. This essay is going to present the meaning of privacy and the implications of it for Internet users. Also, this essay will demonstrate some of t...

  6. Unveiling consumer's privacy paradox behaviour in an economic exchange.

    Science.gov (United States)

    Motiwalla, Luvai F; Li, Xiao-Bai

    2016-01-01

    Privacy paradox is of great interest to IS researchers and firms gathering personal information. It has been studied from social, behavioural, and economic perspectives independently. However, prior research has not examined the degrees of influence these perspectives contribute to the privacy paradox problem. We combine both economic and behavioural perspectives in our study of the privacy paradox with a price valuation of personal information through an economic experiment combined with a behavioural study on privacy paradox. Our goal is to reveal more insights on the privacy paradox through economic valuation on personal information. Results indicate that general privacy concerns or individual disclosure concerns do not have a significant influence on the price valuation of personal information. Instead, prior disclosure behaviour in specific scenario, like with healthcare providers or social networks, is a better indicator of consumer price valuations.

  7. 75 FR 10554 - Privacy Act of 1974; System of Records Notice

    Science.gov (United States)

    2010-03-08

    ..., privacy and security objectives: Provide driver-related MCMIS crash and inspection data electronically... to submit a Freedom of Information Act (FOIA) request or Privacy Act request to FMCSA for the data..., privacy and security objectives are being met. The PSP system will only allow operator-applicants to...

  8. Enhancing source location protection in wireless sensor networks

    Science.gov (United States)

    Chen, Juan; Lin, Zhengkui; Wu, Di; Wang, Bailing

    2015-12-01

    Wireless sensor networks are widely deployed in the internet of things to monitor valuable objects. Once the object is monitored, the sensor nearest to the object which is known as the source informs the base station about the object's information periodically. It is obvious that attackers can capture the object successfully by localizing the source. Thus, many protocols have been proposed to secure the source location. However, in this paper, we examine that typical source location protection protocols generate not only near but also highly localized phantom locations. As a result, attackers can trace the source easily from these phantom locations. To address these limitations, we propose a protocol to enhance the source location protection (SLE). With phantom locations far away from the source and widely distributed, SLE improves source location anonymity significantly. Theory analysis and simulation results show that our SLE provides strong source location privacy preservation and the average safety period increases by nearly one order of magnitude compared with existing work with low communication cost.

  9. 76 FR 64112 - Privacy Act of 1974; Privacy Act System of Records Appendices

    Science.gov (United States)

    2011-10-17

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (11-091)] Privacy Act of 1974; Privacy Act...: Revisions of NASA Appendices to Privacy Act System of Records. SUMMARY: Notice is hereby given that NASA is... Privacy Act of 1974. This notice publishes those amendments as set forth below under the caption...

  10. Privacy encounters in Teledialogue

    DEFF Research Database (Denmark)

    Andersen, Lars Bo; Bøge, Ask Risom; Danholt, Peter

    2017-01-01

    Privacy is a major concern when new technologies are introduced between public authorities and private citizens. What is meant by privacy, however, is often unclear and contested. Accordingly, this article utilises grounded theory to study privacy empirically in the research and design project...... Teledialogue aimed at introducing new ways for public case managers and placed children to communicate through IT. The resulting argument is that privacy can be understood as an encounter, that is, as something that arises between implicated actors and entails some degree of friction and negotiation....... An argument which is further qualified through the philosophy of Gilles Deleuze. The article opens with a review of privacy literature before continuing to present privacy as an encounter with five different foci: what technologies bring into the encounter; who is related to privacy by implication; what...

  11. Toward privacy-preserving JPEG image retrieval

    Science.gov (United States)

    Cheng, Hang; Wang, Jingyue; Wang, Meiqing; Zhong, Shangping

    2017-07-01

    This paper proposes a privacy-preserving retrieval scheme for JPEG images based on local variance. Three parties are involved in the scheme: the content owner, the server, and the authorized user. The content owner encrypts JPEG images for privacy protection by jointly using permutation cipher and stream cipher, and then, the encrypted versions are uploaded to the server. With an encrypted query image provided by an authorized user, the server may extract blockwise local variances in different directions without knowing the plaintext content. After that, it can calculate the similarity between the encrypted query image and each encrypted database image by a local variance-based feature comparison mechanism. The authorized user with the encryption key can decrypt the returned encrypted images with plaintext content similar to the query image. The experimental results show that the proposed scheme not only provides effective privacy-preserving retrieval service but also ensures both format compliance and file size preservation for encrypted JPEG images.

  12. Privacy-Related Context Information for Ubiquitous Health

    Science.gov (United States)

    Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-01-01

    data can be processed or how components are regulated or in what kind of environment data can be processed. Conclusions This study added to the vision of ubiquitous health by analyzing information processing from the viewpoint of an individual’s privacy. We learned that health and wellness-related activities may happen in several environments and situations with multiple stakeholders, services, and systems. We have provided new knowledge regarding privacy-related context information and corresponding components by analyzing typical activities in ubiquitous health. With the identified components and their properties, individuals can define their personal preferences on information processing based on situational information, and privacy services can capture privacy-related context of the information-processing situation. PMID:25100084

  13. Acoustic assessment of speech privacy curtains in two nursing units

    Science.gov (United States)

    Pope, Diana S.; Miller-Klein, Erik T.

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s’ standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered. PMID:26780959

  14. Acoustic assessment of speech privacy curtains in two nursing units.

    Science.gov (United States)

    Pope, Diana S; Miller-Klein, Erik T

    2016-01-01

    Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation) and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient's bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s' standard hospital construction and the other was newly refurbished (2013) with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  15. Acoustic assessment of speech privacy curtains in two nursing units

    Directory of Open Access Journals (Sweden)

    Diana S Pope

    2016-01-01

    Full Text Available Hospitals have complex soundscapes that create challenges to patient care. Extraneous noise and high reverberation rates impair speech intelligibility, which leads to raised voices. In an unintended spiral, the increasing noise may result in diminished speech privacy, as people speak loudly to be heard over the din. The products available to improve hospital soundscapes include construction materials that absorb sound (acoustic ceiling tiles, carpet, wall insulation and reduce reverberation rates. Enhanced privacy curtains are now available and offer potential for a relatively simple way to improve speech privacy and speech intelligibility by absorbing sound at the hospital patient′s bedside. Acoustic assessments were performed over 2 days on two nursing units with a similar design in the same hospital. One unit was built with the 1970s′ standard hospital construction and the other was newly refurbished (2013 with sound-absorbing features. In addition, we determined the effect of an enhanced privacy curtain versus standard privacy curtains using acoustic measures of speech privacy and speech intelligibility indexes. Privacy curtains provided auditory protection for the patients. In general, that protection was increased by the use of enhanced privacy curtains. On an average, the enhanced curtain improved sound absorption from 20% to 30%; however, there was considerable variability, depending on the configuration of the rooms tested. Enhanced privacy curtains provide measureable improvement to the acoustics of patient rooms but cannot overcome larger acoustic design issues. To shorten reverberation time, additional absorption, and compact and more fragmented nursing unit floor plate shapes should be considered.

  16. Role Management in a Privacy-Enhanced Collaborative Environment

    Science.gov (United States)

    Lorenz, Anja; Borcea-Pfitzmann, Katrin

    2010-01-01

    Purpose: Facing the dilemma between collaboration and privacy is a continual challenge for users. In this setting, the purpose of this paper is to discuss issues of a highly flexible role management integrated in a privacy-enhanced collaborative environment (PECE). Design/methodology/approach: The general framework was provided by former findings…

  17. An examination of electronic health information privacy in older adults.

    Science.gov (United States)

    Le, Thai; Thompson, Hilaire; Demiris, George

    2013-01-01

    Older adults are the quickest growing demographic group and are key consumers of health services. As the United States health system transitions to electronic health records, it is important to understand older adult perceptions of privacy and security. We performed a secondary analysis of the Health Information National Trends Survey (2012, Cycle 1), to examine differences in perceptions of electronic health information privacy between older adults and the general population. We found differences in the level of importance placed on access to electronic health information (older adults placed greater emphasis on provider as opposed to personal access) and tendency to withhold information out of concerns for privacy and security (older adults were less likely to withhold information). We provide recommendations to alleviate some of these privacy concerns. This may facilitate greater use of electronic health communication between patient and provider, while promoting shared decision making.

  18. Privacy-Preserving Biometric Authentication: Challenges and Directions

    Directory of Open Access Journals (Sweden)

    Elena Pagnin

    2017-01-01

    Full Text Available An emerging direction for authenticating people is the adoption of biometric authentication systems. Biometric credentials are becoming increasingly popular as a means of authenticating people due to the wide range of advantages that they provide with respect to classical authentication methods (e.g., password-based authentication. The most characteristic feature of this authentication method is the naturally strong bond between a user and her biometric credentials. This very same advantageous property, however, raises serious security and privacy concerns in case the biometric trait gets compromised. In this article, we present the most challenging issues that need to be taken into consideration when designing secure and privacy-preserving biometric authentication protocols. More precisely, we describe the main threats against privacy-preserving biometric authentication systems and give directions on possible countermeasures in order to design secure and privacy-preserving biometric authentication protocols.

  19. Trajectory data privacy protection based on differential privacy mechanism

    Science.gov (United States)

    Gu, Ke; Yang, Lihao; Liu, Yongzhi; Liao, Niandong

    2018-05-01

    In this paper, we propose a trajectory data privacy protection scheme based on differential privacy mechanism. In the proposed scheme, the algorithm first selects the protected points from the user’s trajectory data; secondly, the algorithm forms the polygon according to the protected points and the adjacent and high frequent accessed points that are selected from the accessing point database, then the algorithm calculates the polygon centroids; finally, the noises are added to the polygon centroids by the differential privacy method, and the polygon centroids replace the protected points, and then the algorithm constructs and issues the new trajectory data. The experiments show that the running time of the proposed algorithms is fast, the privacy protection of the scheme is effective and the data usability of the scheme is higher.

  20. Practical Privacy Assessment

    DEFF Research Database (Denmark)

    Peen, Søren; Jansen, Thejs Willem; Jensen, Christian D.

    2008-01-01

    This chapter proposes a privacy assessment model called the Operational Privacy Assessment Model that includes organizational, operational and technical factors for the protection of personal data stored in an IT system. The factors can be evaluated in a simple scale so that not only the resulting...... graphical depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment tool may be used to standardize privacy assessment criteria, making it less painful...... for the management to assess privacy risks on their systems....

  1. One Size Doesn’t Fit All: Measuring Individual Privacy in Aggregate Genomic Data

    Science.gov (United States)

    Simmons, Sean; Berger, Bonnie

    2017-01-01

    Even in the aggregate, genomic data can reveal sensitive information about individuals. We present a new model-based measure, PrivMAF, that provides provable privacy guarantees for aggregate data (namely minor allele frequencies) obtained from genomic studies. Unlike many previous measures that have been designed to measure the total privacy lost by all participants in a study, PrivMAF gives an individual privacy measure for each participant in the study, not just an average measure. These individual measures can then be combined to measure the worst case privacy loss in the study. Our measure also allows us to quantify the privacy gains achieved by perturbing the data, either by adding noise or binning. Our findings demonstrate that both perturbation approaches offer significant privacy gains. Moreover, we see that these privacy gains can be achieved while minimizing perturbation (and thus maximizing the utility) relative to stricter notions of privacy, such as differential privacy. We test PrivMAF using genotype data from the Wellcome Trust Case Control Consortium, providing a more nuanced understanding of the privacy risks involved in an actual genome-wide association studies. Interestingly, our analysis demonstrates that the privacy implications of releasing MAFs from a study can differ greatly from individual to individual. An implementation of our method is available at http://privmaf.csail.mit.edu. PMID:29202050

  2. Unveiling consumer’s privacy paradox behaviour in an economic exchange

    Science.gov (United States)

    Li, Xiao-Bai

    2015-01-01

    Privacy paradox is of great interest to IS researchers and firms gathering personal information. It has been studied from social, behavioural, and economic perspectives independently. However, prior research has not examined the degrees of influence these perspectives contribute to the privacy paradox problem. We combine both economic and behavioural perspectives in our study of the privacy paradox with a price valuation of personal information through an economic experiment combined with a behavioural study on privacy paradox. Our goal is to reveal more insights on the privacy paradox through economic valuation on personal information. Results indicate that general privacy concerns or individual disclosure concerns do not have a significant influence on the price valuation of personal information. Instead, prior disclosure behaviour in specific scenario, like with healthcare providers or social networks, is a better indicator of consumer price valuations. PMID:27708687

  3. Neuroethics and Brain Privacy

    DEFF Research Database (Denmark)

    Ryberg, Jesper

    2017-01-01

    An introduction is presented in which editor discusses various articles within the issue on topics including ethical challenges with importance of privacy for well-being, impact of brain-reading on mind privacy and neurotechnology.......An introduction is presented in which editor discusses various articles within the issue on topics including ethical challenges with importance of privacy for well-being, impact of brain-reading on mind privacy and neurotechnology....

  4. Protecting genetic privacy.

    Science.gov (United States)

    Roche, P A; Annas, G J

    2001-05-01

    This article outlines the arguments for and against new rules to protect genetic privacy. We explain why genetic information is different to other sensitive medical information, why researchers and biotechnology companies have opposed new rules to protect genetic privacy (and favour anti-discrimination laws instead), and discuss what can be done to protect privacy in relation to genetic-sequence information and to DNA samples themselves.

  5. Big Data and Consumer Participation in Privacy Contracts: Deciding who Decides on Privacy

    Directory of Open Access Journals (Sweden)

    Michiel Rhoen

    2015-02-01

    Full Text Available Big data puts data protection to the test. Consumers granting permission to process their personal data are increasingly opening up their personal lives, thanks to the “datafication” of everyday life, indefinite data retention and the increasing sophistication of algorithms for analysis.The privacy implications of big data call for serious consideration of consumers’ opportunities to participate in decision-making processes about their contracts. If these opportunities are insufficient, the resulting rules may represent special interests rather than consumers’ needs. This may undermine the legitimacy of big data applications.This article argues that providing sufficient consumer participation in privacy matters requires choosing the best available decision making mechanism. Is a consumer to negotiate his own privacy terms in the market, will lawmakers step in on his behalf, or is he to seek protection through courts? Furthermore is this a matter of national law or European law? These choices will affect the opportunities for achieving different policy goals associated with the possible benefits of the “big data revolution”.

  6. Disentangling privacy from property: toward a deeper understanding of genetic privacy.

    Science.gov (United States)

    Suter, Sonia M

    2004-04-01

    With the mapping of the human genome, genetic privacy has become a concern to many. People care about genetic privacy because genes play an important role in shaping us--our genetic information is about us, and it is deeply connected to our sense of ourselves. In addition, unwanted disclosure of our genetic information, like a great deal of other personal information, makes us vulnerable to unwanted exposure, stigmatization, and discrimination. One recent approach to protecting genetic privacy is to create property rights in genetic information. This Article argues against that approach. Privacy and property are fundamentally different concepts. At heart, the term "property" connotes control within the marketplace and over something that is disaggregated or alienable from the self. "Privacy," in contrast, connotes control over access to the self as well as things close to, intimately connected to, and about the self. Given these different meanings, a regime of property rights in genetic information would impoverish our understanding of that information, ourselves, and the relationships we hope will be built around and through its disclosure. This Article explores our interests in genetic information in order to deepen our understanding of the ongoing discourse about the distinction between property and privacy. It develops a conception of genetic privacy with a strong relational component. We ordinarily share genetic information in the context of relationships in which disclosure is important to the relationship--family, intimate, doctor-patient, researcher-participant, employer-employee, and insurer-insured relationships. Such disclosure makes us vulnerable to and dependent on the person to whom we disclose it. As a result, trust is essential to the integrity of these relationships and our sharing of genetic information. Genetic privacy can protect our vulnerability in these relationships and enhance the trust we hope to have in them. Property, in contrast, by

  7. Public Opinion about the Importance of Privacy in Biobank Research

    Science.gov (United States)

    Kaufman, David J.; Murphy-Bollinger, Juli; Scott, Joan; Hudson, Kathy L.

    2009-01-01

    Concerns about privacy may deter people from participating in genetic research. Recruitment and retention of biobank participants requires understanding the nature and magnitude of these concerns. Potential participants in a proposed biobank were asked about their willingness to participate, their privacy concerns, informed consent, and data sharing. A representative survey of 4659 U.S. adults was conducted. Ninety percent of respondents would be concerned about privacy, 56% would be concerned about researchers having their information, and 37% would worry that study data could be used against them. However, 60% would participate in the biobank if asked. Nearly half (48%) would prefer to provide consent once for all research approved by an oversight panel, whereas 42% would prefer to provide consent for each project separately. Although 92% would allow academic researchers to use study data, 80% and 75%, respectively, would grant access to government and industry researchers. Concern about privacy was related to lower willingness to participate only when respondents were told that they would receive $50 for participation and would not receive individual research results back. Among respondents who were told that they would receive $200 or individual research results, privacy concerns were not related to willingness. Survey respondents valued both privacy and participation in biomedical research. Despite pervasive privacy concerns, 60% would participate in a biobank. Assuring research participants that their privacy will be protected to the best of researchers' abilities may increase participants' acceptance of consent for broad research uses of biobank data by a wide range of researchers. PMID:19878915

  8. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees.

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-09-10

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one.

  9. Privacy Verification Using Ontologies

    NARCIS (Netherlands)

    Kost, Martin; Freytag, Johann-Christoph; Kargl, Frank; Kung, Antonio

    2011-01-01

    As information systems extensively exchange information between participants, privacy concerns may arise from its potential misuse. A Privacy by Design (PbD) approach considers privacy requirements of different stakeholders during the design and the implementation of a system. Currently, a

  10. Identity management and privacy languages technologies: Improving user control of data privacy

    Science.gov (United States)

    García, José Enrique López; García, Carlos Alberto Gil; Pacheco, Álvaro Armenteros; Organero, Pedro Luis Muñoz

    The identity management solutions have the capability to bring confidence to internet services, but this confidence could be improved if user has more control over the privacy policy of its attributes. Privacy languages could help to this task due to its capability to define privacy policies for data in a very flexible way. So, an integration problem arises: making work together both identity management and privacy languages. Despite several proposals for accomplishing this have already been defined, this paper suggests some topics and improvements that could be considered.

  11. Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees

    Science.gov (United States)

    Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian

    2016-01-01

    With the rapid growth of the health data scale, the limited storage and computation resources of wireless body area sensor networks (WBANs) is becoming a barrier to their development. Therefore, outsourcing the encrypted health data to the cloud has been an appealing strategy. However, date aggregation will become difficult. Some recently-proposed schemes try to address this problem. However, there are still some functions and privacy issues that are not discussed. In this paper, we propose a privacy-enhanced and multifunctional health data aggregation scheme (PMHA-DP) under differential privacy. Specifically, we achieve a new aggregation function, weighted average (WAAS), and design a privacy-enhanced aggregation scheme (PAAS) to protect the aggregated data from cloud servers. Besides, a histogram aggregation scheme with high accuracy is proposed. PMHA-DP supports fault tolerance while preserving data privacy. The performance evaluation shows that the proposal leads to less communication overhead than the existing one. PMID:27626417

  12. 48 CFR 39.105 - Privacy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Privacy. 39.105 Section 39... CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.105 Privacy. Agencies shall ensure that contracts for information technology address protection of privacy in accordance with the Privacy Act (5 U.S.C...

  13. Privacy in domestic environments

    OpenAIRE

    Radics, Peter J; Gracanin, Denis

    2011-01-01

    non-peer-reviewed While there is a growing body of research on privacy,most of the work puts the focus on information privacy. Physical and psychological privacy issues receive little to no attention. However, the introduction of technology into our lives can cause problems with regard to these aspects of privacy. This is especially true when it comes to our homes, both as nodes of our social life and places for relaxation. This paper presents the results of a study intended to captu...

  14. Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy

    Science.gov (United States)

    Koopman, Colin; Doty, Nick

    2016-01-01

    The meaning of privacy has been much disputed throughout its history in response to wave after wave of new technological capabilities and social configurations. The current round of disputes over privacy fuelled by data science has been a cause of despair for many commentators and a death knell for privacy itself for others. We argue that privacy’s disputes are neither an accidental feature of the concept nor a lamentable condition of its applicability. Privacy is essentially contested. Because it is, privacy is transformable according to changing technological and social conditions. To make productive use of privacy’s essential contestability, we argue for a new approach to privacy research and practical design, focused on the development of conceptual analytics that facilitate dissecting privacy’s multiple uses across multiple contexts. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336797

  15. Android Watchdog - A Privacy Preserving Android Application

    OpenAIRE

    Stenbro, Fredrik; Falk, Sigurd Hagen

    2015-01-01

    This study explores issues related to privacy, both in general, and especially on Android smartphones. Previous research indicates that people often are irrational when it comes to privacy. They state that they are in control of their digitally stored personal information, but their actions show the opposite. On Android devices, permissions are intended to provide users with information about the critical functionality an application can implement by requesting it on install-time. This vision...

  16. 77 FR 38597 - Multistakeholder Process To Develop Consumer Data Privacy Code of Conduct Concerning Mobile...

    Science.gov (United States)

    2012-06-28

    ... Global Digital Economy (the ``Privacy Blueprint'').\\1\\ The Privacy Blueprint directs NTIA to convene... companies providing applications and interactive services for mobile devices handle personal data.\\3\\ \\1\\ The Privacy Blueprint is available at http://www.whitehouse.gov/sites/default/files/privacy-final.pdf...

  17. Auditing cloud computing a security and privacy guide

    CERN Document Server

    Halpert, Ben

    2011-01-01

    The auditor's guide to ensuring correct security and privacy practices in a cloud computing environment Many organizations are reporting or projecting a significant cost savings through the use of cloud computing-utilizing shared computing resources to provide ubiquitous access for organizations and end users. Just as many organizations, however, are expressing concern with security and privacy issues for their organization's data in the "cloud." Auditing Cloud Computing provides necessary guidance to build a proper audit to ensure operational integrity and customer data protection, among othe

  18. Towards Territorial Privacy in Smart Environments

    NARCIS (Netherlands)

    Könings, Bastian; Schaub, Florian; Weber, M.; Kargl, Frank

    Territorial privacy is an old concept for privacy of the personal space dating back to the 19th century. Despite its former relevance, territorial privacy has been neglected in recent years, while privacy research and legislation mainly focused on the issue of information privacy. However, with the

  19. Cyber security challenges in Smart Cities: Safety, security and privacy

    Science.gov (United States)

    Elmaghraby, Adel S.; Losavio, Michael M.

    2014-01-01

    The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the “Internet of Things.” Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect. PMID:25685517

  20. Cyber security challenges in Smart Cities: Safety, security and privacy

    Directory of Open Access Journals (Sweden)

    Adel S. Elmaghraby

    2014-07-01

    Full Text Available The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the “Internet of Things.” Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect.

  1. Cyber security challenges in Smart Cities: Safety, security and privacy.

    Science.gov (United States)

    Elmaghraby, Adel S; Losavio, Michael M

    2014-07-01

    The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the "Internet of Things." Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect.

  2. Privacy and Library Records

    Science.gov (United States)

    Bowers, Stacey L.

    2006-01-01

    This paper summarizes the history of privacy as it relates to library records. It commences with a discussion of how the concept of privacy first originated through case law and follows the concept of privacy as it has affected library records through current day and the "USA PATRIOT Act."

  3. Trust and Privacy Solutions Based on Holistic Service Requirements

    Science.gov (United States)

    Sánchez Alcón, José Antonio; López, Lourdes; Martínez, José-Fernán; Rubio Cifuentes, Gregorio

    2015-01-01

    The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing. PMID:26712752

  4. Trust and Privacy Solutions Based on Holistic Service Requirements

    Directory of Open Access Journals (Sweden)

    José Antonio Sánchez Alcón

    2015-12-01

    Full Text Available The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing.

  5. Trust and Privacy Solutions Based on Holistic Service Requirements.

    Science.gov (United States)

    Sánchez Alcón, José Antonio; López, Lourdes; Martínez, José-Fernán; Rubio Cifuentes, Gregorio

    2015-12-24

    The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens' information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing.

  6. Data privacy for the smart grid

    CERN Document Server

    Herold, Rebecca

    2015-01-01

    The Smart Grid and PrivacyWhat Is the Smart Grid? Changes from Traditional Energy Delivery Smart Grid Possibilities Business Model Transformations Emerging Privacy Risks The Need for Privacy PoliciesPrivacy Laws, Regulations, and Standards Privacy-Enhancing Technologies New Privacy Challenges IOT Big Data What Is the Smart Grid?Market and Regulatory OverviewTraditional Electricity Business SectorThe Electricity Open Market Classifications of Utilities Rate-Making ProcessesElectricity Consumer

  7. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Directory of Open Access Journals (Sweden)

    Cynthia Cheung

    Full Text Available Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters.This study sought to understand privacy attitudes of early adopters of emerging health technologies.Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis.Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science.Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  8. Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Science.gov (United States)

    Cheung, Cynthia; Bietz, Matthew J; Patrick, Kevin; Bloss, Cinnamon S

    2016-01-01

    Advances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters. This study sought to understand privacy attitudes of early adopters of emerging health technologies. Transcripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis. Although interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science. Early adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research.

  9. Designing Privacy-by-Design

    NARCIS (Netherlands)

    Rest, J.H.C. van; Boonstra, D.; Everts, M.H.; Rijn, M. van; Paassen, R.J.G. van

    2014-01-01

    The proposal for a new privacy regulation d.d. January 25th 2012 introduces sanctions of up to 2% of the annual turnover of enterprises. This elevates the importance of mitigation of privacy risks. This paper makes Privacy by Design more concrete, and positions it as the mechanism to mitigate these

  10. A Privacy-by-Design Contextual Suggestion System for Tourism

    Directory of Open Access Journals (Sweden)

    Pavlos S. Efraimidis

    2016-05-01

    Full Text Available We focus on personal data generated by the sensors and through the everyday usage of smart devices and take advantage of these data to build a non-invasive contextual suggestion system for tourism. The system, which we call Pythia, exploits the computational capabilities of modern smart devices to offer high quality personalized POI (point of interest recommendations. To protect user privacy, we apply a privacy by design approach within all of the steps of creating Pythia. The outcome is a system that comprises important architectural and operational innovations. The system is designed to process sensitive personal data, such as location traces, browsing history and web searches (query logs, to automatically infer user preferences and build corresponding POI-based user profiles. These profiles are then used by a contextual suggestion engine to anticipate user choices and make POI recommendations for tourists. Privacy leaks are minimized by implementing an important part of the system functionality at the user side, either as a mobile app or as a client-side web application, and by taking additional precautions, like data generalization, wherever necessary. As a proof of concept, we present a prototype that implements the aforementioned mechanisms on the Android platform accompanied with certain web applications. Even though the current prototype focuses only on location data, the results from the evaluation of the contextual suggestion algorithms and the user experience feedback from volunteers who used the prototype are very positive.

  11. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao; Liu, An; Zhang, Xiangliang; Li, Zhixu; Liu, Guanfeng; Zhao, Lei; Zhou, Xiaofang

    2017-01-01

    result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP

  12. K-Anonymity Based Privacy Risk Budgeting System for Interactive Record Linkage

    Directory of Open Access Journals (Sweden)

    Hye-Chung Kum

    2017-04-01

    The k-anonymity based privacy risk budgeting system provides a mechanism where we can concretely reason about the tradeoff between the privacy risks due to information disclosed, accuracy gained, and biases reduced during interactive record linkage.

  13. Privacy Protection in Cloud Using Rsa Algorithm

    OpenAIRE

    Amandeep Kaur; Manpreet Kaur

    2014-01-01

    The cloud computing architecture has been on high demand nowadays. The cloud has been successful over grid and distributed environment due to its cost and high reliability along with high security. However in the area of research it is observed that cloud computing still has some issues in security regarding privacy. The cloud broker provide services of cloud to general public and ensures that data is protected however they sometimes lag security and privacy. Thus in this work...

  14. A Novel Quantum Solution to Privacy-Preserving Nearest Neighbor Query in Location-Based Services

    Science.gov (United States)

    Luo, Zhen-yu; Shi, Run-hua; Xu, Min; Zhang, Shun

    2018-04-01

    We present a cheating-sensitive quantum protocol for Privacy-Preserving Nearest Neighbor Query based on Oblivious Quantum Key Distribution and Quantum Encryption. Compared with the classical related protocols, our proposed protocol has higher security, because the security of our protocol is based on basic physical principles of quantum mechanics, instead of difficulty assumptions. Especially, our protocol takes single photons as quantum resources and only needs to perform single-photon projective measurement. Therefore, it is feasible to implement this protocol with the present technologies.

  15. Privacy information management for video surveillance

    Science.gov (United States)

    Luo, Ying; Cheung, Sen-ching S.

    2013-05-01

    The widespread deployment of surveillance cameras has raised serious privacy concerns. Many privacy-enhancing schemes have been proposed to automatically redact images of trusted individuals in the surveillance video. To identify these individuals for protection, the most reliable approach is to use biometric signals such as iris patterns as they are immutable and highly discriminative. In this paper, we propose a privacy data management system to be used in a privacy-aware video surveillance system. The privacy status of a subject is anonymously determined based on her iris pattern. For a trusted subject, the surveillance video is redacted and the original imagery is considered to be the privacy information. Our proposed system allows a subject to access her privacy information via the same biometric signal for privacy status determination. Two secure protocols, one for privacy information encryption and the other for privacy information retrieval are proposed. Error control coding is used to cope with the variability in iris patterns and efficient implementation is achieved using surrogate data records. Experimental results on a public iris biometric database demonstrate the validity of our framework.

  16. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  17. The awareness of Privacy issues in Ambient Intelligence

    Directory of Open Access Journals (Sweden)

    Mar LÓPEZ

    2015-03-01

    Full Text Available Ambient Intelligence (AmI involves extensive and invisible integration of computer technologies in people´s daily lives: Smart Sensors, Smart Phones, Tablets, Wireless Sensor Network (Wi-Fi, Bluetooth, NFC, RFID, etc., Internet (Facebook, WhatsApp, Twitter, You Tube, Blogs, Cloud Computing, etc.. The Intelligent Environments (IE collect and process a massive amount of person-related and sensitive information.The aim of this work is to show the awareness of privacy issues in AmI and to identify the relevant design issues that should be addressed in order to provide privacy in the design of Ambient Intelligence’s applications focused in the user´s domain and involved technologies. We propose a conceptual framework in order to enforce privacy that takes care of interaction between technologies and devices, users and application´s domain with different modules that contain different steps relating to the privacy policies.

  18. 17 CFR 248.5 - Annual privacy notice to customers required.

    Science.gov (United States)

    2010-04-01

    ... customers required. 248.5 Section 248.5 Commodity and Securities Exchanges SECURITIES AND EXCHANGE... Safeguarding Personal Information Privacy and Opt Out Notices § 248.5 Annual privacy notice to customers required. (a)(1) General rule. You must provide a clear and conspicuous notice to customers that accurately...

  19. Can privacy concerns for insurance of connected cars be compensated?

    NARCIS (Netherlands)

    Derikx, S; de Reuver, G.A.; Kroesen, M.

    2015-01-01

    Internet-of-things technologies enable service providers such as insurance companies to collect vast amounts of privacy-sensitive data on car drivers. This paper studies whether and how privacy concerns of car owners can be compensated by offering monetary benefits. We study the case of usage based

  20. Privacy, Liveliness and Fairness for Reputation

    Science.gov (United States)

    Schiffner, Stefan; Clauß, Sebastian; Steinbrecher, Sandra

    In various Internet applications, reputation systems are typical means to collect experiences users make with each other. We present a reputation system that balances the security and privacy requirements of all users involed. Our system provides privacy in the form of information theoretic relationship anonymity w.r.t. users and the reputation provider. Furthermore, it preserves liveliness, i.e., all past ratings can influence the current reputation profile of a user. In addition, mutual ratings are forced to be simultaneous and self rating is prevented, which enforces fairness. What is more, without performing mock interactions - even if all users are colluding - users cannot forge ratings. As far as we know, this is the first protocol proposed that fulfills all these properties simultaneously.

  1. δ-dependency for privacy-preserving XML data publishing.

    Science.gov (United States)

    Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny

    2014-08-01

    An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis

  2. A privacy protection model to support personal privacy in relational databases.

    OpenAIRE

    2008-01-01

    The individual of today incessantly insists on more protection of his/her personal privacy than a few years ago. During the last few years, rapid technological advances, especially in the field of information technology, directed most attention and energy to the privacy protection of the Internet user. Research was done and is still being done covering a vast area to protect the privacy of transactions performed on the Internet. However, it was established that almost no research has been don...

  3. Internet privacy options for adequate realisation

    CERN Document Server

    2013-01-01

    A thorough multidisciplinary analysis of various perspectives on internet privacy was published as the first volume of a study, revealing the results of the achatech project "Internet Privacy - A Culture of Privacy and Trust on the Internet." The second publication from this project presents integrated, interdisciplinary options for improving privacy on the Internet utilising a normative, value-oriented approach. The ways in which privacy promotes and preconditions fundamental societal values and how privacy violations endanger the flourishing of said values are exemplified. The conditions which must be fulfilled in order to achieve a culture of privacy and trust on the internet are illuminated. This volume presents options for policy-makers, educators, businesses and technology experts how to facilitate solutions for more privacy on the Internet and identifies further research requirements in this area.

  4. Cognitive Privacy for Personal Clouds

    Directory of Open Access Journals (Sweden)

    Milena Radenkovic

    2016-01-01

    Full Text Available This paper proposes a novel Cognitive Privacy (CogPriv framework that improves privacy of data sharing between Personal Clouds for different application types and across heterogeneous networks. Depending on the behaviour of neighbouring network nodes, their estimated privacy levels, resource availability, and social network connectivity, each Personal Cloud may decide to use different transmission network for different types of data and privacy requirements. CogPriv is fully distributed, uses complex graph contacts analytics and multiple implicit novel heuristics, and combines these with smart probing to identify presence and behaviour of privacy compromising nodes in the network. Based on sensed local context and through cooperation with remote nodes in the network, CogPriv is able to transparently and on-the-fly change the network in order to avoid transmissions when privacy may be compromised. We show that CogPriv achieves higher end-to-end privacy levels compared to both noncognitive cellular network communication and state-of-the-art strategies based on privacy-aware adaptive social mobile networks routing for a range of experiment scenarios based on real-world user and network traces. CogPriv is able to adapt to varying network connectivity and maintain high quality of service while managing to keep low data exposure for a wide range of privacy leakage levels in the infrastructure.

  5. 76 FR 61761 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2011-10-05

    ... locations to grocery or drug stores, office supply stores, retail chains, and self-service kiosks. By... and to the Office of Management and Budget for their evaluation. The Postal Service does not expect this amended notice to have any adverse effect on individual privacy rights. The Postal Service...

  6. 75 FR 39500 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2010-07-09

    ... from the address above. The proposed system report, as required by 5 U.S.C. 552a(r) of the Privacy Act... location: Add to entry as last paragraph ``Defense Information Systems Agency (DISA) Mega Center, Building... plate number, drivers license number, vehicle make, model, year, color, drivers identification...

  7. Secure privacy-preserving biometric authentication scheme for telecare medicine information systems.

    Science.gov (United States)

    Li, Xuelei; Wen, Qiaoyan; Li, Wenmin; Zhang, Hua; Jin, Zhengping

    2014-11-01

    Healthcare delivery services via telecare medicine information systems (TMIS) can help patients to obtain their desired telemedicine services conveniently. However, information security and privacy protection are important issues and crucial challenges in healthcare information systems, where only authorized patients and doctors can employ telecare medicine facilities and access electronic medical records. Therefore, a secure authentication scheme is urgently required to achieve the goals of entity authentication, data confidentiality and privacy protection. This paper investigates a new biometric authentication with key agreement scheme, which focuses on patient privacy and medical data confidentiality in TMIS. The new scheme employs hash function, fuzzy extractor, nonce and authenticated Diffie-Hellman key agreement as primitives. It provides patient privacy protection, e.g., hiding identity from being theft and tracked by unauthorized participant, and preserving password and biometric template from being compromised by trustless servers. Moreover, key agreement supports secure transmission by symmetric encryption to protect patient's medical data from being leaked. Finally, the analysis shows that our proposal provides more security and privacy protection for TMIS.

  8. The Privacy Jungle:On the Market for Data Protection in Social Networks

    Science.gov (United States)

    Bonneau, Joseph; Preibusch, Sören

    We have conducted the first thorough analysis of the market for privacy practices and policies in online social networks. From an evaluation of 45 social networking sites using 260 criteria we find that many popular assumptions regarding privacy and social networking need to be revisited when considering the entire ecosystem instead of only a handful of well-known sites. Contrary to the common perception of an oligopolistic market, we find evidence of vigorous competition for new users. Despite observing many poor security practices, there is evidence that social network providers are making efforts to implement privacy enhancing technologies with substantial diversity in the amount of privacy control offered. However, privacy is rarely used as a selling point, even then only as auxiliary, nondecisive feature. Sites also failed to promote their existing privacy controls within the site. We similarly found great diversity in the length and content of formal privacy policies, but found an opposite promotional trend: though almost all policies are not accessible to ordinary users due to obfuscating legal jargon, they conspicuously vaunt the sites' privacy practices. We conclude that the market for privacy in social networks is dysfunctional in that there is significant variation in sites' privacy controls, data collection requirements, and legal privacy policies, but this is not effectively conveyed to users. Our empirical findings motivate us to introduce the novel model of a privacy communication game, where the economically rational choice for a site operator is to make privacy control available to evade criticism from privacy fundamentalists, while hiding the privacy control interface and privacy policy to maximize sign-up numbers and encourage data sharing from the pragmatic majority of users.

  9. Privacy and Property? Multi-level Strategies for Protecting Personal Interests in Genetic Material

    OpenAIRE

    Laurie, Graeme

    2003-01-01

    The paper builds on earlier medico-legal work by Laurie on privacy in relation to genetic material. In this chapter, the author discusses not only Laurie's views as 'pro-privacy' but the limitations of privacy, particularly once information, genetic or otherwise, enters a public sphere. The article draws on cases and laws in the UK, continental Europe, and the US, to provide a comparative view in suggesting an alternative approach to privacy.

  10. Cybersecurity and Privacy

    DEFF Research Database (Denmark)

    he huge potential in future connected services has as a precondition that privacy and security needs are dealt with in order for new services to be accepted. This issue is increasingly on the agenda both at the company and at individual level. Cybersecurity and Privacy – bridging the gap addresses...... two very complex fields of the digital world, i.e., Cybersecurity and Privacy. These multifaceted, multidisciplinary and complex issues are usually understood and valued differently by different individuals, data holders and legal bodies. But a change in one field immediately affects the others....... Policies, frameworks, strategies, laws, tools, techniques, and technologies – all of these are tightly interwoven when it comes to security and privacy. This book is another attempt to bridge the gap between the industry and academia. The book addresses the views from academia and industry on the subject...

  11. Privacy for Sale?

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Sørensen, Jannick Kirk; Khajuria, Samant

    Data brokers have become central players in the collection online of private user data. Data brokers’ activities are however not very transparent or even known by users. Many users regard privacy a central element when they use online services. Based on 12 short interviews with users, this paper...... analyses how users perceive the concept of online privacy in respect to data brokers col- lection of private data, and particularly novel services that offer users the possi- bility to sell their private data. Two groups of users are identified: Those who are considering selling their data under specific...... conditions, and those who reject the idea completely. Based on the literature we identify two positions to privacy either as an instrumental good, or as an intrinsic good. The paper positions vari- ous user perceptions on privacy that are relevant for future service develop- ment....

  12. Privacy on Hypothesis Testing in Smart Grids

    OpenAIRE

    Li, Zuxing; Oechtering, Tobias

    2015-01-01

    In this paper, we study the problem of privacy information leakage in a smart grid. The privacy risk is assumed to be caused by an unauthorized binary hypothesis testing of the consumer's behaviour based on the smart meter readings of energy supplies from the energy provider. Another energy supplies are produced by an alternative energy source. A controller equipped with an energy storage device manages the energy inflows to satisfy the energy demand of the consumer. We study the optimal ener...

  13. Security, privacy and trust in cloud systems

    CERN Document Server

    Nepal, Surya

    2013-01-01

    The book compiles technologies for enhancing and provisioning security, privacy and trust in cloud systems based on Quality of Service requirements. It is a timely contribution to a field that is gaining considerable research interest, momentum, and provides a comprehensive coverage of technologies related to cloud security, privacy and trust. In particular, the book includes - Cloud security fundamentals and related technologies to-date, with a comprehensive coverage of evolution, current landscape, and future roadmap. - A smooth organization with introductory, advanced and specialist content

  14. 78 FR 44931 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2013-07-25

    ...), as amended. This system will provide DLA installations with the ability to rapidly and effectively... Defense Privacy and Civil Liberties Web site at http://dpclo.defense.gov/privacy/SORNs/component/dla/index...: First name, last name, work email, work phone number, mobile phone number, short message service (SMS...

  15. 77 FR 37061 - DHS Data Privacy and Integrity Advisory Committee

    Science.gov (United States)

    2012-06-20

    .... Please note that the meeting may end early if the Committee has completed its business. ADDRESSES: The... draft report to the Department providing guidance on privacy protections for cybersecurity pilot... . Please note that the meeting may end early if all business is completed. Privacy Act Statement: DHS's Use...

  16. Aligning the Effective Use of Student Data with Student Privacy and Security Laws

    Science.gov (United States)

    Winnick, Steve; Coleman, Art; Palmer, Scott; Lipper, Kate; Neiditz, Jon

    2011-01-01

    This legal and policy guidance provides a summary framework for state policymakers as they work to use longitudinal data to improve student achievement while also protecting the privacy and security of individual student records. Summarizing relevant federal privacy and security laws, with a focus on the Family Educational Records and Privacy Act…

  17. Review of the model of technological pragmatism considering privacy and security

    Directory of Open Access Journals (Sweden)

    Kovačević-Lepojević Marina M.

    2013-01-01

    Full Text Available The model of technological pragmatism assumes awareness that technological development involves both benefits and dangers. Most modern security technologies represent citizens' mass surveillance tools, which can lead to compromising a significant amount of personal data due to the lack of institutional monitoring and control. On the other hand, people are interested in improving crime control and reducing the fear of potential victimization which this framework provides as a rational justification for the apparent loss of privacy, personal rights and freedoms. Citizens' perception on the categories of security and privacy, and their balancing, can provide the necessary guidelines to regulate the application of security technologies in the actual context. The aim of this paper is to analyze the attitudes of students at the University of Belgrade (N = 269 toward the application of security technology and identification of the key dimensions. On the basis of the relevant research the authors have formed assumptions about the following dimensions: security, privacy, trust in institutions and concern about the misuse of security technology. The Prise Questionnaire on Security Technology and Privacy was used for data collection. Factor analysis abstracted eight factors which together account for 58% of variance, with the highest loading of the four factors that are identified as security, privacy, trust and concern. The authors propose a model of technological pragmatism considering security and privacy. The data also showed that students are willing to change their privacy for the purpose of improving security and vice versa.

  18. The benefits, risks and costs of privacy: patient preferences and willingness to pay.

    Science.gov (United States)

    Trachtenbarg, David E; Asche, Carl; Ramsahai, Shweta; Duling, Joy; Ren, Jinma

    2017-05-01

    Multiple surveys show that patients want medical privacy; however, there are costs to maintaining privacy. There are also risks if information is not shared. A review of previous surveys found that most surveys asked questions about patient's privacy concerns and willingness to share their medical information. We found only one study that asked about sharing medical information for better care and no survey that asked patients about the risk, cost or comparison between medical privacy and privacy in other areas. To fill this gap, we designed a survey to: (1) compare medical privacy preferences to privacy preferences in other areas; (2) measure willingness to pay the cost of additional privacy measures; and (3) measure willingness to accept the risks of not sharing information. A total of 834 patients attending physician offices at 14 sites completed all or part of an anonymous questionnaire. Over 95% of patients were willing to share all their medical information with their treating physicians. There was no difference in willingness to share between primary care and specialty sites including psychiatry and an HIV clinic. In our survey, there was no difference in sharing preference between standard medical information and information with additional legal protections including genetic testing, drug/alcohol treatment and HIV results. Medical privacy was ranked lower than sharing social security and credit card numbers, but was deemed more private than other information including tax returns and handgun purchases. There was no statistical difference for any questions by site except for HIV/AIDS clinic patients ranking privacy of the medical record more important than reducing high medical costs and risk of medical errors (p risks to keep medical information hidden. Patients were very willing to share medical information with their providers. They were able to see the importance of sharing medical information to provide the best possible care. They were unwilling to

  19. Do Smartphone Power Users Protect Mobile Privacy Better than Nonpower Users? Exploring Power Usage as a Factor in Mobile Privacy Protection and Disclosure.

    Science.gov (United States)

    Kang, Hyunjin; Shin, Wonsun

    2016-03-01

    This study examines how consumers' competence at using smartphone technology (i.e., power usage) affects their privacy protection behaviors. A survey conducted with smartphone users shows that power usage influences privacy protection behavior not only directly but also indirectly through privacy concerns and trust placed in mobile service providers. A follow-up experiment indicates that the effects of power usage on smartphone users' information management can be a function of content personalization. Users, high on power usage, are less likely to share personal information on personalized mobile sites, but they become more revealing when they interact with nonpersonalized mobile sites.

  20. Patient privacy and social media.

    Science.gov (United States)

    Hader, Amy L; Brown, Evan D

    2010-08-01

    Healthcare providers using social media must remain mindful of professional boundaries and patients' privacy rights. Facebook and other online postings must comply with the Health Insurance Portability and Accountability Act of 1996 (HIPAA), applicable facility policy, state law, and AANA's Code of Ethics.

  1. Privacy Protection in Personal Health Information and Shared Care Records

    Directory of Open Access Journals (Sweden)

    Roderick L B Neame

    2014-03-01

    Full Text Available Background The protection of personal information privacy has become one of the most pressing security concerns for record keepers. Many institutions have yet to implement the essential infrastructure for data privacy protection and patient control when accessing and sharing data; even more have failed to instil a privacy and security awareness mindset and culture amongst their staff. Increased regulation, together with better compliance monitoring has led to the imposition of increasingly significant monetary penalties for failures to protect privacy. Objective  There is growing pressure in clinical environments to deliver shared patient care and to support this with integrated information.  This demands that more information passes between institutions and care providers without breaching patient privacy or autonomy.  This can be achieved with relatively minor enhancements of existing infrastructures and does not require extensive investment in inter-operating electronic records: indeed such investments to date have been shown not to materially improve data sharing.Requirements for Privacy  There is an ethical duty as well as a legal obligation on the part of care providers (and record keepers to keep patient information confidential and to share it only with the authorisation of the patient.  To achieve this information storage and retrieval, and communication systems must be appropriately configured. Patients may consult clinicians anywhere and at any time: therefore their data must be available for recipient-driven retrieval under patient control and kept private. 

  2. 76 FR 59073 - Privacy Act

    Science.gov (United States)

    2011-09-23

    ... CENTRAL INTELLIGENCE AGENCY 32 CFR Part 1901 Privacy Act AGENCY: Central Intelligence Agency. ACTION: Proposed rule. SUMMARY: Consistent with the Privacy Act (PA), the Central Intelligence Agency...-1379. SUPPLEMENTARY INFORMATION: Consistent with the Privacy Act (PA), the CIA has undertaken and...

  3. On privacy-preserving protocols for smart metering systems security and privacy in smart grids

    CERN Document Server

    Borges de Oliveira, Fábio

    2017-01-01

    This book presents current research in privacy-preserving protocols for smart grids. It contains several approaches and compares them analytically and by means of simulation. In particular, the book introduces asymmetric DC-Nets, which offer an ideal combination of performance and features in comparison with homomorphic encryption; data anonymization via cryptographic protocols; and data obfuscation by means of noise injection or by means of the installation of storage banks. The author shows that this theory can be leveraged into several application scenarios, and how asymmetric DC-Nets are generalizations of additive homomorphic encryption schemes and abstractions of symmetric DC-Nets. The book provides the reader with an understanding about smart grid scenarios, the privacy problem, and the mathematics and algorithms used to solve it.

  4. Privacy-Preserving Data Aggregation Protocols for Wireless Sensor Networks: A Survey

    Directory of Open Access Journals (Sweden)

    Rabindra Bista

    2010-05-01

    Full Text Available Many wireless sensor network (WSN applications require privacy-preserving aggregation of sensor data during transmission from the source nodes to the sink node. In this paper, we explore several existing privacy-preserving data aggregation (PPDA protocols for WSNs in order to provide some insights on their current status. For this, we evaluate the PPDA protocols on the basis of such metrics as communication and computation costs in order to demonstrate their potential for supporting privacy-preserving data aggregation in WSNs. In addition, based on the existing research, we enumerate some important future research directions in the field of privacy-preserving data aggregation for WSNs.

  5. Privacy-preserving data aggregation protocols for wireless sensor networks: a survey.

    Science.gov (United States)

    Bista, Rabindra; Chang, Jae-Woo

    2010-01-01

    Many wireless sensor network (WSN) applications require privacy-preserving aggregation of sensor data during transmission from the source nodes to the sink node. In this paper, we explore several existing privacy-preserving data aggregation (PPDA) protocols for WSNs in order to provide some insights on their current status. For this, we evaluate the PPDA protocols on the basis of such metrics as communication and computation costs in order to demonstrate their potential for supporting privacy-preserving data aggregation in WSNs. In addition, based on the existing research, we enumerate some important future research directions in the field of privacy-preserving data aggregation for WSNs.

  6. Bridging the transatlantic divide in privacy

    Directory of Open Access Journals (Sweden)

    Paula Kift

    2013-08-01

    Full Text Available In the context of the US National Security Agency surveillance scandal, the transatlantic privacy divide has come back to the fore. In the United States, the right to privacy is primarily understood as a right to physical privacy, thus the protection from unwarranted government searches and seizures. In Germany on the other hand, it is also understood as a right to spiritual privacy, thus the right of citizens to develop into autonomous moral agents. The following article will discuss the different constitutional assumptions that underlie American and German attitudes towards privacy, namely privacy as an aspect of liberty or as an aspect of dignity. As data flows defy jurisdictional boundaries, however, policymakers across the Atlantic are faced with a conundrum: how can German and American privacy cultures be reconciled?

  7. Privacy Penetration Testing: How to Establish Trust in Your Cloud Provider

    DEFF Research Database (Denmark)

    Probst, Christian W.; Sasse, M. Angela; Pieters, Wolter

    2012-01-01

    In the age of cloud computing, IT infrastructure becomes virtualised and takes the form of services. This virtualisation results in an increasing de-perimeterisation, where the location of data and computation is irrelevant from a user’s point of view. This irrelevance means that private...... and institutional users no longer have a concept of where their data is stored, and whether they can trust in cloud providers to protect their data. In this chapter, we investigate methods for increasing customers’ trust into cloud providers, and suggest a public penetration-testing agency as an essential component...... in a trustworthy cloud infrastructure....

  8. 12 CFR 716.6 - Information to be included in privacy notices.

    Science.gov (United States)

    2010-01-01

    ...) Financial service providers; (ii) Non-financial companies; and (iii) Others. (4) Disclosures under exception... CREDIT UNIONS PRIVACY OF CONSUMER FINANCIAL INFORMATION Privacy and Opt Out Notices § 716.6 Information... jointly with another financial institution, you satisfy the disclosure requirement of paragraph (a)(5) of...

  9. Privacy in an Ambient World

    NARCIS (Netherlands)

    Dekker, M.A.C.; Etalle, Sandro; den Hartog, Jeremy

    Privacy is a prime concern in today's information society. To protect the privacy of individuals, enterprises must follow certain privacy practices, while collecting or processing personal data. In this chapter we look at the setting where an enterprise collects private data on its website,

  10. Information Privacy Revealed

    Science.gov (United States)

    Lavagnino, Merri Beth

    2013-01-01

    Why is Information Privacy the focus of the January-February 2013 issue of "EDUCAUSE Review" and "EDUCAUSE Review Online"? Results from the 2012 annual survey of the International Association of Privacy Professionals (IAPP) indicate that "meeting regulatory compliance requirements continues to be the top perceived driver…

  11. Genome privacy: challenges, technical approaches to mitigate risk, and ethical considerations in the United States

    Science.gov (United States)

    Wang, Shuang; Jiang, Xiaoqian; Singh, Siddharth; Marmor, Rebecca; Bonomi, Luca; Fox, Dov; Dow, Michelle; Ohno-Machado, Lucila

    2016-01-01

    Accessing and integrating human genomic data with phenotypes is important for biomedical research. Making genomic data accessible for research purposes, however, must be handled carefully to avoid leakage of sensitive individual information to unauthorized parties and improper use of data. In this article, we focus on data sharing within the scope of data accessibility for research. Current common practices to gain biomedical data access are strictly rule based, without a clear and quantitative measurement of the risk of privacy breaches. In addition, several types of studies require privacy-preserving linkage of genotype and phenotype information across different locations (e.g., genotypes stored in a sequencing facility and phenotypes stored in an electronic health record) to accelerate discoveries. The computer science community has developed a spectrum of techniques for data privacy and confidentiality protection, many of which have yet to be tested on real-world problems. In this article, we discuss clinical, technical, and ethical aspects of genome data privacy and confidentiality in the United States, as well as potential solutions for privacy-preserving genotype–phenotype linkage in biomedical research. PMID:27681358

  12. A Survey of Privacy on Data Integration

    OpenAIRE

    Do Son, Thanh

    2015-01-01

    This survey is an integrated view of other surveys on privacy preserving for data integration. First, we review the database context and challenges and research questions. Second, we formulate the privacy problems for schema matching and data matching. Next, we introduce the elements of privacy models. Then, we summarize the existing privacy techniques and the analysis (proofs) of privacy guarantees. Finally, we describe the privacy frameworks and their applications.

  13. Privacy in social networking sites

    OpenAIRE

    Λεονάρδος, Γεώργιος; Leonardos, Giorgos

    2016-01-01

    The purpose of this study is to explore the aspects of privacy over the use of social networks web sites. More specific, we will show the types of social networks, their privacy mechanisms that are different in each social network site, their privacy options that are offered to users. We will report some serious privacy violations incidents of the most popular social networks sites such as Facebook, Twitter, LinkedIn. Also, we will report some important surveys about social networks and pr...

  14. Health information: reconciling personal privacy with the public good of human health.

    Science.gov (United States)

    Gostin, L O

    2001-01-01

    The success of the health care system depends on the accuracy, correctness and trustworthiness of the information, and the privacy rights of individuals to control the disclosure of personal information. A national policy on health informational privacy should be guided by ethical principles that respect individual autonomy while recognizing the important collective interests in the use of health information. At present there are no adequate laws or constitutional principles to help guide a rational privacy policy. The laws are scattered and fragmented across the states. Constitutional law is highly general, without important specific safeguards. Finally, a case study is provided showing the important trade-offs that exist between public health and privacy. For a model public health law, see www.critpath.org/msphpa/privacy.

  15. Data security breaches and privacy in Europe

    CERN Document Server

    Wong, Rebecca

    2013-01-01

    Data Security Breaches and Privacy in Europe aims to consider data protection and cybersecurity issues; more specifically, it aims to provide a fruitful discussion on data security breaches. A detailed analysis of the European Data Protection framework will be examined. In particular, the Data Protection Directive 95/45/EC, the Directive on Privacy and Electronic Communications and the proposed changes under the Data Protection Regulation (data breach notifications) and its implications are considered. This is followed by an examination of the Directive on Attacks against information systems a

  16. 77 FR 31371 - Public Workshop: Privacy Compliance Workshop

    Science.gov (United States)

    2012-05-25

    ... presentations, including the privacy compliance fundamentals, privacy and data security, and the privacy... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Public Workshop: Privacy Compliance... Homeland Security Privacy Office will host a public workshop, ``Privacy Compliance Workshop.'' DATES: The...

  17. Privacy Expectations in Online Contexts

    Science.gov (United States)

    Pure, Rebekah Abigail

    2013-01-01

    Advances in digital networked communication technology over the last two decades have brought the issue of personal privacy into sharper focus within contemporary public discourse. In this dissertation, I explain the Fourth Amendment and the role that privacy expectations play in the constitutional protection of personal privacy generally, and…

  18. Online Tracking Technologies and Web Privacy:Technologieën voor Online volgen en Web Privacy

    OpenAIRE

    Acar, Mustafa Gunes Can

    2017-01-01

    In my PhD thesis, I would like to study the problem of online privacy with a focus on Web and mobile applications. Key research questions to be addressed by my study are the following: How can we formalize and quantify web tracking? What are the threats presented against privacy by different tracking techniques such as browser fingerprinting and cookie based tracking? What kind of privacy enhancing technologies (PET) can be used to ensure privacy without degrading service quality? The stud...

  19. Privacy and medical information on the Internet.

    Science.gov (United States)

    Nelson, Steven B

    2006-02-01

    Health-care consumers are beginning to realize the presence and value of health-care information available on the Internet, but they need to be aware of risks that may be involved. In addition to delivering information, some Web sites collect information. Though not all of the information might be classified as protected health information, consumers need to realize what is collected and how it might be used. Consumers should know a Web site\\'s privacy policy before divulging any personal information. Health-care providers have a responsibility to know what information they are collecting and why. Web servers may collect large amounts of visitor information by default, and they should be modified to limit data collection to only what is necessary. Providers need to be cognizant of the many regulations concerning collection and disclosure of information obtained from consumers. Providers should also provide an easily understood privacy policy for users.

  20. Patients want granular privacy control over health information in electronic medical records.

    Science.gov (United States)

    Caine, Kelly; Hanania, Rima

    2013-01-01

    To assess patients' desire for granular level privacy control over which personal health information should be shared, with whom, and for what purpose; and whether these preferences vary based on sensitivity of health information. A card task for matching health information with providers, questionnaire, and interview with 30 patients whose health information is stored in an electronic medical record system. Most patients' records contained sensitive health information. No patients reported that they would prefer to share all information stored in an electronic medical record (EMR) with all potential recipients. Sharing preferences varied by type of information (EMR data element) and recipient (eg, primary care provider), and overall sharing preferences varied by participant. Patients with and without sensitive records preferred less sharing of sensitive versus less-sensitive information. Patients expressed sharing preferences consistent with a desire for granular privacy control over which health information should be shared with whom and expressed differences in sharing preferences for sensitive versus less-sensitive EMR data. The pattern of results may be used by designers to generate privacy-preserving EMR systems including interfaces for patients to express privacy and sharing preferences. To maintain the level of privacy afforded by medical records and to achieve alignment with patients' preferences, patients should have granular privacy control over information contained in their EMR.

  1. Privacy protection for patients with substance use problems

    Directory of Open Access Journals (Sweden)

    Hu LL

    2011-12-01

    Full Text Available Lianne Lian Hu1, Steven Sparenborg2, Betty Tai21Department of Preventive Medicine and Biometrics, Uniformed Services University of the Health Sciences, 2Center for the Clinical Trials Network, National Institute on Drug Abuse, National Institutes of Health, Bethesda, MDAbstract: Many Americans with substance use problems will have opportunities to receive coordinated health care through the integration of primary care and specialty care for substance use disorders under the Patient Protection and Affordable Care Act of 2010. Sharing of patient health records among care providers is essential to realize the benefits of electronic health records. Health information exchange through meaningful use of electronic health records can improve health care safety, quality, and efficiency. Implementation of electronic health records and health information exchange presents great opportunities for health care integration, but also makes patient privacy potentially vulnerable. Privacy issues are paramount for patients with substance use problems. This paper discusses major differences between two federal privacy laws associated with health care for substance use disorders, identifies health care problems created by privacy policies, and describes potential solutions to these problems through technology innovation and policy improvement.Keywords: substance abuse, patient privacy, electronic health records, health information exchange

  2. 39 CFR 262.5 - Systems (Privacy).

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Systems (Privacy). 262.5 Section 262.5 Postal... DEFINITIONS § 262.5 Systems (Privacy). (a) Privacy Act system of records. A Postal Service system containing... individual. (c) Computer matching program. A “matching program,” as defined in the Privacy Act, 5 U.S.C. 552a...

  3. Assessing the privacy policies in mobile personal health records.

    Science.gov (United States)

    Zapata, Belén Cruz; Hernández Niñirola, Antonio; Fernández-Alemán, José Luis; Toval, Ambrosio

    2014-01-01

    The huge increase in the number and use of smartphones and tablets has led health service providers to take an interest in mHealth. Popular mobile app markets like Apple App Store or Google Play contain thousands of health applications. Although mobile personal health records (mPHRs) have a number of benefits, important challenges appear in the form of adoption barriers. Security and privacy have been identified as part of these barriers and should be addressed. This paper analyzes and assesses a total of 24 free mPHRs for Android and iOS. Characteristics regarding privacy and security were extracted from the HIPAA. The results show important differences in both the mPHRs and the characteristics analyzed. A questionnaire containing six questions concerning privacy policies was defined. Our questionnaire may assist developers and stakeholders to evaluate the security and privacy of their mPHRs.

  4. Book Review: Online Privacy: Issues in the Digital Age

    Directory of Open Access Journals (Sweden)

    Darlene M Tester

    2011-09-01

    Full Text Available Currie, Stephen (2012: Online Privacy: Issues in the Digital Age, San Diego, CA, Reference Point Press, Inc. 96 pages, ISBN: 13-978-1-60152-194-1, US $27.95.Reviewed by Darlene M Tester, CISSP, CISM, ITIL, CHSS, JD, Metropolitan State University, Minnesota (nonsequitr60@gmail.comThis book is one of a series of books Currie has written about online areas of concern. This is the sixth book in the series. The purpose of the book is to act as a primer for people in the IT field who may need a point of reference for Internet issues such as gaming, security and privacy. The book takes a high level look at the complexities of privacy online from social networking to hackers and provides insight into what the most pressing issues of privacy are online today.(see PDF for full review

  5. Is Electronic Privacy Achievable?

    National Research Council Canada - National Science Library

    Irvine, Cynthia E; Levin, Timothy E

    2000-01-01

    ... individuals. The purpose of this panel was to focus on how new technologies are affecting privacy. Technologies that might adversely affect privacy were identified by Rein Turn at previous symposia...

  6. A PhD abstract presentation on Personal Information Privacy System based on Proactive Design

    DEFF Research Database (Denmark)

    Dhotre, Prashant Shantaram; Olesen, Henning

    2014-01-01

    providers and websites collects and make an extensive use of personal information. Using different Big Data methods and techniques the knowledge and patterns are generated or extracted from the data. This will lead to a serious problem to privacy breach. Hence, there is a need of embedding privacy...... in the design phase will be the basic principle on which the data security can be provided, and the privacy will be protected. This will give more control and power to user over personal information....

  7. Privacy enhanced group communication in clinical environment

    Science.gov (United States)

    Li, Mingyan; Narayanan, Sreeram; Poovendran, Radha

    2005-04-01

    Privacy protection of medical records has always been an important issue and is mandated by the recent Health Insurance Portability and Accountability Act (HIPAA) standards. In this paper, we propose security architectures for a tele-referring system that allows electronic group communication among professionals for better quality treatments, while protecting patient privacy against unauthorized access. Although DICOM defines the much-needed guidelines for confidentiality of medical data during transmission, there is no provision in the existing medical security systems to guarantee patient privacy once the data has been received. In our design, we address this issue by enabling tracing back to the recipient whose received data is disclosed to outsiders, using watermarking technique. We present security architecture design of a tele-referring system using a distributed approach and a centralized web-based approach. The resulting tele-referring system (i) provides confidentiality during the transmission and ensures integrity and authenticity of the received data, (ii) allows tracing of the recipient who has either distributed the data to outsiders or whose system has been compromised, (iii) provides proof of receipt or origin, and (iv) can be easy to use and low-cost to employ in clinical environment.

  8. Privacy and security disclosures on telecardiology websites

    NARCIS (Netherlands)

    Dubbeld, L.

    2006-01-01

    This article discusses telemedicine providers¿ online privacy and security disclosures. It presents the results of an exploratory study of a number of telecardiology companies¿ Web sites, providing insight in some of the current strategies towards data protection and information security in the

  9. SoK: Privacy on Mobile Devices – It’s Complicated

    Directory of Open Access Journals (Sweden)

    Spensky Chad

    2016-07-01

    Full Text Available Modern mobile devices place a wide variety of sensors and services within the personal space of their users. As a result, these devices are capable of transparently monitoring many sensitive aspects of these users’ lives (e.g., location, health, or correspondences. Users typically trade access to this data for convenient applications and features, in many cases without a full appreciation of the nature and extent of the information that they are exposing to a variety of third parties. Nevertheless, studies show that users remain concerned about their privacy and vendors have similarly been increasing their utilization of privacy-preserving technologies in these devices. Still, despite significant efforts, these technologies continue to fail in fundamental ways, leaving users’ private data exposed.

  10. Teaching Information Privacy in Marketing Courses: Key Educational Issues for Principles of Marketing and Elective Marketing Courses

    Science.gov (United States)

    Peltier, James W.; Milne, George R.; Phelps, Joseph E.; Barrett, Jennifer T.

    2010-01-01

    An "information privacy gap" exists in marketing education, with little research addressing the state of information privacy and how appropriate privacy strategies and tactics should be communicated to students. The primary purpose of this article is to provide educators an understanding of information privacy and how they can incorporate this…

  11. Collecting location-based voice messages on a TalkingBadge

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Glenstrup, Arne John; Wusheng, Wang

    2012-01-01

    and specific enough to distinguish between zones located just 20 meters apart. Finally, we played digitized voice messages to 11 participants walking into a zone. They received most of the messages well, but a majority of their comments were negative, expressing concerns for the potential infringement...... of privacy. We conclude that location specific audio messaging works from a technical perspective, but requires careful consideration of social comfort....

  12. Disclosure 'downunder': misadventures in Australian genetic privacy law.

    Science.gov (United States)

    Bonython, Wendy; Arnold, Bruce

    2014-03-01

    Along with many jurisdictions, Australia is struggling with the unique issues raised by genetic information in the context of privacy laws and medical ethics. Although the consequences of disclosure of most private information are generally confined to individuals, disclosure of genetic information has far-reaching consequences, with a credible argument that genetic relatives have a right to know about potential medical conditions. In 2006, the Privacy Act was amended to permit disclosure of an individual's genetic information, without their consent, to genetic relatives, if it was to avoid or mitigate serious illness. Unfortunately, additional amendments required for operation of the disclosure amendment were overlooked. Public Interest Determinations (PIDs)-delegated legislation issued by the privacy commissioner-have, instead, been used to exempt healthcare providers from provisions which would otherwise make disclosure unlawful. This paper critiques the PIDs using documents obtained under the Freedom of Information Act-specifically the impact of both the PIDs and the disclosure amendment on patients and relatives-and confidentiality and the procedural validity of subordinate laws regulating medical privacy.

  13. Governance Through Privacy, Fairness, and Respect for Individuals.

    Science.gov (United States)

    Baker, Dixie B; Kaye, Jane; Terry, Sharon F

    2016-01-01

    Individuals have a moral claim to be involved in the governance of their personal data. Individuals' rights include privacy, autonomy, and the ability to choose for themselves how they want to manage risk, consistent with their own personal values and life situations. The Fair Information Practices principles (FIPPs) offer a framework for governance. Privacy-enhancing technology that complies with applicable law and FIPPs offers a dynamic governance tool for enabling the fair and open use of individual's personal data. Any governance model must protect against the risks posed by data misuse. Individual perceptions of risks are a subjective function involving individuals' values toward self, family, and society, their perceptions of trust, and their cognitive decision-making skills. Individual privacy protections and individuals' right to choose are codified in the HIPAA Privacy Rule, which attempts to strike a balance between the dual goals of information flow and privacy protection. The choices most commonly given individuals regarding the use of their health information are binary ("yes" or "no") and immutable. Recent federal recommendations and law recognize the need for granular, dynamic choices. Individuals expect that they will govern the use of their own health and genomic data. Failure to build and maintain individuals' trust increases the likelihood that they will refuse to grant permission to access or use their data. The "no surprises principle" asserts that an individual's personal information should never be collected, used, transmitted, or disclosed in a way that would surprise the individual were she to learn about it. The FIPPs provide a powerful framework for enabling data sharing and use, while maintaining trust. We introduce the eight FIPPs adopted by the Department of Health and Human Services, and provide examples of their interpretation and implementation. Privacy risk and health risk can be reduced by giving consumers control, autonomy, and

  14. The Impact of Privacy Concerns and Perceived Vulnerability to Risks on Users Privacy Protection Behaviors on SNS: A Structural Equation Model

    OpenAIRE

    Noora Sami Al-Saqer; Mohamed E. Seliaman

    2016-01-01

    This research paper investigates Saudi users’ awareness levels about privacy policies in Social Networking Sites (SNSs), their privacy concerns and their privacy protection measures. For this purpose, a research model that consists of five main constructs namely information privacy concern, awareness level of privacy policies of social networking sites, perceived vulnerability to privacy risks, perceived response efficacy, and privacy protecting behavior was developed. An online survey questi...

  15. Regulating Online Data Privacy

    OpenAIRE

    Paul Reid

    2004-01-01

    With existing data protection laws proving inadequate in the fight to protect online data privacy and with the offline law of privacy in a state of change and uncertainty, the search for an alternative solution to the important problem of online data privacy should commence. With the inherent problem of jurisdiction that the Internet presents, such a solution is best coming from a multi-national body with the power to approximate laws in as many jurisdictions as possible, with a recognised au...

  16. Accountability as a Way Forward for Privacy Protection in the Cloud

    Science.gov (United States)

    Pearson, Siani; Charlesworth, Andrew

    The issue of how to provide appropriate privacy protection for cloud computing is important, and as yet unresolved. In this paper we propose an approach in which procedural and technical solutions are co-designed to demonstrate accountability as a path forward to resolving jurisdictional privacy and security risks within the cloud.

  17. Trust Me, I’m a Doctor: Examining Changes in How Privacy Concerns Affect Patient Withholding Behavior

    Science.gov (United States)

    Johnson, Tyler; Ford, Eric W; Huerta, Timothy R

    2017-01-01

    Background As electronic health records (EHRs) become ubiquitous in the health care industry, privacy breaches are increasing and being made public. These breaches may make consumers wary of the technology, undermining its potential to improve care coordination and research. Objective Given the developing concerns around privacy of personal health information stored in digital format, it is important for providers to understand how views on privacy and security may be associated with patient disclosure of health information. This study aimed to understand how privacy concerns may be shifting patient behavior. Methods Using a pooled cross-section of data from the 2011 and 2014 cycles of the Health Information and National Trends Survey (HINTS), we tested whether privacy and security concerns, as well as quality perceptions, are associated with the likelihood of withholding personal health information from a provider. A fully interacted multivariate model was used to compare associations between the 2 years, and interaction terms were used to evaluate trends in the factors that are associated with withholding behavior. Results No difference was found regarding the effect of privacy and security concerns on withholding behavior between 2011 and 2014. Similarly, whereas perceived high quality of care was found to reduce the likelihood of withholding information from a provider in both 2011 (odds ratio [OR] 0.73, 95% confidence interval [CI] 0.56-0.94) and 2014 (OR 0.61, 95% CI 0.48-0.76), no difference was observed between years. Conclusions These findings suggest that consumers’ beliefs about EHR privacy and security, the relationship between technology use and quality, and intentions to share information with their health care provider have not changed. These findings are counter to the ongoing discussions about the implications of security failures in other domains. Our results suggest that providers could ameliorate privacy and security by focusing on the care

  18. Explaining the Impfact of Cloud Assurance Seals on Customers’ Perceived Privacy

    OpenAIRE

    Lang, Michael; Wiesche, Manuel; Krcmar, Helmut

    2018-01-01

    Privacy concerns inhabit professional cloud adoption. Assurance seals resulting from a third-party cer-tification are frequently used from cloud service provider to provide privacy assurance for their cus-tomers. However, empirical findings on the effectiveness of assurance seals focusing on “who” issues those, even if customers also require the information why the assurance seal is valid and reliable. To fill this gap, we build on information integration theory and investigate the impact of ...

  19. Privacy-by-Design(PbD IoT Framework : A Case of Location Privacy Mitigation Strategies for Near Field Communication (NFC Tag Sensor

    Directory of Open Access Journals (Sweden)

    V.Ragunatha Nadarajah

    2017-01-01

    Full Text Available Near Field Communication (NFC technology is a short range (range about 10cm standard extended from the core standard Radio Frequency Identifier (RFID. These technologies are a portion of wireless communication technology. Even though NFC technologies benefit in various field, but it’s still exposed to multiple type of privacy attacks and threat as well since the communication occur in an open environment. The filtering technique been perform on the tag in order to get access to the embedded information. As solution based on tag filtering techniques, existing NFC filtering, Intent filtering has merged together with Bloom filtering from RFID technology. This help in term of elimination the duplicate tag and verify the receiving tag. Meanwhile, as a content protection to NFC Data Exchange Format (NDEF message been transmitted through the communication channel, Advance Encryption Standard (AES 128bit has been implemented on the NDEF message. AES provide solution to encrypt the NDEF message which has been communicated. Bloom filtering performed the hashing operation using MD5 technique as a verification of registered user to the NFC system. While the default Intent filtering direct the user to the selected invocation as registered on the tag after the Bloom filtering verification. Besides that, implementation of AES cryptographic in NDEF message, took approximately about 80 trillion years++ to crack the key using brute force attack. Communication of two legitimate entities is secured with AES encryption. Hence, secured user validation or filtering with encrypted message, prevent the possibility for MITM attacker to retrieve sensitive or personal information. The overall framework provide a better security solution compare to the existing framework.

  20. Privacy and security in teleradiology

    International Nuclear Information System (INIS)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  1. Privacy and security in teleradiology

    Energy Technology Data Exchange (ETDEWEB)

    Ruotsalainen, Pekka [National Institute for Health and Welfare, Helsinki (Finland)], E-mail: pekka.ruotsalainen@THL.fi

    2010-01-15

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper.

  2. Advertising and Invasion of Privacy.

    Science.gov (United States)

    Rohrer, Daniel Morgan

    The right of privacy as it relates to advertising and the use of a person's name or likeness is discussed in this paper. After an introduction that traces some of the history of invasion of privacy in court decisions, the paper examines cases involving issues such as public figures and newsworthy items, right of privacy waived, right of privacy…

  3. Balancing Health Information Exchange and Privacy Governance from a Patient-Centred Connected Health and Telehealth Perspective.

    Science.gov (United States)

    Kuziemsky, Craig E; Gogia, Shashi B; Househ, Mowafa; Petersen, Carolyn; Basu, Arindam

    2018-04-22

     Connected healthcare is an essential part of patient-centred care delivery. Technology such as telehealth is a critical part of connected healthcare. However, exchanging health information brings the risk of privacy issues. To better manage privacy risks we first need to understand the different patterns of patient-centred care in order to tailor solutions to address privacy risks.  Drawing upon published literature, we develop a business model to enable patient-centred care via telehealth. The model identifies three patient-centred connected health patterns. We then use the patterns to analyse potential privacy risks and possible solutions from different types of telehealth delivery.  Connected healthcare raises the risk of unwarranted access to health data and related invasion of privacy. However, the risk and extent of privacy issues differ according to the pattern of patient-centred care delivery and the type of particular challenge as they enable the highest degree of connectivity and thus the greatest potential for privacy breaches.  Privacy issues are a major concern in telehealth systems and patients, providers, and administrators need to be aware of these privacy issues and have guidance on how to manage them. This paper integrates patient-centred connected health care, telehealth, and privacy risks to provide an understanding of how risks vary across different patterns of patient-centred connected health and different types of telehealth delivery. Georg Thieme Verlag KG Stuttgart.

  4. Efficiency and Privacy Enhancement for a Track and Trace System of RFID-Based Supply Chains

    Directory of Open Access Journals (Sweden)

    Xunjun Chen

    2015-06-01

    Full Text Available One of the major applications of Radio Frequency Identification (RFID technology is in supply chain management as it promises to provide real-time visibility based on the function of track and trace. However, such an RFID-based track and trace system raises new security and privacy challenges due to the restricted resource of tags. In this paper, we refine three privacy related models (i.e., the privacy, path unlinkability, and tag unlinkability of RFID-based track and trace systems, and clarify the relations among these privacy models. Specifically, we have proven that privacy is equivalent to path unlinkability and tag unlinkability implies privacy. Our results simplify the privacy concept and protocol design for RFID-based track and trace systems. Furthermore, we propose an efficient track and trace scheme, Tracker+, which allows for authentic and private identification of RFID-tagged objects in supply chains. In the Tracker+, no computational ability is required for tags, but only a few bytes of storage (such as EPC Class 1 Gen 2 tags are needed to store the tag state. Indeed, Tracker+ reduces the memory requirements for each tag by one group element compared to the Tracker presented in other literature. Moreover, Tracker+ provides privacy against supply chain inside attacks.

  5. Privacy and User Experience in 21st Century Library Discovery

    Directory of Open Access Journals (Sweden)

    Shayna Pekala

    2017-06-01

    Full Text Available Over the last decade, libraries have taken advantage of emerging technologies to provide new discovery tools to help users find information and resources more efficiently. In the wake of this technological shift in discovery, privacy has become an increasingly prominent and complex issue for libraries. The nature of the web, over which users interact with discovery tools, has substantially diminished the library’s ability to control patron privacy. The emergence of a data economy has led to a new wave of online tracking and surveillance, in which multiple third parties collect and share user data during the discovery process, making it much more difficult, if not impossible, for libraries to protect patron privacy. In addition, users are increasingly starting their searches with web search engines, diminishing the library’s control over privacy even further. While libraries have a legal and ethical responsibility to protect patron privacy, they are simultaneously challenged to meet evolving user needs for discovery. In a world where “search” is synonymous with Google, users increasingly expect their library discovery experience to mimic their experience using web search engines. However, web search engines rely on a drastically different set of privacy standards, as they strive to create tailored, personalized search results based on user data. Libraries are seemingly forced to make a choice between delivering the discovery experience users expect and protecting user privacy. This paper explores the competing interests of privacy and user experience, and proposes possible strategies to address them in the future design of library discovery tools.

  6. A Hierarchical Multitier Approach for Privacy Policies in e-Government Environments

    Directory of Open Access Journals (Sweden)

    Prokopios Drogkaris

    2015-12-01

    Full Text Available The appeal of e-Government users to retain control over their personal information, while making use of advanced governmental electronic services through interconnected and interoperable deployments, can be assisted by the incorporation of privacy policy and Preferences documents. This paper addresses the formulation of light-weight and accurate privacy policies, while preserving compliance with underlying legal and regulatory framework. Through the exploitation of existing governmental hierarchies, a multitier approach is proposed able to support diverge data needs and processing requests imposed by service providers. The incorporation of this approach into e-Government environments will reduce the administrative workload, imposed by the inclusion of privacy policy documents, and promote the implementation and provision of user-centric and data privacy aware electronic services.

  7. Efficient Secure and Privacy-Preserving Route Reporting Scheme for VANETs

    Science.gov (United States)

    Zhang, Yuanfei; Pei, Qianwen; Dai, Feifei; Zhang, Lei

    2017-10-01

    Vehicular ad-hoc network (VANET) is a core component of intelligent traffic management system which could provide various of applications such as accident prediction, route reporting, etc. Due to the problems caused by traffic congestion, route reporting becomes a prospective application which can help a driver to get optimal route to save her travel time. Before enjoying the convenience of route reporting, security and privacy-preserving issues need to be concerned. In this paper, we propose a new secure and privacy-preserving route reporting scheme for VANETs. In our scheme, only an authenticated vehicle can use the route reporting service provided by the traffic management center. Further, a vehicle may receive the response from the traffic management center with low latency and without violating the privacy of the vehicle. Experiment results show that our scheme is much more efficiency than the existing one.

  8. A Certificate Authority (CA-based cryptographic solution for HIPAA privacy/security regulations

    Directory of Open Access Journals (Sweden)

    Sangram Ray

    2014-07-01

    Full Text Available The Health Insurance Portability and Accountability Act (HIPAA passed by the US Congress establishes a number of privacy/security regulations for e-healthcare systems. These regulations support patients’ medical privacy and secure exchange of PHI (protected health information among medical practitioners. Three existing HIPAA-based schemes have been studied but appear to be ineffective as patients’ PHI is stored in smartcards. Moreover, carrying a smartcard during a treatment session and accessing PHI from different locations results in restrictions. In addition, authentication of the smartcard presenter would not be possible if the PIN is compromised. In this context, we propose an MCS (medical center server should be located at each hospital and accessed via the Internet for secure handling of patients’ PHI. All entities of the proposed e-health system register online with the MCS, and each entity negotiates a contributory registration key, where public-key certificates issued and maintained by CAs are used for authentication. Prior to a treatment session, a doctor negotiates a secret session key with MCS and uploads/retrieves patients’ PHI securely. The proposed scheme has five phases, which have been implemented in a secure manner for supporting HIPAA privacy/security regulations. Finally, the security aspects, computation and communication costs of the scheme are analyzed and compared with existing methods that display satisfactory performance.

  9. A Privacy Preservation Model for Health-Related Social Networking Sites

    Science.gov (United States)

    2015-01-01

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS. PMID:26155953

  10. A Privacy Preservation Model for Health-Related Social Networking Sites.

    Science.gov (United States)

    Li, Jingquan

    2015-07-08

    The increasing use of social networking sites (SNS) in health care has resulted in a growing number of individuals posting personal health information online. These sites may disclose users' health information to many different individuals and organizations and mine it for a variety of commercial and research purposes, yet the revelation of personal health information to unauthorized individuals or entities brings a concomitant concern of greater risk for loss of privacy among users. Many users join multiple social networks for different purposes and enter personal and other specific information covering social, professional, and health domains into other websites. Integration of multiple online and real social networks makes the users vulnerable to unintentional and intentional security threats and misuse. This paper analyzes the privacy and security characteristics of leading health-related SNS. It presents a threat model and identifies the most important threats to users and SNS providers. Building on threat analysis and modeling, this paper presents a privacy preservation model that incorporates individual self-protection and privacy-by-design approaches and uses the model to develop principles and countermeasures to protect user privacy. This study paves the way for analysis and design of privacy-preserving mechanisms on health-related SNS.

  11. UnLynx: A Decentralized System for Privacy-Conscious Data Sharing

    Directory of Open Access Journals (Sweden)

    Froelicher David

    2017-10-01

    Full Text Available Current solutions for privacy-preserving data sharing among multiple parties either depend on a centralized authority that must be trusted and provides only weakest-link security (e.g., the entity that manages private/secret cryptographic keys, or leverage on decentralized but impractical approaches (e.g., secure multi-party computation. When the data to be shared are of a sensitive nature and the number of data providers is high, these solutions are not appropriate. Therefore, we present UnLynx, a new decentralized system for efficient privacy-preserving data sharing. We consider m servers that constitute a collective authority whose goal is to verifiably compute on data sent from n data providers. UnLynx guarantees the confidentiality, unlinkability between data providers and their data, privacy of the end result and the correctness of computations by the servers. Furthermore, to support differentially private queries, UnLynx can collectively add noise under encryption. All of this is achieved through a combination of a set of new distributed and secure protocols that are based on homomorphic cryptography, verifiable shuffling and zero-knowledge proofs. UnLynx is highly parallelizable and modular by design as it enables multiple security/privacy vs. runtime tradeoffs. Our evaluation shows that UnLynx can execute a secure survey on 400,000 personal data records containing 5 encrypted attributes, distributed over 20 independent databases, for a total of 2,000,000 ciphertexts, in 24 minutes.

  12. Simple Peer-to-Peer SIP Privacy

    Science.gov (United States)

    Koskela, Joakim; Tarkoma, Sasu

    In this paper, we introduce a model for enhancing privacy in peer-to-peer communication systems. The model is based on data obfuscation, preventing intermediate nodes from tracking calls, while still utilizing the shared resources of the peer network. This increases security when moving between untrusted, limited and ad-hoc networks, when the user is forced to rely on peer-to-peer schemes. The model is evaluated using a Host Identity Protocol-based prototype on mobile devices, and is found to provide good privacy, especially when combined with a source address hiding scheme. The contribution of this paper is to present the model and results obtained from its use, including usability considerations.

  13. Redefining genomic privacy: trust and empowerment.

    Directory of Open Access Journals (Sweden)

    Yaniv Erlich

    2014-11-01

    Full Text Available Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  14. Redefining genomic privacy: trust and empowerment.

    Science.gov (United States)

    Erlich, Yaniv; Williams, James B; Glazer, David; Yocum, Kenneth; Farahany, Nita; Olson, Maynard; Narayanan, Arvind; Stein, Lincoln D; Witkowski, Jan A; Kain, Robert C

    2014-11-01

    Fulfilling the promise of the genetic revolution requires the analysis of large datasets containing information from thousands to millions of participants. However, sharing human genomic data requires protecting subjects from potential harm. Current models rely on de-identification techniques in which privacy versus data utility becomes a zero-sum game. Instead, we propose the use of trust-enabling techniques to create a solution in which researchers and participants both win. To do so we introduce three principles that facilitate trust in genetic research and outline one possible framework built upon those principles. Our hope is that such trust-centric frameworks provide a sustainable solution that reconciles genetic privacy with data sharing and facilitates genetic research.

  15. Privacy-invading technologies : safeguarding privacy, liberty & security in the 21st century

    NARCIS (Netherlands)

    Klitou, Demetrius

    2012-01-01

    With a focus on the growing development and deployment of the latest technologies that threaten privacy, the PhD dissertation argues that the US and UK legal frameworks, in their present form, are inadequate to defend privacy and other civil liberties against the intrusive capabilities of body

  16. SIED, a Data Privacy Engineering Framework

    OpenAIRE

    Mivule, Kato

    2013-01-01

    While a number of data privacy techniques have been proposed in the recent years, a few frameworks have been suggested for the implementation of the data privacy process. Most of the proposed approaches are tailored towards implementing a specific data privacy algorithm but not the overall data privacy engineering and design process. Therefore, as a contribution, this study proposes SIED (Specification, Implementation, Evaluation, and Dissemination), a conceptual framework that takes a holist...

  17. Comparison of two speech privacy measurements, articulation index (AI) and speech privacy noise isolation class (NIC'), in open workplaces

    Science.gov (United States)

    Yoon, Heakyung C.; Loftness, Vivian

    2002-05-01

    Lack of speech privacy has been reported to be the main dissatisfaction among occupants in open workplaces, according to workplace surveys. Two speech privacy measurements, Articulation Index (AI), standardized by the American National Standards Institute in 1969, and Speech Privacy Noise Isolation Class (NIC', Noise Isolation Class Prime), adapted from Noise Isolation Class (NIC) by U. S. General Services Administration (GSA) in 1979, have been claimed as objective tools to measure speech privacy in open offices. To evaluate which of them, normal privacy for AI or satisfied privacy for NIC', is a better tool in terms of speech privacy in a dynamic open office environment, measurements were taken in the field. AIs and NIC's in the different partition heights and workplace configurations have been measured following ASTM E1130 (Standard Test Method for Objective Measurement of Speech Privacy in Open Offices Using Articulation Index) and GSA test PBS-C.1 (Method for the Direct Measurement of Speech-Privacy Potential (SPP) Based on Subjective Judgments) and PBS-C.2 (Public Building Service Standard Method of Test Method for the Sufficient Verification of Speech-Privacy Potential (SPP) Based on Objective Measurements Including Methods for the Rating of Functional Interzone Attenuation and NC-Background), respectively.

  18. An Alternative View of Privacy on Facebook

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2011-02-01

    Full Text Available The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy tends to ignore the political economy of privacy in capitalism that can mask socio-economic inequality and protect capital and the rich from public accountability. Facebook is in this paper analyzed with the help of an approach, in which privacy for dominant groups, in regard to the ability of keeping wealth and power secret from the public, is seen as problematic, whereas privacy at the bottom of the power pyramid for consumers and normal citizens is seen as a protection from dominant interests. Facebook’s privacy concept is based on an understanding that stresses self-regulation and on an individualistic understanding of privacy. The theoretical analysis of the political economy of privacy on Facebook in this paper is based on the political theories of Karl Marx, Hannah Arendt and Jürgen Habermas. Based on the political economist Dallas Smythe’s concept of audience commodification, the process of prosumer commodification on Facebook is analyzed. The political economy of privacy on Facebook is analyzed with the help of a theory of drives that is grounded in Herbert Marcuse’s interpretation of Sigmund Freud, which allows to analyze Facebook based on the concept of play labor (= the convergence of play and labor.

  19. Efficient and privacy-preserving biometric identification in cloud

    Directory of Open Access Journals (Sweden)

    Changhee Hahn

    2016-09-01

    Full Text Available With the rapid growth in the development of smart devices equipped with biometric sensors, client identification system using biometric traits are widely adopted across various applications. Among many biometric traits, fingerprint-based identification systems have been extensively studied and deployed. However, to adopt biometric identification systems in practical applications, two main obstacles in terms of efficiency and client privacy must be resolved simultaneously. That is, identification should be performed at an acceptable time, and only a client should have access to his/her biometric traits, which are not revocable if leaked. Until now, multiple studies have demonstrated successful protection of client biometric data; however, such systems lack efficiency that leads to excessive time utilization for identification. The most recently researched scheme shows efficiency improvements but reveals client biometric traits to other entities such as biometric database server. This violates client privacy. In this paper, we propose an efficient and privacy-preserving fingerprint identification scheme by using cloud systems. The proposed scheme extensively exploits the computation power of a cloud so that most of the laborious computations are performed by the cloud service provider. According to our experimental results on an Amazon EC2 cloud, the proposed scheme is faster than the existing schemes and guarantees client privacy by exploiting symmetric homomorphic encryption. Our security analysis shows that during identification, the client fingerprint data is not disclosed to the cloud service provider or fingerprint database server.

  20. An Alternative View of Privacy on Facebook

    OpenAIRE

    Christian Fuchs

    2011-01-01

    The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy ...

  1. Effective online privacy mechanisms with persuasive communication

    OpenAIRE

    Coopamootoo, P L

    2016-01-01

    This thesis contributes to research by taking a social psychological perspective to managing privacy online. The thesis proposes to support the effort to form a mental model that is required to evaluate a context with regards to privacy attitudes or to ease the effort by biasing activation of privacy attitudes. Privacy being a behavioural concept, the human-computer interaction design plays a major role in supporting and contributing to end users’ ability to manage their privacy online. Howev...

  2. PriBots: Conversational Privacy with Chatbots

    OpenAIRE

    Harkous, Hamza; Fawaz, Kassem; Shin, Kang G.; Aberer, Karl

    2016-01-01

    Traditional mechanisms for delivering notice and enabling choice have so far failed to protect users’ privacy. Users are continuously frustrated by complex privacy policies, unreachable privacy settings, and a multitude of emerging standards. The miniaturization trend of smart devices and the emergence of the Internet of Things (IoTs) will exacerbate this problem further. In this paper, we propose Conversational Privacy Bots (PriBots) as a new way of delivering notice and choice through a two...

  3. Scalable privacy-preserving data sharing methodology for genome-wide association studies.

    Science.gov (United States)

    Yu, Fei; Fienberg, Stephen E; Slavković, Aleksandra B; Uhler, Caroline

    2014-08-01

    The protection of privacy of individual-level information in genome-wide association study (GWAS) databases has been a major concern of researchers following the publication of "an attack" on GWAS data by Homer et al. (2008). Traditional statistical methods for confidentiality and privacy protection of statistical databases do not scale well to deal with GWAS data, especially in terms of guarantees regarding protection from linkage to external information. The more recent concept of differential privacy, introduced by the cryptographic community, is an approach that provides a rigorous definition of privacy with meaningful privacy guarantees in the presence of arbitrary external information, although the guarantees may come at a serious price in terms of data utility. Building on such notions, Uhler et al. (2013) proposed new methods to release aggregate GWAS data without compromising an individual's privacy. We extend the methods developed in Uhler et al. (2013) for releasing differentially-private χ(2)-statistics by allowing for arbitrary number of cases and controls, and for releasing differentially-private allelic test statistics. We also provide a new interpretation by assuming the controls' data are known, which is a realistic assumption because some GWAS use publicly available data as controls. We assess the performance of the proposed methods through a risk-utility analysis on a real data set consisting of DNA samples collected by the Wellcome Trust Case Control Consortium and compare the methods with the differentially-private release mechanism proposed by Johnson and Shmatikov (2013). Copyright © 2014 Elsevier Inc. All rights reserved.

  4. (IN-PRIVACY IN MOBILE APPS. CUSTOMER OPPORTUNITIES

    Directory of Open Access Journals (Sweden)

    Yu.S. Chemerkina

    2016-01-01

    implement the missing privacy and security protection control and provide the privacy requirements (keeping the users informed about possibility to avoid untrusted usage cases. Practical Relevance. Practical relevance of the received results is the following: first, the set of knowledge facts about each examined application to privacy score per application, per application category (IM, travel, etc., per OS, etc; second, the developed method under the forensics approach can be used to carry out analysis of the application data privacy in relation to the specified requirements including audit, reconfiguring EMM application policies and reasons for their commissioning.

  5. Defending Privacy: the Development and Deployment of a Darknet

    OpenAIRE

    McManamon, Conor; Mtenzi, Fredrick

    2010-01-01

    New measures imposed by governments, Internet service providers and other third parties which threaten the state of privacy are also opening new avenues to protecting it. The unwarranted scrutiny of legitimate services such as file hosters and the BitTorrent protocol, once relatively unknown to the casual Internet user, is becoming more obvious. The darknet is a rising contender against these new measures and will preserve the default right to privacy of Internet users. A darknet is defined i...

  6. Privacy Protection: Mandating New Arrangements to Implement and Assess Federal Privacy Policy and Practice

    National Research Council Canada - National Science Library

    Relyea, Harold C

    2004-01-01

    When Congress enacted the Privacy Act of 1974, it established a temporary national study commission to conduct a comprehensive assessment of privacy policy and practice in both the public and private...

  7. Better Together: Co-Location of Dental and Primary Care Provides Opportunities to Improve Oral Health.

    Science.gov (United States)

    Pourat, Nadereh; Martinez, Ana E; Crall, James J

    2015-09-01

    Community Health Centers (CHCs) are one of the principal safety-net providers of health care for low-income and uninsured populations. Co-locating dental services in primary care settings provides an opportunity to improve access to dental care. Yet this study of California CHCs that provide primary care services shows that only about one-third of them co-located primary and dental care services on-site. An additional one-third were members of multisite organizations in which at least one other site provided dental care. The remaining one-third of CHC sites had no dental care capacity. Policy options to promote co-location include requiring on-site availability of dental services, providing infrastructure funding to build and equip dental facilities, and offering financial incentives to provide dental care and recruit dental providers.

  8. 24 CFR 3280.107 - Interior privacy.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Interior privacy. 3280.107 Section 3280.107 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... privacy. Bathroom and toilet compartment doors shall be equipped with a privacy lock. ...

  9. 45 CFR 503.1 - Definitions-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Definitions-Privacy Act. 503.1 Section 503.1... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.1 Definitions—Privacy Act. For the purpose of this part: Agency...

  10. 48 CFR 52.224-2 - Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Privacy Act. 52.224-2... AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 52.224-2 Privacy... agency function: Privacy Act (APR 1984) (a) The Contractor agrees to— (1) Comply with the Privacy Act of...

  11. Virtue, Privacy and Self-Determination

    DEFF Research Database (Denmark)

    Stamatellos, Giannis

    2011-01-01

    The ethical problem of privacy lies at the core of computer ethics and cyber ethics discussions. The extensive use of personal data in digital networks poses a serious threat to the user’s right of privacy not only at the level of a user’s data integrity and security but also at the level of a user......’s identity and freedom. In normative ethical theory the need for an informational self-deterministic approach of privacy is stressed with greater emphasis on the control over personal data. However, scant attention has been paid on a virtue ethics approach of information privacy. Plotinus’ discussion of self......-determination is related to ethical virtue, human freedom and intellectual autonomy. The Plotinian virtue ethics approach of self-determination is not primarily related to the sphere of moral action, but to the quality of the self prior to moral practice. In this paper, it is argued that the problem of information privacy...

  12. Privacy and security in teleradiology.

    Science.gov (United States)

    Ruotsalainen, Pekka

    2010-01-01

    Teleradiology is probably the most successful eHealth service available today. Its business model is based on the remote transmission of radiological images (e.g. X-ray and CT-images) over electronic networks, and on the interpretation of the transmitted images for diagnostic purpose. Two basic service models are commonly used teleradiology today. The most common approach is based on the message paradigm (off-line model), but more developed teleradiology systems are based on the interactive use of PACS/RIS systems. Modern teleradiology is also more and more cross-organisational or even cross-border service between service providers having different jurisdictions and security policies. This paper defines the requirements needed to make different teleradiology models trusted. Those requirements include a common security policy that covers all partners and entities, common security and privacy protection principles and requirements, controlled contracts between partners, and the use of security controls and tools that supporting the common security policy. The security and privacy protection of any teleradiology system must be planned in advance, and the necessary security and privacy enhancing tools should be selected (e.g. strong authentication, data encryption, non-repudiation services and audit-logs) based on the risk analysis and requirements set by the legislation. In any case the teleradiology system should fulfil ethical and regulatory requirements. Certification of the whole teleradiology service system including security and privacy is also proposed. In the future, teleradiology services will be an integrated part of pervasive eHealth. Security requirements for this environment including dynamic and context aware security services are also discussed in this paper. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  13. Older and Wiser? Facebook Use, Privacy Concern, and Privacy Protection in the Life Stages of Emerging, Young, and Middle Adulthood

    Directory of Open Access Journals (Sweden)

    Evert Van den Broeck

    2015-11-01

    Full Text Available A large part of research conducted on privacy concern and protection on social networking sites (SNSs concentrates on children and adolescents. Individuals in these developmental stages are often described as vulnerable Internet users. But how vulnerable are adults in terms of online informational privacy? This study applied a privacy boundary management approach and investigated Facebook use, privacy concern, and the application of privacy settings on Facebook by linking the results to Erikson’s three stages of adulthood: emerging, young, and middle adulthood. An online survey was distributed among 18- to 65-year-old Dutch-speaking adults ( N  = 508, 51.8% females. Analyses revealed clear differences between the three adult age groups in terms of privacy concern, Facebook use, and privacy protection. Results indicated that respondents in young adulthood and middle adulthood were more vulnerable in terms of privacy protection than emerging adults. Clear discrepancies were found between privacy concern and protection for these age groups. More particularly, the middle adulthood group was more concerned about their privacy in comparison to the emerging adulthood and young adulthood group. Yet, they reported to use privacy settings less frequently than the younger age groups. Emerging adults were found to be pragmatic and privacy conscious SNS users. Young adults occupied the intermediate position, suggesting a developmental shift. The impact of generational differences is discussed, as well as implications for education and governmental action.

  14. Enforcement of Privacy Policies over Multiple Online Social Networks for Collaborative Activities

    Science.gov (United States)

    Wu, Zhengping; Wang, Lifeng

    Our goal is to tend to develop an enforcement architecture of privacy policies over multiple online social networks. It is used to solve the problem of privacy protection when several social networks build permanent or temporary collaboration. Theoretically, this idea is practical, especially due to more and more social network tend to support open source framework “OpenSocial”. But as we known different social network websites may have the same privacy policy settings based on different enforcement mechanisms, this would cause problems. In this case, we have to manually write code for both sides to make the privacy policy settings enforceable. We can imagine that, this is a huge workload based on the huge number of current social networks. So we focus on proposing a middleware which is used to automatically generate privacy protection component for permanent integration or temporary interaction of social networks. This middleware provide functions, such as collecting of privacy policy of each participant in the new collaboration, generating a standard policy model for each participant and mapping all those standard policy to different enforcement mechanisms of those participants.

  15. Current Trends and Challenges in Location-Based Services

    Directory of Open Access Journals (Sweden)

    Haosheng Huang

    2018-05-01

    Full Text Available Location-based services (LBS are a growing area of research. This editorial paper introduces the key research areas within the scientific field of LBS, which consist of positioning, modelling, communication, applications, evaluation, analysis of LBS data, and privacy and ethical issues. After that, 18 original papers are presented, which provide a general picture of recent research activities on LBS, especially related to the research areas of positioning, modelling, applications, and LBS data analysis. This Special Issue together with other recent events and publications concerning LBS show that the scientific field of LBS is rapidly evolving, and that LBS applications have become smarter and more ubiquitous in many aspects of our daily life.

  16. 49 CFR 10.13 - Privacy Officer.

    Science.gov (United States)

    2010-10-01

    ... INDIVIDUALS General § 10.13 Privacy Officer. (a) To assist with implementation, evaluation, and administration issues, the Chief Information Officer appoints a principal coordinating official with the title Privacy... 49 Transportation 1 2010-10-01 2010-10-01 false Privacy Officer. 10.13 Section 10.13...

  17. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  18. An innovative privacy preserving technique for incremental datasets on cloud computing.

    Science.gov (United States)

    Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan

    2016-08-01

    Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Will you accept the government's friend request? Social networks and privacy concerns.

    Science.gov (United States)

    Siegel, David A

    2013-01-01

    Participating in social network websites entails voluntarily sharing private information, and the explosive growth of social network websites over the last decade suggests shifting views on privacy. Concurrently, new anti-terrorism laws, such as the USA Patriot Act, ask citizens to surrender substantial claim to privacy in the name of greater security. I address two important questions regarding individuals' views on privacy raised by these trends. First, how does prompting individuals to consider security concerns affect their views on government actions that jeopardize privacy? Second, does the use of social network websites alter the effect of prompted security concerns? I posit that prompting individuals to consider security concerns does lead to an increased willingness to accept government actions that jeopardize privacy, but that frequent users of websites like Facebook are less likely to be swayed by prompted security concerns. An embedded survey experiment provides support for both parts of my claim.

  20. 77 FR 33761 - Privacy Act of 1974; Notification to Update an Existing Privacy Act System of Records, “Grievance...

    Science.gov (United States)

    2012-06-07

    ... of a data breach. (See also on HUD's privacy Web site, Appendix I for other ways that the Privacy Act... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5613-N-04] Privacy Act of 1974; Notification to Update an Existing Privacy Act System of Records, ``Grievance Records'' AGENCY: Office of the...

  1. Patient Privacy in the Era of Big Data.

    Science.gov (United States)

    Kayaalp, Mehmet

    2018-01-20

    and responsibilities such as requesting and granting only the amount of health information that is necessary for the scientific study. On the other hand, developers of de-identification systems provide guidelines to use different modes of operations to maximize the effectiveness of their tools and the success of de-identification. Institutions with clinical repositories need to follow these rules and guidelines closely to successfully protect patient privacy. To open the gates of big data to scientific communities, healthcare institutions need to be supported in their de-identification and data sharing efforts by the public, scientific communities, and local, state, and federal legislators and government agencies.

  2. Online Privacy as a Corporate Social Responsibility

    DEFF Research Database (Denmark)

    Pollach, Irene

    2011-01-01

    Information technology and the Internet have added a new stakeholder concern to the corporate social responsibility agenda: online privacy. While theory suggests that online privacy is a corporate social responsibility, only very few studies in the business ethics literature have connected...... of the companies have comprehensive privacy programs, although more than half of them voice moral or relational motives for addressing online privacy. The privacy measures they have taken are primarily compliance measures, while measures that stimulate a stakeholder dialogue are rare. Overall, a wide variety...

  3. Trust Me, I'm a Doctor: Examining Changes in How Privacy Concerns Affect Patient Withholding Behavior.

    Science.gov (United States)

    Walker, Daniel M; Johnson, Tyler; Ford, Eric W; Huerta, Timothy R

    2017-01-04

    As electronic health records (EHRs) become ubiquitous in the health care industry, privacy breaches are increasing and being made public. These breaches may make consumers wary of the technology, undermining its potential to improve care coordination and research. Given the developing concerns around privacy of personal health information stored in digital format, it is important for providers to understand how views on privacy and security may be associated with patient disclosure of health information. This study aimed to understand how privacy concerns may be shifting patient behavior. Using a pooled cross-section of data from the 2011 and 2014 cycles of the Health Information and National Trends Survey (HINTS), we tested whether privacy and security concerns, as well as quality perceptions, are associated with the likelihood of withholding personal health information from a provider. A fully interacted multivariate model was used to compare associations between the 2 years, and interaction terms were used to evaluate trends in the factors that are associated with withholding behavior. No difference was found regarding the effect of privacy and security concerns on withholding behavior between 2011 and 2014. Similarly, whereas perceived high quality of care was found to reduce the likelihood of withholding information from a provider in both 2011 (odds ratio [OR] 0.73, 95% confidence interval [CI] 0.56-0.94) and 2014 (OR 0.61, 95% CI 0.48-0.76), no difference was observed between years. These findings suggest that consumers' beliefs about EHR privacy and security, the relationship between technology use and quality, and intentions to share information with their health care provider have not changed. These findings are counter to the ongoing discussions about the implications of security failures in other domains. Our results suggest that providers could ameliorate privacy and security by focusing on the care quality benefits EHRs provide. ©Daniel M Walker

  4. Privacy protection for personal health information and shared care records.

    Science.gov (United States)

    Neame, Roderick L B

    2014-01-01

    The protection of personal information privacy has become one of the most pressing security concerns for record keepers: this will become more onerous with the introduction of the European General Data Protection Regulation (GDPR) in mid-2014. Many institutions, both large and small, have yet to implement the essential infrastructure for data privacy protection and patient consent and control when accessing and sharing data; even more have failed to instil a privacy and security awareness mindset and culture amongst their staff. Increased regulation, together with better compliance monitoring, has led to the imposition of increasingly significant monetary penalties for failure to protect privacy: these too are set to become more onerous under the GDPR, increasing to a maximum of 2% of annual turnover. There is growing pressure in clinical environments to deliver shared patient care and to support this with integrated information. This demands that more information passes between institutions and care providers without breaching patient privacy or autonomy. This can be achieved with relatively minor enhancements of existing infrastructures and does not require extensive investment in inter-operating electronic records: indeed such investments to date have been shown not to materially improve data sharing. REQUIREMENTS FOR PRIVACY: There is an ethical duty as well as a legal obligation on the part of care providers (and record keepers) to keep patient information confidential and to share it only with the authorisation of the patient. To achieve this information storage and retrieval, communication systems must be appropriately configured. There are many components of this, which are discussed in this paper. Patients may consult clinicians anywhere and at any time: therefore, their data must be available for recipient-driven retrieval (i.e. like the World Wide Web) under patient control and kept private: a method for delivering this is outlined.

  5. Governing the internet in the privacy arena

    Directory of Open Access Journals (Sweden)

    Carsten Ochs

    2016-09-01

    Full Text Available The surveillance disclosures triggered by Snowden have fueled the public re-negotiation of privacy. To follow resulting controversies we present a methodology that links social worlds theory to approaches asking for the democratic governance character of issue-centred arenas. After having outlined this approach it is put to the test. We analyse and compare two cases: the Schengen/National Routing, and the Parliamentary Committee investigating the NSA surveillance disclosures. The analysis reveals two oscillating governance modes at work in the privacy arena; their interplay results in an obstruction. Based on this observation we finally provide a diagnosis of possible future arena trajectories.

  6. A new privacy preserving technique for cloud service user endorsement using multi-agents

    Directory of Open Access Journals (Sweden)

    D. Chandramohan

    2016-01-01

    Full Text Available In data analysis the present focus on storage services are leveraged to attain its crucial part while user data get compromised. In the recent years service user’s valuable information has been utilized by unauthorized users and service providers. This paper examines the privacy awareness and importance of user’s secrecy preserving in the current cloud computing era. Gradually the information kept under the cloud environment gets increased due to its elasticity and availability. However, highly sensitive information is in a serious attack from various sources. Once private information gets misused, the probability of privacy breaching increases which thereby reduces user’s trust on cloud providers. In the modern internet world, information management and maintenance is one among the most decisive tasks. Information stored in the cloud by the finance, healthcare, government sectors, etc. makes it all the more challenging since such tasks are to be handled globally. The present scenario therefore demands a new Petri-net Privacy Preserving Framework (PPPF for safeguarding user’s privacy and, providing consistent and breach-less services from the cloud. This paper illustrates the design of PPPF and mitigates the cloud provider’s trust among users. The proposed technique conveys and collaborates with Privacy Preserving Cohesion Technique (PPCT, to develop validate, promote, adapt and also increase the need for data privacy. Moreover, this paper focuses on clinching and verification of unknown user intervention into the confidential data present in storage area and ensuring the performance of the cloud services. It also acts as an information preserving guard for high secrecy data storage areas.

  7. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  8. Privacy notice for dummies? Towards European guidelines on how to give "clear and comprehensive information" on the cookies' use in order to protect the internet users' right to online privacy

    NARCIS (Netherlands)

    Luzak, J.A.

    2014-01-01

    The reviewed ePrivacy Directive aims at ensuring internet users’ online privacy by requiring users to give informed consent to the gathering, storing, and processing of their data by internet service providers, e.g., through the cookies’ use. However, it is hardly possible to talk about an

  9. A model-driven privacy compliance decision support for medical data sharing in Europe.

    Science.gov (United States)

    Boussi Rahmouni, H; Solomonides, T; Casassa Mont, M; Shiu, S; Rahmouni, M

    2011-01-01

    Clinical practitioners and medical researchers often have to share health data with other colleagues across Europe. Privacy compliance in this context is very important but challenging. Automated privacy guidelines are a practical way of increasing users' awareness of privacy obligations and help eliminating unintentional breaches of privacy. In this paper we present an ontology-plus-rules based approach to privacy decision support for the sharing of patient data across European platforms. We use ontologies to model the required domain and context information about data sharing and privacy requirements. In addition, we use a set of Semantic Web Rule Language rules to reason about legal privacy requirements that are applicable to a specific context of data disclosure. We make the complete set invocable through the use of a semantic web application acting as an interactive privacy guideline system can then invoke the full model in order to provide decision support. When asked, the system will generate privacy reports applicable to a specific case of data disclosure described by the user. Also reports showing guidelines per Member State may be obtained. The advantage of this approach lies in the expressiveness and extensibility of the modelling and inference languages adopted and the ability they confer to reason with complex requirements interpreted from high level regulations. However, the system cannot at this stage fully simulate the role of an ethics committee or review board.

  10. New Collaborative Filtering Algorithms Based on SVD++ and Differential Privacy

    Directory of Open Access Journals (Sweden)

    Zhengzheng Xian

    2017-01-01

    Full Text Available Collaborative filtering technology has been widely used in the recommender system, and its implementation is supported by the large amount of real and reliable user data from the big-data era. However, with the increase of the users’ information-security awareness, these data are reduced or the quality of the data becomes worse. Singular Value Decomposition (SVD is one of the common matrix factorization methods used in collaborative filtering, which introduces the bias information of users and items and is realized by using algebraic feature extraction. The derivative model SVD++ of SVD achieves better predictive accuracy due to the addition of implicit feedback information. Differential privacy is defined very strictly and can be proved, which has become an effective measure to solve the problem of attackers indirectly deducing the personal privacy information by using background knowledge. In this paper, differential privacy is applied to the SVD++ model through three approaches: gradient perturbation, objective-function perturbation, and output perturbation. Through theoretical derivation and experimental verification, the new algorithms proposed can better protect the privacy of the original data on the basis of ensuring the predictive accuracy. In addition, an effective scheme is given that can measure the privacy protection strength and predictive accuracy, and a reasonable range for selection of the differential privacy parameter is provided.

  11. To have or not to have: the true privacy question

    Directory of Open Access Journals (Sweden)

    Paula Kift

    2013-12-01

    Full Text Available In light of the recent US National Security Agency (NSA surveillance scandals, the article reflects on the continued importance of privacy in the information age. Based on a taxonomy of privacy violations provided by Daniel Solove, it takes the reader on an imaginary journey to a world in which privacy has ceased to exist. What does it mean “to have or not to have privacy” in the information age? This essay, part academic, part call for action, explores this question by means of an analogy, focusing on the relationship between citizens and the state. It demonstrates that the invisible presence of the NSA should be a matter of great concern to us. There is no justification for blanket surveillance. The right to security is an illusion. Instead of fighting windmills, we should fight for our right to privacy instead. We need to have privacy; we need it to live and love, to make mistakes, and to grow. We need it as individuals and as a society. And we can have it if we press our legislators to return it to us. It is time to start fighting back.

  12. The Privacy Calculus: Mobile Apps and User Perceptions of Privacy and Security

    Directory of Open Access Journals (Sweden)

    Elizabeth Fife

    2012-07-01

    Full Text Available A continuing stream of new mobile data services are being released that rely upon the collection of personal data to support a business model. New technologies including facial recognition, sensors and Near Field Communications (NFC will increasingly become a part of everyday services and applications that challenge traditional concepts of individual privacy. The average person as well as the “tech‐savvy” mobile phone user may not yet be fully aware of the extent to which their privacy and security are being affected through their mobile activities and how comparable this situation is to personal computer usage. We investigate perceptions and usage of mobile data services that appear to have specific privacy and security sensitivities, specifically social networking,\tbanking/payments\tand\thealth‐related activities. Our annual survey of smartphone users in the U.S. and Japan is presented from 2011. This nationally representative survey data is used to show demographic and cultural differences, and substantiate our hypotheses about the links between use and privacy concerns

  13. Privacy Enforcement in a Cost-Effective Smart Grid

    DEFF Research Database (Denmark)

    Mikkelsen, Søren Aagaard

    In this technical report we present the current state of the research conducted during the first part of the PhD period. The PhD thesis “Privacy Enforcement in a Cost-Effective Smart Grid” focuses on ensuring privacy when generating market for energy service providers that develop web services...... for the residential domain in the envisaged smart grid. The PhD project is funded and associated to the EU project “Energy Demand Aware Open Services for Smart Grid Intelligent Automation” (Smart HG) and therefore introduces the project on a system-level. Based on this, we present some of the integration, security...... and privacy challenges that emerge when designing a system architecture and infrastructure. The resulting architecture is a consumer-centric and agent-based design and uses open Internet-based communication protocols for enabling interoperability while being cost-effective. Finally, the PhD report present...

  14. Privacy Protection Research of Mobile RFID

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Radio Frequency Identification is one of the most controversial technologies at present.It is very difficult to detect who reads a tag incorporated into products owned by a person,a significant concern to privacy threats in RFID system arises from this reason.User privacy problem is prior considersion for mobile RFID service,because most mobile RFID service based on end-user service.Propose a solution for user privacy protection,which is a modification of EPC Class 1 Generation 2 protocol,and introduce a privacy protection scenario for mobile RFID service using this method.

  15. Analysis of Privacy on Social Networks

    OpenAIRE

    Tomandl, Luboš

    2015-01-01

    This thesis deals with a question of privacy in a context of social networks. The main substance of these services is the users' option to share an information about their lives. This alone can be a problem for privacy. In the first part of this thesis concentrates on the meaning of privacy as well as its value for both individuals and the society. In the next part the privacy threats on social networks, namely Facebook, are discussed. These threats are disclosed on four levels according to f...

  16. Genetic secrets: Protecting privacy and confidentiality in the genetic era

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-07-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  17. Privacy-Preserving Meter Report Protocol of Isolated Smart Grid Devices

    Directory of Open Access Journals (Sweden)

    Zhiwei Wang

    2017-01-01

    Full Text Available Smart grid aims to improve the reliability, efficiency, and security of the traditional grid, which allows two-way transmission and efficiency-driven response. However, a main concern of this new technique is that the fine-grained metering data may leak the personal privacy information of the customers. Thus, the data aggregation mechanism for privacy protection is required for the meter report protocol in smart grid. In this paper, we propose an efficient privacy-preserving meter report protocol for the isolated smart grid devices. Our protocol consists of an encryption scheme with additively homomorphic property and a linearly homomorphic signature scheme, where the linearly homomorphic signature scheme is suitable for privacy-preserving data aggregation. We also provide security analysis of our protocol in the context of some typical attacks in smart grid. The implementation of our protocol on the Intel Edison platform shows that our protocol is efficient enough for the physical constrained devices, like smart meters.

  18. Privacy preservation and authentication on secure geographical routing in VANET

    Science.gov (United States)

    Punitha, A.; Manickam, J. Martin Leo

    2017-05-01

    Vehicular Ad hoc Networks (VANETs) play an important role in vehicle-to-vehicle communication as it offers a high level of safety and convenience to drivers. In order to increase the level of security and safety in VANETs, in this paper, we propose a Privacy Preservation and Authentication on Secure Geographical Routing Protocol (PPASGR) for VANET. It provides security by detecting and preventing malicious nodes through two directional antennas such as forward (f-antenna) and backward (b-antenna). The malicious nodes are detected by direction detection, consistency detection and conflict detection. The location of the trusted neighbour is identified using TNT-based location verification scheme after the implementation of the Vehicle Tamper Proof Device (VTPD), Trusted Authority (TA) is generated that produces the anonymous credentials. Finally, VTPD generates pseudo-identity using TA which retrieves the real identity of the sender. Through this approach, the authentication, integrity and confidentiality for routing packets can be achieved. The simulation results show that the proposed approach reduces the packet drop due to attack and improves the packet delivery ratio.

  19. Privacy by design in personal health monitoring.

    Science.gov (United States)

    Nordgren, Anders

    2015-06-01

    The concept of privacy by design is becoming increasingly popular among regulators of information and communications technologies. This paper aims at analysing and discussing the ethical implications of this concept for personal health monitoring. I assume a privacy theory of restricted access and limited control. On the basis of this theory, I suggest a version of the concept of privacy by design that constitutes a middle road between what I call broad privacy by design and narrow privacy by design. The key feature of this approach is that it attempts to balance automated privacy protection and autonomously chosen privacy protection in a way that is context-sensitive. In personal health monitoring, this approach implies that in some contexts like medication assistance and monitoring of specific health parameters one single automatic option is legitimate, while in some other contexts, for example monitoring in which relatives are receivers of health-relevant information rather than health care professionals, a multi-choice approach stressing autonomy is warranted.

  20. Pre-Capture Privacy for Small Vision Sensors.

    Science.gov (United States)

    Pittaluga, Francesco; Koppal, Sanjeev Jagannatha

    2017-11-01

    The next wave of micro and nano devices will create a world with trillions of small networked cameras. This will lead to increased concerns about privacy and security. Most privacy preserving algorithms for computer vision are applied after image/video data has been captured. We propose to use privacy preserving optics that filter or block sensitive information directly from the incident light-field before sensor measurements are made, adding a new layer of privacy. In addition to balancing the privacy and utility of the captured data, we address trade-offs unique to miniature vision sensors, such as achieving high-quality field-of-view and resolution within the constraints of mass and volume. Our privacy preserving optics enable applications such as depth sensing, full-body motion tracking, people counting, blob detection and privacy preserving face recognition. While we demonstrate applications on macro-scale devices (smartphones, webcams, etc.) our theory has impact for smaller devices.

  1. Through Patients' Eyes: Regulation, Technology, Privacy, and the Future.

    Science.gov (United States)

    Petersen, Carolyn

    2018-04-22

    Privacy is commonly regarded as a regulatory requirement achieved via technical and organizational management practices. Those working in the field of informatics often play a role in privacy preservation as a result of their expertise in information technology, workflow analysis, implementation science, or related skills. Viewing privacy from the perspective of patients whose protected health information is at risk broadens the considerations to include the perceived duality of privacy; the existence of privacy within a context unique to each patient; the competing needs inherent within privacy management; the need for particular consideration when data are shared; and the need for patients to control health information in a global setting. With precision medicine, artificial intelligence, and other treatment innovations on the horizon, health care professionals need to think more broadly about how to preserve privacy in a health care environment driven by data sharing. Patient-reported privacy preferences, privacy portability, and greater transparency around privacy-preserving functionalities are potential strategies for ensuring that privacy regulations are met and privacy is preserved. Georg Thieme Verlag KG Stuttgart.

  2. Collaborative eHealth Meets Security: Privacy-Enhancing Patient Profile Management.

    Science.gov (United States)

    Sanchez-Guerrero, Rosa; Mendoza, Florina Almenarez; Diaz-Sanchez, Daniel; Cabarcos, Patricia Arias; Lopez, Andres Marin

    2017-11-01

    Collaborative healthcare environments offer potential benefits, including enhancing the healthcare quality delivered to patients and reducing costs. As a direct consequence, sharing of electronic health records (EHRs) among healthcare providers has experienced a noteworthy growth in the last years, since it enables physicians to remotely monitor patients' health and enables individuals to manage their own health data more easily. However, these scenarios face significant challenges regarding security and privacy of the extremely sensitive information contained in EHRs. Thus, a flexible, efficient, and standards-based solution is indispensable to guarantee selective identity information disclosure and preserve patient's privacy. We propose a privacy-aware profile management approach that empowers the patient role, enabling him to bring together various healthcare providers as well as user-generated claims into an unique credential. User profiles are represented through an adaptive Merkle Tree, for which we formalize the underlying mathematical model. Furthermore, performance of the proposed solution is empirically validated through simulation experiments.

  3. The privacy implications of Bluetooth

    OpenAIRE

    Kostakos, Vassilis

    2008-01-01

    A substantial amount of research, as well as media hype, has surrounded RFID technology and its privacy implications. Currently, researchers and the media focus on the privacy threats posed by RFID, while consumer groups choose to boycott products bearing RFID tags. At the same, however, a very similar technology has quietly become part of our everyday lives: Bluetooth. In this paper we highlight the fact that Bluetooth is a widespread technology that has real privacy implications. Furthermor...

  4. BangA: An Efficient and Flexible Generalization-Based Algorithm for Privacy Preserving Data Publication

    Directory of Open Access Journals (Sweden)

    Adeel Anjum

    2017-01-01

    Full Text Available Privacy-Preserving Data Publishing (PPDP has become a critical issue for companies and organizations that would release their data. k-Anonymization was proposed as a first generalization model to guarantee against identity disclosure of individual records in a data set. Point access methods (PAMs are not well studied for the problem of data anonymization. In this article, we propose yet another approximation algorithm for anonymization, coined BangA, that combines useful features from Point Access Methods (PAMs and clustering. Hence, it achieves fast computation and scalability as a PAM, and very high quality thanks to its density-based clustering step. Extensive experiments show the efficiency and effectiveness of our approach. Furthermore, we provide guidelines for extending BangA to achieve a relaxed form of differential privacy which provides stronger privacy guarantees as compared to traditional privacy definitions.

  5. Big data privacy: The datafication of personal information

    DEFF Research Database (Denmark)

    Mai, Jens-Erik

    2016-01-01

    . This broadened approach will take our thinking beyond current preoccupation with whether or not individuals’ consent was secured for data collection to privacy issues arising from the development of new information on individuals' likely behavior through analysis of already collected data—this new information......In the age of big data we need to think differently about privacy. We need to shift our thinking from definitions of privacy (characteristics of privacy) to models of privacy (how privacy works). Moreover, in addition to the existing models of privacy—the surveillance model and capture model......—we need to also consider a new model: the datafication model presented in this article, wherein new personal information is deduced by employing predictive analytics on already-gathered data. These three models of privacy supplement each other; they are not competing understandings of privacy...

  6. Children's Privacy in the Big Data Era: Research Opportunities.

    Science.gov (United States)

    Montgomery, Kathryn C; Chester, Jeff; Milosevic, Tijana

    2017-11-01

    This article focuses on the privacy implications of advertising on social media, mobile apps, and games directed at children. Academic research on children's privacy has primarily focused on the safety risks involved in sharing personal information on the Internet, leaving market forces (such as commercial data collection) as a less discussed aspect of children's privacy. Yet, children's privacy in the digital era cannot be fully understood without examining marketing practices, especially in the context of "big data." As children increasingly consume content on an ever-expanding variety of digital devices, media and advertising industries are creating new ways to track their behaviors and target them with personalized content and marketing messages based on individual profiles. The advent of the so-called Internet of Things, with its ubiquitous sensors, is expanding these data collection and profiling practices. These trends raise serious concerns about digital dossiers that could follow young people into adulthood, affecting their access to education, employment, health care, and financial services. Although US privacy law provides some safeguards for children younger than 13 years old online, adolescents are afforded no such protections. Moreover, scholarship on children and privacy continues to lag behind the changes taking place in global media, advertising, and technology. This article proposes collaboration among researchers from a range of fields that will enable cross-disciplinary studies addressing not only the developmental issues related to different age groups but also the design of digital media platforms and the strategies used to influence young people. Copyright © 2017 by the American Academy of Pediatrics.

  7. Will you accept the government's friend request? Social networks and privacy concerns.

    Directory of Open Access Journals (Sweden)

    David A Siegel

    Full Text Available Participating in social network websites entails voluntarily sharing private information, and the explosive growth of social network websites over the last decade suggests shifting views on privacy. Concurrently, new anti-terrorism laws, such as the USA Patriot Act, ask citizens to surrender substantial claim to privacy in the name of greater security. I address two important questions regarding individuals' views on privacy raised by these trends. First, how does prompting individuals to consider security concerns affect their views on government actions that jeopardize privacy? Second, does the use of social network websites alter the effect of prompted security concerns? I posit that prompting individuals to consider security concerns does lead to an increased willingness to accept government actions that jeopardize privacy, but that frequent users of websites like Facebook are less likely to be swayed by prompted security concerns. An embedded survey experiment provides support for both parts of my claim.

  8. Social Media Users’ Legal Consciousness About Privacy

    Directory of Open Access Journals (Sweden)

    Katharine Sarikakis

    2017-02-01

    Full Text Available This article explores the ways in which the concept of privacy is understood in the context of social media and with regard to users’ awareness of privacy policies and laws in the ‘Post-Snowden’ era. In the light of presumably increased public exposure to privacy debates, generated partly due to the European “Right to be Forgotten” ruling and the Snowden revelations on mass surveillance, this article explores users’ meaning-making of privacy as a matter of legal dimension in terms of its violations and threats online and users’ ways of negotiating their Internet use, in particular social networking sites. Drawing on the concept of legal consciousness, this article explores through focus group interviews the ways in which social media users negotiate privacy violations and what role their understanding of privacy laws (or lack thereof might play in their strategies of negotiation. The findings are threefold: first, privacy is understood almost universally as a matter of controlling one’s own data, including information disclosure even to friends, and is strongly connected to issues about personal autonomy; second, a form of resignation with respect to control over personal data appears to coexist with a recognized need to protect one’s private data, while respondents describe conscious attempts to circumvent systems of monitoring or violation of privacy, and third, despite widespread coverage of privacy legal issues in the press, respondents’ concerns about and engagement in “self-protecting” tactics derive largely from being personally affected by violations of law and privacy.

  9. Hacktivism 1-2-3: how privacy enhancing technologies change the face of anonymous hacktivism

    NARCIS (Netherlands)

    Bodó, B.

    2014-01-01

    This short essay explores how the notion of hacktivism changes due to easily accessible, military grade Privacy Enhancing Technologies (PETs). Privacy Enhancing Technologies, technological tools which provide anonymous communications and protect users from online surveillance enable new forms of

  10. 31 CFR 0.216 - Privacy Act.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Privacy Act. 0.216 Section 0.216... RULES OF CONDUCT Rules of Conduct § 0.216 Privacy Act. Employees involved in the design, development, operation, or maintenance of any system of records or in maintaining records subject to the Privacy Act of...

  11. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II.

    Science.gov (United States)

    Watzlaf, Valerie J M; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti

    2011-01-01

    In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR.

  12. A Distributed Ensemble Approach for Mining Healthcare Data under Privacy Constraints.

    Science.gov (United States)

    Li, Yan; Bai, Changxin; Reddy, Chandan K

    2016-02-10

    In recent years, electronic health records (EHRs) have been widely adapted at many healthcare facilities in an attempt to improve the quality of patient care and increase the productivity and efficiency of healthcare delivery. These EHRs can accurately diagnose diseases if utilized appropriately. While the EHRs can potentially resolve many of the existing problems associated with disease diagnosis, one of the main obstacles in effectively using them is the patient privacy and sensitivity of the medical information available in the EHR. Due to these concerns, even if the EHRs are available for storage and retrieval purposes, sharing of the patient records between different healthcare facilities has become a major concern and has hampered some of the effective advantages of using EHRs. Due to this lack of data sharing, most of the facilities aim at building clinical decision support systems using limited amount of patient data from their own EHR systems to provide important diagnosis related decisions. It becomes quite infeasible for a newly established healthcare facility to build a robust decision making system due to the lack of sufficient patient records. However, to make effective decisions from clinical data, it is indispensable to have large amounts of data to train the decision models. In this regard, there are conflicting objectives of preserving patient privacy and having sufficient data for modeling and decision making. To handle such disparate goals, we develop two adaptive distributed privacy-preserving algorithms based on a distributed ensemble strategy. The basic idea of our approach is to build an elegant model for each participating facility to accurately learn the data distribution, and then can transfer the useful healthcare knowledge acquired on their data from these participators in the form of their own decision models without revealing and sharing the patient-level sensitive data, thus protecting patient privacy. We demonstrate that our

  13. 75 FR 28051 - Public Workshop: Pieces of Privacy

    Science.gov (United States)

    2010-05-19

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Public Workshop: Pieces of Privacy AGENCY: Privacy Office, DHS. ACTION: Notice announcing public workshop. SUMMARY: The Department of Homeland Security Privacy Office will host a public workshop, ``Pieces of Privacy.'' DATES: The workshop will be...

  14. The interplay between decentralization and privacy: the case of blockchain technologies

    OpenAIRE

    De Filippi , Primavera

    2016-01-01

    International audience; Decentralized architectures are gaining popularity as a way to protect one's privacy against the ubiquitous surveillance of states and corporations. Yet, in spite of the obvious benefits they provide when it comes to data sovereignty, decentralized architectures also present certain characteristics that—if not properly accounted for—might ultimately impinge upon users' privacy. While they are capable of preserving the confidentiality of data, decentralized architecture...

  15. Scalable privacy-preserving data sharing methodology for genome-wide association studies: an application to iDASH healthcare privacy protection challenge.

    Science.gov (United States)

    Yu, Fei; Ji, Zhanglong

    2014-01-01

    In response to the growing interest in genome-wide association study (GWAS) data privacy, the Integrating Data for Analysis, Anonymization and SHaring (iDASH) center organized the iDASH Healthcare Privacy Protection Challenge, with the aim of investigating the effectiveness of applying privacy-preserving methodologies to human genetic data. This paper is based on a submission to the iDASH Healthcare Privacy Protection Challenge. We apply privacy-preserving methods that are adapted from Uhler et al. 2013 and Yu et al. 2014 to the challenge's data and analyze the data utility after the data are perturbed by the privacy-preserving methods. Major contributions of this paper include new interpretation of the χ2 statistic in a GWAS setting and new results about the Hamming distance score, a key component for one of the privacy-preserving methods.

  16. A hybrid technique for private location-based queries with database protection

    KAUST Repository

    Ghinita, Gabriel; Kalnis, Panos; Kantarcioǧlu, Murâ t; Bertino, Elisa

    2009-01-01

    on finding good trade-offs between privacy and performance of user protection techniques, but disregarded the important issue of protecting the POI dataset D. For instance, location cloaking requires large-sized CRs, leading to excessive disclosure of POIs (O

  17. Electronic health records: eliciting behavioral health providers' beliefs.

    Science.gov (United States)

    Shank, Nancy; Willborn, Elizabeth; Pytlikzillig, Lisa; Noel, Harmonijoie

    2012-04-01

    Interviews with 32 community behavioral health providers elicited perceived benefits and barriers of using electronic health records. Themes identified were (a) quality of care, (b) privacy and security, and (c) delivery of services. Benefits to quality of care were mentioned by 100% of the providers, and barriers by 59% of providers. Barriers involving privacy and security concerns were mentioned by 100% of providers, and benefits by 22%. Barriers to delivery of services were mentioned by 97% of providers, and benefits by 66%. Most providers (81%) expressed overall positive support for electronic behavioral health records.

  18. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security, and HIPAA Compliance

    Science.gov (United States)

    Watzlaf, Valerie J.M.; Moeini, Sohrab; Firouzan, Patti

    2010-01-01

    Voice over the Internet Protocol (VoIP) systems such as Adobe ConnectNow, Skype, ooVoo, etc. may include the use of software applications for telerehabilitation (TR) therapy that can provide voice and video teleconferencing between patients and therapists. Privacy and security applications as well as HIPAA compliance within these protocols have been questioned by information technologists, providers of care and other health care entities. This paper develops a privacy and security checklist that can be used within a VoIP system to determine if it meets privacy and security procedures and whether it is HIPAA compliant. Based on this analysis, specific HIPAA criteria that therapists and health care facilities should follow are outlined and discussed, and therapists must weigh the risks and benefits when deciding to use VoIP software for TR. PMID:25945172

  19. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security, and HIPAA Compliance.

    Science.gov (United States)

    Watzlaf, Valerie J M; Moeini, Sohrab; Firouzan, Patti

    2010-01-01

    Voice over the Internet Protocol (VoIP) systems such as Adobe ConnectNow, Skype, ooVoo, etc. may include the use of software applications for telerehabilitation (TR) therapy that can provide voice and video teleconferencing between patients and therapists. Privacy and security applications as well as HIPAA compliance within these protocols have been questioned by information technologists, providers of care and other health care entities. This paper develops a privacy and security checklist that can be used within a VoIP system to determine if it meets privacy and security procedures and whether it is HIPAA compliant. Based on this analysis, specific HIPAA criteria that therapists and health care facilities should follow are outlined and discussed, and therapists must weigh the risks and benefits when deciding to use VoIP software for TR.

  20. Predicting user concerns about online privacy in Hong Kong.

    Science.gov (United States)

    Yao, Mike Z; Zhang, Jinguang

    2008-12-01

    Empirical studies on people's online privacy concerns have largely been conducted in the West. The global threat of privacy violations on the Internet calls for similar studies to be done in non-Western regions. To fill this void, the current study develops a path model to investigate the influence of people's Internet use-related factors, their beliefs in the right to privacy, and psychological need for privacy on Hong Kong people's concerns about online privacy. Survey responses from 332 university students were analyzed. Results from this study show that people's belief in the right to privacy was the most important predictor of their online privacy concerns. It also significantly mediated the relationship between people's psychological need for privacy and their concerns with privacy violations online. Moreover, while frequent use of the Internet may increase concerns about online privacy issues, Internet use diversity may actually reduce such worries. The final model, well supported by the observed data, successfully explained 25% of the variability in user concerns about online privacy.

  1. Privacy Issues: Journalists Should Balance Need for Privacy with Need to Cover News.

    Science.gov (United States)

    Plopper, Bruce

    1998-01-01

    Notes that journalists have to balance their desire to print the news with personal rights to privacy. Argues that a working knowledge of ethics and law helps journalism students resolve such issues. Discusses ethical issues; legal aspects of privacy; and "training" administrators. Offers a list of questions to ask, six notable court…

  2. The Regulatory Framework for Privacy and Security

    Science.gov (United States)

    Hiller, Janine S.

    The internet enables the easy collection of massive amounts of personally identifiable information. Unregulated data collection causes distrust and conflicts with widely accepted principles of privacy. The regulatory framework in the United States for ensuring privacy and security in the online environment consists of federal, state, and self-regulatory elements. New laws have been passed to address technological and internet practices that conflict with privacy protecting policies. The United States and the European Union approaches to privacy differ significantly, and the global internet environment will likely cause regulators to face the challenge of balancing privacy interests with data collection for many years to come.

  3. A Privacy-Preserving NFC Mobile Pass for Transport Systems

    Directory of Open Access Journals (Sweden)

    Ghada Arfaoui

    2014-12-01

    Full Text Available The emergence of the NFC (Near Field Communication technology brings new capacities to the next generation of smartphones, but also new security and privacy challenges. Indeed through its contactless interactions with external entities, the smartphone of an individual will become an essential authentication tool for service providers such as transport operators. However, from the point of view of the user, carrying a part of the service through his smartphone could be a threat for his privacy. Indeed, an external attacker or the service provider himself could be tempted to track the actions of the user. In this paper, we propose a privacy-preserving contactless mobile service, in which a user’s identity cannot be linked to his actions when using the transport system. The security of our proposition relies on the combination of a secure element in the smartphone and on a privacy-enhancing cryptographic protocol based on a variant of group signatures. In addition, although a user should remain anonymous and his actions unlinkable in his daily journeys, we designed a technique for lifting his anonymity in extreme circumstances. In order to guarantee the usability of our solution, we implemented a prototype demonstrating that our solution meets the major functional requirements for real transport systems: namely that the mobile pass can be validated at a gate in less than 300 ms, and this even if the battery of the smartphone is exhausted.

  4. Data Security and Privacy in Apps for Dementia: An Analysis of Existing Privacy Policies.

    Science.gov (United States)

    Rosenfeld, Lisa; Torous, John; Vahia, Ipsit V

    2017-08-01

    Despite tremendous growth in the number of health applications (apps), little is known about how well these apps protect their users' health-related data. This gap in knowledge is of particular concern for apps targeting people with dementia, whose cognitive impairment puts them at increased risk of privacy breaches. In this article, we determine how many dementia apps have privacy policies and how well they protect user data. Our analysis included all iPhone apps that matched the search terms "medical + dementia" or "health & fitness + dementia" and collected user-generated content. We evaluated all available privacy policies for these apps based on criteria that systematically measure how individual user data is handled. Seventy-two apps met the above search teams and collected user data. Of these, only 33 (46%) had an available privacy policy. Nineteen of the 33 with policies (58%) were specific to the app in question, and 25 (76%) specified how individual-user as opposed to aggregate data would be handled. Among these, there was a preponderance of missing information, the majority acknowledged collecting individual data for internal purposes, and most admitted to instances in which they would share user data with outside parties. At present, the majority of health apps focused on dementia lack a privacy policy, and those that do exist lack clarity. Bolstering safeguards and improving communication about privacy protections will help facilitate consumer trust in apps, thereby enabling more widespread and meaningful use by people with dementia and those involved in their care. Copyright © 2017. Published by Elsevier Inc.

  5. Fuzzy Privacy Decision for Context-Aware Access Personal Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qingsheng; QI Yong; ZHAO Jizhong; HOU Di; NIU Yujie

    2007-01-01

    A context-aware privacy protection framework was designed for context-aware services and privacy control methods about access personal information in pervasive environment. In the process of user's privacy decision, it can produce fuzzy privacy decision as the change of personal information sensitivity and personal information receiver trust. The uncertain privacy decision model was proposed about personal information disclosure based on the change of personal information receiver trust and personal information sensitivity. A fuzzy privacy decision information system was designed according to this model. Personal privacy control policies can be extracted from this information system by using rough set theory. It also solves the problem about learning privacy control policies of personal information disclosure.

  6. Privacy concerns, dead or misunderstood? : The perceptions of privacy amongst the young and old

    NARCIS (Netherlands)

    Steijn, Wouter; Vedder, Anton

    2015-01-01

    The concept of ‘privacy’ has become an important topic for academics and policy-makers. Ubiquitous computing and internet access raise new questions in relation to privacy in the virtual world, including individuals’ appreciation of privacy and how this can be safeguarded. This article contributes

  7. Security and privacy qualities of medical devices: an analysis of FDA postmarket surveillance.

    Science.gov (United States)

    Kramer, Daniel B; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R

    2012-01-01

    Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients' stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware.

  8. Effective sharing of health records, maintaining privacy: a practical schema.

    Science.gov (United States)

    Neame, Roderick

    2013-01-01

    A principal goal of computerisation of medical records is to join up care services for patients, so that their records can follow them wherever they go and thereby reduce delays, duplications, risks and errors, and costs. Healthcare records are increasingly being stored electronically, which has created the necessary conditions for them to be readily sharable. However simply driving the implementation of electronic medical records is not sufficient, as recent developments have demonstrated (1): there remain significant obstacles. The three main obstacles relate to (a) record accessibility (knowing where event records are and being able to access them), (b) maintaining privacy (ensuring that only those authorised by the patient can access and extract meaning from the records) and (c) assuring the functionality of the shared information (ensuring that the records can be shared non-proprietorially across platforms without loss of meaning, and that their authenticity and trustworthiness are demonstrable). These constitute a set of issues that need new thinking, since existing systems are struggling to deliver them. The solution to this puzzle lies in three main parts. Clearly there is only one environment suited to such widespread sharing, which is the World Wide Web, so this is the communications basis. Part one requires that a sharable synoptic record is created for each care event and stored in standard web-format and in readily accessible locations, on 'the web' or in 'the cloud'. To maintain privacy these publicly-accessible records must be suitably protected either stripped of identifiers (names, addresses, dates, places etc.) and/or encrypted: either way the record must be tagged with a tag that means nothing to anyone, but serves to identify and authenticate a specific record when retrieved. For ease of retrieval patients must hold an index of care events, records and web locations (plus any associated information for each such as encryption keys, context etc

  9. Privacy-Preserving Billing Scheme against Free-Riders for Wireless Charging Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Xingwen Zhao

    2017-01-01

    Full Text Available Recently, scientists in South Korea developed on-line electric vehicle (OLEV, which is a kind of electric vehicle that can be charged wirelessly while it is moving on the road. The battery in the vehicle can absorb electric energy from the power transmitters buried under the road without any contact with them. Several billing schemes have been presented to offer privacy-preserving billing for OLEV owners. However, they did not consider the existence of free-riders. When some vehicles are being charged after showing the tokens, vehicles that are running ahead or behind can switch on their systems and drive closely for a free charging. We describe a billing scheme against free-riders by using several cryptographic tools. Each vehicle should authenticate with a compensation-prepaid token before it can drive on the wireless-charging-enabled road. The service provider can obtain compensation if it can prove that certain vehicle is a free-rider. Our scheme is privacy-preserving so the charging will not disclose the locations and routine routes of each vehicle. In fact, our scheme is a fast authentication scheme that anonymously authenticates each user on accessing a sequence of services. Thus, it can be applied to sequential data delivering services in future 5G systems.

  10. Privacy and the Connected Society

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Khajuria, Samant; Skouby, Knud Erik

    The Vision of the 5G enabled connected society is highly based on the evolution and implementation of Internet of Things. This involves, amongst others, a significant raise in devices, sensors and communication in pervasive interconnections as well as cooperation amongst devices and entities across...... the society. Enabling the vision of the connected society, researchers point in the direction of security and privacy as areas to challenge the vision. By use of the Internet of Things reference model as well as the vision of the connected society, this paper identifies privacy of the individual with respect...... to three selected areas: Shopping, connected cars and online gaming. The paper concludes that privacy is a complexity within the connected society vision and that thee is a need for more privacy use cases to shed light on the challenge....

  11. Privacy-preserving Kruskal-Wallis test.

    Science.gov (United States)

    Guo, Suxin; Zhong, Sheng; Zhang, Aidong

    2013-10-01

    Statistical tests are powerful tools for data analysis. Kruskal-Wallis test is a non-parametric statistical test that evaluates whether two or more samples are drawn from the same distribution. It is commonly used in various areas. But sometimes, the use of the method is impeded by privacy issues raised in fields such as biomedical research and clinical data analysis because of the confidential information contained in the data. In this work, we give a privacy-preserving solution for the Kruskal-Wallis test which enables two or more parties to coordinately perform the test on the union of their data without compromising their data privacy. To the best of our knowledge, this is the first work that solves the privacy issues in the use of the Kruskal-Wallis test on distributed data. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Negotiating privacy in surveillant welfare relations

    DEFF Research Database (Denmark)

    Andersen, Lars Bo; Lauritsen, Peter; Bøge, Ask Risom

    . However, while privacy is central to debates of surveillance, it has proven less productive as an analytical resource for studying surveillance in practice. Consequently, this paper reviews different conceptualisations of privacy in relation to welfare and surveillance and argues for strengthening...... the analytical capacity of the concept by rendering it a situated and relational concept. The argument is developed through a research and design project called Teledialogue meant to improve the relation between case managers and children placed at institutions or in foster families. Privacy in Teledialogue...... notion of privacy are discussed in relation to both research- and public debates on surveillance in a welfare setting....

  13. Visual privacy by context: proposal and evaluation of a level-based visualisation scheme.

    Science.gov (United States)

    Padilla-López, José Ramón; Chaaraoui, Alexandros Andre; Gu, Feng; Flórez-Revuelta, Francisco

    2015-06-04

    Privacy in image and video data has become an important subject since cameras are being installed in an increasing number of public and private spaces. Specifically, in assisted living, intelligent monitoring based on computer vision can allow one to provide risk detection and support services that increase people's autonomy at home. In the present work, a level-based visualisation scheme is proposed to provide visual privacy when human intervention is necessary, such as at telerehabilitation and safety assessment applications. Visualisation levels are dynamically selected based on the previously modelled context. In this way, different levels of protection can be provided, maintaining the necessary intelligibility required for the applications. Furthermore, a case study of a living room, where a top-view camera is installed, is presented. Finally, the performed survey-based evaluation indicates the degree of protection provided by the different visualisation models, as well as the personal privacy preferences and valuations of the users.

  14. Economics of Privacy: Users'€™ Attitudes and Economic Impact of Information Privacy Protection

    OpenAIRE

    Frik, Alisa

    2017-01-01

    This doctoral thesis consists of three essays within the field of economics of information privacy examined through the lens of behavioral and experimental economics. Rapid development and expansion of Internet, mobile and network technologies in the last decades has provided multitudinous opportunities and benefits to both business and society proposing the customized services and personalized offers at a relatively low price and high speed. However, such innovations and progress have al...

  15. Insights to develop privacy policy for organization in Indonesia

    Science.gov (United States)

    Rosmaini, E.; Kusumasari, T. F.; Lubis, M.; Lubis, A. R.

    2018-03-01

    Nowadays, the increased utilization of shared application in the network needs not only dictate to have enhanced security but also emphasize the need to balance its privacy protection and ease of use. Meanwhile, its accessibility and availability as the demand from organization service put privacy obligations become more complex process to be handled and controlled. Nonetheless, the underlying principles for privacy policy exist in Indonesian current laws, even though they spread across various article regulations. Religions, constitutions, statutes, regulations, custom and culture requirements still become the reference model to control the activity process for data collection and information sharing accordingly. Moreover, as the customer and organization often misinterpret their responsibilities and rights in the business function, process and level, the essential thing to be considered for professionals on how to articulate clearly the rules that manage their information gathering and distribution in a manner that translates into information system specification and requirements for developers and managers. This study focus on providing suggestion and recommendation to develop privacy policy based on descriptive analysis of 791 respondents on personal data protection in accordance with political and economic factor in Indonesia.

  16. 32 CFR 311.7 - OSD/JS Privacy Office Processes.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 2 2010-07-01 2010-07-01 false OSD/JS Privacy Office Processes. 311.7 Section...) PRIVACY PROGRAM OFFICE OF THE SECRETARY OF DEFENSE AND JOINT STAFF PRIVACY PROGRAM § 311.7 OSD/JS Privacy Office Processes. The OSD/JS Privacy Office shall: (a) Exercise oversight and administrative control of...

  17. 32 CFR 701.101 - Privacy program terms and definitions.

    Science.gov (United States)

    2010-07-01

    ... from a project on privacy issues, identifying and resolving the privacy risks, and approval by a... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy program terms and definitions. 701.101... DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.101 Privacy program terms and...

  18. An informational theory of privacy

    NARCIS (Netherlands)

    Schottmuller, C.; Jann, Ole

    2016-01-01

    We develop a theory that explains how and when privacy can increase welfare. Without privacy, some individuals misrepresent their preferences, because they will otherwise be statistically discriminated against. This "chilling effect" hurts them individually, and impairs information aggregation. The

  19. The role of health care experience and consumer information efficacy in shaping privacy and security perceptions of medical records: national consumer survey results.

    Science.gov (United States)

    Patel, Vaishali; Beckjord, Ellen; Moser, Richard P; Hughes, Penelope; Hesse, Bradford W

    2015-04-02

    Providers' adoption of electronic health records (EHRs) is increasing and consumers have expressed concerns about the potential effects of EHRs on privacy and security. Yet, we lack a comprehensive understanding regarding factors that affect individuals' perceptions regarding the privacy and security of their medical information. The aim of this study was to describe national perceptions regarding the privacy and security of medical records and identify a comprehensive set of factors associated with these perceptions. Using a nationally representative 2011-2012 survey, we reported on adults' perceptions regarding privacy and security of medical records and sharing of health information between providers, and whether adults withheld information from a health care provider due to privacy or security concerns. We used multivariable models to examine the association between these outcomes and sociodemographic characteristics, health and health care experience, information efficacy, and technology-related variables. Approximately one-quarter of American adults (weighted n=235,217,323; unweighted n=3959) indicated they were very confident (n=989) and approximately half indicated they were somewhat confident (n=1597) in the privacy of their medical records; we found similar results regarding adults' confidence in the security of medical records (very confident: n=828; somewhat confident: n=1742). In all, 12.33% (520/3904) withheld information from a health care provider and 59.06% (2100/3459) expressed concerns about the security of both faxed and electronic health information. Adjusting for other characteristics, adults who reported higher quality of care had significantly greater confidence in the privacy and security of their medical records and were less likely to withhold information from their health care provider due to privacy or security concerns. Adults with higher information efficacy had significantly greater confidence in the privacy and security of medical

  20. 45 CFR 503.2 - General policies-Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false General policies-Privacy Act. 503.2 Section 503.2... THE UNITED STATES, DEPARTMENT OF JUSTICE RULES OF PRACTICE PRIVACY ACT AND GOVERNMENT IN THE SUNSHINE REGULATIONS Privacy Act Regulations § 503.2 General policies—Privacy Act. The Commission will protect the...

  1. Privacy and Personal Information Held by Government: A Comparative Study, Japan and New Zealand

    Science.gov (United States)

    Cullen, Rowena

    This chapter reports on the concepts of information privacy and trust in government among citizens in Japan and New Zealand in a transnational, crosscultural study. Data from both countries are presented, and cultural and other factors are sought that might explain differences in attitudes shown. In both countries, citizens display a range of views, not related to age or gender. New Zealand citizens express concern about information privacy in relation to information held by government, but show a higher level of trust in government overall, and most attribute breaches of privacy to incompetence, rather than deliberate malfeasance. Japanese citizens interviewed also indicated that they had major concerns about information privacy, and had considerably less trust in government than New Zealand respondents showed. They were more inclined to attribute breaches of privacy to lax behavior in individuals than government systems. In both countries citizens showed an awareness of the tradeoffs necessary between personal privacy and the needs of the state to hold information for the benefit of all citizens, but knew little about the protection offered by privacy legislation, and expressed overall concern about privacy practices in the modern state. The study also provides evidence of cultural differences that can be related to Hofstede's dimensions of culture.

  2. Sharing privacy-sensitive access to neuroimaging and genetics data: a review and preliminary validation.

    Science.gov (United States)

    Sarwate, Anand D; Plis, Sergey M; Turner, Jessica A; Arbabshirani, Mohammad R; Calhoun, Vince D

    2014-01-01

    The growth of data sharing initiatives for neuroimaging and genomics represents an exciting opportunity to confront the "small N" problem that plagues contemporary neuroimaging studies while further understanding the role genetic markers play in the function of the brain. When it is possible, open data sharing provides the most benefits. However, some data cannot be shared at all due to privacy concerns and/or risk of re-identification. Sharing other data sets is hampered by the proliferation of complex data use agreements (DUAs) which preclude truly automated data mining. These DUAs arise because of concerns about the privacy and confidentiality for subjects; though many do permit direct access to data, they often require a cumbersome approval process that can take months. An alternative approach is to only share data derivatives such as statistical summaries-the challenges here are to reformulate computational methods to quantify the privacy risks associated with sharing the results of those computations. For example, a derived map of gray matter is often as identifiable as a fingerprint. Thus alternative approaches to accessing data are needed. This paper reviews the relevant literature on differential privacy, a framework for measuring and tracking privacy loss in these settings, and demonstrates the feasibility of using this framework to calculate statistics on data distributed at many sites while still providing privacy.

  3. Sharing privacy-sensitive access to neuroimaging and genetics data: a review and preliminary validation

    Directory of Open Access Journals (Sweden)

    Anand D. Sarwate

    2014-04-01

    Full Text Available The growth of data sharing initiatives for neuroimaging and genomics represents an exciting opportunity to confront the ``small $N$'' problem that plagues contemporary neuroimaging studies while further understanding the role genetic markers play in in the function of the brain. When it is possible, open data sharing provides the most benefits. However some data cannot be shared at all due to privacy concerns and/or risk of re-identification. Sharing other data sets is hampered by the proliferation of complex data use agreements (DUAs which preclude truly automated data mining. These DUAs arise because of concerns about the privacy and confidentiality for subjects; though many do permit direct access to data, they often require a cumbersome approval process that can take months. An alternative approach is to only share data derivatives such as statistical summaries -- the challenges here are to reformulate computational methods to quantify the privacy risks associated with sharing the results of those computations. For example, a derived map of gray matter is often as identifiable as a fingerprint. Thus alternative approaches to accessing data are needed. This paper reviews the relevant literature on differential privacy, a framework for measuring and tracking privacy loss in these settings, and demonstrates the feasibility of using this framework to calculate statistics on data distributed at many sites while still providing privacy.

  4. Sharing privacy-sensitive access to neuroimaging and genetics data: a review and preliminary validation

    Science.gov (United States)

    Sarwate, Anand D.; Plis, Sergey M.; Turner, Jessica A.; Arbabshirani, Mohammad R.; Calhoun, Vince D.

    2014-01-01

    The growth of data sharing initiatives for neuroimaging and genomics represents an exciting opportunity to confront the “small N” problem that plagues contemporary neuroimaging studies while further understanding the role genetic markers play in the function of the brain. When it is possible, open data sharing provides the most benefits. However, some data cannot be shared at all due to privacy concerns and/or risk of re-identification. Sharing other data sets is hampered by the proliferation of complex data use agreements (DUAs) which preclude truly automated data mining. These DUAs arise because of concerns about the privacy and confidentiality for subjects; though many do permit direct access to data, they often require a cumbersome approval process that can take months. An alternative approach is to only share data derivatives such as statistical summaries—the challenges here are to reformulate computational methods to quantify the privacy risks associated with sharing the results of those computations. For example, a derived map of gray matter is often as identifiable as a fingerprint. Thus alternative approaches to accessing data are needed. This paper reviews the relevant literature on differential privacy, a framework for measuring and tracking privacy loss in these settings, and demonstrates the feasibility of using this framework to calculate statistics on data distributed at many sites while still providing privacy. PMID:24778614

  5. SmartPrivacy for the smart grid : embedding privacy into the design of electricity conservation

    Energy Technology Data Exchange (ETDEWEB)

    Cavoukian, A. [Ontario Information and Privacy Commissioner, Toronto, ON (Canada); Polonetsky, J.; Wolf, C. [Future of Privacy Forum, Washington, DC (United States)

    2009-11-15

    Modernization efforts are underway to make the current electrical grid smarter. The future of the Smart Grid will be capable of informing consumers of their day-to-day energy use, curbing greenhouse gas emissions, and reducing consumers' energy bills. However, the Smart Grid also brings with it the possibility of collecting detailed information on individual energy consumption use and patterns within peoples' homes. This paper discussed the Smart Grid and its benefits, as well as the questions that should be examined regarding privacy. The paper also outlined the concept of SmartPrivacy and discussed its application to the Smart Grid scenario. Privacy by design foundational principles and Smart Grid components were also presented in an appendix. It was concluded that the information collected on a Smart Grid will form a library of personal information. The mishandling of this information could be extremely invasive of consumer privacy. 46 refs., 1 fig., 2 appendices.

  6. 16 CFR 313.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Model privacy form and examples. 313.2... PRIVACY OF CONSUMER FINANCIAL INFORMATION § 313.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  7. Preserving differential privacy under finite-precision semantics.

    Directory of Open Access Journals (Sweden)

    Ivan Gazeau

    2013-06-01

    Full Text Available The approximation introduced by finite-precision representation of continuous data can induce arbitrarily large information leaks even when the computation using exact semantics is secure. Such leakage can thus undermine design efforts aimed at protecting sensitive information. We focus here on differential privacy, an approach to privacy that emerged from the area of statistical databases and is now widely applied also in other domains. In this approach, privacy is protected by the addition of noise to a true (private value. To date, this approach to privacy has been proved correct only in the ideal case in which computations are made using an idealized, infinite-precision semantics. In this paper, we analyze the situation at the implementation level, where the semantics is necessarily finite-precision, i.e. the representation of real numbers and the operations on them, are rounded according to some level of precision. We show that in general there are violations of the differential privacy property, and we study the conditions under which we can still guarantee a limited (but, arguably, totally acceptable variant of the property, under only a minor degradation of the privacy level. Finally, we illustrate our results on two cases of noise-generating distributions: the standard Laplacian mechanism commonly used in differential privacy, and a bivariate version of the Laplacian recently introduced in the setting of privacy-aware geolocation.

  8. Providing a USSD location based clinic finder in South Africa: did it work?

    Science.gov (United States)

    Parsons, Annie Neo; Timler, Dagmar

    2014-01-01

    A new mHealth service, Clinic Finder, was designed to provide a location-based service for any cellphone user in South Africa dialing a dedicated USSD string to find the nearest public primary health care facility. The service was funded by a European Union grant to Cell-Life to support the National Department of Health. Clinic Finder's aims were to provide a reliable and accurate service, and to assess both the most effective means of advertising the service as well as interest in the service. Users dialing the USSD string are asked to agree to geo-location (Vodacom and MTN users) or asked to enter their province, town and street (virtual network users and those choosing not to geo-locate). The service provider, AAT, sends the data to Cell-Life where an SMS with details of the nearest public primary health care facility is sent to the user by Cell-Life's open-source Communicate platform. The service was advertised on 3 days in 2014 using two different means: a newspaper ad on 20 May 2014 and Please Call Me ads on 30 July 2014 and 14 August 2014. 28.2% of unique users on 20 May 2014, 10.5% of unique users on 30 July 2014 and 92.8% of unique users on 14 August 2014 who agreed to geo-location successfully received SMSs. However, only 4.2%, 0.5%, and 2.4% of unique users responding to each advertisement who did not geo-locate then received an SMS. A small survey of users following the 20 May 2014 newspaper ad found overall interest in the idea of Clinic Finder, though unsuccessful users were more likely to dislike the service. The overall experience of using location based services and USSD for Clinic Finder suggests a need in the field of mHealth for wider availability of data on service usability and effectiveness.

  9. 76 FR 24557 - Privacy Act of 1974; as Amended; Proposed Alteration to an Existing Privacy Act System of Records...

    Science.gov (United States)

    2011-05-02

    ... computer system that will house the data. We annually provide all our employees and contractors with... by the system: This system covers vocational experts, medical experts, other health care professional... Privacy Act System of Records, Housekeeping Changes, and New Routine Use AGENCY: Social Security...

  10. A Privacy Model for RFID Tag Ownership Transfer

    Directory of Open Access Journals (Sweden)

    Xingchun Yang

    2017-01-01

    Full Text Available The ownership of RFID tag is often transferred from one owner to another in its life cycle. To address the privacy problem caused by tag ownership transfer, we propose a tag privacy model which captures the adversary’s abilities to get secret information inside readers, to corrupt tags, to authenticate tags, and to observe tag ownership transfer processes. This model gives formal definitions for tag forward privacy and backward privacy and can be used to measure the privacy property of tag ownership transfer scheme. We also present a tag ownership transfer scheme, which is privacy-preserving under the proposed model and satisfies the other common security requirements, in addition to achieving better performance.

  11. 12 CFR 716.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Model privacy form and examples. 716.2 Section... PRIVACY OF CONSUMER FINANCIAL INFORMATION § 716.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  12. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security, and HIPAA Compliance

    Directory of Open Access Journals (Sweden)

    Valerie J.M. Watzlaf

    2010-10-01

    Full Text Available Voice over the Internet Protocol (VoIP systems such as Adobe ConnectNow, Skype, ooVoo, etc. may include the use of software applications for telerehabilitation (TR therapy that can provide voice and video teleconferencing between patients and therapists.  Privacy and security applications as well as HIPAA compliance within these protocols have been questioned by information technologists, providers of care, and other health care entities. This paper develops a privacy and security checklist that can be used within a VoIP system to determine if it meets privacy and security procedures and whether it is HIPAA compliant. Based on this analysis, specific HIPAA criteria that therapists and health care facilities should follow are outlined and discussed, and therapists must weigh the risks and benefits when deciding to use VoIP software for TR.   

  13. Culture, Privacy Conception and Privacy Concern: Evidence from Europe before PRISM

    OpenAIRE

    Omrani, Nessrine; Soulié, Nicolas

    2017-01-01

    This article analyses individuals’ online privacy concerns between cultural country groups. We use a dataset of more than 14 000 Internet users collected by the European Union in 2010 in 26 EU countries. We use a probit model to examine the variables associated with the probability of being concerned about privacy, in order to draw policy and regulatory implications. The results show that women and poor people are more concerned than their counterparts. People who often use Internet are not p...

  14. A Privacy-Preserving Outsourcing Data Storage Scheme with Fragile Digital Watermarking-Based Data Auditing

    Directory of Open Access Journals (Sweden)

    Xinyue Cao

    2016-01-01

    Full Text Available Cloud storage has been recognized as the popular solution to solve the problems of the rising storage costs of IT enterprises for users. However, outsourcing data to the cloud service providers (CSPs may leak some sensitive privacy information, as the data is out of user’s control. So how to ensure the integrity and privacy of outsourced data has become a big challenge. Encryption and data auditing provide a solution toward the challenge. In this paper, we propose a privacy-preserving and auditing-supporting outsourcing data storage scheme by using encryption and digital watermarking. Logistic map-based chaotic cryptography algorithm is used to preserve the privacy of outsourcing data, which has a fast operation speed and a good effect of encryption. Local histogram shifting digital watermark algorithm is used to protect the data integrity which has high payload and makes the original image restored losslessly if the data is verified to be integrated. Experiments show that our scheme is secure and feasible.

  15. Guaranteeing Privacy-Observing Data Exchange

    DEFF Research Database (Denmark)

    Probst, Christian W.

    2016-01-01

    Privacy is a major concern in large of parts of the world when exchanging information. Ideally, we would like to be able to have fine-grained control about how information that we deem sensitive can be propagated and used. While privacy policy languages exist, it is not possible to control whether...... the entity that receives data is living up to its own policy specification. In this work we present our initial work on an approach that empowers data owners to specify their privacy preferences, and data consumers to specify their data needs. Using a static analysis of the two specifications, our approach...... then finds a communication scheme that complies with these preferences and needs. While applicable to online transactions, the same techniques can be used in development of IT systems dealing with sensitive data. To the best of our knowledge, no existing privacy policy languages supports negotiation...

  16. 77 FR 46643 - Children's Online Privacy Protection Rule

    Science.gov (United States)

    2012-08-06

    ... providing notice to and obtaining consent from parents. Conversely, online services whose business models..., challenging others to gameplay, swapping digital collectibles, participating in monitored `chat' with... Digital Democracy (``CDD''), Consumers Union (``CU''), and the Electronic Privacy Information Center...

  17. Privacy and policy for genetic research.

    Science.gov (United States)

    DeCew, Judith Wagner

    2004-01-01

    I begin with a discussion of the value of privacy and what we lose without it. I then turn to the difficulties of preserving privacy for genetic information and other medical records in the face of advanced information technology. I suggest three alternative public policy approaches to the problem of protecting individual privacy and also preserving databases for genetic research: (1) governmental guidelines and centralized databases, (2) corporate self-regulation, and (3) my hybrid approach. None of these are unproblematic; I discuss strengths and drawbacks of each, emphasizing the importance of protecting the privacy of sensitive medical and genetic information as well as letting information technology flourish to aid patient care, public health and scientific research.

  18. Online access to doctors' notes: patient concerns about privacy.

    Science.gov (United States)

    Vodicka, Elisabeth; Mejilla, Roanne; Leveille, Suzanne G; Ralston, James D; Darer, Jonathan D; Delbanco, Tom; Walker, Jan; Elmore, Joann G

    2013-09-26

    Offering patients online access to medical records, including doctors' visit notes, holds considerable potential to improve care. However, patients may worry about loss of privacy when accessing personal health information through Internet-based patient portals. The OpenNotes study provided patients at three US health care institutions with online access to their primary care doctors' notes and then collected survey data about their experiences, including their concerns about privacy before and after participation in the intervention. To identify patients' attitudes toward privacy when given electronic access to their medical records, including visit notes. The design used a nested cohort study of patients surveyed at baseline and after a 1-year period during which they were invited to read their visit notes through secure patient portals. Participants consisted of 3874 primary care patients from Beth Israel Deaconess Medical Center (Boston, MA), Geisinger Health System (Danville, PA), and Harborview Medical Center (Seattle, WA) who completed surveys before and after the OpenNotes intervention. The measures were patient-reported levels of concern regarding privacy associated with online access to visit notes. 32.91% of patients (1275/3874 respondents) reported concerns about privacy at baseline versus 36.63% (1419/3874 respondents) post-intervention. Baseline concerns were associated with non-white race/ethnicity and lower confidence in communicating with doctors, but were not associated with choosing to read notes or desire for continued online access post-intervention (nearly all patients with notes available chose to read them and wanted continued access). While the level of concern among most participants did not change during the intervention, 15.54% (602/3874 respondents, excluding participants who responded "don't know") reported more concern post-intervention, and 12.73% (493/3874 respondents, excluding participants who responded "don't know") reported less

  19. Forensic DNA phenotyping: Developing a model privacy impact assessment.

    Science.gov (United States)

    Scudder, Nathan; McNevin, Dennis; Kelty, Sally F; Walsh, Simon J; Robertson, James

    2018-05-01

    Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. “Jones-ing” for a Solution: Commercial Street Surveillance and Privacy Torts in Canada

    Directory of Open Access Journals (Sweden)

    Stuart Hargreaves

    2014-07-01

    Full Text Available While street surveillance technologies such as Google Street View are deployed with no discriminatory intent, there is selective scrutiny applied to the published imagery by the anonymous crowd. Disproportionately directed at women and members of ethnic minority groups, this scrutiny means the social risks of street surveillance are not equal. This paper considers the possibility of invasion of privacy actions in tort brought against the commercial service provider as a possible solution. Analysis suggests that Canadian law has evolved in a way such that it is exceedingly difficult to make a claim for “privacy” in tort when the plaintiff is located in public space. This evolution exists in order to ensure that innocuous behavior not be rendered actionable. Furthermore, conceptual reasons exist to suggest that actions in tort are unlikely to be the best solution to the problems posed by commercial street surveillance. While any individual case of embarrassment or nuisance matters, broader “macro-harms” that impact entire communities reflect perhaps the most serious problem associated with the selective scrutiny of street surveillance imagery. Yet, it seems difficult to justify attaching liability for those harms to the commercial providers. While limits need to be placed on the operation of these street surveillance programmes, it is unlikely that invasion of privacy actions are the most effective way to achieve that goal.

  1. A PRIVACY MANAGEMENT ARCHITECTURE FOR PATIENT-CONTROLLED PERSONAL HEALTH RECORD SYSTEM

    Directory of Open Access Journals (Sweden)

    MD. NURUL HUDA

    2009-06-01

    Full Text Available Patient-controlled personal health record systems can help make health care safer, cheaper, and more convenient by facilitating patients to 1 grant any care provider access to their complete personal health records anytime from anywhere, 2 avoid repeated tests and 3 control their privacy transparently. In this paper, we present the architecture of our Privacy-aware Patient-controlled Personal Health Record (P3HR system through which a patient can view her integrated health history, and share her health information transparently with others (e.g., healthcare providers. Access to the health information of a particular patient is completely controlled by that patient. We also carry out intuitive security and privacy analysis of the P3HR system architecture considering different types of security attacks. Finally, we describe a prototype implementation of the P3HR system that we developed reflecting the special view of Japanese society. The most important advantage of P3HR system over other existing systems is that most likely P3HR system provides complete privacy protection without losing data accuracy. Unlike traditional partially anonymous health records (e.g., using k-anonymity or l-diversity, the health records in P3HR are closer to complete anonymity, and yet preserve data accuracy. Our approach makes it very unlikely that patients could be identified by an attacker from their anonymous health records in the P3HR system.

  2. DECENTRALIZED SOCIAL NETWORK SERVICE USING THE WEB HOSTING SERVER FOR PRIVACY PRESERVATION

    Directory of Open Access Journals (Sweden)

    Yoonho Nam

    2013-10-01

    Full Text Available In recent years, the number of subscribers of the social network services such as Facebook and Twitter has increased rapidly. In accordance with the increasing popularity of social network services, concerns about user privacy are also growing. Existing social network services have a centralized structure that a service provider collects all the user’s profile and logs until the end of the connection. The information collected typically useful for commercial purposes, but may lead to a serious user privacy violation. The user’s profile can be compromised for malicious purposes, and even may be a tool of surveillance extremely. In this paper, we remove a centralized structure to prevent the service provider from collecting all users’ information indiscriminately, and present a decentralized structure using the web hosting server. The service provider provides only the service applications to web hosting companies, and the user should select a web hosting company that he trusts. Thus, the user’s information is distributed, and the user’s privacy is guaranteed from the service provider.

  3. New Technology "Clouds" Student Data Privacy

    Science.gov (United States)

    Krueger, Keith R.; Moore, Bob

    2015-01-01

    As technology has leaped forward to provide valuable learning tools, parents and policy makers have begun raising concerns about the privacy of student data that schools and systems have. Federal laws are intended to protect students and their families but they have not and will never be able to keep up with rapidly evolving technology. School…

  4. 78 FR 9678 - Multi-stakeholder Process To Develop a Voluntary Code of Conduct for Smart Grid Data Privacy

    Science.gov (United States)

    2013-02-11

    ... providing consumer energy use services. DATES: Tuesday, February 26, 2013 (9:30 a.m. to 4:30 p.m., Eastern... Privacy and Promoting Innovation in the Global Digital Economy \\2\\ (Privacy Blueprint). The Privacy Blueprint outlines a multi-stakeholder process for developing voluntary codes of conduct that, if adopted by...

  5. The Effect of Agile Workspace and Remote Working on Experiences of Privacy, Crowding and Satisfaction

    Directory of Open Access Journals (Sweden)

    Trevor Keeling

    2015-08-01

    Full Text Available Occupant density is an important and basic metric of space use efficiency. It affects user experience of privacy, crowding and satisfaction. The effect of agile working has been two fold. Firstly, offices have an increasing range of workspace settings such as break out space, collaborative space and contemplative space in contrast to the traditional workspace settings of assigned desks and formal meeting rooms. Secondly, office workers have become increasingly mobile as they are able to work from a greater variety of locations both in and out of their main place of work. This study asks whether workers who occupy agile workspaces and those with greater mobility experience privacy differently from workers with more conventional offices and work patterns. The experience of privacy can be considered in terms of retreat from people, control of information flow and control of interactions. Our results show that agile workspaces improve the ability to control information compared with open plan offices. It was also found that highly mobile workers are more sensitive to the negative effects of interacting with people. From this a taxonomy of offices is defined in terms of the features that contribute to the experience of privacy.

  6. DESIGN AND IMPLEMENTATION OF A PRIVACY PRESERVED OFF-PREMISES CLOUD STORAGE

    OpenAIRE

    Sarfraz Nawaz Brohi; Mervat Adib Bamiah; Suriayati Chuprat; Jamalul-lail Ab Manan

    2014-01-01

    Despite several cost-effective and flexible characteristics of cloud computing, some clients are reluctant to adopt this paradigm due to emerging security and privacy concerns. Organization such as Healthcare and Payment Card Industry where confidentiality of information is a vital act, are not assertive to trust the security techniques and privacy policies offered by cloud service providers. Malicious attackers have violated the cloud storages to steal, view, manipulate and tamper client&...

  7. Enhancing Privacy for Biometric Identification Cards

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Most developed countries have started the implementation of biometric electronic identification cards, especially passports. The European Union and the United States of America struggle to introduce and standardize these electronic documents. Due to the personal nature of the biometric elements used for the generation of these cards, privacy issues were raised on both sides of the Atlantic Ocean, leading to civilian protests and concerns. The lack of transparency from the public authorities responsible with the implementation of such identification systems, and the poor technological approaches chosen by these authorities, are the main reasons for the negative popularity of the new identification methods. The following article shows an approach that provides all the benefits of modern technological advances in the fields of biometrics and cryptography, without sacrificing the privacy of those that will be the beneficiaries of the new system

  8. FCJ-195 Privacy, Responsibility, and Human Rights Activism

    Directory of Open Access Journals (Sweden)

    Becky Kazansky

    2015-06-01

    Full Text Available In this article, we argue that many difficulties associated with the protection of digital privacy are rooted in the framing of privacy as a predominantly individual responsibility. We examine how models of privacy protection, such as Notice and Choice, contribute to the ‘responsibilisation’ of human rights activists who rely on the use of technologies for their work. We also consider how a group of human rights activists countered technology-mediated threats that this ‘responsibilisation’ causes by developing a collective approach to address their digital privacy and security needs. We conclude this article by discussing how technological tools used to maintain or counter the loss of privacy can be improved in order to support the privacy and digital security of human rights activists.

  9. A framework for privacy and security analysis of probe-based traffic information systems

    KAUST Repository

    Canepa, Edward S.; Claudel, Christian G.

    2013-01-01

    Most large scale traffic information systems rely on fixed sensors (e.g. loop detectors, cameras) and user generated data, this latter in the form of GPS traces sent by smartphones or GPS devices onboard vehicles. While this type of data is relatively inexpensive to gather, it can pose multiple security and privacy risks, even if the location tracks are anonymous. In particular, creating bogus location tracks and sending them to the system is relatively easy. This bogus data could perturb traffic flow estimates, and disrupt the transportation system whenever these estimates are used for actuation. In this article, we propose a new framework for solving a variety of privacy and cybersecurity problems arising in transportation systems. The state of traffic is modeled by the Lighthill-Whitham-Richards traffic flow model, which is a first order scalar conservation law with concave flux function. Given a set of traffic flow data, we show that the constraints resulting from this partial differential equation are mixed integer linear inequalities for some decision variable. The resulting framework is very flexible, and can in particular be used to detect spoofing attacks in real time, or carry out attacks on location tracks. Numerical implementations are performed on experimental data from the Mobile Century experiment to validate this framework. © 2013 ACM.

  10. 48 CFR 352.224-70 - Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Privacy Act. 352.224-70... SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 352.224-70 Privacy Act. As prescribed in 324.103(b)(2), the Contracting Officer shall insert the following clause: Privacy Act (January...

  11. Access to Information and Privacy | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    As a Crown corporation, IDRC is subject to Canada's laws on access to information and privacy protection. The following resources will help you learn more about IDRC and the access to information and privacy acts, including instructions for submitting an access to information or privacy act (ATIP) request. IDRC and ATIP ...

  12. The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy.

    Science.gov (United States)

    O'Brolcháin, Fiachra; Jacquemard, Tim; Monaghan, David; O'Connor, Noel; Novitzky, Peter; Gordijn, Bert

    2016-02-01

    The rapid evolution of information, communication and entertainment technologies will transform the lives of citizens and ultimately transform society. This paper focuses on ethical issues associated with the likely convergence of virtual realities (VR) and social networks (SNs), hereafter VRSNs. We examine a scenario in which a significant segment of the world's population has a presence in a VRSN. Given the pace of technological development and the popularity of these new forms of social interaction, this scenario is plausible. However, it brings with it ethical problems. Two central ethical issues are addressed: those of privacy and those of autonomy. VRSNs pose threats to both privacy and autonomy. The threats to privacy can be broadly categorized as threats to informational privacy, threats to physical privacy, and threats to associational privacy. Each of these threats is further subdivided. The threats to autonomy can be broadly categorized as threats to freedom, to knowledge and to authenticity. Again, these three threats are divided into subcategories. Having categorized the main threats posed by VRSNs, a number of recommendations are provided so that policy-makers, developers, and users can make the best possible use of VRSNs.

  13. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    Science.gov (United States)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  14. PRIVACY PROTECTION PROBLEMS IN SOCIAL NETWORKS

    OpenAIRE

    OKUR, M. Cudi

    2011-01-01

    Protecting privacy has become a major concern for most social network users because of increased difficulties of controlling the online data. This article presents an assessment of the common privacy related risks of social networking sites. Open and hidden privacy risks of active and passive online profiles are examined and increasing share of social networking in these phenomena is discussed. Inadequacy of available legal and institutional protection is demonstrated and the effectiveness of...

  15. Facebook: Personality and privacy on profiles

    OpenAIRE

    Casado Riera, Carla; Oberst, Ursula; Carbonell, Xavier

    2015-01-01

    The aim of this study was to study the possible relationship between the privacy settings in Facebook profiles and two personality dimensions, extraversion and neuroticism, in relation to gender. The Privacy on Facebook Questionnaire and the Eysenck Personality Inventory was applied to a sample of 92 womenand 70 men, all users of Facebook. No significant relationship was found between extraversion or neuroticism and the privacy settings of Facebook profiles, but the results showed significant...

  16. Privacy enhancing techniques - the key to secure communication and management of clinical and genomic data.

    Science.gov (United States)

    De Moor, G J E; Claerhout, B; De Meyer, F

    2003-01-01

    To introduce some of the privacy protection problems related to genomics based medicine and to highlight the relevance of Trusted Third Parties (TTPs) and of Privacy Enhancing Techniques (PETs) in the restricted context of clinical research and statistics. Practical approaches based on two different pseudonymisation models, both for batch and interactive data collection and exchange, are described and analysed. The growing need of managing both clinical and genetic data raises important legal and ethical challenges. Protecting human rights in the realm of privacy, while optimising research potential and other statistical activities is a challenge that can easily be overcome with the assistance of a trust service provider offering advanced privacy enabling/enhancing solutions. As such, the use of pseudonymisation and other innovative Privacy Enhancing Techniques can unlock valuable data sources.

  17. New threats to health data privacy.

    Science.gov (United States)

    Li, Fengjun; Zou, Xukai; Liu, Peng; Chen, Jake Y

    2011-11-24

    Along with the rapid digitalization of health data (e.g. Electronic Health Records), there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. We study how patient privacy could be compromised with the help of today's information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  18. New threats to health data privacy

    Directory of Open Access Journals (Sweden)

    Li Fengjun

    2011-11-01

    Full Text Available Abstract Background Along with the rapid digitalization of health data (e.g. Electronic Health Records, there is an increasing concern on maintaining data privacy while garnering the benefits, especially when the data are required to be published for secondary use. Most of the current research on protecting health data privacy is centered around data de-identification and data anonymization, which removes the identifiable information from the published health data to prevent an adversary from reasoning about the privacy of the patients. However, published health data is not the only source that the adversaries can count on: with a large amount of information that people voluntarily share on the Web, sophisticated attacks that join disparate information pieces from multiple sources against health data privacy become practical. Limited efforts have been devoted to studying these attacks yet. Results We study how patient privacy could be compromised with the help of today’s information technologies. In particular, we show that private healthcare information could be collected by aggregating and associating disparate pieces of information from multiple online data sources including online social networks, public records and search engine results. We demonstrate a real-world case study to show user identity and privacy are highly vulnerable to the attribution, inference and aggregation attacks. We also show that people are highly identifiable to adversaries even with inaccurate information pieces about the target, with real data analysis. Conclusion We claim that too much information has been made available electronic and available online that people are very vulnerable without effective privacy protection.

  19. Security and Privacy Qualities of Medical Devices: An Analysis of FDA Postmarket Surveillance

    Science.gov (United States)

    Kramer, Daniel B.; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R.

    2012-01-01

    Background Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients’ stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. Methods We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Results Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Conclusions Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware. PMID:22829874

  20. Sexiled: Privacy Acquisition Strategies of College Roommates

    Science.gov (United States)

    Erlandson, Karen

    2014-01-01

    This study sought to understand how roommates make privacy bids in college residence halls. The results indicate that privacy for sexual activity is a problem for students living in college residence halls, as almost all participants (82%) reported having dealt with this issue. Two sets of responses were collected and analyzed: privacy acquisition…

  1. System And Method For Monitoring Traffic While Preserving Personal Privacy

    KAUST Repository

    Canepa, Edward

    2015-08-06

    A traffic monitoring system and method for mapping traffic speed and density while preserving privacy. The system can include fixed stations that make up a network and mobile probes that are associated with vehicles. The system and method do not gather, store, or transmit any unique or identifying information, and thereby preserves the privacy of members of traffic. The system and method provide real-time traffic density and speed mapping. The system and method can further be integrated with a complementary flood monitoring system and method.

  2. Privacy and CHI : methodologies for studying privacy issues

    NARCIS (Netherlands)

    Patil, S.; Romero, N.A.; Karat, J.

    2006-01-01

    This workshop aims to reflect on methodologies to empirically study privacy issues related to advanced technology. The goal is to address methodological concerns by drawing upon both theoretical perspectives as well as practical experiences.

  3. Privacy Perspectives for Online Searchers: Confidentiality with Confidence?

    Science.gov (United States)

    Duberman, Josh; Beaudet, Michael

    2000-01-01

    Presents issues and questions involved in online privacy from the information professional's perspective. Topics include consumer concerns; query confidentiality; securing computers from intrusion; electronic mail; search engines; patents and intellectual property searches; government's role; Internet service providers; database mining; user…

  4. Patient Privacy in the Era of Big Data

    Directory of Open Access Journals (Sweden)

    Mehmet Kayaalp

    2018-02-01

    Full Text Available Protecting patient privacy requires various technical tools. It involves regulations for sharing, de-identifying, securely storing, transmitting and handling protected health information (PHI. It involves privacy laws and legal agreements. It requires establishing rules for monitoring privacy leaks, determining actions when they occur, and handling de-identified clinical narrative reports. Deidentification is one such indispensable instrument in this set of privacy tools

  5. Biobanking and Privacy in India.

    Science.gov (United States)

    Chaturvedi, Sachin; Srinivas, Krishna Ravi; Muthuswamy, Vasantha

    2016-03-01

    Biobank-based research is not specifically addressed in Indian statutory law and therefore Indian Council for Medical Research guidelines are the primary regulators of biobank research in India. The guidelines allow for broad consent and for any level of identification of specimens. Although privacy is a fundamental right under the Indian Constitution, courts have limited this right when it conflicts with other rights or with the public interest. Furthermore, there is no established privacy test or actionable privacy right in the common law of India. In order to facilitate biobank-based research, both of these lacunae should be addressed by statutory law specifically addressing biobanking and more directly addressing the accompanying privacy concerns. A biobank-specific law should be written with international guidelines in mind, but harmonization with other laws should not be attempted until after India has created a law addressing biobank research within the unique legal and cultural environment of India. © 2016 American Society of Law, Medicine & Ethics.

  6. Maintaining the privacy of a minor's sexual orientation and gender identity in the medical environment.

    Science.gov (United States)

    Hyatt, Josh

    2015-01-01

    Dealing with self-identity, sexual orientation, and gender identity is often a struggle for minors. The potential negative outcomes minors face when their sexual orientation or gender identity is disclosed to others before they have an opportunity to address it in their own time has become more evident in the media. Because of the intimate nature of the provider-patient relationship, the healthcare provider may be the first person in whom they confide. If a minor receives a positive, nonjudgmental experience from his or her provider, it will often lead to a more positive self-image, whereas a negative, judgmental experience will often result in the opposite. Critical components of their experience are a sense of trust that the provider will keep the information confidential and the healthcare setting being organized in a manner that promotes privacy. Healthcare providers play a key role in developing and projecting a safe, comfortable environment where the minor can discretely discuss issues of sexual orientation and gender identity. Establishing this environment will usually facilitate a positive therapeutic relationship between the minor and the provider. Steps healthcare providers can take to achieve trust from minor patients and ensure confidentiality of sensitive information are understanding privacy laws, making privacy a priority, getting consent, training staff, and demonstrating privacy in the environment. © 2015 American Society for Healthcare Risk Management of the American Hospital Association.

  7. Provable Fair Document Exchange Protocol with Transaction Privacy for E-Commerce

    Directory of Open Access Journals (Sweden)

    Ren-Junn Hwang

    2015-04-01

    Full Text Available Transaction privacy has attracted a lot of attention in the e-commerce. This study proposes an efficient and provable fair document exchange protocol with transaction privacy. Using the proposed protocol, any untrusted parties can fairly exchange documents without the assistance of online, trusted third parties. Moreover, a notary only notarizes each document once. The authorized document owner can exchange a notarized document with different parties repeatedly without disclosing the origin of the document or the identities of transaction participants. Security and performance analyses indicate that the proposed protocol not only provides strong fairness, non-repudiation of origin, non-repudiation of receipt, and message confidentiality, but also enhances forward secrecy, transaction privacy, and authorized exchange. The proposed protocol is more efficient than other works.

  8. 10 CFR 1304.103 - Privacy Act inquiries.

    Science.gov (United States)

    2010-01-01

    ... writing may be sent to: Privacy Act Officer, U.S. Nuclear Waste Technical Review Board, 2300 Clarendon... NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.103 Privacy Act inquiries. (a) Requests... contains a record pertaining to him or her may file a request in person or in writing, via the internet, or...

  9. The role of privacy protection in healthcare information systems adoption.

    Science.gov (United States)

    Hsu, Chien-Lung; Lee, Ming-Ren; Su, Chien-Hui

    2013-10-01

    Privacy protection is an important issue and challenge in healthcare information systems (HISs). Recently, some privacy-enhanced HISs are proposed. Users' privacy perception, intention, and attitude might affect the adoption of such systems. This paper aims to propose a privacy-enhanced HIS framework and investigate the role of privacy protection in HISs adoption. In the proposed framework, privacy protection, access control, and secure transmission modules are designed to enhance the privacy protection of a HIS. An experimental privacy-enhanced HIS is also implemented. Furthermore, we proposed a research model extending the unified theory of acceptance and use of technology by considering perceived security and information security literacy and then investigate user adoption of a privacy-enhanced HIS. The experimental results and analyses showed that user adoption of a privacy-enhanced HIS is directly affected by social influence, performance expectancy, facilitating conditions, and perceived security. Perceived security has a mediating effect between information security literacy and user adoption. This study proposes several implications for research and practice to improve designing, development, and promotion of a good healthcare information system with privacy protection.

  10. A Model-Based Privacy Compliance Checker

    OpenAIRE

    Siani Pearson; Damien Allison

    2009-01-01

    Increasingly, e-business organisations are coming under pressure to be compliant to a range of privacy legislation, policies and best practice. There is a clear need for high-level management and administrators to be able to assess in a dynamic, customisable way the degree to which their enterprise complies with these. We outline a solution to this problem in the form of a model-driven automated privacy process analysis and configuration checking system. This system models privacy compliance ...

  11. Privacy Law and Print Photojournalism.

    Science.gov (United States)

    Dykhouse, Caroline Dow

    Reviews of publications about privacy law, of recent court actions, and of interviews with newspaper photographers and attorneys indicate that torts of privacy often conflict with the freedoms to publish and to gather news. Although some guidelines have already been established (about running distorted pictures, "stealing" pictures, taking…

  12. Privacy Act

    Science.gov (United States)

    Learn about the Privacy Act of 1974, the Electronic Government Act of 2002, the Federal Information Security Management Act, and other information about the Environmental Protection Agency maintains its records.

  13. Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach

    Directory of Open Access Journals (Sweden)

    Tuukka Lehtiniemi

    2017-07-01

    Full Text Available In privacy self-management, people are expected to perform cost–benefit analysis on the use of their personal data, and only consent when their subjective benefits outweigh the costs. However, the ubiquitous collection of personal data and Big Data analytics present increasing challenges to successful privacy management. A number of services and research initiatives have proposed similar solutions to provide people with more control over their data by consolidating consent decisions under a single interface. We have named this the ‘ consent intermediary ’ approach. In this paper, we first identify the eight obstacles to privacy self-management which make cost–benefit analysis conceptually and practically challenging. We then analyse to which extent consent intermediaries can help overcome the obstacles. We argue that simply bringing consent decisions under one interface offers limited help, but that the potential of this approach lies in leveraging the intermediary position to provide aides for privacy management. We find that with suitable tools, some of the more practical obstacles indeed can become solvable, while others remain fundamentally insuperable within the individuated privacy self-management model. Attention should also be paid to how the consent intermediaries may take advantage of the power vested in the intermediary positions between users and other services.

  14. Achieving Better Privacy for the 3GPP AKA Protocol

    Directory of Open Access Journals (Sweden)

    Fouque Pierre-Alain

    2016-10-01

    Full Text Available Proposed by the 3rd Generation Partnership Project (3GPP as a standard for 3G and 4G mobile-network communications, the AKA protocol is meant to provide a mutually-authenticated key-exchange between clients and associated network servers. As a result AKA must guarantee the indistinguishability from random of the session keys (key-indistinguishability, as well as client- and server-impersonation resistance. A paramount requirement is also that of client privacy, which 3GPP defines in terms of: user identity confidentiality, service untraceability, and location untraceability. Moreover, since servers are sometimes untrusted (in the case of roaming, the AKA protocol must also protect clients with respect to these third parties. Following the description of client-tracking attacks e.g. by using error messages or IMSI catchers, van den Broek et al. and respectively Arapinis et al. each proposed a new variant of AKA, addressing such problems. In this paper we use the approach of provable security to show that these variants still fail to guarantee the privacy of mobile clients. We propose an improvement of AKA, which retains most of its structure and respects practical necessities such as key-management, but which provably attains security with respect to servers and Man-in-the- Middle (MiM adversaries. Moreover, it is impossible to link client sessions in the absence of client-corruptions. Finally, we prove that any variant of AKA retaining its mutual authentication specificities cannot achieve client-unlinkability in the presence of corruptions. In this sense, our proposed variant is optimal.

  15. Privacy-preserving recommender systems in dynamic environments

    NARCIS (Netherlands)

    Erkin, Z.; Veugen, T.; Lagendijk, R.L.

    2013-01-01

    Recommender systems play a crucial role today in on-line applications as they improve the customer satisfaction, and at the same time results in an increase in the profit for the service provider. However, there are serious privacy concerns as such systems rely on the personal data of the customers.

  16. Story Lab: Student Data Privacy

    Science.gov (United States)

    Herold, Benjamin

    2015-01-01

    Student data privacy is an increasingly high-profile--and controversial--issue that touches schools and families across the country. There are stories to tell in virtually every community. About three dozen states have passed legislation addressing student data privacy in the past two years, and eight different proposals were floating around…

  17. Ensuring privacy in the study of pathogen genetics.

    Science.gov (United States)

    Mehta, Sanjay R; Vinterbo, Staal A; Little, Susan J

    2014-08-01

    Rapid growth in the genetic sequencing of pathogens in recent years has led to the creation of large sequence databases. This aggregated sequence data can be very useful for tracking and predicting epidemics of infectious diseases. However, the balance between the potential public health benefit and the risk to personal privacy for individuals whose genetic data (personal or pathogen) are included in such work has been difficult to delineate, because neither the true benefit nor the actual risk to participants has been adequately defined. Existing approaches to minimise the risk of privacy loss to participants are based on de-identification of data by removal of a predefined set of identifiers. These approaches neither guarantee privacy nor protect the usefulness of the data. We propose a new approach to privacy protection that will quantify the risk to participants, while still maximising the usefulness of the data to researchers. This emerging standard in privacy protection and disclosure control, which is known as differential privacy, uses a process-driven rather than data-centred approach to protecting privacy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Privacy as virtue: searching for a new privacy paradigm in the age of Big Data

    NARCIS (Netherlands)

    van der Sloot, B.; Beyvers, E.; Helm, P.; Hennig, M.; Keckeis, C.; Kreknin, I.; Püschel, F.

    2017-01-01

    Originally, privacy was conceived primarily as a duty of the state not to abuse its powers It could not, for example, enter a private house without legitimate reason or reasonable suspicion that the owner of the house had engaged in, for example, criminal conduct Gradually, however, privacy has been

  19. Security measures required for HIPAA privacy.

    Science.gov (United States)

    Amatayakul, M

    2000-01-01

    HIPAA security requirements include administrative, physical, and technical services and mechanisms to safeguard confidentiality, availability, and integrity of health information. Security measures, however, must be implemented in the context of an organization's privacy policies. Because HIPAA's proposed privacy rules are flexible and scalable to account for the nature of each organization's business, size, and resources, each organization will be determining its own privacy policies within the context of the HIPAA requirements and its security capabilities. Security measures cannot be implemented in a vacuum.

  20. Protecting privacy in a clinical data warehouse.

    Science.gov (United States)

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  1. 'Privacy lost - and found?' : the information value chain as a model to meet citizens' concerns

    NARCIS (Netherlands)

    van de Pas, John; van Bussel, Geert-Jan

    2015-01-01

    In this paper we explore the extent to which privacy enhancing technologies (PETs) could be effective in providing privacy to citizens. Rapid development of ubiquitous computing and ‘the internet of things’ are leading to Big Data and the application of Predictive Analytics, effectively merging the

  2. Kids Sell: Celebrity Kids’ Right to Privacy

    Directory of Open Access Journals (Sweden)

    Seong Choul Hong

    2016-04-01

    Full Text Available The lives of celebrities are often spotlighted in the media because of their newsworthiness; however, many celebrities argue that their right to privacy is often infringed upon. Concerns about celebrity privacy are not limited to the celebrities themselves and often expand to their children. As a result of their popularity, public interest has pushed paparazzi and journalists to pursue trivial and private details about the lives of both celebrities and their children. This paper investigates conflicting areas where the right to privacy and the right to know collide when dealing with the children of celebrities. In general, the courts have been unsympathetic to celebrity privacy claims, noting their newsworthiness and self-promoted characteristic. Unless the press violates news-gathering ethics or torts, the courts will often rule in favor of the media. However, the story becomes quite different when related to an infringement on the privacy of celebrities’ children. This paper argues that all children have a right to protect their privacy regardless of their parents’ social status. Children of celebrities should not be exempt to principles of privacy just because their parents are a celebrity. Furthermore, they should not be exposed by the media without the voluntary consent of their legal patrons. That is, the right of the media to publish and the newsworthiness of children of celebrities must be restrictedly acknowledged.

  3. Disclosing genetic information to at-risk relatives: new Australian privacy principles, but uniformity still elusive.

    Science.gov (United States)

    Otlowski, Margaret F A

    2015-04-06

    There is growing understanding of the need for genetic information to be shared with genetic relatives in some circumstances. Since 2006, s 95AA of the Privacy Act 1988 (Cwlth) has permitted the disclosure of genetic information to genetic relatives without the patient's consent, provided that the health practitioner reasonably believes that disclosure is necessary to lessen or prevent a serious threat to the life, health or safety of the genetic relatives. Enabling guidelines were introduced in 2009. These were limited to the private sector, and excluded doctors working in the public sector at both Commonwealth and state and territory levels. Privacy legislation was amended in March 2014, and new Australian Privacy Principles, which replace the National Privacy Principles and Information Privacy Principles, now cover the collection and use of personal information. The Privacy Act and the Australian Privacy Principles now extend to practitioners employed by the Commonwealth but not to health practitioners working in state and territory public hospitals. In this article, I review these legislative developments and highlight the implications of the lack of uniformity and the consequent need for a collaborative, uniform approach by states and territories.

  4. An overview of human genetic privacy.

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2017-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. © 2016 New York Academy of Sciences.

  5. An overview of human genetic privacy

    Science.gov (United States)

    Shi, Xinghua; Wu, Xintao

    2016-01-01

    The study of human genomics is becoming a Big Data science, owing to recent biotechnological advances leading to availability of millions of personal genome sequences, which can be combined with biometric measurements from mobile apps and fitness trackers, and of human behavior data monitored from mobile devices and social media. With increasing research opportunities for integrative genomic studies through data sharing, genetic privacy emerges as a legitimate yet challenging concern that needs to be carefully addressed, not only for individuals but also for their families. In this paper, we present potential genetic privacy risks and relevant ethics and regulations for sharing and protecting human genomics data. We also describe the techniques for protecting human genetic privacy from three broad perspectives: controlled access, differential privacy, and cryptographic solutions. PMID:27626905

  6. 17 CFR 160.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-04-01

    ... examples. 160.2 Section 160.2 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION PRIVACY OF CONSUMER FINANCIAL INFORMATION § 160.2 Model privacy form and examples. (a) Model privacy form..., although use of the model privacy form is not required. (b) Examples. The examples in this part are not...

  7. Privacy-preserving Identity Management

    OpenAIRE

    Milutinovic, Milica

    2015-01-01

    With the technological advances and the evolution of online services, user privacy is becoming a crucial issue in the modern day society. Privacy in the general sense refers to individuals’ ability to protect information about themselves and selectively present it to other entities. This concept is nowadays strongly affected by everyday practices that assume personal data disclosure, such as online shopping and participation in loyalty schemes. This makes it difficult for an individual to con...

  8. Legal Aspects of a Location-Based Mobile Advertising Platform

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix; Gidofalvi, Gyozo

    2008-01-01

    Recent advances in communication and information technology, such as the increasing accuracy of GPS technology and the miniaturization of wireless communication devices pave the road for Location-Based Services. Among these services, mobile advertising is predicted to represent a high yield revenue...... stream. In this article the possibilities of using a location-aware mobile messenger for the purpose of mobile advertising will be introduced. However, mobile advertising may become an extremely intrusive practice if the user's privacy is not taken in account. The objective of this article is therefore...

  9. The Privacy Problem: Although School Librarians Seldom Discuss It, Students' Privacy Rights Are under Attack

    Science.gov (United States)

    Adams, Helen R.

    2011-01-01

    Every day in school libraries nationwide, students' privacy rights are under attack, but many principals, teachers, parents, and community members do not know much about these rights. Even though school librarians are among the strongest proponents of privacy, the subject is rarely discussed, probably because state and federal laws can be…

  10. Genetic secrets: Protecting privacy and confidentiality in the genetic era. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rothstein, M.A. [ed.

    1998-09-01

    Few developments are likely to affect human beings more profoundly in the long run than the discoveries resulting from advances in modern genetics. Although the developments in genetic technology promise to provide many additional benefits, their application to genetic screening poses ethical, social, and legal questions, many of which are rooted in issues of privacy and confidentiality. The ethical, practical, and legal ramifications of these and related questions are explored in depth. The broad range of topics includes: the privacy and confidentiality of genetic information; the challenges to privacy and confidentiality that may be projected to result from the emerging genetic technologies; the role of informed consent in protecting the confidentiality of genetic information in the clinical setting; the potential uses of genetic information by third parties; the implications of changes in the health care delivery system for privacy and confidentiality; relevant national and international developments in public policies, professional standards, and laws; recommendations; and the identification of research needs.

  11. Effectiveness of Anonymization Methods in Preserving Patients' Privacy: A Systematic Literature Review.

    Science.gov (United States)

    Langarizadeh, Mostafa; Orooji, Azam; Sheikhtaheri, Abbas

    2018-01-01

    An ever growing for application of electronic health records (EHRs) has improved healthcare providers' communications, access to data for secondary use and promoted the quality of services. Patient's privacy has been changed to a great issue today since there are large loads of critical information in EHRs. Therefore, many privacy preservation techniques have been proposed and anonymization is a common one. This study aimed to investigate the effectiveness of anonymization in preserving patients' privacy. The articles published in the 2005-2016 were included. Pubmed, Cochrane, IEEE and ScienceDirect were searched with a variety of related keywords. Finally, 18 articles were included. In the present study, the relevant anonymization issues were investigated in four categories: secondary use of anonymized data, re-identification risk, anonymization effect on information extraction and inadequacy of current methods for different document types. The results revealed that though anonymization cannot reduce the risk of re-identification to zero, if implemented correctly, can manage to help preserve patient's privacy.

  12. Towards context adaptive privacy decisions in ubiquitous computing

    NARCIS (Netherlands)

    Schaub, Florian; Könings, Bastian; Weber, M.; Kargl, Frank

    2012-01-01

    In ubiquitous systems control of privacy settings will be increasingly difficult due to the pervasive nature of sensing and communication capabilities. We identify challenges for privacy decisions in ubiquitous systems and propose a system for in situ privacy decision support. When context changes

  13. Privacy in Online Social Networking Sites

    OpenAIRE

    M.Ida Evones

    2015-01-01

    There are more than 192 act ive social networking websites. Bringing every kind of social group together in one place and letting them interact is really a big thing indeed .Huge amount of information process in the sites each day, end up making it vulnerable to attack. There is no systematic framework taking into account the importance of privacy. Increased privacy settings don’t always guarantee privacy when there is a loop hole in the applications. Lack of user education results is over sh...

  14. Anonymous Search Histories Featuring Personalized Advertisement - Balancing Privacy with Economic Interests

    OpenAIRE

    Thorben Burghardt; Klemens Bohm; Achim Guttmann; Chris Clifton

    2011-01-01

    Search engines are key to finding information on the web. Search presently is free for users financed by targeted advertisement. Today, the current search terms determine the ad placement. In the near future, search-engine providers will make use of detailed user profiles for better ad placement. This puts user privacy at risk. Anonymizing search histories, which is a solution in principle, gives way to a trade-off between privacy and the usability of the data for ad placement. This paper stu...

  15. Computer-Aided Identification and Validation of Privacy Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2016-05-01

    Full Text Available Privacy is a software quality that is closely related to security. The main difference is that security properties aim at the protection of assets that are crucial for the considered system, and privacy aims at the protection of personal data that are processed by the system. The identification of privacy protection needs in complex systems is a hard and error prone task. Stakeholders whose personal data are processed might be overlooked, or the sensitivity and the need of protection of the personal data might be underestimated. The later personal data and the needs to protect them are identified during the development process, the more expensive it is to fix these issues, because the needed changes of the system-to-be often affect many functionalities. In this paper, we present a systematic method to identify the privacy needs of a software system based on a set of functional requirements by extending the problem-based privacy analysis (ProPAn method. Our method is tool-supported and automated where possible to reduce the effort that has to be spent for the privacy analysis, which is especially important when considering complex systems. The contribution of this paper is a semi-automatic method to identify the relevant privacy requirements for a software-to-be based on its functional requirements. The considered privacy requirements address all dimensions of privacy that are relevant for software development. As our method is solely based on the functional requirements of the system to be, we enable users of our method to identify the privacy protection needs that have to be addressed by the software-to-be at an early stage of the development. As initial evaluation of our method, we show its applicability on a small electronic health system scenario.

  16. Ethics and Privacy Implications of Using the Internet and Social Media to Recruit Participants for Health Research: A Privacy-by-Design Framework for Online Recruitment

    Science.gov (United States)

    Cyr, Alaina B; Arbuckle, Luk; Ferris, Lorraine E

    2017-01-01

    Background The Internet and social media offer promising ways to improve the reach, efficiency, and effectiveness of recruitment efforts at a reasonable cost, but raise unique ethical dilemmas. We describe how we used social media to recruit cancer patients and family caregivers for a research study, the ethical issues we encountered, and the strategies we developed to address them. Objective Drawing on the principles of Privacy by Design (PbD), a globally recognized standard for privacy protection, we aimed to develop a PbD framework for online health research recruitment. Methods We proposed a focus group study on the dietary behaviors of cancer patients and their families, and the role of Web-based dietary self-management tools. Using an established blog on our hospital website, we proposed publishing a recruitment post and sharing the link on our Twitter and Facebook pages. The Research Ethics Board (REB) raised concern about the privacy risks associated with our recruitment strategy; by clicking on a recruitment post, an individual could inadvertently disclose personal health information to third-party companies engaged in tracking online behavior. The REB asked us to revise our social media recruitment strategy with the following questions in mind: (1) How will you inform users about the potential for privacy breaches and their implications? and (2) How will you protect users from privacy breaches or inadvertently sharing potentially identifying information about themselves? Results Ethical guidelines recommend a proportionate approach to ethics assessment, which advocates for risk mitigation strategies that are proportional to the magnitude and probability of risks. We revised our social media recruitment strategy to inform users about privacy risks and to protect their privacy, while at the same time meeting our recruitment objectives. We provide a critical reflection of the perceived privacy risks associated with our social media recruitment strategy and

  17. 12 CFR 573.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Model privacy form and examples. 573.2 Section... FINANCIAL INFORMATION § 573.2 Model privacy form and examples. (a) Model privacy form. Use of the model... privacy form is not required. (b) Examples. The examples in this part are not exclusive. Compliance with...

  18. 12 CFR 332.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Model privacy form and examples. 332.2 Section... POLICY PRIVACY OF CONSUMER FINANCIAL INFORMATION § 332.2 Model privacy form and examples. (a) Model... this part, although use of the model privacy form is not required. (b) Examples. The examples in this...

  19. 12 CFR 216.2 - Model privacy form and examples.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 2 2010-01-01 2010-01-01 false Model privacy form and examples. 216.2 Section... PRIVACY OF CONSUMER FINANCIAL INFORMATION (REGULATION P) § 216.2 Model privacy form and examples. (a... of this part, although use of the model privacy form is not required. (b) Examples. The examples in...

  20. 43 CFR 2.47 - Records subject to Privacy Act.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Records subject to Privacy Act. 2.47 Section 2.47 Public Lands: Interior Office of the Secretary of the Interior RECORDS AND TESTIMONY; FREEDOM OF INFORMATION ACT Privacy Act § 2.47 Records subject to Privacy Act. The Privacy Act applies to all...