WorldWideScience

Sample records for protocol analysis techniques

  1. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  2. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  3. Affinity biosensors: techniques and protocols

    National Research Council Canada - National Science Library

    Rogers, Kim R; Mulchandani, Ashok

    1998-01-01

    ..., and government to begin or expand their biosensors research. This volume, Methods in Biotechnology vol. 7: Affinity Biosensors: Techniques and Protocols, describes a variety of classical and emerging transduction technologies that have been interfaced to bioaffinity elements (e.g., antibodies and receptors). Some of the reas...

  4. Analysis of security protocols based on challenge-response

    Institute of Scientific and Technical Information of China (English)

    LUO JunZhou; YANG Ming

    2007-01-01

    Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.

  5. RCRA groundwater data analysis protocol for the Hanford Site, Washington

    International Nuclear Information System (INIS)

    Chou, C.J.; Jackson, R.L.

    1992-04-01

    The Resource Conservation and Recovery Act of 1976 (RCRA) groundwater monitoring program currently involves site-specific monitoring of 20 facilities on the Hanford Site in southeastern Washington. The RCRA groundwater monitoring program has collected abundant data on groundwater quality. These data are used to assess the impact of a facility on groundwater quality or whether remediation efforts under RCRA corrective action programs are effective. Both evaluations rely on statistical analysis of groundwater monitoring data. The need for information on groundwater quality by regulators and environmental managers makes statistical analysis of monitoring data an important part of RCRA groundwater monitoring programs. The complexity of groundwater monitoring programs and variabilities (spatial, temporal, and analytical) exhibited in groundwater quality variables indicate the need for a data analysis protocol to guide statistical analysis. A data analysis protocol was developed from the perspective of addressing regulatory requirements, data quality, and management information needs. This data analysis protocol contains four elements: data handling methods; graphical evaluation techniques; statistical tests for trend, central tendency, and excursion analysis; and reporting procedures for presenting results to users

  6. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  7. Toward Synthesis, Analysis, and Certification of Security Protocols

    Science.gov (United States)

    Schumann, Johann

    2004-01-01

    Implemented security protocols are basically pieces of software which are used to (a) authenticate the other communication partners, (b) establish a secure communication channel between them (using insecure communication media), and (c) transfer data between the communication partners in such a way that these data only available to the desired receiver, but not to anyone else. Such an implementation usually consists of the following components: the protocol-engine, which controls in which sequence the messages of the protocol are sent over the network, and which controls the assembly/disassembly and processing (e.g., decryption) of the data. the cryptographic routines to actually encrypt or decrypt the data (using given keys), and t,he interface to the operating system and to the application. For a correct working of such a security protocol, all of these components must work flawlessly. Many formal-methods based techniques for the analysis of a security protocols have been developed. They range from using specific logics (e.g.: BAN-logic [4], or higher order logics [12] to model checking [2] approaches. In each approach, the analysis tries to prove that no (or at least not a modeled intruder) can get access to secret data. Otherwise, a scenario illustrating the &tack may be produced. Despite the seeming simplicity of security protocols ("only" a few messages are sent between the protocol partners in order to ensure a secure communication), many flaws have been detected. Unfortunately, even a perfect protocol engine does not guarantee flawless working of a security protocol, as incidents show. Many break-ins and security vulnerabilities are caused by exploiting errors in the implementation of the protocol engine or the underlying operating system. Attacks using buffer-overflows are a very common class of such attacks. Errors in the implementation of exception or error handling can open up additional vulnerabilities. For example, on a website with a log-in screen

  8. PERFORMANCE ANALYSIS OF DISTINCT SECURED AUTHENTICATION PROTOCOLS USED IN THE RESOURCE CONSTRAINED PLATFORM

    Directory of Open Access Journals (Sweden)

    S. Prasanna

    2014-03-01

    Full Text Available Most of the e-commerce and m-commerce applications in the current e-business world, has adopted asymmetric key cryptography technique in their authentication protocol to provide an efficient authentication of the involved parties. This paper exhibits the performance analysis of distinct authentication protocol which implements the public key cryptography like RSA, ECC and HECC. The comparison is made based on key generation, sign generation and sign verification processes. The results prove that the performance achieved through HECC based authentication protocol is better than the ECC- and RSA based authentication protocols.

  9. Protocol Analysis as a Method for Analyzing Conversational Data.

    Science.gov (United States)

    Aleman, Carlos G.; Vangelisti, Anita L.

    Protocol analysis, a technique that uses people's verbal reports about their cognitions as they engage in an assigned task, has been used in a number of applications to provide insight into how people mentally plan, assess, and carry out those assignments. Using a system of networked computers where actors communicate with each other over…

  10. Analysis of Security Protocols by Annotations

    DEFF Research Database (Denmark)

    Gao, Han

    . The development of formal techniques, e.g. control flow analyses, that can check various security properties, is an important tool to meet this challenge. This dissertation contributes to the development of such techniques. In this dissertation, security protocols are modelled in the process calculus LYSA......The trend in Information Technology is that distributed systems and networks are becoming increasingly important, as most of the services and opportunities that characterise the modern society are based on these technologies. Communication among agents over networks has therefore acquired a great...... deal of research interest. In order to provide effective and reliable means of communication, more and more communication protocols are invented, and for most of them, security is a significant goal. It has long been a challenge to determine conclusively whether a given protocol is secure or not...

  11. Automata Techniques for Epistemic Protocol Synthesis

    Directory of Open Access Journals (Sweden)

    Guillaume Aucher

    2014-04-01

    Full Text Available In this work we aim at applying automata techniques to problems studied in Dynamic Epistemic Logic, such as epistemic planning. To do so, we first remark that repeatedly executing ad infinitum a propositional event model from an initial epistemic model yields a relational structure that can be finitely represented with automata. This correspondence, together with recent results on uniform strategies, allows us to give an alternative decidability proof of the epistemic planning problem for propositional events, with as by-products accurate upper-bounds on its time complexity, and the possibility to synthesize a finite word automaton that describes the set of all solution plans. In fact, using automata techniques enables us to solve a much more general problem, that we introduce and call epistemic protocol synthesis.

  12. Novel Techniques with the Aid of a Staged CBCT Guided Surgical Protocol

    Directory of Open Access Journals (Sweden)

    Evdokia Chasioti

    2015-01-01

    Full Text Available The case report will present some novel techniques for using a “staged” protocol utilizing strategic periodontally involved teeth as transitional abutments in combination with CBCT guided implant surgery. Staging the case prevented premature loading of the grafted sites during the healing phase. A CBCT following a tenting screw guided bone regeneration procedure ensured adequate bone to place an implant fixture. Proper assessment of the CBCT allowed the surgeon to do an osteotome internal sinus lift in an optimum location. The depth of the bone needed for the osteotome sinus floor elevation was planned. The staged appliance allowed these sinus-augmented sites to heal for an extended period of time compared to implants, which were uncovered and loaded at an earlier time frame. The staged protocol and CBCT analysis enabled the immediate implants to be placed in proper alignment to the adjacent fixture. After teeth were extracted, the osseointegrated implants were converted to abutments for the transitional appliance. Finally, the staged protocol allowed for soft tissue enhancement in the implant and pontic areas prior to final insertion of the prosthesis.

  13. Energy neutral protocol based on hierarchical routing techniques for energy harvesting wireless sensor network

    Science.gov (United States)

    Muhammad, Umar B.; Ezugwu, Absalom E.; Ofem, Paulinus O.; Rajamäki, Jyri; Aderemi, Adewumi O.

    2017-06-01

    Recently, researchers in the field of wireless sensor networks have resorted to energy harvesting techniques that allows energy to be harvested from the ambient environment to power sensor nodes. Using such Energy harvesting techniques together with proper routing protocols, an Energy Neutral state can be achieved so that sensor nodes can run perpetually. In this paper, we propose an Energy Neutral LEACH routing protocol which is an extension to the traditional LEACH protocol. The goal of the proposed protocol is to use Gateway node in each cluster so as to reduce the data transmission ranges of cluster head nodes. Simulation results show that the proposed routing protocol achieves a higher throughput and ensure the energy neutral status of the entire network.

  14. Formal Analysis of SET and NSL Protocols Using the Interpretation Functions-Based Method

    Directory of Open Access Journals (Sweden)

    Hanane Houmani

    2012-01-01

    Full Text Available Most applications in the Internet such as e-banking and e-commerce use the SET and the NSL protocols to protect the communication channel between the client and the server. Then, it is crucial to ensure that these protocols respect some security properties such as confidentiality, authentication, and integrity. In this paper, we analyze the SET and the NSL protocols with respect to the confidentiality (secrecy property. To perform this analysis, we use the interpretation functions-based method. The main idea behind the interpretation functions-based technique is to give sufficient conditions that allow to guarantee that a cryptographic protocol respects the secrecy property. The flexibility of the proposed conditions allows the verification of daily-life protocols such as SET and NSL. Also, this method could be used under different assumptions such as a variety of intruder abilities including algebraic properties of cryptographic primitives. The NSL protocol, for instance, is analyzed with and without the homomorphism property. We show also, using the SET protocol, the usefulness of this approach to correct weaknesses and problems discovered during the analysis.

  15. Analyzing the effect of routing protocols on media access control protocols in radio networks

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C. L. (Christopher L.); Drozda, M. (Martin); Marathe, A. (Achla); Marathe, M. V. (Madhav V.)

    2002-01-01

    We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i) number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.

  16. Energy Reduction Multipath Routing Protocol for MANET Using Recoil Technique

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar Sahu

    2018-04-01

    Full Text Available In Mobile Ad-hoc networks (MANET, power conservation and utilization is an acute problem and has received significant attention from academics and industry in recent years. Nodes in MANET function on battery power, which is a rare and limited energy resource. Hence, its conservation and utilization should be done judiciously for the effective functioning of the network. In this paper, a novel protocol namely Energy Reduction Multipath Routing Protocol for MANET using Recoil Technique (AOMDV-ER is proposed, which conserves the energy along with optimal network lifetime, routing overhead, packet delivery ratio and throughput. It performs better than any other AODV based algorithms, as in AOMDV-ER the nodes transmit packets to their destination smartly by using a varying recoil off time technique based on their geographical location. This concept reduces the number of transmissions, which results in the improvement of network lifetime. In addition, the local level route maintenance reduces the additional routing overhead. Lastly, the prediction based link lifetime of each node is estimated which helps in reducing the packet loss in the network. This protocol has three subparts: an optimal route discovery algorithm amalgamation with the residual energy and distance mechanism; a coordinated recoiled nodes algorithm which eliminates the number of transmissions in order to reduces the data redundancy, traffic redundant, routing overhead, end to end delay and enhance the network lifetime; and a last link reckoning and route maintenance algorithm to improve the packet delivery ratio and link stability in the network. The experimental results show that the AOMDV-ER protocol save at least 16% energy consumption, 12% reduction in routing overhead, significant achievement in network lifetime and packet delivery ratio than Ad hoc on demand multipath distance vector routing protocol (AOMDV, Ad hoc on demand multipath distance vector routing protocol life

  17. [Professional divers: analysis of critical issues and proposal of a health protocol for work fitness].

    Science.gov (United States)

    Pedata, Paola; Corvino, Anna Rita; Napolitano, Raffaele Carmine; Garzillo, Elpidio Maria; Furfaro, Ciro; Lamberti, Monica

    2016-01-20

    From many years now, thanks to the development of modern diving techniques, there has been a rapid spread of diving activities everywhere. In fact, divers are ever more numerous both among the Armed Forces and civilians who dive for work, like fishing, biological research and archeology. The aim of the study was to propose a health protocol for work fitness of professional divers keeping in mind the peculiar work activity, existing Italian legislation that is almost out of date and the technical and scientific evolution in this occupational field. We performed an analysis of the most frequently occurring diseases among professional divers and of the clinical investigation and imaging techniques used for work fitness assessment of professional divers. From analysis of the health protocol recommended by D.M. 13 January 1979 (Ministerial Decree), that is most used by occupational health physician, several critical issues emerged. Very often the clinical investigation and imaging techniques still used are almost obsolete, ignoring the execution of simple and inexpensive investigations that are more useful for work fitness assessment. Considering the out-dated legislation concerning diving disciplines, it is necessary to draw up a common health protocol that takes into account clinical and scientific knowledge and skills acquired in this area. This protocol's aim is to propose a useful tool for occupational health physicians who work in this sector.

  18. Results of a protocol of transfusion threshold and surgical technique on transfusion requirements in burn patients.

    Science.gov (United States)

    O'Mara, Michael S; Hayetian, Fernando; Slater, Harvey; Goldfarb, I William; Tolchin, Eric; Caushaj, Philip F

    2005-08-01

    Blood loss and high rates of transfusion in burn centers remains an area of ongoing concern. Blood use brings the risk of infection, adverse reaction, and immunosuppression. A protocol to reduce blood loss and blood use was implemented. Analysis included 3-year periods before and after institution of the protocol. All patients were transfused for a hemoglobin below 8.0 gm/dL. Operations per admission did not change during the two time periods (0.78 in each). Overall units transfused per operation decreased from 1.56+/-0.06 to 1.25+/-0.14 units after instituting the protocol (pburns of less than 20% surface area, declining from 386 to 46 units after protocol institution, from 0.37 to 0.04 units per admission, and from 0.79 to 0.08 units per operation in this group of smallest burns. There was no change noted in the larger burns. This study suggests that a defined protocol of hemostasis, technique, and transfusion trigger should be implemented in the process of burn excision and grafting. This will help especially those patients with the smallest burns, essentially eliminating transfusion need in that group.

  19. Split bolus technique in polytrauma: a prospective study on scan protocols for trauma analysis

    NARCIS (Netherlands)

    Beenen, Ludo F. M.; Sierink, Joanne C.; Kolkman, Saskia; Nio, C. Yung; Saltzherr, Teun Peter; Dijkgraaf, Marcel G. W.; Goslings, J. Carel

    2015-01-01

    For the evaluation of severely injured trauma patients a variety of total body computed tomography (CT) scanning protocols exist. Frequently multiple pass protocols are used. A split bolus contrast protocol can reduce the number of passes through the body, and thereby radiation exposure, in this

  20. Agricultural Soil Spectral Response and Properties Assessment: Effects of Measurement Protocol and Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Asa Gholizadeh

    2017-10-01

    Full Text Available Soil spectroscopy has shown to be a fast, cost-effective, environmentally friendly, non-destructive, reproducible and repeatable analytical technique. Soil components, as well as types of instruments, protocols, sampling methods, sample preparation, spectral acquisition techniques and analytical algorithms have a combined influence on the final performance. Therefore, it is important to characterize these differences and to introduce an effective approach in order to minimize the technical factors that alter reflectance spectra and consequent prediction. To quantify this alteration, a joint project between Czech University of Life Sciences Prague (CULS and Tel-Aviv University (TAU was conducted to estimate Cox, pH-H2O, pH-KCl and selected forms of Fe and Mn. Two different soil spectral measurement protocols and two data mining techniques were used to examine seventy-eight soil samples from five agricultural areas in different parts of the Czech Republic. Spectral measurements at both laboratories were made using different ASD spectroradiometers. The CULS protocol was based on employing a contact probe (CP spectral measurement scheme, while the TAU protocol was carried out using a CP measurement method, accompanied with the internal soil standard (ISS procedure. Two spectral datasets, acquired from different protocols, were both analyzed using partial least square regression (PLSR technique as well as the PARACUDA II®, a new data mining engine for optimizing PLSR models. The results showed that spectra based on the CULS setup (non-ISS demonstrated significantly higher albedo intensity and reflectance values relative to the TAU setup with ISS. However, the majority of statistics using the TAU protocol was not noticeably better than the CULS spectra. The paper also highlighted that under both measurement protocols, the PARACUDA II® engine proved to be a powerful tool for providing better results than PLSR. Such initiative is not only a way to

  1. A Logical Analysis of Quantum Voting Protocols

    Science.gov (United States)

    Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja

    2017-12-01

    In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.

  2. Authentication Test-Based the RFID Authentication Protocol with Security Analysis

    Directory of Open Access Journals (Sweden)

    Minghui Wang

    2014-08-01

    Full Text Available To the problem of many recently proposed RFID authentication protocol was soon find security holes, we analyzed the main reason, which is that protocol design is not rigorous, and the correctness of the protocol cannot be guaranteed. To this end, authentication test method was adopted in the process of the formal analysis and strict proof to the proposed RFID protocol in this paper. Authentication Test is a new type of analysis and design method of security protocols based on Strand space model, and it can be used for most types of the security protocols. After analysis the security, the proposed protocol can meet the RFID security demand: information confidentiality, data integrity and identity authentication.

  3. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  4. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  5. Optical code-division multiple-access protocol with selective retransmission

    Science.gov (United States)

    Mohamed, Mohamed A. A.; Shalaby, Hossam M. H.; El-Badawy, El-Sayed A.

    2006-05-01

    An optical code-division multiple-access (OCDMA) protocol based on selective retransmission technique is proposed. The protocol is modeled using a detailed state diagram and is analyzed using equilibrium point analysis (EPA). Both traditional throughput and average delay are used to examine its performance for several network parameters. In addition, the performance of the proposed protocol is compared to that of the R3T protocol, which is based on a go-back-n technique. Our results show that a higher performance is achieved by the proposed protocol at the expense of system complexity.

  6. Reliability and criterion validity of an observation protocol for working technique assessments in cash register work.

    Science.gov (United States)

    Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina

    2016-06-01

    We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.

  7. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    Science.gov (United States)

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. A Calculus for Control Flow Analysis of Security Protocols

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Nielson, Hanne Riis; Nielson, Flemming

    2004-01-01

    The design of a process calculus for anaysing security protocols is governed by three factors: how to express the security protocol in a precise and faithful manner, how to accommodate the variety of attack scenarios, and how to utilise the strengths (and limit the weaknesses) of the underlying...... analysis methodology. We pursue an analysis methodology based on control flow analysis in flow logic style and we have previously shown its ability to analyse a variety of security protocols. This paper develops a calculus, LysaNS that allows for much greater control and clarity in the description...

  9. Mac protocols for cyber-physical systems

    CERN Document Server

    Xia, Feng

    2015-01-01

    This book provides a literature review of various wireless MAC protocols and techniques for achieving real-time and reliable communications in the context of cyber-physical systems (CPS). The evaluation analysis of IEEE 802.15.4 for CPS therein will give insights into configuration and optimization of critical design parameters of MAC protocols. In addition, this book also presents the design and evaluation of an adaptive MAC protocol for medical CPS, which exemplifies how to facilitate real-time and reliable communications in CPS by exploiting IEEE 802.15.4 based MAC protocols. This book wil

  10. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    Directory of Open Access Journals (Sweden)

    McNally James

    2009-01-01

    Full Text Available Abstract Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development, a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community.

  11. Analyzing security protocols in hierarchical networks

    DEFF Research Database (Denmark)

    Zhang, Ye; Nielson, Hanne Riis

    2006-01-01

    Validating security protocols is a well-known hard problem even in a simple setting of a single global network. But a real network often consists of, besides the public-accessed part, several sub-networks and thereby forms a hierarchical structure. In this paper we first present a process calculus...... capturing the characteristics of hierarchical networks and describe the behavior of protocols on such networks. We then develop a static analysis to automate the validation. Finally we demonstrate how the technique can benefit the protocol development and the design of network systems by presenting a series...

  12. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Science.gov (United States)

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  13. Analysis of Security Protocols in Embedded Systems

    DEFF Research Database (Denmark)

    Bruni, Alessandro

    Embedded real-time systems have been adopted in a wide range of safety-critical applications—including automotive, avionics, and train control systems—where the focus has long been on safety (i.e., protecting the external world from the potential damage caused by the system) rather than security (i.......e., protecting the system from the external world). With increased connectivity of these systems to external networks the attack surface has grown, and consequently there is a need for securing the system from external attacks. Introducing security protocols in safety critical systems requires careful...... in this direction is to extend saturation-based techniques so that enough state information can be modelled and analysed. Finally, we present a methodology for proving the same security properties in the computational model, by means of typing protocol implementations....

  14. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Directory of Open Access Journals (Sweden)

    Shameng Wen

    Full Text Available Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  15. Emulation Platform for Cyber Analysis of Wireless Communication Network Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Van Leeuwen, Brian P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldridge, John M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    Wireless networking and mobile communications is increasing around the world and in all sectors of our lives. With increasing use, the density and complexity of the systems increase with more base stations and advanced protocols to enable higher data throughputs. The security of data transported over wireless networks must also evolve with the advances in technologies enabling more capable wireless networks. However, means for analysis of the effectiveness of security approaches and implementations used on wireless networks are lacking. More specifically a capability to analyze the lower-layer protocols (i.e., Link and Physical layers) is a major challenge. An analysis approach that incorporates protocol implementations without the need for RF emissions is necessary. In this research paper several emulation tools and custom extensions that enable an analysis platform to perform cyber security analysis of lower layer wireless networks is presented. A use case of a published exploit in the 802.11 (i.e., WiFi) protocol family is provided to demonstrate the effectiveness of the described emulation platform.

  16. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  17. Mean-Field Analysis for the Evaluation of Gossip Protocols

    NARCIS (Netherlands)

    Bakshi, Rena; Cloth, L.; Fokkink, Wan; Haverkort, Boudewijn R.H.M.

    Gossip protocols are designed to operate in very large, decentralised networks. A node in such a network bases its decision to interact (gossip) with another node on its partial view of the global system. Because of the size of these networks, analysis of gossip protocols is mostly done using

  18. Mean-field analysis for the evaluation of gossip protocols

    NARCIS (Netherlands)

    Bakhshi, Rena; Cloth, L.; Fokkink, Wan; Haverkort, Boudewijn R.H.M.

    2008-01-01

    Gossip protocols are designed to operate in very large, decentralised networks. A node in such a network bases its decision to interact (gossip) with another node on its partial view of the global system. Because of the size of these networks, analysis of gossip protocols is mostly done using

  19. Recursion vs. Replication in Simple Cryptographic Protocols

    DEFF Research Database (Denmark)

    Huttel, Hans; Srba, Jiri

    2005-01-01

    We use some recent techniques from process algebra to draw several conclusions about the well studied class of ping-pong protocols introduced by Dolev and Yao. In particular we show that all nontrivial properties, including reachability and equivalence checking wrt. the whole van Glabbeek's spect...... of messages in the sense of Amadio, Lugiez and Vanackere. We conclude by showing that reachability analysis for a replicative variant of the protocol becomes decidable....

  20. New protocol for construction of eyeglasses-supported provisional nasal prosthesis using CAD/CAM techniques.

    Science.gov (United States)

    Ciocca, Leonardo; Fantini, Massimiliano; De Crescenzio, Francesca; Persiani, Franco; Scotti, Roberto

    2010-01-01

    A new protocol for making an immediate provisional eyeglasses-supported nasal prosthesis is presented that uses laser scanning, computer-aided design/computer-aided manufacturing procedures, and rapid prototyping techniques, reducing time and costs while increasing the quality of the final product. With this protocol, the eyeglasses were digitized, and the relative position of the nasal prosthesis was planned and evaluated in a virtual environment without any try-in appointment. This innovative method saves time, reduces costs, and restores the patient's aesthetic appearance after a disfiguration caused by ablation of the nasal pyramid better than conventional restoration methods. Moreover, the digital model of the designed nasal epithesis can be used to develop a definitive prosthesis anchored to osseointegrated craniofacial implants.

  1. Communicating systems with UML 2 modeling and analysis of network protocols

    CERN Document Server

    Barrera, David Garduno

    2013-01-01

    This book gives a practical approach to modeling and analyzing communication protocols using UML 2. Network protocols are always presented with a point of view focusing on partial mechanisms and starting models. This book aims at giving the basis needed for anybody to model and validate their own protocols. It follows a practical approach and gives many examples for the description and analysis of well known basic network mechanisms for protocols.The book firstly shows how to describe and validate the main protocol issues (such as synchronization problems, client-server interactions, layer

  2. An approach to standardization of urine sediment analysis via suggestion of a common manual protocol.

    Science.gov (United States)

    Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki

    2016-01-01

    The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.

  3. Establishment of a protocol for the gene expression analysis of laser microdissected rat kidney samples with affymetrix genechips

    International Nuclear Information System (INIS)

    Stemmer, Kerstin; Ellinger-Ziegelbauer, Heidrun; Lotz, Kerstin; Ahr, Hans-J.; Dietrich, Daniel R.

    2006-01-01

    Laser microdissection in conjunction with microarray technology allows selective isolation and analysis of specific cell populations, e.g., preneoplastic renal lesions. To date, only limited information is available on sample preparation and preservation techniques that result in both optimal histomorphological preservation of sections and high-quality RNA for microarray analysis. Furthermore, amplification of minute amounts of RNA from microdissected renal samples allowing analysis with genechips has only scantily been addressed to date. The objective of this study was therefore to establish a reliable and reproducible protocol for laser microdissection in conjunction with microarray technology using kidney tissue from Eker rats p.o. treated for 7 days and 6 months with 10 and 1 mg Aristolochic acid/kg bw, respectively. Kidney tissues were preserved in RNAlater or snap frozen. Cryosections were cut and stained with either H and E or cresyl violet for subsequent morphological and RNA quality assessment and laser microdissection. RNA quality was comparable in snap frozen and RNAlater-preserved samples, however, the histomorphological preservation of renal sections was much better following cryopreservation. Moreover, the different staining techniques in combination with sample processing time at room temperature can have an influence on RNA quality. Different RNA amplification protocols were shown to have an impact on gene expression profiles as demonstrated with Affymetrix Rat Genome 230 2 .0 arrays. Considering all the parameters analyzed in this study, a protocol for RNA isolation from laser microdissected samples with subsequent Affymetrix chip hybridization was established that was also successfully applied to preneoplastic lesions laser microdissected from Aristolochic acid-treated rats

  4. A security analysis of the 802.11s wireless mesh network routing protocol and its secure routing protocols.

    Science.gov (United States)

    Tan, Whye Kit; Lee, Sang-Gon; Lam, Jun Huy; Yoo, Seong-Moo

    2013-09-02

    Wireless mesh networks (WMNs) can act as a scalable backbone by connecting separate sensor networks and even by connecting WMNs to a wired network. The Hybrid Wireless Mesh Protocol (HWMP) is the default routing protocol for the 802.11s WMN. The routing protocol is one of the most important parts of the network, and it requires protection, especially in the wireless environment. The existing security protocols, such as the Broadcast Integrity Protocol (BIP), Counter with cipher block chaining message authentication code protocol (CCMP), Secure Hybrid Wireless Mesh Protocol (SHWMP), Identity Based Cryptography HWMP (IBC-HWMP), Elliptic Curve Digital Signature Algorithm HWMP (ECDSA-HWMP), and Watchdog-HWMP aim to protect the HWMP frames. In this paper, we have analyzed the vulnerabilities of the HWMP and developed security requirements to protect these identified vulnerabilities. We applied the security requirements to analyze the existing secure schemes for HWMP. The results of our analysis indicate that none of these protocols is able to satisfy all of the security requirements. We also present a quantitative complexity comparison among the protocols and an example of a security scheme for HWMP to demonstrate how the result of our research can be utilized. Our research results thus provide a tool for designing secure schemes for the HWMP.

  5. Protocol design and analysis for cooperative wireless networks

    CERN Document Server

    Song, Wei; Jin, A-Long

    2017-01-01

    This book focuses on the design and analysis of protocols for cooperative wireless networks, especially at the medium access control (MAC) layer and for crosslayer design between the MAC layer and the physical layer. It highlights two main points that are often neglected in other books: energy-efficiency and spatial random distribution of wireless devices. Effective methods in stochastic geometry for the design and analysis of wireless networks are also explored. After providing a comprehensive review of existing studies in the literature, the authors point out the challenges that are worth further investigation. Then, they introduce several novel solutions for cooperative wireless network protocols that reduce energy consumption and address spatial random distribution of wireless nodes. For each solution, the book offers a clear system model and problem formulation, details of the proposed cooperative schemes, comprehensive performance analysis, and extensive numerical and simulation results that validate th...

  6. Rethinking Protocol Analysis from a Cultural Perspective.

    Science.gov (United States)

    Smagorinsky, Peter

    2001-01-01

    Outlines a cultural-historical activity theory (CHAT) perspective that accounts for protocol analysis along three key dimensions: the relationship between thinking and speech from a representational standpoint; the social role of speech in research methodology; and the influence of speech on thinking and data collection. (Author/VWL)

  7. Efficient secure two-party protocols

    CERN Document Server

    Hazay, Carmit

    2010-01-01

    The authors present a comprehensive study of efficient protocols and techniques for secure two-party computation -- both general constructions that can be used to securely compute any functionality, and protocols for specific problems of interest. The book focuses on techniques for constructing efficient protocols and proving them secure. In addition, the authors study different definitional paradigms and compare the efficiency of protocols achieved under these different definitions.The book opens with a general introduction to secure computation and then presents definitions of security for a

  8. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4, Volume IV: Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols

    Science.gov (United States)

    Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.

  9. Portable abdomen radiography. Moving to thickness-based protocols

    International Nuclear Information System (INIS)

    Sanchez, Adrian A.; Reiser, Ingrid; Baxter, Tina; Zhang, Yue; Finkle, Joshua H.; Lu, Zheng Feng; Feinstein, Kate A.

    2018-01-01

    Default pediatric protocols on many digital radiography systems are configured based on patient age. However, age does not adequately characterize patient size, which is the principal determinant of proper imaging technique. Use of default pediatric protocols by inexperienced technologists can result in patient overexposure, inadequate image quality, or repeated examinations. To ensure diagnostic image quality at a well-managed patient radiation exposure by transitioning to thickness-based protocols for pediatric portable abdomen radiography. We aggregated patient thickness data, milliamperes (mAs), kilovoltage peak (kVp), exposure index (EI), source-to-detector distance, and grid use for all portable abdomen radiographs performed in our pediatric hospital in a database with a combination of automated and manual data collection techniques. We then analyzed the database and used it as the basis to construct thickness-based protocols with consistent image quality across varying patient thicknesses, as determined by the EI. Retrospective analysis of pediatric portable exams performed at our adult-focused hospitals demonstrated substantial variability in EI relative to our pediatric hospital. Data collection at our pediatric hospital over 4 months accumulated roughly 800 portable abdomen exams, which we used to develop a thickness-based technique chart. Through automated retrieval of data in our systems' digital radiography exposure logs and recording of patient abdomen thickness, we successfully developed thickness-based techniques for portable abdomen radiography. (orig.)

  10. Portable abdomen radiography. Moving to thickness-based protocols

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Adrian A.; Reiser, Ingrid; Baxter, Tina; Zhang, Yue; Finkle, Joshua H.; Lu, Zheng Feng; Feinstein, Kate A. [University of Chicago Medical Center, Department of Radiology, Chicago, IL (United States)

    2018-02-15

    Default pediatric protocols on many digital radiography systems are configured based on patient age. However, age does not adequately characterize patient size, which is the principal determinant of proper imaging technique. Use of default pediatric protocols by inexperienced technologists can result in patient overexposure, inadequate image quality, or repeated examinations. To ensure diagnostic image quality at a well-managed patient radiation exposure by transitioning to thickness-based protocols for pediatric portable abdomen radiography. We aggregated patient thickness data, milliamperes (mAs), kilovoltage peak (kVp), exposure index (EI), source-to-detector distance, and grid use for all portable abdomen radiographs performed in our pediatric hospital in a database with a combination of automated and manual data collection techniques. We then analyzed the database and used it as the basis to construct thickness-based protocols with consistent image quality across varying patient thicknesses, as determined by the EI. Retrospective analysis of pediatric portable exams performed at our adult-focused hospitals demonstrated substantial variability in EI relative to our pediatric hospital. Data collection at our pediatric hospital over 4 months accumulated roughly 800 portable abdomen exams, which we used to develop a thickness-based technique chart. Through automated retrieval of data in our systems' digital radiography exposure logs and recording of patient abdomen thickness, we successfully developed thickness-based techniques for portable abdomen radiography. (orig.)

  11. Design and Analysis of Transport Protocols for Reliable High-Speed Communications

    NARCIS (Netherlands)

    Oláh, A.

    1997-01-01

    The design and analysis of transport protocols for reliable communications constitutes the topic of this dissertation. These transport protocols guarantee the sequenced and complete delivery of user data over networks which may lose, duplicate and reorder packets. Reliable transport services are

  12. Analysis of the LTE Access Reservation Protocol for Real-Time Traffic

    DEFF Research Database (Denmark)

    Thomsen, Henning; Kiilerich Pratas, Nuno; Stefanovic, Cedomir

    2013-01-01

    LTE is increasingly seen as a system for serving real-time Machine-to-Machine (M2M) communication needs. The asynchronous M2M user access in LTE is obtained through a two-phase access reservation protocol (contention and data phase). Existing analysis related to these protocols is based...... of the two-phase LTE reservation protocol and asses its performance, when assumptions (1) and (2) do not hold....

  13. DNA repair protocols

    DEFF Research Database (Denmark)

    Bjergbæk, Lotte

    In its 3rd edition, this Methods in Molecular Biology(TM) book covers the eukaryotic response to genomic insult including advanced protocols and standard techniques in the field of DNA repair. Offers expert guidance for DNA repair, recombination, and replication. Current knowledge of the mechanisms...... that regulate DNA repair has grown significantly over the past years with technology advances such as RNA interference, advanced proteomics and microscopy as well as high throughput screens. The third edition of DNA Repair Protocols covers various aspects of the eukaryotic response to genomic insult including...... recent advanced protocols as well as standard techniques used in the field of DNA repair. Both mammalian and non-mammalian model organisms are covered in the book, and many of the techniques can be applied with only minor modifications to other systems than the one described. Written in the highly...

  14. Formal Security Analysis of the MaCAN Protocol

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Sojka, Michal; Nielson, Flemming

    2014-01-01

    analysis identifies two flaws in the original protocol: one creates unavailability concerns during key establishment, and the other allows re-using authenticated signals for different purposes. We propose and analyse a modification that improves its behaviour while fitting the constraints of CAN bus...

  15. Performance Analysis of Untraceability Protocols for Mobile Agents Using an Adaptable Framework

    OpenAIRE

    LESZCZYNA RAFAL; GORSKI Janusz Kazimierz

    2006-01-01

    Recently we had proposed two untraceability protocols for mobile agents and began investigating their quality. We believe that quality evaluation of security protocols should extend a sole validation of their security and cover other quality aspects, primarily their efficiency. Thus after conducting a security analysis, we wanted to complement it with a performance analysis. For this purpose we developed a performance evaluation framework, which, as we realised, with certain adjustments, can ...

  16. An optimized protocol for handling and processing fragile acini cultured with the hanging drop technique.

    Science.gov (United States)

    Snyman, Celia; Elliott, Edith

    2011-12-15

    The hanging drop three-dimensional culture technique allows cultivation of functional three-dimensional mammary constructs without exogenous extracellular matrix. The fragile acini are, however, difficult to preserve during processing steps for advanced microscopic investigation. We describe adaptations to the protocol for handling of hanging drop cultures to include investigation using confocal, scanning, and electron microscopy, with minimal loss of cell culture components. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. A markerless protocol for genetic analysis of Aggregatibacter actinomycetemcomitans

    Science.gov (United States)

    Cheng, Ya-An; Jee, Jason; Hsu, Genie; Huang, Yanyan; Chen, Casey; Lin, Chun-Pin

    2015-01-01

    Background/Purpose The genomes of different Aggregatibacter actinomycetemcomitans strains contain many strain-specific genes and genomic islands (defined as DNA found in some but not all strains) of unknown functions. Genetic analysis for the functions of these islands will be constrained by the limited availability of genetic markers and vectors for A. actinomycetemcomitans. In this study we tested a novel genetic approach of gene deletion and restoration in a naturally competent A. actinomycetemcomitans strain D7S-1. Methods Specific genes’ deletion mutants and mutants restored with the deleted genes were constructed by a markerless loxP/Cre system. In mutants with sequential deletion of multiple genes loxP with different spacer regions were used to avoid unwanted recombinations between loxP sites. Results Eight single-gene deletion mutants, four multiple-gene deletion mutants, and two mutants with restored genes were constructed. No unintended non-specific deletion mutants were generated by this protocol. The protocol did not negatively affect the growth and biofilm formation of A. actinomycetemcomitans. Conclusion The protocol described in this study is efficient and specific for genetic manipulation of A. actinomycetemcomitans, and will be amenable for functional analysis of multiple genes in A. actinomycetemcomitans. PMID:24530245

  18. Bioinspired Security Analysis of Wireless Protocols

    DEFF Research Database (Denmark)

    Petrocchi, Marinella; Spognardi, Angelo; Santi, Paolo

    2016-01-01

    work, this paper investigates feasibility of adopting fraglets as model for specifying security protocols and analysing their properties. In particular, we give concrete sample analyses over a secure RFID protocol, showing evolution of the protocol run as chemical dynamics and simulating an adversary...

  19. Performance Analysis of an Enhanced PRMA-HS Protocol for LEO Satellite Communication

    Institute of Scientific and Technical Information of China (English)

    ZHUO Yong-ning; YAN Shao-hu; WU Shi-qi

    2005-01-01

    The packet reservation multiple access with hindering state (PRMA-HS) is a protocol suitable for LEO satellite mobile communication. Although working well with light system payload (amount of user terminals), the protocol imposes high channel congestion on system with heavy payload, thus degrades the system's quality of service. To controlling the channel congestion, a scheme of enhanced PRMA-HS protocol is proposed, which aims to reduce the collision of voice packets by adopting a mechanism of access control. Through theoretic analysis, the system's mathematic model is presented and the packet drop probability of the scheme is deduced. To testify the performance of the scheme, a simulation is performed and the results support our analysis.

  20. Emotional Freedom Techniques for Anxiety: A Systematic Review With Meta-analysis.

    Science.gov (United States)

    Clond, Morgan

    2016-05-01

    Emotional Freedom Technique (EFT) combines elements of exposure and cognitive therapies with acupressure for the treatment of psychological distress. Randomized controlled trials retrieved by literature search were assessed for quality using the criteria developed by the American Psychological Association's Division 12 Task Force on Empirically Validated Treatments. As of December 2015, 14 studies (n = 658) met inclusion criteria. Results were analyzed using an inverse variance weighted meta-analysis. The pre-post effect size for the EFT treatment group was 1.23 (95% confidence interval, 0.82-1.64; p freedom technique treatment demonstrated a significant decrease in anxiety scores, even when accounting for the effect size of control treatment. However, there were too few data available comparing EFT to standard-of-care treatments such as cognitive behavioral therapy, and further research is needed to establish the relative efficacy of EFT to established protocols.

  1. Performance Analysis of On-Demand Routing Protocols in Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Arafatur RAHMAN

    2009-01-01

    Full Text Available Wireless Mesh Networks (WMNs have recently gained a lot of popularity due to their rapid deployment and instant communication capabilities. WMNs are dynamically self-organizing, self-configuring and self-healing with the nodes in the network automatically establishing an adiej hoc network and preserving the mesh connectivity. Designing a routing protocol for WMNs requires several aspects to consider, such as wireless networks, fixed applications, mobile applications, scalability, better performance metrics, efficient routing within infrastructure, load balancing, throughput enhancement, interference, robustness etc. To support communication, various routing protocols are designed for various networks (e.g. ad hoc, sensor, wired etc.. However, all these protocols are not suitable for WMNs, because of the architectural differences among the networks. In this paper, a detailed simulation based performance study and analysis is performed on the reactive routing protocols to verify the suitability of these protocols over such kind of networks. Ad Hoc On-Demand Distance Vector (AODV, Dynamic Source Routing (DSR and Dynamic MANET On-demand (DYMO routing protocol are considered as the representative of reactive routing protocols. The performance differentials are investigated using varying traffic load and number of source. Based on the simulation results, how the performance of each protocol can be improved is also recommended.

  2. Micro-computed tomography and bond strength analysis of different root canal filling techniques

    Directory of Open Access Journals (Sweden)

    Juliane Nhata

    2014-01-01

    Full Text Available Introduction: The aim of this study was to evaluate the quality and bond strength of three root filling techniques (lateral compaction, continuous wave of condensation and Tagger′s Hybrid technique [THT] using micro-computed tomography (CT images and push-out tests, respectively. Materials and Methods: Thirty mandibular incisors were prepared using the same protocol and randomly divided into three groups (n = 10: Lateral condensation technique (LCT, continuous wave of condensation technique (CWCT, and THT. All specimens were filled with Gutta-percha (GP cones and AH Plus sealer. Five specimens of each group were randomly chosen for micro-CT analysis and all of them were sectioned into 1 mm slices and subjected to push-out tests. Results: Micro-CT analysis revealed less empty spaces when GP was heated within the root canals in CWCT and THT when compared to LCT. Push-out tests showed that LCT and THT had a significantly higher displacement resistance (P < 0.05 when compared to the CWCT. Bond strength was lower in apical and middle thirds than in the coronal thirds. Conclusions: It can be concluded that LCT and THT were associated with higher bond strengths to intraradicular dentine than CWCT. However, LCT was associated with more empty voids than the other techniques.

  3. IPv4 and IPv6 protocol compatibility options analysis

    Directory of Open Access Journals (Sweden)

    Regina Misevičienė

    2013-09-01

    Full Text Available The popularity of the internet has led to a very rapid growth of IPv4 (Internet Protocol v4 users. This caused a shortage of IP addresses, so it was created a new version – IPv6 (Internet Protocol v6. Currently, there are two versions of IP for IPv4 and IPv6. Due to the large differences in addressing the protocols IPv4 and IPv6 are incompatible. It is therefore necessary to find ways to move from IPv4 to IPv6. To facilitate the transition from one version to another are developed various mechanisms and strategies. Comparative analysis is done for dual stack, 6to4 tunnel and NAT64 mechanisms in this work. It has helped to reveal the shortcomings of these mechanisms and their application in selection of realization decisions.

  4. The use of crypto-analysis techniques for securing internet ...

    African Journals Online (AJOL)

    ... recommended to be combined with other techniques, such as client-side software, data transaction protocols, web server software, and the network server operating system involved in handling e-commerce, for securing internet transaction. This recommendation will invariable ensure that internet transaction is secured.

  5. Observing documentary reading by verbal protocol

    Directory of Open Access Journals (Sweden)

    Fujita Mariangela Spotti Lopes

    2003-01-01

    Full Text Available Verifies the applicability to research on indexers' reading strategies of the process observing technique known as Verbal Protocol or Thinking Aloud. This interpretative-qualitative data collecting technique allows the observation of different kinds of process during the progress of different kinds of tasks. Presents a theoretical investigation into "reading" and into formal methodological procedures to observe reading processes. Describes details of the methodological procedures adopted in five case studies with analysis of samples of data. The project adopted three kinds of parameters for data analysis: theoretical, normative, empirical (derived from observations made in the first case study. The results are compared, and important conclusions regarding documentary reading are drawn.

  6. Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)

    Science.gov (United States)

    2006-12-01

    field of 9 identifies the ICMP message as an adverstisement . Code Mobile IP home agents and foreign agents use the value of 16 to prevent any nodes...ANALYSIS OF THE MOBILE IP PROTOCOL (RFC 3344 AND RELATED RFCS) by Chin Chin Ng December 2006 Thesis Co-Advisors: George W. Dinolt J. D...December 2006 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Performance Analysis of the Mobile IP Protocol (RFC 3344 and

  7. Analysis of transmission speed of AX.25 Protocol implemented in satellital earth station UPTC

    Directory of Open Access Journals (Sweden)

    Oscar Fernando Vera Cely

    2015-11-01

    Full Text Available One of the important parameters for the proper functioning of satellital ground station projected on Pedagogical and Technological University of Colombia (UPTC is the efficiency in transmission speed on communications protocol. This paper shows the results of analysis of the transmission speed of the AX.25 protocol implemented in the communication system of the satellital ground station UPTC. It begins with a brief description of the implemented hardware; the behavior of the transmission rate is evaluated using a theoretical analysis based on equations to estimate this parameter in the operation of the protocol, then tests are performed using the hardware that the satellital ground station UPTC has and finally, the conclusions are presented. Based on comparison of the theoretical analysis results obtained experimentally, it became apparent that AX.25 protocol efficiency is higher when increasing the number of frames.

  8. National protocol framework for the inventory and monitoring of bees

    Science.gov (United States)

    Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,

    2016-01-01

    This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative

  9. A Formal Analysis of the Web Services Atomic Transaction Protocol with UPPAAL

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2010-01-01

    We present a formal analysis of the Web Services Atomic Transaction (WS-AT) protocol. WS-AT is a part of the WS-Coordination framework and describes an algorithm for reaching agreement on the outcome of a distributed transaction. The protocol is modelled and verified using the model checker UPPAAL...

  10. Comparison of radiation doses using weight-based protocol and dose modulation techniques for patients undergoing biphasic abdominal computed tomography examinations

    Directory of Open Access Journals (Sweden)

    Livingstone Roshan

    2009-01-01

    Full Text Available Computed tomography (CT of the abdomen contributes a substantial amount of man-made radiation dose to patients and use of this modality is on the increase. This study intends to compare radiation dose and image quality using dose modulation techniques and weight- based protocol exposure parameters for biphasic abdominal CT. Using a six-slice CT scanner, a prospective study of 426 patients who underwent abdominal CT examinations was performed. Constant tube potentials of 90 kV and 120 kV were used for all arterial and portal venous phase respectively. The tube current-time product for weight-based protocol was optimized according to patient′s body weight; this was automatically selected in dose modulations. The effective dose using weight-based protocol, angular and z-axis dose modulation was 11.3 mSv, 9.5 mSv and 8.2 mSv respectively for the patient′s body weight ranging from 40 to 60 kg. For patients of body weights ranging 60 to 80 kg, the effective doses were 13.2 mSv, 11.2 mSv and 10.6 mSv respectively. The use of dose modulation technique resulted in a reduction of 16 to 28% in radiation dose with acceptable diagnostic accuracy in comparison to the use of weight-based protocol settings.

  11. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA)

    DEFF Research Database (Denmark)

    Muharemovic, O; Troelsen, A; Thomsen, M G

    2018-01-01

    INTRODUCTION: Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time...... imaging. Radiographers in the control group used a standard RSA protocol. RESULTS: At three months, radiographers in the case group significantly reduced (p .... No significant improvements were found in the control group at any time point. CONCLUSION: There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute...

  12. Fetal MRI: techniques and protocols

    International Nuclear Information System (INIS)

    Prayer, Daniela; Brugger, Peter Christian; Prayer, Lucas

    2004-01-01

    The development of ultrafast sequences has led to a significant improvement in fetal MRI. Imaging protocols have to be adjusted to the rapidly developing fetal central nervous system (CNS) and to the clinical question. Sequence parameters must be changed to cope with the respective developmental stage, to produce images free from motion artefacts and to provide optimum visualization of the region and focus of interest. In contrast to postnatal studies, every suspect fetal CNS abnormality requires examination of the whole fetus and the extrafetal intrauterine structures including the uterus. This approach covers both aspects of fetal CNS disorders: isolated and complex malformations and cerebral lesions arising from the impaired integrity of the feto-placental unit. (orig.)

  13. Fetal MRI: techniques and protocols

    Energy Technology Data Exchange (ETDEWEB)

    Prayer, Daniela [Department of Neuroradiology, University Clinics of Radiodiagnostics, Medical University Vienna, Waehringerguertel 18-10, 1090, Vienna (Austria); Brugger, Peter Christian [Department of Anatomy, Integrative Morphology Group, Medical University Vienna (Austria); Prayer, Lucas [Diagnosezentrum Urania, Vienna (Austria)

    2004-09-01

    The development of ultrafast sequences has led to a significant improvement in fetal MRI. Imaging protocols have to be adjusted to the rapidly developing fetal central nervous system (CNS) and to the clinical question. Sequence parameters must be changed to cope with the respective developmental stage, to produce images free from motion artefacts and to provide optimum visualization of the region and focus of interest. In contrast to postnatal studies, every suspect fetal CNS abnormality requires examination of the whole fetus and the extrafetal intrauterine structures including the uterus. This approach covers both aspects of fetal CNS disorders: isolated and complex malformations and cerebral lesions arising from the impaired integrity of the feto-placental unit. (orig.)

  14. Quantitative CT: technique dependence of volume estimation on pulmonary nodules

    Science.gov (United States)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan

    2012-03-01

    Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.

  15. A comparison of maximal bioenergetic enzyme activities obtained with commonly used homogenization techniques.

    Science.gov (United States)

    Grace, M; Fletcher, L; Powers, S K; Hughes, M; Coombes, J

    1996-12-01

    Homogenization of tissue for analysis of bioenergetic enzyme activities is a common practice in studies examining metabolic properties of skeletal muscle adaptation to disease, aging, inactivity or exercise. While numerous homogenization techniques are in use today, limited information exists concerning the efficacy of specific homogenization protocols. Therefore, the purpose of this study was to compare the efficacy of four commonly used approaches to homogenizing skeletal muscle for analysis of bioenergetic enzyme activity. The maximal enzyme activity (Vmax) of citrate synthase (CS) and lactate dehydrogenase (LDH) were measured from homogenous muscle samples (N = 48 per homogenization technique) and used as indicators to determine which protocol had the highest efficacy. The homogenization techniques were: (1) glass-on-glass pestle; (2) a combination of a mechanical blender and a teflon pestle (Potter-Elvehjem); (3) a combination of the mechanical blender and a biological detergent; and (4) the combined use of a mechanical blender and a sonicator. The glass-on-glass pestle homogenization protocol produced significantly higher (P pestle homogenization protocol is the technique of choice for studying bioenergetic enzyme activity in skeletal muscle.

  16. Staged protocol for the treatment of chronic femoral shaft osteomyelitis with Ilizarov's technique followed by the use of intramedullary locked nail

    Directory of Open Access Journals (Sweden)

    Po-Hsin Chou

    2017-06-01

    Conclusion: In the treatment of chronic femur osteomyelitis, the staged protocol of Ilizarov distraction osteogenesis followed by intramedullary nailing was safe and successful, and allowed for union, realignment, reorientation, and leg-length restoration. With regard to the soft tissue, this technique provides a unique type of reconstructive closure for infected wounds. It is suggested that the staged protocol is reliable in providing successful simultaneous reconstruction for bone and soft tissue defects without flap coverage.

  17. IEEE 802.11 Wireless LANs: Performance Analysis and Protocol Refinement

    Directory of Open Access Journals (Sweden)

    Chatzimisios P.

    2005-01-01

    Full Text Available The IEEE 802.11 protocol is emerging as a widely used standard and has become the most mature technology for wireless local area networks (WLANs. In this paper, we focus on the tuning of the IEEE 802.11 protocol parameters taking into consideration, in addition to throughput efficiency, performance metrics such as the average packet delay, the probability of a packet being discarded when it reaches the maximum retransmission limit, the average time to drop a packet, and the packet interarrival time. We present an analysis, which has been validated by simulation that is based on a Markov chain model commonly used in the literature. We further study the improvement on these performance metrics by employing suitable protocol parameters according to the specific communication needs of the IEEE 802.11 protocol for both basic access and RTS/CTS access schemes. We show that the use of a higher initial contention window size does not considerably degrade performance in small networks and performs significantly better in any other scenario. Moreover, we conclude that the combination of a lower maximum contention window size and a higher retry limit considerably improves performance. Results indicate that the appropriate adjustment of the protocol parameters enhances performance and improves the services that the IEEE 802.11 protocol provides to various communication applications.

  18. Technical Analysis of SSP-21 Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Bromberger, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-09

    As part of the California Energy Systems for the Twenty-First Century (CES-21) program, in December 2016 San Diego Gas and Electric (SDG&E) contracted with Lawrence Livermore National Laboratory (LLNL) to perform an independent verification and validation (IV&V) of a white paper describing their Secure SCADA Protocol for the Twenty-First Century (SSP-21) in order to analyze the effectiveness and propriety of cryptographic protocol use within the SSP-21 specification. SSP-21 is designed to use cryptographic protocols to provide (optional) encryption, authentication, and nonrepudiation, among other capabilities. The cryptographic protocols to be used reflect current industry standards; future versions of SSP-21 will use other advanced technologies to provide a subset of security services.

  19. A Secure Simplification of the PKMv2 Protocol in IEEE 802.16e-2005

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielsen, Christoffer Rosenkilde

    2007-01-01

    Static analysis is successfully used for automatically validating security properties of classical cryptographic protocols. In this paper, we shall employ the same technique to a modern security protocol for wireless networks, namely the latest version of the Privacy and Key Management protocol...... for IEEE 802.16e, PKMv2. This protocol seems to have an exaggerated mixture of security features. Thus, we iteratively investigate which components are necessary for upholding the security properties and which can be omitted safely. This approach is based on the LySa process calculus and employs...

  20. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    Science.gov (United States)

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  1. Security analysis of session initiation protocol

    OpenAIRE

    Dobson, Lucas E.

    2010-01-01

    Approved for public release; distribution is unlimited The goal of this thesis is to investigate the security of the Session Initiation Protocol (SIP). This was accomplished by researching previously discovered protocol and implementation vulnerabilities, evaluating the current state of security tools and using those tools to discover new vulnerabilities in SIP software. The CVSS v2 system was used to score protocol and implementation vulnerabilities to give them a meaning that was us...

  2. Security analysis of standards-driven communication protocols for healthcare scenarios.

    Science.gov (United States)

    Masi, Massimiliano; Pugliese, Rosario; Tiezzi, Francesco

    2012-12-01

    The importance of the Electronic Health Record (EHR), that stores all healthcare-related data belonging to a patient, has been recognised in recent years by governments, institutions and industry. Initiatives like the Integrating the Healthcare Enterprise (IHE) have been developed for the definition of standard methodologies for secure and interoperable EHR exchanges among clinics and hospitals. Using the requisites specified by these initiatives, many large scale projects have been set up for enabling healthcare professionals to handle patients' EHRs. The success of applications developed in these contexts crucially depends on ensuring such security properties as confidentiality, authentication, and authorization. In this paper, we first propose a communication protocol, based on the IHE specifications, for authenticating healthcare professionals and assuring patients' safety. By means of a formal analysis carried out by using the specification language COWS and the model checker CMC, we reveal a security flaw in the protocol thus demonstrating that to simply adopt the international standards does not guarantee the absence of such type of flaws. We then propose how to emend the IHE specifications and modify the protocol accordingly. Finally, we show how to tailor our protocol for application to more critical scenarios with no assumptions on the communication channels. To demonstrate feasibility and effectiveness of our protocols we have fully implemented them.

  3. Privacy-Preserving Meter Report Protocol of Isolated Smart Grid Devices

    Directory of Open Access Journals (Sweden)

    Zhiwei Wang

    2017-01-01

    Full Text Available Smart grid aims to improve the reliability, efficiency, and security of the traditional grid, which allows two-way transmission and efficiency-driven response. However, a main concern of this new technique is that the fine-grained metering data may leak the personal privacy information of the customers. Thus, the data aggregation mechanism for privacy protection is required for the meter report protocol in smart grid. In this paper, we propose an efficient privacy-preserving meter report protocol for the isolated smart grid devices. Our protocol consists of an encryption scheme with additively homomorphic property and a linearly homomorphic signature scheme, where the linearly homomorphic signature scheme is suitable for privacy-preserving data aggregation. We also provide security analysis of our protocol in the context of some typical attacks in smart grid. The implementation of our protocol on the Intel Edison platform shows that our protocol is efficient enough for the physical constrained devices, like smart meters.

  4. 2014 Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engebrecht-Metzger, C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hendron, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-03-01

    As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  5. 2014 Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, E. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Engebrecht, C. Metzger [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Horowitz, S. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hendron, R. [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2014-03-01

    As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  6. Techniques for incorporating operator expertise into intelligent decision aids and training

    International Nuclear Information System (INIS)

    Blackman, H.S.; Nelson, W.R.

    1988-01-01

    An experiment is presented that was designed to investigate the use of protocol analysis, during task performance, as a technique for knowledge engineering that provides a direct tie between knowledge and performance. The technique is described and problem solving strategies are presented that were found to correlate with optimal performance. The results indicate that protocol analysis adds a dimension to the more standard knowledge engineering approaches by providing a more complete picture of the expert's knowledge and a performance yardstick to determine the most optimal problem solving strategies. Implications for the developers of expert systems and training programs are discussed. (author)

  7. Formal analysis of a fair payment protocol

    NARCIS (Netherlands)

    J.G. Cederquist; M.T. Dashti (Mohammad)

    2004-01-01

    textabstractWe formally specify a payment protocol. This protocol is intended for fair exchange of time-sensitive data. Here the ?-CRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free ?-calculus. These properties are then verified

  8. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA).

    Science.gov (United States)

    Muharemovic, O; Troelsen, A; Thomsen, M G; Kallemose, T; Gosvig, K K

    2018-05-01

    Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time and the number of exposure repetitions. Forty patients undergoing primary total hip arthroplasty were equally randomized to either a case or a control group. Radiographers in the case group were assisted by personalized patient protocols containing information about each patient's post-operative RSA imaging. Radiographers in the control group used a standard RSA protocol. At three months, radiographers in the case group significantly reduced (p RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute to the reduction of examination time, thus ensuring a cost benefit for department and patient safety. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  9. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, M.T.

    2004-01-01

    We formally specify a payment protocol. This protocol is intended for fair exchange of timesensitive data. Here the μCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free μ-calculus. These properties are then verified using the finite

  10. I-123 IBZM-SPECT: improved acquisition protocol with HR-collimator

    International Nuclear Information System (INIS)

    Sandrock, D.; Ivancevic, V.; Dopichaj-Menge, U.; Gruber, D.; Munz, D.

    2002-01-01

    Aim: Aim of this study was to evaluate the impact of different acquisition and reconstruction protocols on the quality of SPECT diagnostics with I-123 IBZM in patients with diseases of the dopaminergic system. Material and Methods: Overall, 30 patients (19 men, 11 women, aged 22 - 80 years) were studied with SPECT after i.v. injection of 5 mCi (185 MBq) I-123 IBZM. Acquisition was performed 60 min p.i. (protocol A) using a medium energy collimator, a 64 x 64 matrix, 64 projections, step and shoot technique, 5.6 0 and 20 seconds per step with a MultiSPECT2 double-head gamma camera (Siemens). Immediately afterwards, a second acquisition (protocol B, 90 min p.i.) using a high resolution collimator, a 128 x 128 matrix, 60 projections, step and shoot technique, 3 0 and 30 seconds per step was performed. The reconstruction was done in filtered backprojection technique and a Butterworth filter of order 7, in protocol A using a cutoff of 0.5 and in protocol B of 0.4. Finally, the net count ratios of the basal ganglia to the frontal lobe(s) were calculated and compared with the clinical diagnosis. Results: The visual analysis yielded - as expected -a better image quality for protocol B (concordant impression of 3 independent observers) with more accurate delineation of the basal ganglia. The count ratios with protocol B were (mean) 0.19 higher (equivalent to 13 %) than with protocol A. In the group of patients with count ratios > 1.55 there was the highest (and significant) difference between protocols A and B with (mean count ratio difference) 0.32 (equivalent to 20 %). Protocol B also allowed a better differentiation between patients clinically staged normal and abnormal. In patients with unilateral disease, this difference between normal and abnormal was more prominent (in comparison to the contralateral side as well as compared to the reference value [1.5]). The comparison of the count ratios with the clinical data revealed 7 patients with borderline results with

  11. XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes

    Science.gov (United States)

    Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.

    2009-01-01

    Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.

  12. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, Muhammad Torabi; Dimitrakos, Theo; Martinelli, Fabio

    We formally specify a payment protocol described by Vogt et al. This protocol is intended for fair exchange of time-sensitive data. Here the mCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free mu-calculus. These properties are then

  13. Reliability of diagnostic imaging techniques in suspected acute appendicitis: proposed diagnostic protocol

    International Nuclear Information System (INIS)

    Cura del, J. L.; Oleaga, L.; Grande, D.; Vela, A. C.; Ibanez, A. M.

    2001-01-01

    To study the utility of ultrasound and computed tomography (CT) in case of suspected appendicitis. To determine the diagnostic yield in terms of different clinical contexts and patient characteristics. to assess the costs and benefits of introducing these techniques and propose a protocol for their use. Negative appendectomies, complications and length of hospital stay in a group of 152 patients with suspected appendicitis who underwent ultrasound and CT were compared with those of 180 patients who underwent appendectomy during the same time period, but had not been selected for the first group: these patients costs for each group were calculated. In the first group, the diagnostic value of the clinical signs was also evaluated. The reliability of the clinical signs was limited, while the results with ultrasound and CT were excellent. The incidence of negative appendectomy was 9.6% in the study group and 12.2% in the control group. Moreover, there were fewer complications and a shorter hospital stay in the first group. Among men, however, the rate of negative appendectomy was lower in the control group. The cost of using ultrasound and CT in the management of appendicitis was only slightly higher than that of the control group. Although ultrasound and CT are not necessary in cases in which the probability of appendicitis is low or in men presenting clear clinical evidence, the use of these techniques is indicated in the remaining cases in which appendicitis is suspected. In children, ultrasound is the technique of choice. In all other patients, if negative results are obtained with one of the two techniques, the other should be performed. (Author) 49 refs

  14. Cost-utility analysis of an advanced pressure ulcer management protocol followed by trained wound, ostomy, and continence nurses.

    Science.gov (United States)

    Kaitani, Toshiko; Nakagami, Gojiro; Iizaka, Shinji; Fukuda, Takashi; Oe, Makoto; Igarashi, Ataru; Mori, Taketoshi; Takemura, Yukie; Mizokami, Yuko; Sugama, Junko; Sanada, Hiromi

    2015-01-01

    The high prevalence of severe pressure ulcers (PUs) is an important issue that requires to be highlighted in Japan. In a previous study, we devised an advanced PU management protocol to enable early detection of and intervention for deep tissue injury and critical colonization. This protocol was effective for preventing more severe PUs. The present study aimed to compare the cost-effectiveness of the care provided using an advanced PU management protocol, from a medical provider's perspective, implemented by trained wound, ostomy, and continence nurses (WOCNs), with that of conventional care provided by a control group of WOCNs. A Markov model was constructed for a 1-year time horizon to determine the incremental cost-effectiveness ratio of advanced PU management compared with conventional care. The number of quality-adjusted life-years gained, and the cost in Japanese yen (¥) ($US1 = ¥120; 2015) was used as the outcome. Model inputs for clinical probabilities and related costs were based on our previous clinical trial results. Univariate sensitivity analyses were performed. Furthermore, a Bayesian multivariate probability sensitivity analysis was performed using Monte Carlo simulations with advanced PU management. Two different models were created for initial cohort distribution. For both models, the expected effectiveness for the intervention group using advanced PU management techniques was high, with a low expected cost value. The sensitivity analyses suggested that the results were robust. Intervention by WOCNs using advanced PU management techniques was more effective and cost-effective than conventional care. © 2015 by the Wound Healing Society.

  15. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  16. Fed-state gastric media and drug analysis techniques: Current status and points to consider.

    Science.gov (United States)

    Baxevanis, Fotios; Kuiper, Jesse; Fotaki, Nikoletta

    2016-10-01

    Gastric fed state conditions can have a significant effect on drug dissolution and absorption. In vitro dissolution tests with simple aqueous media cannot usually predict drugs' in vivo response, as several factors such as the meal content, the gastric emptying and possible interactions between food and drug formulations can affect drug's pharmacokinetics. Good understanding of the effect of the in vivo fed gastric conditions on the drug is essential for the development of biorelevant dissolution media simulating the gastric environment after the administration of the standard high fat meal proposed by the FDA and the EMA in bioavailability/bioequivalence (BA/BE) studies. The analysis of drugs in fed state media can be quite challenging as most analytical protocols currently employed are time consuming and labour intensive. In this review, an overview of the in vivo gastric conditions and the biorelevant media used for their in vitro simulation are described. Furthermore an analysis of the physicochemical properties of the drugs and the formulations related to food effect is given. In terms of drug analysis, the protocols currently used for the fed state media sample treatment and analysis and the analytical challenges and needs emerging for more efficient and time saving techniques for a broad spectrum of compounds are being discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Dynamic Channel Slot Allocation Scheme and Performance Analysis of Cyclic Quorum Multichannel MAC Protocol

    Directory of Open Access Journals (Sweden)

    Xing Hu

    2017-01-01

    Full Text Available In high diversity node situation, multichannel MAC protocol can improve the frequency efficiency, owing to fewer collisions compared with single-channel MAC protocol. And the performance of cyclic quorum-based multichannel (CQM MAC protocol is outstanding. Based on cyclic quorum system and channel slot allocation, it can avoid the bottleneck that others suffered from and can be easily realized with only one transceiver. To obtain the accurate performance of CQM MAC protocol, a Markov chain model, which combines the channel-hopping strategy of CQM protocol and IEEE 802.11 distributed coordination function (DCF, is proposed. The results of numerical analysis show that the optimal performance of CQM protocol can be obtained in saturation bound situation. And then we obtain the saturation bound of CQM system by bird swarm algorithm. In addition, to improve the performance of CQM protocol in unsaturation situation, a dynamic channel slot allocation of CQM (DCQM protocol is proposed, based on wavelet neural network. Finally, the performance of CQM protocol and DCQM protocol is simulated by Qualnet platform. And the simulation results show that the analytic and simulation results match very well; the DCQM performs better in unsaturation situation.

  18. Analysis of the differential-phase-shift-keying protocol in the quantum-key-distribution system

    International Nuclear Information System (INIS)

    Rong-Zhen, Jiao; Chen-Xu, Feng; Hai-Qiang, Ma

    2009-01-01

    The analysis is based on the error rate and the secure communication rate as functions of distance for three quantum-key-distribution (QKD) protocols: the Bennett–Brassard 1984, the Bennett–Brassard–Mermin 1992, and the coherent differential-phase-shift keying (DPSK) protocols. We consider the secure communication rate of the DPSK protocol against an arbitrary individual attack, including the most commonly considered intercept-resend and photon-number splitting attacks, and concluded that the simple and efficient differential-phase-shift-keying protocol allows for more than 200 km of secure communication distance with high communication rates. (general)

  19. Effectiveness of behavioral change techniques employed in eHealth interventions designed to improve glycemic control in persons with poorly controlled type 2 diabetes: a systematic review and meta-analysis protocol

    Directory of Open Access Journals (Sweden)

    Mihiretu Kebede

    2017-10-01

    Full Text Available Abstract Background The incorporation of Behavioral Change Techniques (BCTs in eHealth interventions for the management of non-communicable diseases (NCDs, such as type 2 diabetes mellitus (T2DM, might be a promising approach to improve clinical and behavioral outcomes of NCDs in the long run. This 3paper reports a protocol for a systematic review that aims to (a identify the effects of individual BCTs in eHealth interventions for lowering glycated hemoglobin levels (HbA1c and (b investigate which additional intervention features (duration of intervention, tailoring, theory-base, and mode of delivery affect levels of HbA1c in this population. The protocol follows the Preferred Reporting Items for Systematic review and Meta-Analysis Protocols (PRISMA-P 2015 guideline. Methods/design To identify eligible studies, an extensive systematic database search (PubMed, Web of Science, and PsycINFO using keywords will be conducted. This review will include randomized controlled trials examining the effects of eHealth interventions on HbA1c in persons with poorly controlled T2DM over a minimum follow-up period of 3 months. Relevant data will be extracted from the included studies using Microsoft Excel. The content of the interventions will be extracted from the description of interventions and will be classified according to the BCT taxonomy v1 tool. The quality of studies will be independently assessed by two reviewers using the Cochrane risk of bias tool. If the studies have adequate homogeneity, meta-analysis will be considered. The effect sizes of each BCT will be calculated using the random effect model. The quality of the synthesized evidence will be evaluated employing the Grading of the Recommendations Assessment, Development and Evaluation (GRADE criteria. Discussion This systematic review is one of the firsts to appraise the effectiveness of eHealth interventions employing BCTs which aimed at improving glycemic control in persons with poorly

  20. Timing Analysis of the FlexRay Communication Protocol

    DEFF Research Database (Denmark)

    Pop, Traian; Pop, Paul; Eles, Petru

    2006-01-01

    FlexRay will very likely become the de-facto standard for in-vehicle communications. However, before it can be successfully used for safety-critical applications that require predictability, timing analysis techniques are necessary for providing bounds for the message communication times....... In this paper, we propose techniques for determining the timing properties of messages transmitted in both the static (ST) and the dynamic (DYN) segments of a FlexRay communication cycle. The analysis techniques for messages are integrated in the context of a holistic schedulability analysis that computes...

  1. Implementation and Analysis of Real-Time Streaming Protocols.

    Science.gov (United States)

    Santos-González, Iván; Rivero-García, Alexandra; Molina-Gil, Jezabel; Caballero-Gil, Pino

    2017-04-12

    Communication media have become the primary way of interaction thanks to the discovery and innovation of many new technologies. One of the most widely used communication systems today is video streaming, which is constantly evolving. Such communications are a good alternative to face-to-face meetings, and are therefore very useful for coping with many problems caused by distance. However, they suffer from different issues such as bandwidth limitation, network congestion, energy efficiency, cost, reliability and connectivity. Hence, the quality of service and the quality of experience are considered the two most important issues for this type of communication. This work presents a complete comparative study of two of the most used protocols of video streaming, Real Time Streaming Protocol (RTSP) and the Web Real-Time Communication (WebRTC). In addition, this paper proposes two new mobile applications that implement those protocols in Android whose objective is to know how they are influenced by the aspects that most affect the streaming quality of service, which are the connection establishment time and the stream reception time. The new video streaming applications are also compared with the most popular video streaming applications for Android, and the experimental results of the analysis show that the developed WebRTC implementation improves the performance of the most popular video streaming applications with respect to the stream packet delay.

  2. Practical security analysis of a quantum stream cipher by the Yuen 2000 protocol

    International Nuclear Information System (INIS)

    Hirota, Osamu

    2007-01-01

    There exists a great gap between one-time pad with perfect secrecy and conventional mathematical encryption. The Yuen 2000 (Y00) protocol or αη scheme may provide a protocol which covers from the conventional security to the ultimate one, depending on implementations. This paper presents the complexity-theoretic security analysis on some models of the Y00 protocol with nonlinear pseudo-random-number-generator and quantum noise diffusion mapping (QDM). Algebraic attacks and fast correlation attacks are applied with a model of the Y00 protocol with nonlinear filtering like the Toyocrypt stream cipher as the running key generator, and it is shown that these attacks in principle do not work on such models even when the mapping between running key and quantum state signal is fixed. In addition, a security property of the Y00 protocol with QDM is clarified. Consequently, we show that the Y00 protocol has a potential which cannot be realized by conventional cryptography and that it goes beyond mathematical encryption with physical encryption

  3. Mobile Internet Protocol Analysis

    National Research Council Canada - National Science Library

    Brachfeld, Lawrence

    1999-01-01

    ...) and User Datagram Protocol (UDP). Mobile IP allows mobile computers to send and receive packets addressed with their home network IP address, regardless of the IP address of their current point of attachment on the Internet...

  4. Symbolic Analysis of Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten

    We present our work on using abstract models for formally analysing cryptographic protocols: First, we present an ecient method for verifying trace-based authenticity properties of protocols using nonces, symmetric encryption, and asymmetric encryption. The method is based on a type system...... of Gordon et al., which we modify to support fully-automated type inference. Tests conducted via an implementation of our algorithm found it to be very ecient. Second, we show how privacy may be captured in a symbolic model using an equivalencebased property and give a formal denition. We formalise...

  5. Specimen preparation, imaging, and analysis protocols for knife-edge scanning microscopy.

    Science.gov (United States)

    Choe, Yoonsuck; Mayerich, David; Kwon, Jaerock; Miller, Daniel E; Sung, Chul; Chung, Ji Ryang; Huffman, Todd; Keyser, John; Abbott, Louise C

    2011-12-09

    Major advances in high-throughput, high-resolution, 3D microscopy techniques have enabled the acquisition of large volumes of neuroanatomical data at submicrometer resolution. One of the first such instruments producing whole-brain-scale data is the Knife-Edge Scanning Microscope (KESM), developed and hosted in the authors' lab. KESM has been used to section and image whole mouse brains at submicrometer resolution, revealing the intricate details of the neuronal networks (Golgi), vascular networks (India ink), and cell body distribution (Nissl). The use of KESM is not restricted to the mouse nor the brain. We have successfully imaged the octopus brain, mouse lung, and rat brain. We are currently working on whole zebra fish embryos. Data like these can greatly contribute to connectomics research; to microcirculation and hemodynamic research; and to stereology research by providing an exact ground-truth. In this article, we will describe the pipeline, including specimen preparation (fixing, staining, and embedding), KESM configuration and setup, sectioning and imaging with the KESM, image processing, data preparation, and data visualization and analysis. The emphasis will be on specimen preparation and visualization/analysis of obtained KESM data. We expect the detailed protocol presented in this article to help broaden the access to KESM and increase its utilization.

  6. A Secure Key Establishment Protocol for ZigBee Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2009-01-01

    ZigBee is a wireless sensor network standard that defines network and application layers on top of IEEE 802.15.4’s physical and medium access control layers. In the latest version of ZigBee, enhancements are prescribed for the security sublayer but we show in this paper that problems persist....... In particular we show that the End-to-End Application Key Establishment Protocol is flawed and we propose a secure protocol instead. We do so by using formal verification techniques based on static program analysis and process algebras. We present a way of using formal methods in wireless network security......, and propose a secure key establishment protocol for ZigBee networks....

  7. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  8. A content analysis of posthumous sperm procurement protocols with considerations for developing an institutional policy.

    Science.gov (United States)

    Bahm, Sarah M; Karkazis, Katrina; Magnus, David

    2013-09-01

    To identify and analyze existing posthumous sperm procurement (PSP) protocols in order to outline central themes for institutions to consider when developing future policies. Qualitative content analysis. Large academic institutions across the United States. We performed a literature search and contacted 40 institutions to obtain nine full PSP protocols. We then performed a content analysis on these policies to identify major themes and factors to consider when developing a PSP protocol. Presence of a PSP policy. We identified six components of a thorough PSP protocol: Standard of Evidence, Terms of Eligibility, Sperm Designee, Restrictions on Use in Reproduction, Logistics, and Contraindications. We also identified two different approaches to policy structure. In the Limited Role approach, institutions have stricter consent requirements and limit their involvement to the time of procurement. In the Family-Centered approach, substituted judgment is permitted but a mandatory wait period is enforced before sperm use in reproduction. Institutions seeking to implement a PSP protocol will benefit from considering the six major building blocks of a thorough protocol and where they would like to fall on the spectrum from a Limited Role to a Family-Centered approach. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  9. Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.

    Science.gov (United States)

    Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q

    2018-04-01

    Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.

  10. Modelling the protocol stack in NCS with deterministic and stochastic petri net

    Science.gov (United States)

    Hui, Chen; Chunjie, Zhou; Weifeng, Zhu

    2011-06-01

    Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.

  11. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  12. Objective and automated protocols for the evaluation of biomedical search engines using No Title Evaluation protocols.

    Science.gov (United States)

    Campagne, Fabien

    2008-02-29

    The evaluation of information retrieval techniques has traditionally relied on human judges to determine which documents are relevant to a query and which are not. This protocol is used in the Text Retrieval Evaluation Conference (TREC), organized annually for the past 15 years, to support the unbiased evaluation of novel information retrieval approaches. The TREC Genomics Track has recently been introduced to measure the performance of information retrieval for biomedical applications. We describe two protocols for evaluating biomedical information retrieval techniques without human relevance judgments. We call these protocols No Title Evaluation (NT Evaluation). The first protocol measures performance for focused searches, where only one relevant document exists for each query. The second protocol measures performance for queries expected to have potentially many relevant documents per query (high-recall searches). Both protocols take advantage of the clear separation of titles and abstracts found in Medline. We compare the performance obtained with these evaluation protocols to results obtained by reusing the relevance judgments produced in the 2004 and 2005 TREC Genomics Track and observe significant correlations between performance rankings generated by our approach and TREC. Spearman's correlation coefficients in the range of 0.79-0.92 are observed comparing bpref measured with NT Evaluation or with TREC evaluations. For comparison, coefficients in the range 0.86-0.94 can be observed when evaluating the same set of methods with data from two independent TREC Genomics Track evaluations. We discuss the advantages of NT Evaluation over the TRels and the data fusion evaluation protocols introduced recently. Our results suggest that the NT Evaluation protocols described here could be used to optimize some search engine parameters before human evaluation. Further research is needed to determine if NT Evaluation or variants of these protocols can fully substitute

  13. Analysis of a security protocol in ?CRL

    NARCIS (Netherlands)

    J. Pang

    2002-01-01

    textabstractNeedham-Schroeder public-key protocol; With the growth and commercialization of the Internet, the security of communication between computers becomes a crucial point. A variety of security protocols based on cryptographic primitives are used to establish secure communication over

  14. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  15. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    Science.gov (United States)

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  16. A class-chest for deriving transport protocols

    Energy Technology Data Exchange (ETDEWEB)

    Strayer, W.T.

    1996-10-01

    Development of new transport protocols or protocol algorithms suffers from the complexity of the environment in which they are intended to run. Modeling techniques attempt to avoid this by simulating the environment. Another approach to promoting rapid prototyping of protocols and protocol algorithms is to provide a pre-built infrastructure that is common to transport protocols, so that the focus is placed on the protocol-specific aspects. The Meta-Transport Library is a library of C++ base classes that implement or abstract out the mundane functions of a protocol, new protocol implementations are derived from base classes. The result is a fully viable user- level transport protocol implementation, with emphasis on modularity. The collection of base classes form a ``class-chest`` of tools .from which protocols can be developed and studied with as little change to a normal UNIX environment as possible.

  17. 75 FR 74007 - Federal Aquatic Nuisance Species Research Risk Analysis Protocol

    Science.gov (United States)

    2010-11-30

    ... site, http://anstaskforce.gov/documents.php . To obtain a hard copy of the Protocol, see Document... aquatic species that are the target of this risk analysis. Language used in the NANPCA differentiates...: http://anstaskforce.gov/documents.php Write: Susan Pasko, National Oceanic and Atmospheric...

  18. Two-party quantum key agreement protocol with four-particle entangled states

    Science.gov (United States)

    He, Yefeng; Ma, Wenping

    2016-09-01

    Based on four-particle entangled states and the delayed measurement technique, a two-party quantum key agreement protocol is proposed in this paper. In the protocol, two participants can deduce the measurement results of each other’s initial quantum states in terms of the measurement correlation property of four-particle entangled states. According to the corresponding initial quantum states deduced by themselves, two parties can extract the secret keys of each other by using the publicly announced value or by performing the delayed measurement, respectively. This guarantees the fair establishment of a shared key. Since each particle in quantum channel is transmitted only once, the protocol is congenitally free from the Trojan horse attacks. The security analysis shows that the protocol not only can resist against both participant and outsider attacks but also has no information leakage problem. Moreover, it has high qubit efficiency.

  19. Comparison of Different Double Immunostaining Protocols for Paraffin Embedded Liver Tissue

    Directory of Open Access Journals (Sweden)

    Alexander Schütz

    1999-01-01

    Full Text Available Most of the double immunostaining protocols that have been introduced so far have been developed for application on fresh frozen material or based on different species antibodies. In liver tissue, general problems of double immunostaining techniques are further complicated by tissue‐specific difficulties, such as necrosis or high intracellular protein content. To assess a reliable double immunostaining protocol for archived, paraffin embedded liver tissue, different protocols based on the use of same species primary antibodies were evaluated in terms of sensitivity, specificity and non‐specific background staining in pathological liver specimens. We compared peroxidase–anti‐peroxidase, alkaline phosphatase–anti‐alkaline phosphatase (PAP/APAP, labelled‐avidin–biotin (LAB/LAB and digoxigenin–anti‐digoxigenin (dig–a‐dig/PAP techniques using different cytokeratin antibodies and an antibody against PCNA. Comparison of the double immunostaining techniques revealed a high sensitivity and specificity in all procedures. Sections, which were stained employing PAP/APAP‐technique, displayed a higher background staining compared to sections which were treated with the LAB/LAB or dig–a‐dig/PAP protocol. In contrast to the dig–a‐dig/PAP protocol, the LAB/LAB technique provides a better time/cost relationship. Therefore, we would like to recommend a modified LAB/LAB protocol for simultaneous detection of different antigens in archived liver tissue.

  20. Correlation dimension based nonlinear analysis of network traffics with different application protocols

    International Nuclear Information System (INIS)

    Wang Jun-Song; Yuan Jing; Li Qiang; Yuan Rui-Xi

    2011-01-01

    This paper uses a correlation dimension based nonlinear analysis approach to analyse the dynamics of network traffics with three different application protocols—HTTP, FTP and SMTP. First, the phase space is reconstructed and the embedding parameters are obtained by the mutual information method. Secondly, the correlation dimensions of three different traffics are calculated and the results of analysis have demonstrated that the dynamics of the three different application protocol traffics is different from each other in nature, i.e. HTTP and FTP traffics are chaotic, furthermore, the former is more complex than the later; on the other hand, SMTP traffic is stochastic. It is shown that correlation dimension approach is an efficient method to understand and to characterize the nonlinear dynamics of HTTP, FTP and SMTP protocol network traffics. This analysis provided insight into and a more accurate understanding of nonlinear dynamics of internet traffics which have a complex mixture of chaotic and stochastic components. (general)

  1. Performance analysis of routing protocols for IoT

    Science.gov (United States)

    Manda, Sridhar; Nalini, N.

    2018-04-01

    Internet of Things (IoT) is an arrangement of advancements that are between disciplinary. It is utilized to have compelling combination of both physical and computerized things. With IoT physical things can have personal virtual identities and participate in distributed computing. Realization of IoT needs the usage of sensors based on the sector for which IoT is integrated. For instance, in healthcare domain, IoT needs to have integration with wearable sensors used by patients. As sensor devices produce huge amount of data, often called big data, there should be efficient routing protocols in place. To the extent remote systems is worried there are some current protocols, for example, OLSR, DSR and AODV. It additionally tosses light into Trust based routing protocol for low-power and lossy systems (TRPL) for IoT. These are broadly utilized remote directing protocols. As IoT is developing round the corner, it is basic to investigate routing protocols that and evaluate their execution regarding throughput, end to end delay, and directing overhead. The execution experiences can help in settling on very much educated choices while incorporating remote systems with IoT. In this paper, we analyzed different routing protocols and their performance is compared. It is found that AODV showed better performance than other routing protocols aforementioned.

  2. Advanced dementia pain management protocols.

    Science.gov (United States)

    Montoro-Lorite, Mercedes; Canalias-Reverter, Montserrat

    Pain management in advanced dementia is complex because of neurological deficits present in these patients, and nurses are directly responsible for providing interventions for the evaluation, management and relief of pain for people suffering from this health problem. In order to facilitate and help decision-makers, pain experts recommend the use of standardized protocols to guide pain management, but in Spain, comprehensive pain management protocols have not yet been developed for advanced dementia. This article reflects the need for an integrated management of pain in advanced dementia. From the review and analysis of the most current and relevant studies in the literature, we performed an approximation of the scales for the determination of pain in these patients, with the observational scale PAINAD being the most recommended for the hospital setting. In addition, we provide an overview for comprehensive management of pain in advanced dementia through the conceptual framework «a hierarchy of pain assessment techniques by McCaffery and Pasero» for the development and implementation of standardized protocols, including a four-phase cyclical process (evaluation, planning/performance, revaluation and recording), which can facilitate the correct management of pain in these patients. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  3. Power Saving MAC Protocols for WSNs and Optimization of S-MAC Protocol

    Directory of Open Access Journals (Sweden)

    Simarpreet Kaur

    2012-11-01

    Full Text Available Low power MAC protocols have received a lot of consideration in the last few years because of their influence on the lifetime of wireless sensor networks. Since, sensors typically operate on batteries, replacement of which is often difficult. A lot of work has been done to minimize the energy expenditure and prolong the sensor lifetime through energy efficient designs, across layers. Meanwhile, the sensor network should be able to maintain a certain throughput in order to fulfill the QoS requirements of the end user, and to ensure the constancy of the network. This paper introduces different types of MAC protocols used for WSNs and proposes S‐MAC, a Medium‐Access Control protocol designed for Wireless Sensor Networks. S‐MAC uses a few innovative techniques to reduce energy consumption and support selfconfiguration. A new protocol is suggested to improve the energy efficiency, latency and throughput of existing MAC protocol for WSNs. A modification of the protocol is then proposed to eliminate the need for some nodes to stay awake longer than the other nodes which improves the energy efficiency, latency and throughput and hence increases the life span of a wireless sensor network.

  4. Rehabilitation with 4 zygomatic implants with a new surgical protocol using ultrasonic technique.

    Science.gov (United States)

    Mozzati, Marco; Mortellaro, Carmen; Arata, Valentina; Gallesio, Giorgia; Previgliano, Valter

    2015-05-01

    When the residual bone crest cannot allow the placement of standard implants, the treatment for complete arch rehabilitation of severely atrophic maxillae can be performed with 4 zygomatic implants (ZIs) and immediate function with predictable results in terms of aesthetics, function, and comfort for the patient. However, even if ZIs' rehabilitations showed a good success rate, this surgery is difficult and need a skillful operator. Complications in this kind of rehabilitation are not uncommon; the main difficulties can be related to the reduced surgical visibility and instrument control in a critical anatomic area. All the surgical protocols described in the literature used drilling techniques. Furthermore, the use of ultrasonic instruments in implant surgery compared with drilling instruments have shown advantages in many aspects of surgical procedures, tissues management, enhancement of control, surgical visualization, and healing. The aim of this study was to report on the preliminary experience using ultrasound technique for ZIs surgery in terms of safety and technical improvement. Ten consecutive patients with severely atrophic maxilla have been treated with 4 ZIs and immediate complete arch acrylic resin provisional prostheses. The patients were followed up from 30 to 32 months evaluating implant success, prosthetic success, and patient satisfaction with a questionnaire. No implants were lost during the study period, with a 100% implant and prosthetic success rate. Within the limitations of this preliminary study, these data indicate that ultrasonic implant site preparation for ZIs can be a good alternative to the drilling technique and an improvement for the surgeon.

  5. A Fair Cooperative MAC Protocol in IEEE 802.11 WLAN

    Directory of Open Access Journals (Sweden)

    Seyed Davoud Mousavi

    2018-05-01

    Full Text Available Cooperative communication techniques have recently enabled wireless technologies to overcome their challenges. The main objective of these techniques is to improve resource allocation. In this paper, we propose a new protocol in medium access control (MAC of the IEEE 802.11 standard. In our new protocol, which is called Fair Cooperative MAC (FC-MAC, every relay node participates in cooperation proportionally to its provided cooperation gain. This technique improves network resource allocation by exploiting the potential capacity of all relay candidates. Simulation results demonstrate that the FC-MAC protocol presents better performance in terms of throughput, fairness, and network lifetime.

  6. Security Protocols in a Nutshell

    OpenAIRE

    Toorani, Mohsen

    2016-01-01

    Security protocols are building blocks in secure communications. They deploy some security mechanisms to provide certain security services. Security protocols are considered abstract when analyzed, but they can have extra vulnerabilities when implemented. This manuscript provides a holistic study on security protocols. It reviews foundations of security protocols, taxonomy of attacks on security protocols and their implementations, and different methods and models for security analysis of pro...

  7. Security Analysis of DTN Architecture and Bundle Protocol Specification for Space-Based Networks

    Science.gov (United States)

    Ivancic, William D.

    2009-01-01

    A Delay-Tolerant Network (DTN) Architecture (Request for Comment, RFC-4838) and Bundle Protocol Specification, RFC-5050, have been proposed for space and terrestrial networks. Additional security specifications have been provided via the Bundle Security Specification (currently a work in progress as an Internet Research Task Force internet-draft) and, for link-layer protocols applicable to Space networks, the Licklider Transport Protocol Security Extensions. This document provides a security analysis of the current DTN RFCs and proposed security related internet drafts with a focus on space-based communication networks, which is a rather restricted subset of DTN networks. Note, the original focus and motivation of DTN work was for the Interplanetary Internet . This document does not address general store-and-forward network overlays, just the current work being done by the Internet Research Task Force (IRTF) and the Consultative Committee for Space Data Systems (CCSDS) Space Internetworking Services Area (SIS) - DTN working group under the DTN and Bundle umbrellas. However, much of the analysis is relevant to general store-and-forward overlays.

  8. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  9. Immunochemical protocols

    National Research Council Canada - National Science Library

    Pound, John D

    1998-01-01

    ... easy and important refinements often are not published. This much anticipated 2nd edition of Immunochemzcal Protocols therefore aims to provide a user-friendly up-to-date handbook of reliable techniques selected to suit the needs of molecular biologists. It covers the full breadth of the relevant established immunochemical methods, from protein blotting and immunoa...

  10. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    Science.gov (United States)

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  11. A Lightweight Continuous Authentication Protocol for the Internet of Things

    Directory of Open Access Journals (Sweden)

    Yo-Hsuan Chuang

    2018-04-01

    Full Text Available Modern societies are moving toward an information-oriented environment. To gather and utilize information around people’s modern life, tiny devices with all kinds of sensing devices and various sizes of gateways need to be deployed and connected with each other through the Internet or proxy-based wireless sensor networks (WSNs. Within this kind of Internet of Things (IoT environment, how to authenticate each other between two communicating devices is a fundamental security issue. As a lot of IoT devices are powered by batteries and they need to transmit sensed data periodically, it is necessary for IoT devices to adopt a lightweight authentication protocol to reduce their energy consumption when a device wants to authenticate and transmit data to its targeted peer. In this paper, a lightweight continuous authentication protocol for sensing devices and gateway devices in general IoT environments is introduced. The concept of valid authentication time period is proposed to enhance robustness of authentication between IoT devices. To construct the proposed lightweight continuous authentication protocol, token technique and dynamic features of IoT devices are adopted in order to reach the design goals: the reduction of time consumption for consecutive authentications and energy saving for authenticating devices through by reducing the computation complexity during session establishment of continuous authentication. Security analysis is conducted to evaluate security strength of the proposed protocol. In addition, performance analysis has shown the proposed protocol is a strong competitor among existing protocols for device-to-device authentication in IoT environments.

  12. A Lightweight Continuous Authentication Protocol for the Internet of Things

    Science.gov (United States)

    Chuang, Yo-Hsuan; Yang, Cheng-Ying; Tang, Ssu-Wei

    2018-01-01

    Modern societies are moving toward an information-oriented environment. To gather and utilize information around people’s modern life, tiny devices with all kinds of sensing devices and various sizes of gateways need to be deployed and connected with each other through the Internet or proxy-based wireless sensor networks (WSNs). Within this kind of Internet of Things (IoT) environment, how to authenticate each other between two communicating devices is a fundamental security issue. As a lot of IoT devices are powered by batteries and they need to transmit sensed data periodically, it is necessary for IoT devices to adopt a lightweight authentication protocol to reduce their energy consumption when a device wants to authenticate and transmit data to its targeted peer. In this paper, a lightweight continuous authentication protocol for sensing devices and gateway devices in general IoT environments is introduced. The concept of valid authentication time period is proposed to enhance robustness of authentication between IoT devices. To construct the proposed lightweight continuous authentication protocol, token technique and dynamic features of IoT devices are adopted in order to reach the design goals: the reduction of time consumption for consecutive authentications and energy saving for authenticating devices through by reducing the computation complexity during session establishment of continuous authentication. Security analysis is conducted to evaluate security strength of the proposed protocol. In addition, performance analysis has shown the proposed protocol is a strong competitor among existing protocols for device-to-device authentication in IoT environments. PMID:29621168

  13. A Lightweight Continuous Authentication Protocol for the Internet of Things.

    Science.gov (United States)

    Chuang, Yo-Hsuan; Lo, Nai-Wei; Yang, Cheng-Ying; Tang, Ssu-Wei

    2018-04-05

    Modern societies are moving toward an information-oriented environment. To gather and utilize information around people's modern life, tiny devices with all kinds of sensing devices and various sizes of gateways need to be deployed and connected with each other through the Internet or proxy-based wireless sensor networks (WSNs). Within this kind of Internet of Things (IoT) environment, how to authenticate each other between two communicating devices is a fundamental security issue. As a lot of IoT devices are powered by batteries and they need to transmit sensed data periodically, it is necessary for IoT devices to adopt a lightweight authentication protocol to reduce their energy consumption when a device wants to authenticate and transmit data to its targeted peer. In this paper, a lightweight continuous authentication protocol for sensing devices and gateway devices in general IoT environments is introduced. The concept of valid authentication time period is proposed to enhance robustness of authentication between IoT devices. To construct the proposed lightweight continuous authentication protocol, token technique and dynamic features of IoT devices are adopted in order to reach the design goals: the reduction of time consumption for consecutive authentications and energy saving for authenticating devices through by reducing the computation complexity during session establishment of continuous authentication. Security analysis is conducted to evaluate security strength of the proposed protocol. In addition, performance analysis has shown the proposed protocol is a strong competitor among existing protocols for device-to-device authentication in IoT environments.

  14. A robust ECC based mutual authentication protocol with anonymity for session initiation protocol.

    Science.gov (United States)

    Mehmood, Zahid; Chen, Gongliang; Li, Jianhua; Li, Linsen; Alzahrani, Bander

    2017-01-01

    Over the past few years, Session Initiation Protocol (SIP) is found as a substantial application-layer protocol for the multimedia services. It is extensively used for managing, altering, terminating and distributing the multimedia sessions. Authentication plays a pivotal role in SIP environment. Currently, Lu et al. presented an authentication protocol for SIP and profess that newly proposed protocol is protected against all the familiar attacks. However, the detailed analysis describes that the Lu et al.'s protocol is exposed against server masquerading attack and user's masquerading attack. Moreover, it also fails to protect the user's identity as well as it possesses incorrect login and authentication phase. In order to establish a suitable and efficient protocol, having ability to overcome all these discrepancies, a robust ECC-based novel mutual authentication mechanism with anonymity for SIP is presented in this manuscript. The improved protocol contains an explicit parameter for user to cope the issues of security and correctness and is found to be more secure and relatively effective to protect the user's privacy, user's masquerading and server masquerading as it is verified through the comprehensive formal and informal security analysis.

  15. A robust ECC based mutual authentication protocol with anonymity for session initiation protocol.

    Directory of Open Access Journals (Sweden)

    Zahid Mehmood

    Full Text Available Over the past few years, Session Initiation Protocol (SIP is found as a substantial application-layer protocol for the multimedia services. It is extensively used for managing, altering, terminating and distributing the multimedia sessions. Authentication plays a pivotal role in SIP environment. Currently, Lu et al. presented an authentication protocol for SIP and profess that newly proposed protocol is protected against all the familiar attacks. However, the detailed analysis describes that the Lu et al.'s protocol is exposed against server masquerading attack and user's masquerading attack. Moreover, it also fails to protect the user's identity as well as it possesses incorrect login and authentication phase. In order to establish a suitable and efficient protocol, having ability to overcome all these discrepancies, a robust ECC-based novel mutual authentication mechanism with anonymity for SIP is presented in this manuscript. The improved protocol contains an explicit parameter for user to cope the issues of security and correctness and is found to be more secure and relatively effective to protect the user's privacy, user's masquerading and server masquerading as it is verified through the comprehensive formal and informal security analysis.

  16. Effect of joint mobilization techniques for primary total knee arthroplasty: Study protocol for a randomized controlled trial.

    Science.gov (United States)

    Xu, Jiao; Zhang, Juan; Wang, Xue-Qiang; Wang, Xuan-Lin; Wu, Ya; Chen, Chan-Cheng; Zhang, Han-Yu; Zhang, Zhi-Wan; Fan, Kai-Yi; Zhu, Qiang; Deng, Zhi-Wei

    2017-12-01

    Total knee arthroplasty (TKA) has become the most preferred procedure by patients for the relief of pain caused by knee osteoarthritis. TKA patients aim a speedy recovery after the surgery. Joint mobilization techniques for rehabilitation have been widely used to relieve pain and improve joint mobility. However, relevant randomized controlled trials showing the curative effect of these techniques remain lacking to date. Accordingly, this study aims to investigate whether joint mobilization techniques are valid for primary TKA. We will manage a single-blind, prospective, randomized, controlled trial of 120 patients with unilateral TKA. Patients will be randomized into an intervention group, a physical modality therapy group, and a usual care group. The intervention group will undergo joint mobilization manipulation treatment once a day and regular training twice a day for a month. The physical modality therapy group will undergo physical therapy once a day and regular training twice a day for a month. The usual care group will perform regular training twice a day for a month. Primary outcome measures will be based on the visual analog scale, the knee joint Hospital for Special Surgery score, range of motion, surrounded degree, and adverse effect. Secondary indicators will include manual muscle testing, 36-Item Short Form Health Survey, Berg Balance Scale function evaluation, Pittsburgh Sleep Quality Index, proprioception, and muscle morphology. We will direct intention-to-treat analysis if a subject withdraws from the trial. The important features of this trial for joint mobilization techniques in primary TKA are randomization procedures, single-blind, large sample size, and standardized protocol. This study aims to investigate whether joint mobilization techniques are effective for early TKA patients. The result of this study may serve as a guide for TKA patients, medical personnel, and healthcare decision makers. It has been registered at http

  17. Information-theoretic security proof for quantum-key-distribution protocols

    International Nuclear Information System (INIS)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-01-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel

  18. Information-theoretic security proof for quantum-key-distribution protocols

    Science.gov (United States)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-07-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel.

  19. Addendum to the Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Engebrecht-Metzger, C.; Wilson, E.; Horowitz, S.

    2012-12-01

    As Building America (BA) has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocols (HSP) provides guidance to program partners and managers so that energy savings for new construction and retrofit projects can be compared alongside each other. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  20. Addendum to the Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Engebrecht, C. Metzger [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wilson, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2012-12-01

    As DOE's Building America program has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program’s goals. The House Simulation Protocols (HSP) provide guidance to program partners and managers so that energy savings for new construction and retrofit projects can be compared alongside each other. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  1. Method-centered digital communities on protocols.io for fast-paced scientific innovation.

    Science.gov (United States)

    Kindler, Lori; Stoliartchouk, Alexei; Teytelman, Leonid; Hurwitz, Bonnie L

    2016-01-01

    The Internet has enabled online social interaction for scientists beyond physical meetings and conferences. Yet despite these innovations in communication, dissemination of methods is often relegated to just academic publishing. Further, these methods remain static, with subsequent advances published elsewhere and unlinked. For communities undergoing fast-paced innovation, researchers need new capabilities to share, obtain feedback, and publish methods at the forefront of scientific development. For example, a renaissance in virology is now underway given the new metagenomic methods to sequence viral DNA directly from an environment. Metagenomics makes it possible to "see" natural viral communities that could not be previously studied through culturing methods. Yet, the knowledge of specialized techniques for the production and analysis of viral metagenomes remains in a subset of labs.  This problem is common to any community using and developing emerging technologies and techniques. We developed new capabilities to create virtual communities in protocols.io, an open access platform, for disseminating protocols and knowledge at the forefront of scientific development. To demonstrate these capabilities, we present a virology community forum called VERVENet. These new features allow virology researchers to share protocols and their annotations and optimizations, connect with the broader virtual community to share knowledge, job postings, conference announcements through a common online forum, and discover the current literature through personalized recommendations to promote discussion of cutting edge research. Virtual communities in protocols.io enhance a researcher's ability to: discuss and share protocols, connect with fellow community members, and learn about new and innovative research in the field.  The web-based software for developing virtual communities is free to use on protocols.io. Data are available through public APIs at protocols.io.

  2. An improved method of studying user-system interaction by combining transaction log analysis and protocol analysis

    Directory of Open Access Journals (Sweden)

    Jillian R. Griffiths

    2002-01-01

    Full Text Available The paper reports a novel approach to studying user-system interaction that captures a complete record of the searcher's actions, the system responses and synchronised talk-aloud comments from the searcher. The data is recorded unobtrusively and is available for later analysis. The approach is set in context by a discussion of transaction logging and protocol analysis and examples of the search logging in operation are presented

  3. Variation in radiographic protocols in paediatric interventional cardiology

    International Nuclear Information System (INIS)

    McFadden, S L; Hughes, C M; Winder, R J

    2013-01-01

    The aim of this work is to determine current radiographic protocols in paediatric interventional cardiology (IC) in the UK and Ireland. To do this we investigated which imaging parameters/protocols are commonly used in IC in different hospitals, to identify if a standard technique is used and illustrate any variation in practice. A questionnaire was sent to all hospitals in the UK and Ireland which perform paediatric IC to obtain information on techniques used in each clinical department and on the range of clinical examinations performed. Ethical and research governance approval was sought from the Office for Research Ethics Committees Northern Ireland and the individual trusts. A response rate of 79% was achieved, and a wide variation in technique was found between hospitals. The main differences in technique involved variations in the use of an anti-scatter grid and the use of additional filtration to the radiation beam, frame rates for digital acquisition and pre-programmed projections/paediatric specific programming in the equipment. We conclude that there is no standard protocol for carrying out paediatric IC in the UK or Ireland. Each hospital carries out the IC procedure according to its own local protocols resulting in a wide variation in radiation dose. (paper)

  4. Variation in radiographic protocols in paediatric interventional cardiology.

    Science.gov (United States)

    McFadden, S L; Hughes, C M; Winder, R J

    2013-06-01

    The aim of this work is to determine current radiographic protocols in paediatric interventional cardiology (IC) in the UK and Ireland. To do this we investigated which imaging parameters/protocols are commonly used in IC in different hospitals, to identify if a standard technique is used and illustrate any variation in practice. A questionnaire was sent to all hospitals in the UK and Ireland which perform paediatric IC to obtain information on techniques used in each clinical department and on the range of clinical examinations performed. Ethical and research governance approval was sought from the Office for Research Ethics Committees Northern Ireland and the individual trusts. A response rate of 79% was achieved, and a wide variation in technique was found between hospitals. The main differences in technique involved variations in the use of an anti-scatter grid and the use of additional filtration to the radiation beam, frame rates for digital acquisition and pre-programmed projections/paediatric specific programming in the equipment. We conclude that there is no standard protocol for carrying out paediatric IC in the UK or Ireland. Each hospital carries out the IC procedure according to its own local protocols resulting in a wide variation in radiation dose.

  5. Symbolic Model Checking and Analysis for E-Commerce Protocol

    Institute of Scientific and Technical Information of China (English)

    WEN Jing-Hua; ZHANG Mei; LI Xiang

    2005-01-01

    A new approach is proposed for analyzing non-repudiation and fairness of e-commerce protocols. The authentication e-mail protocol CMP1 is modeled as finite state machine and analyzed in two vital aspects - non-repudiation and fairness using SMV. As a result, the CMP1 protocol is not fair and we have improved it. This result shows that it is effective to analyze and check the new features of e-commerce protocols using SMV model checker

  6. The comparative cost analysis of EAP Re-authentication Protocol and EAP TLS Protocol

    OpenAIRE

    Seema Mehla; Bhawna Gupta

    2010-01-01

    the Extensible Authentication Protocol (EAP) is a generic framework supporting multiple types of authentication methods. In systems where EAP is used for authentication, it is desirable to not repeat the entire EAP exchange with another authenticator. The EAP reauthentication Protocol provides a consistent, methodindependentand low-latency re-authentication. It is extension to current EAP mechanism to support intradomain handoff authentication. This paper analyzed the performance of the EAP r...

  7. Critical Response Protocol

    Science.gov (United States)

    Ellingson, Charlene; Roehrig, Gillian; Bakkum, Kris; Dubinsky, Janet M.

    2016-01-01

    This article introduces the Critical Response Protocol (CRP), an arts-based technique that engages students in equitable critical discourse and aligns with the "Next Generation Science Standards" vision for providing students opportunities for language learning while advancing science learning (NGSS Lead States 2013). CRP helps teachers…

  8. Protocol and the post-human performativity of security techniques.

    Science.gov (United States)

    O'Grady, Nathaniel

    2016-07-01

    This article explores the deployment of exercises by the United Kingdom Fire and Rescue Service. Exercises stage, simulate and act out potential future emergencies and in so doing help the Fire and Rescue Service prepare for future emergencies. Specifically, exercises operate to assess and develop protocol; sets of guidelines which plan out the actions undertaken by the Fire and Rescue Service in responding to a fire. In the article I outline and assess the forms of knowledge and technologies, what I call the 'aesthetic forces', by which the exercise makes present and imagines future emergencies. By critically engaging with Karen Barad's notion of post-human performativity, I argue that exercises provide a site where such forces can entangle with one another; creating a bricolage through which future emergencies are evoked sensually and representatively, ultimately making it possible to experience emergencies in the present. This understanding of exercises allows also for critical appraisal of protocol both as phenomena that are produced through the enmeshing of different aesthetic forces and as devices which premise the operation of the security apparatus on contingency.

  9. Authentication techniques for smart cards

    International Nuclear Information System (INIS)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thorough understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system

  10. Performance Analysis of a Cluster-Based MAC Protocol for Wireless Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jesús Alonso-Zárate

    2010-01-01

    Full Text Available An analytical model to evaluate the non-saturated performance of the Distributed Queuing Medium Access Control Protocol for Ad Hoc Networks (DQMANs in single-hop networks is presented in this paper. DQMAN is comprised of a spontaneous, temporary, and dynamic clustering mechanism integrated with a near-optimum distributed queuing Medium Access Control (MAC protocol. Clustering is executed in a distributed manner using a mechanism inspired by the Distributed Coordination Function (DCF of the IEEE 802.11. Once a station seizes the channel, it becomes the temporary clusterhead of a spontaneous cluster and it coordinates the peer-to-peer communications between the clustermembers. Within each cluster, a near-optimum distributed queuing MAC protocol is executed. The theoretical performance analysis of DQMAN in single-hop networks under non-saturation conditions is presented in this paper. The approach integrates the analysis of the clustering mechanism into the MAC layer model. Up to the knowledge of the authors, this approach is novel in the literature. In addition, the performance of an ad hoc network using DQMAN is compared to that obtained when using the DCF of the IEEE 802.11, as a benchmark reference.

  11. Acoustic emissions and electric signal recordings, when cement mortar beams are subjected to three-point bending under various loading protocols

    Directory of Open Access Journals (Sweden)

    A. Kyriazopoulos

    2017-04-01

    Full Text Available Two experimental techniques are used study the response of cement mortar beams subjected to three-point bending under various load¬ing protocols. The techniques used are the detection of weak electric current emissions known as Pressure Stimulated Currents and the Acoustic Emissions (in particular, the cumulative AE energy and the b-value analysis. Patterns are detected that can be used to predict upcoming fracture, regard¬less of the adopted loading protocol in each experiment. The expe¬rimental results of the AE and PSC techniques lead to the conclusion that when the calculated Ib values decrease, the PSC starts increasing strongly.

  12. Performance Analysis of TDMA Protocol in a Femtocell Network

    Directory of Open Access Journals (Sweden)

    Wanod Kumar

    2014-07-01

    Full Text Available In this paper, we evaluate the performance of TDMA (Time Division Multiple Access protocol using queuing theory in a femtocell network. The fair use of wireless channel among the users of network is carried out using TDMA protocol. The arrival of data packets from M communicating nodes becomes multiple Poisson process. The time slots of TDMA protocol represent c servers to communicate data packets coming from communicating nodes to the input of FAP (Femtocell Access Point. The service time of each server (time slot is exponentially distributed. This complete communication scenario using TDMA protocol is modeled using M/M/c queue. The performance of the protocol is evaluated in terms of mean number in system, average system delay and utilization for varying traffic intensity

  13. Registered nurses' clinical reasoning in home healthcare clinical practice: A think-aloud study with protocol analysis.

    Science.gov (United States)

    Johnsen, Hege Mari; Slettebø, Åshild; Fossum, Mariann

    2016-05-01

    The home healthcare context can be unpredictable and complex, and requires registered nurses with a high level of clinical reasoning skills and professional autonomy. Thus, additional knowledge about registered nurses' clinical reasoning performance during patient home care is required. The aim of this study is to describe the cognitive processes and thinking strategies used by recently graduated registered nurses while caring for patients in home healthcare clinical practice. An exploratory qualitative think-aloud design with protocol analysis was used. Home healthcare visits to patients with stroke, diabetes, and chronic obstructive pulmonary disease in seven healthcare districts in southern Norway. A purposeful sample of eight registered nurses with one year of experience. Each nurse was interviewed using the concurrent think-aloud technique in three different patient home healthcare clinical practice visits. A total of 24 home healthcare visits occurred. Follow-up interviews were conducted with each participant. The think-aloud sessions were transcribed and analysed using three-step protocol analysis. Recently graduated registered nurses focused on both general nursing concepts and concepts specific to the domains required and tasks provided in home healthcare services as well as for different patient groups. Additionally, participants used several assertion types, cognitive processes, and thinking strategies. Our results showed that recently graduated registered nurses used both simple and complex cognitive processes involving both inductive and deductive reasoning. However, their reasoning was more reactive than proactive. The results may contribute to nursing practice in terms of developing effective nursing education programmes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Analysis of MD5 authentication in various routing protocols using simulation tools

    Science.gov (United States)

    Dinakaran, M.; Darshan, K. N.; Patel, Harsh

    2017-11-01

    Authentication being an important paradigm of security and Computer Networks require secure paths to make the flow of the data even more secure through some security protocols. So MD-5(Message Digest 5) helps in providing data integrity to the data being sent through it and authentication to the network devices. This paper gives a brief introduction to the MD-5, simulation of the networks by including MD-5 authentication using various routing protocols like OSPF, EIGRP and RIPv2. GNS3 is being used to simulate the scenarios. Analysis of the MD-5 authentication is done in the later sections of the paper.

  15. An Evaluation of Protocols for UAV Science Applications

    Science.gov (United States)

    Ivancic, William D.; Stewart, David E.; Sullivan, Donald V.; Finch, Patrick E.

    2012-01-01

    This paper identifies data transport needs for current and future science payloads deployed on the NASA Global Hawk Unmanned Aeronautical Vehicle (UAV). The NASA Global Hawk communication system and operational constrains are presented. The Genesis and Rapid Intensification Processes (GRIP) mission is used to provide the baseline communication requirements as a variety of payloads were utilized in this mission. User needs and desires are addressed. Protocols are matched to the payload needs and an evaluation of various techniques and tradeoffs are presented. Such techniques include utilization rate-base selective negative acknowledgement protocols and possible use of protocol enhancing proxies. Tradeoffs of communication architectures that address ease-of-use and security considerations are also presented.

  16. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  17. Staged protocol for the treatment of chronic femoral shaft osteomyelitis with Ilizarov's technique followed by the use of intramedullary locked nail.

    Science.gov (United States)

    Chou, Po-Hsin; Lin, Hsi-Hsien; Su, Yu-Pin; Chiang, Chao-Ching; Chang, Ming-Chau; Chen, Chuan-Mu

    2017-06-01

    Infected nonunion of the femoral shaft is uncommon, and usually presents with challenging therapeutic and reconstructive problems. There are still controversies over treating infected nonunion of the femoral shaft. The purposes of this retrospective study were to review the treatment outcomes and describe a staged protocol for spontaneous wound healing. Six patients with chronic femoral shaft infected-nonunion from October 2002 to September 2010 were included in this retrospective study. Serial plain films and triple films of lower legs were performed to evaluate the alignment of the treated femoral shaft and bony union following our staged protocol of Ilizarov distraction osteogenesis and intramedullary nailing. An average bone defect of 7 cm was noted after staged osteotomy. Mean follow-up was 87.5 (range, 38-133) months. Union was achieved in all six patients, with an average external fixation time of 6.8 (range, 5-11) months. There was no reinfection. One complication of a 4-cm leg discrepancy was noted, with an initial shortening of 15 cm. The mean knee ranges of motion (ROM) before staged protocols and at final follow-up were 64.2±8.6 (range, 60-75)° and 53.3±9.3 (range, 40-65)°, respectively. The ROM at the knee joint statistically decreased following staged protocols. In the treatment of chronic femur osteomyelitis, the staged protocol of Ilizarov distraction osteogenesis followed by intramedullary nailing was safe and successful, and allowed for union, realignment, reorientation, and leg-length restoration. With regard to the soft tissue, this technique provides a unique type of reconstructive closure for infected wounds. It is suggested that the staged protocol is reliable in providing successful simultaneous reconstruction for bone and soft tissue defects without flap coverage. Copyright © 2017. Published by Elsevier Taiwan LLC.

  18. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  19. Analysis of the Implementation of Standardized Clinical Protocol «Diabetes Mellitus Type 2» by Quality Indicators in Institutions of Kyiv Region

    Directory of Open Access Journals (Sweden)

    V.I. Tkachenko

    2014-10-01

    Full Text Available In Ukraine, a standardized clinical protocol (SCP to provide medical care in diabetes mellitus type 2 (order of the Ministry of Healthcare of Ukraine dated 21.12.2012 № 1118, which identifies 4 quality indicators, is being implemented. The objective of research — to analyze the implementation of SCP based on monitoring of quality indicators in the institutions of the Kyiv region. Materials and Methods. Technique for assessing the quality of diabetes care, one element of which is the monitoring of quality indicators specified in SCP, has been developed and applied. Collection and analysis of information was carried out by forms of primary records № 025/030 and 030/o, forms of statistical reporting № 12 and 20. Statistical analysis was performed using Excel 2007, SPSS. Results. Today, primary health care institutions in Kyiv region developed local protocols that confirms the implementation of the first quality indicator, in accordance with the desired level of the indicator value by SCP. The second indicator — the percentage of patients who were defined the level of glycated hemoglobin in the reporting period amounted to 12.2 %, which is higher than in 2012 (8.84 %, but remains low. The third quality indicator — the percentage of patients who were admitted to hospital for diabetes mellitus and its complications during the reporting period amounted to 15.01 %, while in 2012 it stood at 8.66 %. For comparison, this figure in 2007 was 9.37 %. Conclusions. The quality of care at an early stage of implementation is not enough, partly due to the lack of awareness by physicians of major provisions of the protocol, lack of equipment, the need of payment by a patient for medical services specified in the protocol, lack of doctors’ understanding of the characteristics of different types of medical and technological documents and difficulties in the development and implementation of local protocols, particularly. The obtained results are

  20. Antibody engineering: methods and protocols

    National Research Council Canada - National Science Library

    Chames, Patrick

    2012-01-01

    "Antibody Engineering: Methods and Protocols, Second Edition was compiled to give complete and easy access to a variety of antibody engineering techniques, starting from the creation of antibody repertoires and efficient...

  1. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  2. Analysis of limiting information characteristics of quantum-cryptography protocols

    International Nuclear Information System (INIS)

    Sych, D V; Grishanin, Boris A; Zadkov, Viktor N

    2005-01-01

    The problem of increasing the critical error rate of quantum-cryptography protocols by varying a set of letters in a quantum alphabet for space of a fixed dimensionality is studied. Quantum alphabets forming regular polyhedra on the Bloch sphere and the continual alphabet equally including all the quantum states are considered. It is shown that, in the absence of basis reconciliation, a protocol with the tetrahedral alphabet has the highest critical error rate among the protocols considered, while after the basis reconciliation, a protocol with the continual alphabet possesses the highest critical error rate. (quantum optics and quantum computation)

  3. Allowing Students to Select Deliverables for Peer Review: Analysis of a Free-Selection Protocol

    DEFF Research Database (Denmark)

    Papadopoulos, Pantelis M.; Lagkas, Thomas; Demetriadis, Stavros

    2011-01-01

    This study analyzes the benefits and limitations of a “free-selection” peer assignment protocol by comparing them to the widely implemented “assigned-pair” protocol. The primary motivation was to circumvent the issues that often appear to the instructors implementing peer review activities with pre......-Selection, where students were able to explore and select peer work for review. Result analysis showed a very strong tendency in favor of the Free-Selection students regarding both domain specific (conceptual) and domain-general (reviewing) knowledge....

  4. MRI technique for the preoperative evaluation of deep infiltrating endometriosis: current status and protocol recommendation

    International Nuclear Information System (INIS)

    Schneider, C.; Oehmke, F.; Tinneberg, H.-R.; Krombach, G.A.

    2016-01-01

    Endometriosis is a common cause of chronic pelvic pain and infertility. It is defined as the occurrence of endometrial tissue outside the uterine cavity and can manifest as a peritoneal, ovarian or infiltrating form, the latter being referred to as deep infiltrating endometriosis (DIE). Surgery is essential in the treatment of DIE and depending on the severity of the disease, surgery can be difficult and extensive. Beside clinical examination and ultrasound, magnetic resonance imaging (MRI) has proven its value to provide useful information for planning surgery in patients with suspected DIE. To optimise the quality of MRI examinations, radiologists have to be familiar with the capabilities and also the limitations of this technique with respect to the assessment of DIE. MRI yields morphological information by using mainly T1- and T2-weighted sequences, but can also provide functional information by means of intravenous gadolinium, diffusion-weighted imaging or cine-MRI. In this article, these techniques and also adequate measures of patient preparation, which are indispensable for successful MRI imaging for the preoperative evaluation of DIE, are reviewed and a comprehensive protocol recommendation is provided.

  5. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  6. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  7. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  8. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  9. Protocol: An updated integrated methodology for analysis of metabolites and enzyme activities of ethylene biosynthesis

    Directory of Open Access Journals (Sweden)

    Geeraerd Annemie H

    2011-06-01

    Full Text Available Abstract Background The foundations for ethylene research were laid many years ago by researchers such as Lizada, Yang and Hoffman. Nowadays, most of the methods developed by them are still being used. Technological developments since then have led to small but significant improvements, contributing to a more efficient workflow. Despite this, many of these improvements have never been properly documented. Results This article provides an updated, integrated set of protocols suitable for the assembly of a complete picture of ethylene biosynthesis, including the measurement of ethylene itself. The original protocols for the metabolites 1-aminocyclopropane-1-carboxylic acid and 1-(malonylaminocyclopropane-1-carboxylic acid have been updated and downscaled, while protocols to determine in vitro activities of the key enzymes 1-aminocyclopropane-1-carboxylate synthase and 1-aminocyclopropane-1-carboxylate oxidase have been optimised for efficiency, repeatability and accuracy. All the protocols described were optimised for apple fruit, but have been proven to be suitable for the analysis of tomato fruit as well. Conclusions This work collates an integrated set of detailed protocols for the measurement of components of the ethylene biosynthetic pathway, starting from well-established methods. These protocols have been optimised for smaller sample volumes, increased efficiency, repeatability and accuracy. The detailed protocol allows other scientists to rapidly implement these methods in their own laboratories in a consistent and efficient way.

  10. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  11. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  12. Techniques and Protocols for Dispersing Nanoparticle Powders in Aqueous Media-Is there a Rationale for Harmonization?

    Science.gov (United States)

    Hartmann, Nanna B; Jensen, Keld Alstrup; Baun, Anders; Rasmussen, Kirsten; Rauscher, Hubert; Tantra, Ratna; Cupi, Denisa; Gilliland, Douglas; Pianella, Francesca; Riego Sintes, Juan M

    2015-01-01

    Selecting appropriate ways of bringing engineered nanoparticles (ENP) into aqueous dispersion is a main obstacle for testing, and thus for understanding and evaluating, their potential adverse effects to the environment and human health. Using different methods to prepare (stock) dispersions of the same ENP may be a source of variation in the toxicity measured. Harmonization and standardization of dispersion methods applied in mammalian and ecotoxicity testing are needed to ensure a comparable data quality and to minimize test artifacts produced by modifications of ENP during the dispersion preparation process. Such harmonization and standardization will also enhance comparability among tests, labs, and studies on different types of ENP. The scope of this review was to critically discuss the essential parameters in dispersion protocols for ENP. The parameters are identified from individual scientific studies and from consensus reached in larger scale research projects and international organizations. A step-wise approach is proposed to develop tailored dispersion protocols for ecotoxicological and mammalian toxicological testing of ENP. The recommendations of this analysis may serve as a guide to researchers, companies, and regulators when selecting, developing, and evaluating the appropriateness of dispersion methods applied in mammalian and ecotoxicity testing. However, additional experimentation is needed to further document the protocol parameters and investigate to what extent different stock dispersion methods affect ecotoxicological and mammalian toxicological responses of ENP.

  13. Analysis of Intracellular Metabolites from Microorganisms: Quenching and Extraction Protocols.

    Science.gov (United States)

    Pinu, Farhana R; Villas-Boas, Silas G; Aggio, Raphael

    2017-10-23

    Sample preparation is one of the most important steps in metabolome analysis. The challenges of determining microbial metabolome have been well discussed within the research community and many improvements have already been achieved in last decade. The analysis of intracellular metabolites is particularly challenging. Environmental perturbations may considerably affect microbial metabolism, which results in intracellular metabolites being rapidly degraded or metabolized by enzymatic reactions. Therefore, quenching or the complete stop of cell metabolism is a pre-requisite for accurate intracellular metabolite analysis. After quenching, metabolites need to be extracted from the intracellular compartment. The choice of the most suitable metabolite extraction method/s is another crucial step. The literature indicates that specific classes of metabolites are better extracted by different extraction protocols. In this review, we discuss the technical aspects and advancements of quenching and extraction of intracellular metabolite analysis from microbial cells.

  14. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    Science.gov (United States)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  15. The preparation of Drosophila embryos for live-imaging using the hanging drop protocol.

    Science.gov (United States)

    Reed, Bruce H; McMillan, Stephanie C; Chaudhary, Roopali

    2009-03-13

    Green fluorescent protein (GFP)-based timelapse live-imaging is a powerful technique for studying the genetic regulation of dynamic processes such as tissue morphogenesis, cell-cell adhesion, or cell death. Drosophila embryos expressing GFP are readily imaged using either stereoscopic or confocal microscopy. A goal of any live-imaging protocol is to minimize detrimental effects such as dehydration and hypoxia. Previous protocols for preparing Drosophila embryos for live-imaging analysis have involved placing dechorionated embryos in halocarbon oil and sandwiching them between a halocarbon gas-permeable membrane and a coverslip. The introduction of compression through mounting embryos in this manner represents an undesirable complication for any biomechanical-based analysis of morphogenesis. Our method, which we call the hanging drop protocol, results in excellent viability of embryos during live imaging and does not require that embryos be compressed. Briefly, the hanging drop protocol involves the placement of embryos in a drop of halocarbon oil that is suspended from a coverslip, which is, in turn, fixed in position over a humid chamber. In addition to providing gas exchange and preventing dehydration, this arrangement takes advantage of the buoyancy of embryos in halocarbon oil to prevent them from drifting out of position during timelapse acquisition. This video describes in detail how to collect and prepare Drosophila embryos for live imaging using the hanging drop protocol. This protocol is suitable for imaging dechorionated embryos using stereomicroscopy or any upright compound fluorescence microscope.

  16. Studies on osteoporosis in Chile using isotope-related techniques

    International Nuclear Information System (INIS)

    Lobo, G.; Palma, T.; Cortes-Toro, E.

    1996-01-01

    Several studies on bone densitometry measurements in healthy individuals have been performed in Chile. However due to the fact that different techniques and no uniform protocols have been used to select patients, the results obtained are not suitable as reference values for a normal chilean population. Therefore, foreign reference values are used. This study will select healthy normal individuals, typical urban chilean residents, and measure bone density using the DEXA technique. The selection will be made according to a well defined protocol. Serum osteocalcin, a marker of bone remodeling, will be measured in all subjects as a means-a assessing bone metabolism. Bone trace element composition will be measured in selected subjects. Samples will be obtained by biopsy or through normal surgical procedures and will be analyzed by neutron activation analysis. (author)

  17. Dosimetric evaluation of cone beam computed tomography scanning protocols

    International Nuclear Information System (INIS)

    Soares, Maria Rosangela

    2015-01-01

    It was evaluated the cone beam computed tomography, CBCT scanning protocols, that was introduced in dental radiology at the end of the 1990's, and quickly became a fundamental examination for various procedures. Its main characteristic, the difference of medical CT is the beam shape. This study aimed to calculate the absorbed dose in eight tissues / organs of the head and neck, and to estimate the effective dose in 13 protocols and two techniques (stitched FOV e single FOV) of 5 equipment of different manufacturers of cone beam CT. For that purpose, a female anthropomorphic phantom was used, representing a default woman, in which were inserted thermoluminescent dosimeters at several points, representing organs / tissues with weighting values presented in the standard ICRP 103. The results were evaluated by comparing the dose according to the purpose of the tomographic image. Among the results, there is a difference up to 325% in the effective dose in relation to protocols with the same image goal. In relation to the image acquisition technique, the stitched FOV technique resulted in an effective dose of 5.3 times greater than the single FOV technique for protocols with the same image goal. In the individual contribution, the salivary glands are responsible for 31% of the effective dose in CT exams. The remaining tissues have also a significant contribution, 36%. The results drew attention to the need of estimating the effective dose in different equipment and protocols of the market, besides the knowledge of the radiation parameters and equipment manufacturing engineering to obtain the image. (author)

  18. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  19. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    ... justification for the implementation of a given enhancement technique. The research questions were answered through model and simulation of a satellite transmission system via a Linux-based network topology...

  20. Evaluation of Extraction Protocols for Simultaneous Polar and Non-Polar Yeast Metabolite Analysis Using Multivariate Projection Methods

    Directory of Open Access Journals (Sweden)

    Nicolas P. Tambellini

    2013-07-01

    Full Text Available Metabolomic and lipidomic approaches aim to measure metabolites or lipids in the cell. Metabolite extraction is a key step in obtaining useful and reliable data for successful metabolite studies. Significant efforts have been made to identify the optimal extraction protocol for various platforms and biological systems, for both polar and non-polar metabolites. Here we report an approach utilizing chemoinformatics for systematic comparison of protocols to extract both from a single sample of the model yeast organism Saccharomyces cerevisiae. Three chloroform/methanol/water partitioning based extraction protocols found in literature were evaluated for their effectiveness at reproducibly extracting both polar and non-polar metabolites. Fatty acid methyl esters and methoxyamine/trimethylsilyl derivatized aqueous compounds were analyzed by gas chromatography mass spectrometry to evaluate non-polar or polar metabolite analysis. The comparative breadth and amount of recovered metabolites was evaluated using multivariate projection methods. This approach identified an optimal protocol consisting of 64 identified polar metabolites from 105 ion hits and 12 fatty acids recovered, and will potentially attenuate the error and variation associated with combining metabolite profiles from different samples for untargeted analysis with both polar and non-polar analytes. It also confirmed the value of using multivariate projection methods to compare established extraction protocols.

  1. The development of standard operating protocols for paediatric radiology

    International Nuclear Information System (INIS)

    Hardwick, J.; Mencik, C.; McLaren, C.; Young, C.; Scadden, S.; Mashford, P.; McHugh, K.; Beckett, M.; Calvert, M.; Marsden, P.J.

    2001-01-01

    This paper describes how the requirement for operating protocols for standard radiological practice was expanded to provide a comprehensive aide to the operator conducting a medical exposure. The protocols adopted now include justification criteria, patient preparation, radiographic technique, standard exposure charts, diagnostic reference levels and image quality criteria. In total, the protocols have been welcomed as a tool for ensuring that medical exposures are properly optimised. (author)

  2. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    The purpose of this study was to assess the throughput improvement afforded by the various TCP optimization techniques, with respect to a simulated geosynchronous satellite system, to provide a cost...

  3. Transgenic mouse - Methods and protocols, 2nd edition

    Directory of Open Access Journals (Sweden)

    Carlo Alberto Redi

    2011-09-01

    Full Text Available Marten H. Hofner (from the Dept. of Pathology of the Groningen University and Jan M. van Deursen (from the Mayo College of Medicine at Rochester, MN, USA provided us with the valuable second edition of Transgenic mouse: in fact, eventhough we are in the –omics era and already equipped with the state-of-the-art techniques in whatsoever field, still we need to have gene(s functional analysis data to understand common and complex deseases. Transgenesis is still an irreplaceable method and protocols to well perform it are more than welcome. Here, how to get genetic modified mice (the quintessential model of so many human deseases considering how much of the human genes are conserved in the mouse and the great block of genic synteny existing between the two genomes is analysed in deep and presented in clearly detailed step by step protocols....

  4. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  5. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  6. Analysis of Intracellular Metabolites from Microorganisms: Quenching and Extraction Protocols

    Directory of Open Access Journals (Sweden)

    Farhana R. Pinu

    2017-10-01

    Full Text Available Sample preparation is one of the most important steps in metabolome analysis. The challenges of determining microbial metabolome have been well discussed within the research community and many improvements have already been achieved in last decade. The analysis of intracellular metabolites is particularly challenging. Environmental perturbations may considerably affect microbial metabolism, which results in intracellular metabolites being rapidly degraded or metabolized by enzymatic reactions. Therefore, quenching or the complete stop of cell metabolism is a pre-requisite for accurate intracellular metabolite analysis. After quenching, metabolites need to be extracted from the intracellular compartment. The choice of the most suitable metabolite extraction method/s is another crucial step. The literature indicates that specific classes of metabolites are better extracted by different extraction protocols. In this review, we discuss the technical aspects and advancements of quenching and extraction of intracellular metabolite analysis from microbial cells.

  7. Investigation of the Study Characteristics Affecting Clinical Trial Quality Using the Protocol Deviations Leading to Exclusion of Subjects From the Per Protocol Set Data in Studies for New Drug Application: A Retrospective Analysis.

    Science.gov (United States)

    Kohara, Norihito; Kaneko, Masayuki; Narukawa, Mamoru

    2018-01-01

    The concept of the risk-based approach has been introduced as an effort to secure the quality of clinical trials. In the risk-based approach, identification and evaluation of risk in advance are considered important. For recently completed clinical trials, we investigated the relationship between study characteristics and protocol deviations leading to the exclusion of subjects from Per Protocol Set (PPS) efficacy analysis. New drugs approved in Japan in the fiscal year 2014-2015 were targeted in the research. The reasons for excluding subjects from the PPS efficacy analysis were described in 102 trials out of 492 in the summary of new drug application documents, which was publicly disclosed after the drug's regulatory approval. The author extracted these reasons along with the numbers of the cases and the study characteristics of each clinical trial. Then, the direct comparison, univariate regression analysis, and multivariate regression analysis was carried out based on the exclusion rate. The study characteristics for which exclusion of subjects from the PPS efficacy analysis were frequently observed was multiregional clinical trials in study region; inhalant and external use in administration route; Anti-infective for systemic use; Respiratory system, Dermatologicals, and Nervous system in therapeutic drug under the Anatomical Therapeutic Chemical Classification. In the multivariate regression analysis, the clinical trial variables of inhalant, Respiratory system, or Dermatologicals were selected as study characteristics leading to a higher exclusion rate. The characteristics of the clinical trial that is likely to cause protocol deviations that will affect efficacy analysis were suggested. These studies should be considered for specific attention and priority observation in the trial protocol or its monitoring plan and execution, such as a clear description of inclusion/exclusion criteria in the protocol, development of training materials to site staff, and

  8. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  9. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  10. Immunocytochemical methods and protocols

    National Research Council Canada - National Science Library

    Javois, Lorette C

    1999-01-01

    ... monoclonal antibodies to study cell differentiation during embryonic development. For a select few disciplines volumes have been published focusing on the specific application of immunocytochemical techniques to that discipline. What distinguished Immunocytochemical Methods and Protocols from earlier books when it was first published four years ago was i...

  11. A Clustering Routing Protocol for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jinke Huang

    2016-01-01

    Full Text Available The dynamic topology of a mobile ad hoc network poses a real challenge in the design of hierarchical routing protocol, which combines proactive with reactive routing protocols and takes advantages of both. And as an essential technique of hierarchical routing protocol, clustering of nodes provides an efficient method of establishing a hierarchical structure in mobile ad hoc networks. In this paper, we designed a novel clustering algorithm and a corresponding hierarchical routing protocol for large-scale mobile ad hoc networks. Each cluster is composed of a cluster head, several cluster gateway nodes, several cluster guest nodes, and other cluster members. The proposed routing protocol uses proactive protocol between nodes within individual clusters and reactive protocol between clusters. Simulation results show that the proposed clustering algorithm and hierarchical routing protocol provide superior performance with several advantages over existing clustering algorithm and routing protocol, respectively.

  12. Analysis of 213 currently used rehabilitation protocols in foot and ankle fractures.

    Science.gov (United States)

    Pfeifer, Christian G; Grechenig, Stephan; Frankewycz, Borys; Ernstberger, Antonio; Nerlich, Michael; Krutsch, Werner

    2015-10-01

    Fractures of the ankle, hind- and midfoot are amongst the five most common fractures. Besides initial operative or non-operative treatment, rehabilitation of the patients plays a crucial role for fracture union and long term functional outcome. Limited evidence is available with regard to what a rehabilitation regimen should include and what guidelines should be in place for the initial clinical course of these patients. This study therefore investigated the current rehabilitation concepts after fractures of the ankle, hind- and midfoot. Written rehabilitation protocols provided by orthopedic and trauma surgery institutions in terms of recommendations for weight bearing, range of motion (ROM), physiotherapy and choice of orthosis were screened and analysed. All protocols for lateral ankle fractures type AO 44A1, AO 44B1 and AO 44C1, for calcaneal fractures and fractures of the metatarsal as well as other not specific were included. Descriptive analysis was carried out and statistical analysis applied where appropriate. 209 rehabilitation protocols for ankle fractures type AO 44B1 and AO 44C1, 98 for AO 44A1, 193 for metatarsal fractures, 142 for calcaneal fractures, 107 for 5(th) metatarsal base fractures and 70 for 5(th) metatarsal Jones fractures were evaluated. The mean time recommended for orthosis treatment was 6.04 (SD 0.04) weeks. While the majority of protocols showed a trend towards increased weight bearing and increased ROM over time, the best consensus was noted for weight bearing recommendations. Our study shows that there exists a huge variability in rehabilitation of fractures of the ankle-, hind- and midfoot. This may be contributed to a lack of consensus (e.g. missing publication of guidelines), individualized patient care (e.g. in fragility fractures) or lack of specialization. This study might serve as basis for prospective randomized controlled trials in order to optimize rehabilitation for these common fractures. Copyright © 2015 Elsevier Ltd

  13. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    Science.gov (United States)

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  14. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    International Nuclear Information System (INIS)

    Wang, J; Chan, F; Newman, B; Larson, D; Leung, A; Fleischmann, D; Molvin, L; Marsh, D; Zorich, C; Phillips, L

    2014-01-01

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze the scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds

  15. Securing statically-verified communications protocols against timing attacks

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Hillston, Jane

    2004-01-01

    We present a federated analysis of communication protocols which considers both security properties and timing. These are not entirely independent observations of a protocol; by using timing observations of an executing protocol it is possible to deduce derived information about the nature...... of the communication even in the presence of unbreakable encryption. Our analysis is based on expressing the protocol as a process algebra model and deriving from this process models analysable by the Imperial PEPA Compiler and the LySatool....

  16. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  17. Field protocols for the genomic era

    Directory of Open Access Journals (Sweden)

    N Bulatova

    2009-08-01

    Full Text Available For many decades karyotype was the only source of overall genomic information obtained from species of mammal. However, approaches have been developed in recent years to obtain molecular and ultimately genomic information directly from the extracted DNA of an organism. Molecular data have accumulated hugely for mammalian taxa. The growing volume of studies should motivate field researchers to collect suitable samples for molecular analysis from various species across all their ranges. This is the reason why we here include a molecular sampling procedure within a field work protocol, which also includes more traditional (including cytogenetic techniques. In this way we hope to foster the development of molecular and genomic studies in non-standard wild mammals.

  18. Improvement In MAODV Protocol Using Location Based Routing Protocol

    Directory of Open Access Journals (Sweden)

    Kaur Sharnjeet

    2016-01-01

    Full Text Available Energy saving is difficult in wireless sensor network (WSN due to limited resources. Each node in WSN is constrained by their limited battery power for their energy. The energy is reduced as the time goes off due to the packet transmission and reception. Energy management techniques are necessary to minimize the total power consumption of all the nodes in the network in order to maximize its life span. Our proposed protocol Location based routing (LBR aimed to find a path which utilizes the minimum energy to transmit the packets between the source and the destination. The required energy for the transmission and reception of data is evaluated in MATLAB. LBR is implemented on Multicast Ad hoc On Demand Distance Vector Routing Protocol (MAODV to manage the energy consumption in the transmission and reception of data. Simulation results of LBR show the energy consumption has been reduced.

  19. Perancangan dan Analisis Redistribution Routing Protocol OSPF dan EIGRP

    Directory of Open Access Journals (Sweden)

    DWI ARYANTA

    2014-07-01

    OSPF (Open Shortest Path First and EIGRP (Enhanced Interior Gateway Routing Protocol are two routing protocols are widely used in computer networks. Differences between the characteristics of routing protocols pose a problem in the delivery of data packets. Redistribution technique is the solution for communication between routing protocols. By using the software Cisco Packet Tracer 5.3 in this study were made simulating OSPF and EIGRP redistribution linked by technique, then compared its quality with a single EIGRP and OSPF routing protocols. Testing parameters in this study is the value of the time delay and trace route. Value trace route based on direct calculation of cost and metric compared with the simulation results. The result can be OSPF and EIGRP redistribution process. Value delay redistribution 1% better than OSPF and EIGRP 2-3% under traffic density dependent. In calculating the trace route redistribution is done 2 calculations, the cost for OSPF area and the area of the EIGRP metric. Making primary and alternate paths based on the packet delivery rate and the cost of the smallest metric, it is proved by calculation and simulation. Keywords: OSPF, EIGRP, Redistribution, Delay, Cost, Metric.

  20. Performance analysis of simultaneous dense coding protocol under decoherence

    Science.gov (United States)

    Huang, Zhiming; Zhang, Cai; Situ, Haozhen

    2017-09-01

    The simultaneous dense coding (SDC) protocol is useful in designing quantum protocols. We analyze the performance of the SDC protocol under the influence of noisy quantum channels. Six kinds of paradigmatic Markovian noise along with one kind of non-Markovian noise are considered. The joint success probability of both receivers and the success probabilities of one receiver are calculated for three different locking operators. Some interesting properties have been found, such as invariance and symmetry. Among the three locking operators we consider, the SWAP gate is most resistant to noise and results in the same success probabilities for both receivers.

  1. A critical analysis of a locally agreed protocol for clinical practice

    International Nuclear Information System (INIS)

    Owen, A.; Hogg, P.; Nightingale, J.

    2004-01-01

    Within the traditional scope of radiographic practice (including advanced practice) there is a need to demonstrate effective patient care and management. Such practice should be set within a context of appropriate evidence and should also reflect peer practice. In order to achieve such practice the use of protocols is encouraged. Effective protocols can maximise care and management by minimising inter- and intra-professional variation; they can also allow for detailed procedural records to be kept in case of legal claims. However, whilst literature exists to encourage the use of protocols there is little published material available to indicate how to create, manage and archive them. This article uses an analytical approach to propose a suitable method for protocol creation and archival, it also offers suggestions on the scope and content of a protocol. To achieve this an existing clinical protocol for radiographer reporting barium enemas is analysed to draw out the general issues. Proposals for protocol creation, management, and archival were identified. The clinical practice described or inferred in the protocol should be drawn from evidence, such evidence could include peer-reviewed material, national standards and peer practice. The protocol should include an explanation of how to proceed when the radiographers reach the limit of their ability. It should refer to the initial training required to undertake the clinical duties as well as the on-going continual professional updating required to maintain competence. Audit of practice should be indicated, including the preferred audit methodology, and associated with this should be a clear statement about standards and what to do if standards are not adequately met. Protocols should be archived, in a paper-based form, for lengthy periods in case of legal claims. On the archived protocol the date it was in clinical use should be included

  2. RFID protocol design, optimization, and security for the Internet of Things

    CERN Document Server

    Liu, Alex X; Liu, Xiulong; Li, Keqiu

    2017-01-01

    This book covers the topic of RFID protocol design and optimization and the authors aim to demystify complicated RFID protocols and explain in depth the principles, techniques, and practices in designing and optimizing them.

  3. Organ donation in the ICU: A document analysis of institutional policies, protocols, and order sets.

    Science.gov (United States)

    Oczkowski, Simon J W; Centofanti, John E; Durepos, Pamela; Arseneau, Erika; Kelecevic, Julija; Cook, Deborah J; Meade, Maureen O

    2018-04-01

    To better understand how local policies influence organ donation rates. We conducted a document analysis of our ICU organ donation policies, protocols and order sets. We used a systematic search of our institution's policy library to identify documents related to organ donation. We used Mindnode software to create a publication timeline, basic statistics to describe document characteristics, and qualitative content analysis to extract document themes. Documents were retrieved from Hamilton Health Sciences, an academic hospital system with a high volume of organ donation, from database inception to October 2015. We retrieved 12 active organ donation documents, including six protocols, two policies, two order sets, and two unclassified documents, a majority (75%) after the introduction of donation after circulatory death in 2006. Four major themes emerged: organ donation process, quality of care, patient and family-centred care, and the role of the institution. These themes indicate areas where documented institutional standards may be beneficial. Further research is necessary to determine the relationship of local policies, protocols, and order sets to actual organ donation practices, and to identify barriers and facilitators to improving donation rates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. The UMTS-AKA Protocols for Intelligent Transportation Systems

    Directory of Open Access Journals (Sweden)

    Hwang Min-Shiang

    2009-01-01

    Full Text Available The integration of communication protocols into transport systems is a much adored research area today. Much of seminal work has been reported on the topic of intelligent transportation systems (ITS in the recent years. Many advanced techniques have been garnered to improve online communication and to promote the security, comfort, and efficiency of ITS. Of primary importance to the effective application of ITS is the communication protocol used. A fascinating development is that the yesterday's Global System for Mobile Communication protocol is being replaced by the Universal Mobile Telecommunication System protocol, which is the third-generation mobile technology. This article attempts to identify a suitable communication system for ITS applications. It is impracticable to substantially modify the original UMTS-IMS-AKA protocol which is in practice because it can disturb the operation of the current system, and thus we explore other possibilities through this research. We investigate a novel protocol to make the original UMTS-IMS-AKA protocol compliant with ITS as well as adaptable into the current UMTS protocol.

  5. Bone Healing Around Dental Implants: Simplified vs Conventional Drilling Protocols at Speed of 400 rpm.

    Science.gov (United States)

    Gil, Luiz Fernando; Sarendranath, Alvin; Neiva, Rodrigo; Marão, Heloisa F; Tovar, Nick; Bonfante, Estevam A; Janal, Malvin N; Castellano, Arthur; Coelho, Paulo G

    This study evaluated whether simplified drilling protocols would provide comparable histologic and histomorphometric results to conventional drilling protocols at a low rotational speed. A total of 48 alumina-blasted and acid-etched Ti-6Al-4V implants with two diameters (3.75 and 4.2 mm, n = 24 per group) were bilaterally placed in the tibiae of 12 dogs, under a low-speed protocol (400 rpm). Within the same diameter group, half of the implants were inserted after a simplified drilling procedure (pilot drill + final diameter drill), and the other half were placed using the conventional drilling procedure. After 3 and 5 weeks, the animals were euthanized, and the retrieved bone-implant samples were subjected to nondecalcified histologic sectioning. Histomorphology, bone-to-implant contact (BIC), and bone area fraction occupancy (BAFO) analysis were performed. Histology showed that new bone was formed around implants, and inflammation or bone resorption was not evident for both groups. Histomorphometrically, when all independent variables were collapsed over drilling technique, no differences were detected for BIC and BAFO; when drilling technique was analyzed as a function of time, the conventional groups reached statistically higher BIC and BAFO at 3 weeks, but comparable values between techniques were observed at 5 weeks; 4.2-mm implants obtained statistically higher BAFO relative to 3.75-mm implants. Based on the present methodology, the conventional technique improved bone formation at 3 weeks, and narrower implants were associated with less bone formation.

  6. Bidirectional Quantum Secure Direct Communication Network Protocol with Hyperentanglement

    International Nuclear Information System (INIS)

    Gu Bin; Chen Yulin; Huang Yugai; Fang Xia

    2011-01-01

    We propose a bidirectional quantum secure direct communication (QSDC) network protocol with the hyperentanglment in both the spatial-mode ad the polarization degrees of freedom of photon pairs which can in principle be produced with a beta barium borate crystal. The secret message can be encoded on the photon pairs with unitary operations in these two degrees of freedom independently. Compared with other QSDC network protocols, our QSDC network protocol has a higher capacity as each photon pair can carry 4 bits of information. Also, we discuss the security of our QSDC network protocol and its feasibility with current techniques. (general)

  7. Design and analysis of communication protocols for quantum repeater networks

    International Nuclear Information System (INIS)

    Jones, Cody; Kim, Danny; Rakher, Matthew T; Ladd, Thaddeus D; Kwiat, Paul G

    2016-01-01

    We analyze how the performance of a quantum-repeater network depends on the protocol employed to distribute entanglement, and we find that the choice of repeater-to-repeater link protocol has a profound impact on entanglement-distribution rate as a function of hardware parameters. We develop numerical simulations of quantum networks using different protocols, where the repeater hardware is modeled in terms of key performance parameters, such as photon generation rate and collection efficiency. These parameters are motivated by recent experimental demonstrations in quantum dots, trapped ions, and nitrogen-vacancy centers in diamond. We find that a quantum-dot repeater with the newest protocol (‘MidpointSource’) delivers the highest entanglement-distribution rate for typical cases where there is low probability of establishing entanglement per transmission, and in some cases the rate is orders of magnitude higher than other schemes. Our simulation tools can be used to evaluate communication protocols as part of designing a large-scale quantum network. (paper)

  8. Performance Analysis of Secure and Private Billing Protocols for Smart Metering

    Directory of Open Access Journals (Sweden)

    Tom Eccles

    2017-11-01

    Full Text Available Traditional utility metering is to be replaced by smart metering. Smart metering enables fine-grained utility consumption measurements. These fine-grained measurements raise privacy concerns due to the lifestyle information which can be inferred from the precise time at which utilities were consumed. This paper outlines and compares two privacy-respecting time of use billing protocols for smart metering and investigates their performance on a variety of hardware. These protocols protect the privacy of customers by never transmitting the fine-grained utility readings outside of the customer’s home network. One protocol favors complexity on the trusted smart meter hardware while the other uses homomorphic commitments to offload computation to a third device. Both protocols are designed to operate on top of existing cryptographic secure channel protocols in place on smart meters. Proof of concept software implementations of these protocols have been written and their suitability for real world application to low-performance smart meter hardware is discussed. These protocols may also have application to other privacy conscious aggregation systems, such as electronic voting.

  9. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  10. An optimized DNA extraction protocol for benthic Didymosphenia geminata.

    Science.gov (United States)

    Uyua, Noelia Mariel; Manrique, Julieta Marina; Jones, Leandro Roberto

    2014-09-01

    Didymosphenia geminata mats display few cells in relation to extracellular material and contain polysaccharides and heavy metals that interfere with molecular studies. We describe an optimized DNA extraction protocol that help to overcome these difficulties. Our protocol outperformed five previously described DNA extraction techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Analysis of agreement between cardiac risk stratification protocols applied to participants of a center for cardiac rehabilitation

    Directory of Open Access Journals (Sweden)

    Ana A. S. Santos

    2016-01-01

    Full Text Available ABSTRACT Background Cardiac risk stratification is related to the risk of the occurrence of events induced by exercise. Despite the existence of several protocols to calculate risk stratification, studies indicating that there is similarity between these protocols are still unknown. Objective To evaluate the agreement between the existing protocols on cardiac risk rating in cardiac patients. Method The records of 50 patients from a cardiac rehabilitation program were analyzed, from which the following information was extracted: age, sex, weight, height, clinical diagnosis, medical history, risk factors, associated diseases, and the results from the most recent laboratory and complementary tests performed. This information was used for risk stratification of the patients in the protocols of the American College of Sports Medicine, the Brazilian Society of Cardiology, the American Heart Association, the protocol designed by Frederic J. Pashkow, the American Association of Cardiovascular and Pulmonary Rehabilitation, the Société Française de Cardiologie, and the Sociedad Española de Cardiología. Descriptive statistics were used to characterize the sample and the analysis of agreement between the protocols was calculated using the Kappa coefficient. Differences were considered with a significance level of 5%. Results Of the 21 analyses of agreement, 12 were considered significant between the protocols used for risk classification, with nine classified as moderate and three as low. No agreements were classified as excellent. Different proportions were observed in each risk category, with significant differences between the protocols for all risk categories. Conclusion The agreements between the protocols were considered low and moderate and the risk proportions differed between protocols.

  12. Design and Research of a New secure Authentication Protocol in GSM networks

    Directory of Open Access Journals (Sweden)

    Qi Ai-qin

    2016-01-01

    Full Text Available As the first line of defense in the security application system, Authentication is an important security service. Its typical scheme is challenge/response mechanism and this scheme which is simple-structured and easy to realize has been used worldwide. But these protocols have many following problems In the GSM networks such as the leakage of user indentity privacy, no security protection between home registers and foreign registers and the vicious intruders’ information stealing and so on. This paper presents an authentication protocol in GSM networks based on maths operation and modular square root technique . The analysis of the security and performance has also been done. The results show that it is more robust and secure compared to the previous agreements.

  13. A simple protocol for NMR analysis of the enantiomeric purity of chiral hydroxylamines.

    Science.gov (United States)

    Tickell, David A; Mahon, Mary F; Bull, Steven D; James, Tony D

    2013-02-15

    A practically simple three-component chiral derivatization protocol for determining the enantiopurity of chiral hydroxylamines by (1)H NMR spectroscopic analysis is described, involving their treatment with 2-formylphenylboronic acid and enantiopure BINOL to afford a mixture of diastereomeric nitrono-boronate esters whose ratio is an accurate reflection of the enantiopurity of the parent hydroxylamine.

  14. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  15. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  16. Performance analysis of signaling protocols on OBS switches

    Science.gov (United States)

    Kirci, Pinar; Zaim, A. Halim

    2005-10-01

    In this paper, Just-In-Time (JIT), Just-Enough-Time (JET) and Horizon signalling schemes for Optical Burst Switched Networks (OBS) are presented. These signaling schemes run over a core dWDM network and a network architecture based on Optical Burst Switches (OBS) is proposed to support IP, ATM and Burst traffic. In IP and ATM traffic several packets are assembled in a single packet called burst and the burst contention is handled by burst dropping. The burst length distribution in IP traffic is arbitrary between 0 and 1, and is fixed in ATM traffic at 0,5. Burst traffic on the other hand is arbitrary between 1 and 5. The Setup and Setup ack length distributions are arbitrary. We apply the Poisson model with rate λ and Self-Similar model with pareto distribution rate α to identify inter-arrival times in these protocols. We consider a communication between a source client node and a destination client node over an ingress and one or more multiple intermediate switches.We use buffering only in the ingress node. The communication is based on single burst connections in which, the connection is set up just before sending a burst and then closed as soon as the burst is sent. Our analysis accounts for several important parameters, including the burst setup, burst setup ack, keepalive messages and the optical switching protocol. We compare the performance of the three signalling schemes on the network under as burst dropping probability under a range of network scenarios.

  17. An expert protocol for immunofluorescent detection of calcium channels in tsA-201 cells.

    Science.gov (United States)

    Koch, Peter; Herzig, Stefan; Matthes, Jan

    Pore-forming subunits of voltage gated calcium channels (VGCC) are large membrane proteins (260kDa) containing 24 transmembrane domains. Despite transfection with viral promoter driven vectors, biochemical analysis of VGCC is often hampered by rather low expression levels in heterologous systems rendering VGCC challenging targets. Especially in immunofluorescent detection, calcium channels are demanding proteins. We provide an expert step-by-step protocol with adapted conditions for handling procedures (tsA-201 cell culture, transient transfection, incubation time and temperature at 28°C or 37°C and immunostaining) to address the L-type calcium-channel pore Ca v 1.2 in an immunofluorescent approach. We performed immunocytochemical analysis of Ca v 1.2 expression at single-cell level in combination with detection of different markers for cellular organelles. We show confluency levels and shapes of tsA-201 cells at different time points during an experiment. Our experiments reveal sufficient levels of Ca v 1.2 protein and a correct Ca v 1.2 expression pattern in polygonal shaped cells already 12h after transfection. A sequence of elaborated protocol modifications allows subcellular localization analysis of Ca v 1.2 in an immunocytochemical approach. We provide a protocol that may be used to achieve insights into physiological and pathophysiological processes involving voltage gated calcium channels. Our protocol may be used for expression analysis of other challenging proteins and efficient overexpression may be exploited in related biochemical techniques requiring immunolabels. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. A novel porcine cell culture based protocol for the propagation of hepatitis E virus

    Directory of Open Access Journals (Sweden)

    Walter Chingwaru

    2016-08-01

    Full Text Available Objective: To present a comprehensive protocol for the processing of hepatitis E virus (HEV infected samples and propagation of the virus in primary cell cultures. Methods: Hepatitis E was extracted from porcine liver and faecal samples following standard protocols. The virus was then allowed to attach in the presence of trypsin to primary cells that included porcine and bovine intestinal epithelial cells and macrophages over a period of up to 3 h. The virus was propagated by rotational passaging through the cell cultures. Propagation was confirmed by immunoblotting. Results: We developed a comprehensive protocol to propagate HEV in porcine cell model that includes (i rotational culturing of the virus between porcine cell types, (ii pre-incubation of infected cells for 210 min, (iii use of a semi-complete cell culture medium supplemented with trypsin (0.33 µg/mL and (iv the use of simple immunoblot technique to detect the amplified virus based on the open reading frame 2/3. Conclusions: This protocol opens doors towards systematic analysis of the mechanisms that underlie the pathogenesis of HEV in vitro. Using our protocol, one can complete the propagation process within 6 to 9 d.

  19. SU-E-I-68: Practical Considerations On Implementation of the Image Gently Pediatric CT Protocols

    International Nuclear Information System (INIS)

    Zhang, J; Adams, C; Lumby, C; Dillon, J; Woods, E; Richer, E

    2014-01-01

    Purpose: One limitation associated with the Image Gently pediatric CT protocols is practical implementation of the recommended manual techniques. Inconsistency as a result of different practice is a possibility among technologist. An additional concern is the added risk of data error that would result in over or underexposure. The Automatic Exposure Control (AEC) features automatically reduce radiation for children. However, they do not work efficiently for the patients of very small size and relative large size. This study aims to implement the Image Gently pediatric CT protocols in the practical setting while maintaining the use of AEC features for pediatric patients of varying size. Methods: Anthropomorphological abdomen phantoms were scanned in a CT scanner using the Image Gently pediatric protocols, the AEC technique with a fixed adult baseline, and automatic protocols with various baselines. The baselines were adjusted corresponding to patient age, weight and posterioranterior thickness to match the Image Gently pediatric CT manual techniques. CTDIvol was recorded for each examination. Image noise was measured and recorded for image quality comparison. Clinical images were evaluated by pediatric radiologists. Results: By adjusting vendor default baselines used in the automatic techniques, radiation dose and image quality can match those of the Image Gently manual techniques. In practice, this can be achieved by dividing pediatric patients into three major groups for technologist reference: infant, small child, and large child. Further division can be done but will increase the number of CT protocols. For each group, AEC can efficiently adjust acquisition techniques for children. This implementation significantly overcomes the limitation of the Image Gently manual techniques. Conclusion: Considering the effectiveness in clinical practice, Image Gently Pediatric CT protocols can be implemented in accordance with AEC techniques, with adjusted baselines, to

  20. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  1. Performance Evaluation of TDMA Medium Access Control Protocol in Cognitive Wireless Networks

    Directory of Open Access Journals (Sweden)

    Muhammed Enes Bayrakdar

    2017-02-01

    Full Text Available Cognitive radio paradigm has been revealed as a new communication technology that shares channels in wireless networks. Channel assignment is a crucial issue in the field of cognitive wireless networks because of the spectrum scarcity. In this work, we have evaluated the performance of TDMA medium access control protocol. In our simulation scenarios, primary users and secondary users utilize TDMA as a medium access control protocol. We have designed a network environment in Riverbed simulation software that consists of primary users, secondary users, and base stations. In our system model, secondary users sense the spectrum and inform the base station about empty channels. Then, the base station decides accordingly which secondary user may utilize the empty channel. Energy detection technique is employed as a spectrum sensing technique because it is the best when information about signal of primary user is acquired. Besides, different number of users is selected in simulation scenarios in order to obtain accurate delay and throughput results. Comparing analytical model with simulation results, we have shown that performance analysis of our system model is consistent and accurate.

  2. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  3. Dynamic Aggregation Protocol for Wireless Sensor Networks

    OpenAIRE

    Mounir Said , Adel; William Ibrahim , Ashraf; Soua , Ahmed; Afifi , Hossam

    2013-01-01

    International audience; Sensor networks suffer from limited capabilities such as bandwidth, low processing power, and memory size. There is therefore a need for protocols that deliver sensor data in an energy-efficient way to the sink. One of those techniques, it gathers sensors' data in a small size packet suitable for transmission. In this paper, we propose a new Effective Data Aggregation Protocol (DAP) to reduce the energy consumption in Wireless Sensor Networks (WSNs), which prolongs the...

  4. Optimization of oligonucleotide arrays and RNA amplification protocols for analysis of transcript structure and alternative splicing.

    Science.gov (United States)

    Castle, John; Garrett-Engele, Phil; Armour, Christopher D; Duenwald, Sven J; Loerch, Patrick M; Meyer, Michael R; Schadt, Eric E; Stoughton, Roland; Parrish, Mark L; Shoemaker, Daniel D; Johnson, Jason M

    2003-01-01

    Microarrays offer a high-resolution means for monitoring pre-mRNA splicing on a genomic scale. We have developed a novel, unbiased amplification protocol that permits labeling of entire transcripts. Also, hybridization conditions, probe characteristics, and analysis algorithms were optimized for detection of exons, exon-intron edges, and exon junctions. These optimized protocols can be used to detect small variations and isoform mixtures, map the tissue specificity of known human alternative isoforms, and provide a robust, scalable platform for high-throughput discovery of alternative splicing.

  5. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  6. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  7. Comparative Analysis between Podography and Radiography in the Management of Idiopathic Clubfeet by Ponseti Technique.

    Science.gov (United States)

    Trivedi, Vikas; Badhwar, Sumit; Dube, Abhay S

    2017-02-01

    Idiopathic clubfoot is one of the most common and oldest congenital foot anomalies. There are controversies regarding its optimum management protocol and methodologies to be employed for its functional outcome evaluation. This paper attempts to propose a simple, reasonable and easily reproducible technique of podography for clinical and functional evaluation of clubfoot treated by the popular Ponseti technique. To compare the Foot Bimalleolar (FBM) angle method (podography) and radiography with respect to management of idiopathic clubfoot by Ponseti's Technique and its functional evaluation. Sixty feet of 48 patients with idiopathic clubfoot deformity were assessed in terms of FBM by podography (foot print on paper and FBM angle drawing) and radiologically; before starting treatment, after 6 weeks and at 6 monthly intervals with a maximum follow up period of 4.8 years (Range 1.2 to 4.8 years). Mean age at start of treatment was 1.5 years (2 months to 2.5 years). Functional evaluation was done by Magone's scoring system. After treatment, 92 percent patients had good correction (FBM greater than 70 degrees) which correlated well with post treatment Magone's score of greater than 80 (good to excellent) in nearly 85 percent of cases. Radiologically, talocalcaneal angles in both the views improved in only 60 percent of cases. Radiological criteria show inconsistent correlation with functional outcome for feet treated by Ponseti's Technique. Podography (FBM angle analysis) is a very simple, objective, cost effective, radiation free, easily reproducible and highly reliable clinical criterion for the assessment of deformity correction in club foot by Ponseti's Technique with an excellent correlation with functional outcome.

  8. Evaluation of condyle defects using different reconstruction protocols of cone-beam computed tomography

    International Nuclear Information System (INIS)

    Bastos, Luana Costa; Campos, Paulo Sergio Flores; Ramos-Perez, Flavia Maria de Moraes; Pontual, Andrea dos Anjos; Almeida, Solange Maria

    2013-01-01

    This study was conducted to investigate how well cone-beam computed tomography (CBCT) can detect simulated cavitary defects in condyles, and to test the influence of the reconstruction protocols. Defects were created with spherical diamond burs (numbers 1013, 1016, 3017) in superior and / or posterior surfaces of twenty condyles. The condyles were scanned, and cross-sectional reconstructions were performed with nine different protocols, based on slice thickness (0.2, 0.6, 1.0 mm) and on the filters (original image, Sharpen Mild, S9) used. Two observers evaluated the defects, determining their presence and location. Statistical analysis was carried out using simple Kappa coefficient and McNemar’s test to check inter- and intra-rater reliability. The chi-square test was used to compare the rater accuracy. Analysis of variance (Tukey's test) assessed the effect of the protocols used. Kappa values for inter- and intra-rater reliability demonstrate almost perfect agreement. The proportion of correct answers was significantly higher than that of errors for cavitary defects on both condyle surfaces (p < 0.01). Only in identifying the defects located on the posterior surface was it possible to observe the influence of the 1.0 mm protocol thickness and no filter, which showed a significantly lower value. Based on the results of the current study, the technique used was valid for identifying the existence of cavities in the condyle surface. However, the protocol of a 1.0 mm-thick slice and no filter proved to be the worst method for identifying the defects on the posterior surface. (author)

  9. Evaluation of condyle defects using different reconstruction protocols of cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Bastos, Luana Costa; Campos, Paulo Sergio Flores, E-mail: bastosluana@ymail.com [Universidade Federal da Bahia (UFBA), Salvador, BA (Brazil). Fac. de Odontologia. Dept. de Radiologia Oral e Maxilofacial; Ramos-Perez, Flavia Maria de Moraes [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Fac. de Odontologia. Dept. de Clinica e Odontologia Preventiva; Pontual, Andrea dos Anjos [Universidade Federal de Pernambuco (UFPE), Camaragibe, PE (Brazil). Fac. de Odontologia. Dept. de Radiologia Oral; Almeida, Solange Maria [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Fac. de Odontologia. Dept. de Radiologia Oral

    2013-11-15

    This study was conducted to investigate how well cone-beam computed tomography (CBCT) can detect simulated cavitary defects in condyles, and to test the influence of the reconstruction protocols. Defects were created with spherical diamond burs (numbers 1013, 1016, 3017) in superior and / or posterior surfaces of twenty condyles. The condyles were scanned, and cross-sectional reconstructions were performed with nine different protocols, based on slice thickness (0.2, 0.6, 1.0 mm) and on the filters (original image, Sharpen Mild, S9) used. Two observers evaluated the defects, determining their presence and location. Statistical analysis was carried out using simple Kappa coefficient and McNemar’s test to check inter- and intra-rater reliability. The chi-square test was used to compare the rater accuracy. Analysis of variance (Tukey's test) assessed the effect of the protocols used. Kappa values for inter- and intra-rater reliability demonstrate almost perfect agreement. The proportion of correct answers was significantly higher than that of errors for cavitary defects on both condyle surfaces (p < 0.01). Only in identifying the defects located on the posterior surface was it possible to observe the influence of the 1.0 mm protocol thickness and no filter, which showed a significantly lower value. Based on the results of the current study, the technique used was valid for identifying the existence of cavities in the condyle surface. However, the protocol of a 1.0 mm-thick slice and no filter proved to be the worst method for identifying the defects on the posterior surface. (author)

  10. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2012-09-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  11. Analysis of Security Protocols for Mobile Healthcare.

    Science.gov (United States)

    Wazid, Mohammad; Zeadally, Sherali; Das, Ashok Kumar; Odelu, Vanga

    2016-11-01

    Mobile Healthcare (mHealth) continues to improve because of significant improvements and the decreasing costs of Information Communication Technologies (ICTs). mHealth is a medical and public health practice, which is supported by mobile devices (for example, smartphones) and, patient monitoring devices (for example, various types of wearable sensors, etc.). An mHealth system enables healthcare experts and professionals to have ubiquitous access to a patient's health data along with providing any ongoing medical treatment at any time, any place, and from any device. It also helps the patient requiring continuous medical monitoring to stay in touch with the appropriate medical staff and healthcare experts remotely. Thus, mHealth has become a major driving force in improving the health of citizens today. First, we discuss the security requirements, issues and threats to the mHealth system. We then present a taxonomy of recently proposed security protocols for mHealth system based on features supported and possible attacks, computation cost and communication cost. Our detailed taxonomy demonstrates the strength and weaknesses of recently proposed security protocols for the mHealth system. Finally, we identify some of the challenges in the area of security protocols for mHealth systems that still need to be addressed in the future to enable cost-effective, secure and robust mHealth systems.

  12. The Virtual Insect Brain protocol: creating and comparing standardized neuroanatomy

    Science.gov (United States)

    Jenett, Arnim; Schindelin, Johannes E; Heisenberg, Martin

    2006-01-01

    Background In the fly Drosophila melanogaster, new genetic, physiological, molecular and behavioral techniques for the functional analysis of the brain are rapidly accumulating. These diverse investigations on the function of the insect brain use gene expression patterns that can be visualized and provide the means for manipulating groups of neurons as a common ground. To take advantage of these patterns one needs to know their typical anatomy. Results This paper describes the Virtual Insect Brain (VIB) protocol, a script suite for the quantitative assessment, comparison, and presentation of neuroanatomical data. It is based on the 3D-reconstruction and visualization software Amira, version 3.x (Mercury Inc.) [1]. Besides its backbone, a standardization procedure which aligns individual 3D images (series of virtual sections obtained by confocal microscopy) to a common coordinate system and computes average intensities for each voxel (volume pixel) the VIB protocol provides an elaborate data management system for data administration. The VIB protocol facilitates direct comparison of gene expression patterns and describes their interindividual variability. It provides volumetry of brain regions and helps to characterize the phenotypes of brain structure mutants. Using the VIB protocol does not require any programming skills since all operations are carried out at an intuitively usable graphical user interface. Although the VIB protocol has been developed for the standardization of Drosophila neuroanatomy, the program structure can be used for the standardization of other 3D structures as well. Conclusion Standardizing brains and gene expression patterns is a new approach to biological shape and its variability. The VIB protocol provides a first set of tools supporting this endeavor in Drosophila. The script suite is freely available at [2] PMID:17196102

  13. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  14. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  15. Analysis of Pervasive Mobile Ad Hoc Routing Protocols

    Science.gov (United States)

    Qadri, Nadia N.; Liotta, Antonio

    Mobile ad hoc networks (MANETs) are a fundamental element of pervasive networks and therefore, of pervasive systems that truly support pervasive computing, where user can communicate anywhere, anytime and on-the-fly. In fact, future advances in pervasive computing rely on advancements in mobile communication, which includes both infrastructure-based wireless networks and non-infrastructure-based MANETs. MANETs introduce a new communication paradigm, which does not require a fixed infrastructure - they rely on wireless terminals for routing and transport services. Due to highly dynamic topology, absence of established infrastructure for centralized administration, bandwidth constrained wireless links, and limited resources in MANETs, it is challenging to design an efficient and reliable routing protocol. This chapter reviews the key studies carried out so far on the performance of mobile ad hoc routing protocols. We discuss performance issues and metrics required for the evaluation of ad hoc routing protocols. This leads to a survey of existing work, which captures the performance of ad hoc routing algorithms and their behaviour from different perspectives and highlights avenues for future research.

  16. An Optimal Non-Interactive Message Authentication Protocol

    OpenAIRE

    Pasini, Sylvain; Vaudenay, Serge

    2006-01-01

    Vaudenay recently proposed a message authentication protocol which is interactive and based on short authenticated strings (SAS). We study here SAS-based non-interactive message authentication protocols (NIMAP). We start by the analysis of two popular non-interactive message authentication protocols. The first one is based on a collision-resistant hash function and was presented by Balfanz et al. The second protocol is based on a universal hash function family and was proposed by Gehrmann, Mi...

  17. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  18. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  19. Analiza protokola kvaliteta usluga telekomunikacionih mreža / Analysis of quality of service protocols in telecommunication networks

    Directory of Open Access Journals (Sweden)

    Milojko Jevtović

    2003-05-01

    Full Text Available Protokoli kvaliteta usluga (Quality of Service - QoS sadašnjih i budućih telekomunikacionih mreža razvijeni su, pored ostalog, sa ciljem da podrže različite klase usluga (Class of Service - CoS komunikaciju u realnom vremenu, kao i prenos multimedijalnih poruka preko paketskih IP (Internet Protocol mreža. U raduje dat pregled karakteristika tih protokola i ocena njihovih konkretnih mogućnosti u obezbeđenju kvaliteta usluga unutar sistema ('s vrha do dna', tj. vertikalno u OSI arhitekturi kao i 'horizontalno' odnosno s kraja na kraj veze, tj. između izvora i odredišta. / Today's and future telecommunication networks must enable transmission throughout heterogeneous environment, using different Quality of Service protocols, Quality of Service protocols use a variety of complementary mechanisms to enable deterministic end-to-end different data delivery. The analysis of these protocols and their efficiency in providing QoS and CoS has been given in this paper.

  20. The interventional effect of new drugs combined with the Stupp protocol on glioblastoma: A network meta-analysis.

    Science.gov (United States)

    Li, Mei; Song, Xiangqi; Zhu, Jun; Fu, Aijun; Li, Jianmin; Chen, Tong

    2017-08-01

    New therapeutic agents in combination with the standard Stupp protocol (a protocol about the temozolomide combined with radiotherapy treatment with glioblastoma was research by Stupp R in 2005) were assessed to evaluate whether they were superior to the Stupp protocol alone, to determine the optimum treatment regimen for patients with newly diagnosed glioblastoma. We implemented a search strategy to identify studies in the following databases: PubMed, Cochrane Library, EMBASE, CNKI, CBM, Wanfang, and VIP, and assessed the quality of extracted data from the trials included. Statistical software was used to perform network meta-analysis. The use of novel therapeutic agents in combination with the Stupp protocol were all shown to be superior than the Stupp protocol alone for the treatment of newly diagnosed glioblastoma, ranked as follows: cilengitide 2000mg/5/week, bevacizumab in combination with irinotecan, nimotuzumab, bevacizumab, cilengitide 2000mg/2/week, cytokine-induced killer cell immunotherapy, and the Stupp protocol. In terms of serious adverse effects, the intervention group showed a 29% increase in the incidence of adverse events compared with the control group (patients treated only with Stupp protocol) with a statistically significant difference (RR=1.29; 95%CI 1.17-1.43; P<0.001). The most common adverse events were thrombocytopenia, lymphopenia, neutropenia, pneumonia, nausea, and vomiting, none of which were significantly different between the groups except for neutropenia, pneumonia, and embolism. All intervention drugs evaluated in our study were superior to the Stupp protocol alone when used in combination with it. However, we could not conclusively confirm whether cilengitide 2000mg/5/week was the optimum regime, as only one trial using this protocol was included in our study. Copyright © 2017. Published by Elsevier B.V.

  1. Families of quantum fingerprinting protocols

    Science.gov (United States)

    Lovitz, Benjamin; Lütkenhaus, Norbert

    2018-03-01

    We introduce several families of quantum fingerprinting protocols to evaluate the equality function on two n -bit strings in the simultaneous message passing model. The original quantum fingerprinting protocol uses a tensor product of a small number of O (logn ) -qubit high-dimensional signals [H. Buhrman et al., Phys. Rev. Lett. 87, 167902 (2001), 10.1103/PhysRevLett.87.167902], whereas a recently proposed optical protocol uses a tensor product of O (n ) single-qubit signals, while maintaining the O (logn ) information leakage of the original protocol [J. M. Arazola and N. Lütkenhaus, Phys. Rev. A 89, 062305 (2014), 10.1103/PhysRevA.89.062305]. We find a family of protocols which interpolate between the original and optical protocols while maintaining the O (logn ) information leakage, thus demonstrating a tradeoff between the number of signals sent and the dimension of each signal. There has been interest in experimental realization of the recently proposed optical protocol using coherent states [F. Xu et al., Nat. Commun. 6, 8735 (2015), 10.1038/ncomms9735; J.-Y. Guan et al., Phys. Rev. Lett. 116, 240502 (2016), 10.1103/PhysRevLett.116.240502], but as the required number of laser pulses grows linearly with the input size n , eventual challenges for the long-time stability of experimental setups arise. We find a coherent state protocol which reduces the number of signals by a factor 1/2 while also reducing the information leakage. Our reduction makes use of a simple modulation scheme in optical phase space, and we find that more complex modulation schemes are not advantageous. Using a similar technique, we improve a recently proposed coherent state protocol for evaluating the Euclidean distance between two real unit vectors [N. Kumar et al., Phys. Rev. A 95, 032337 (2017), 10.1103/PhysRevA.95.032337] by reducing the number of signals by a factor 1/2 and also reducing the information leakage.

  2. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  3. An Authentication Protocol for Mobile IPTV Users Based on an RFID-USB Convergence Technique

    Science.gov (United States)

    Jeong, Yoon-Su; Kim, Yong-Tae

    With the growing trend towards convergence in broadcast and communications media, Internet Protocol television (IPTV) that delivers real-time multimedia content over diverse types of communications networks (e.g., broadband Internet, cable TV, and satellite TV) has become a mainstream technology. Authenticating mobile IPTV subscribers who are continuously on the move is a challenge. A complex authentication process often impairs conditional access security or service quality as increasing illegal users and delaying service. This paper proposes an RFID-USB authentication protocol, for mobile IPTV users, combined with USIM-based personalized authentication and lightweight authentication that utilizes the RFID-USB technology with an implanted agent module (called an "agent tag") which temporarily enhanced user status information. The proposed authentication protocol adopts a plug-and-play security agent module that is placed in both an RFID tag and an RFID-USB. The implanted security agents cooperate in such a way that multiple RFID tags are connected seamlessly to an RFID-USB.

  4. Protocol for VOC-Arid ID remediation performance characterization

    International Nuclear Information System (INIS)

    Tegner, B.J.; Hassig, N.L.; Last, G.V.

    1994-09-01

    The Volatile Organic Compound-Arid Integrated Demonstration (VOC-Arid ID) is a technology development program sponsored by the US Department of Energy's Office of Technology Development that is targeted to acquire, develop, demonstrate, and deploy new technologies for the remediation of VOC contaminants in the soils and groundwaters of arid DOE sites. Technologies cannot be adequately evaluated unless sufficient site characterization and technology performance data have been collection and analyzed. The responsibility for identifying these data needs has been placed largely on the Principal Investigators (PIs) developing the remediation technology, who usually are not experts in site characterization or in identification of appropriate sampling, analysis, and monitoring techniques to support the field testing. This document provides a protocol for planning the collection of data before, during, and after a test of a new technology. This generic protocol provides the PIs and project managers with a set of steps to follow. The protocol is based on a data collection planning process called the Data Quality Objectives (DQO) process, which was originally developed by the US Environmental Protection Agency and has been expanded by DOE to support site cleanup decisions. The DQO process focuses on the quality and quantity of data required to make decision. Stakeholders to the decisions must negotiate such key inputs to the process as the decision rules that will be used and the acceptable probabilities of making decision errors

  5. Development of oil hydrocarbon fingerprinting and identification techniques

    International Nuclear Information System (INIS)

    Wang Zhendi; Fingas, Merv F.

    2003-01-01

    Oil, refined product, and pyrogenic hydrocarbons are the most frequently discovered contaminants in the environment. To effectively determine the fate of spilled oil in the environment and to successfully identify source(s) of spilled oil and petroleum products is, therefore, extremely important in many oil-related environmental studies and liability cases. This article briefly reviews the recent development of chemical analysis methodologies which are most frequently used in oil spill characterization and identification studies and environmental forensic investigations. The fingerprinting and data interpretation techniques discussed include oil spill identification protocol, tiered analytical approach, generic features and chemical composition of oils, effects of weathering on hydrocarbon fingerprinting, recognition of distribution patterns of petroleum hydrocarbons, oil type screening and differentiation, analysis of 'source-specific marker' compounds, determination of diagnostic ratios of specific oil constituents, stable isotopic analysis, application of various statistical and numerical analysis tools, and application of other analytical techniques. The issue of how biogenic and pyrogenic hydrocarbons are distinguished from petrogenic hydrocarbons is also addressed

  6. A Protocol for the Comprehensive Flow Cytometric Analysis of Immune Cells in Normal and Inflamed Murine Non-Lymphoid Tissues

    Science.gov (United States)

    Yu, Yen-Rei A.; O’Koren, Emily G.; Hotten, Danielle F.; Kan, Matthew J.; Kopin, David; Nelson, Erik R.; Que, Loretta; Gunn, Michael D.

    2016-01-01

    Flow cytometry is used extensively to examine immune cells in non-lymphoid tissues. However, a method of flow cytometric analysis that is both comprehensive and widely applicable has not been described. We developed a protocol for the flow cytometric analysis of non-lymphoid tissues, including methods of tissue preparation, a 10-fluorochrome panel for cell staining, and a standardized gating strategy, that allows the simultaneous identification and quantification of all major immune cell types in a variety of normal and inflamed non-lymphoid tissues. We demonstrate that our basic protocol minimizes cell loss, reliably distinguishes macrophages from dendritic cells (DC), and identifies all major granulocytic and mononuclear phagocytic cell types. This protocol is able to accurately quantify 11 distinct immune cell types, including T cells, B cells, NK cells, neutrophils, eosinophils, inflammatory monocytes, resident monocytes, alveolar macrophages, resident/interstitial macrophages, CD11b- DC, and CD11b+ DC, in normal lung, heart, liver, kidney, intestine, skin, eyes, and mammary gland. We also characterized the expression patterns of several commonly used myeloid and macrophage markers. This basic protocol can be expanded to identify additional cell types such as mast cells, basophils, and plasmacytoid DC, or perform detailed phenotyping of specific cell types. In examining models of primary and metastatic mammary tumors, this protocol allowed the identification of several distinct tumor associated macrophage phenotypes, the appearance of which was highly specific to individual tumor cell lines. This protocol provides a valuable tool to examine immune cell repertoires and follow immune responses in a wide variety of tissues and experimental conditions. PMID:26938654

  7. Analysis of the results of CAT of thorax with bronchiectasis protocol, period 2000-2001 Hospital Calderon Guardia

    International Nuclear Information System (INIS)

    Pacheco Segura, Maureen

    2003-01-01

    This investigation analyses the computerized axial tomography (CAT) of thorax with protocol of bronchiectasis. It was carried out in the Servicio de Radiologia e Imagenes Medicas of the Hospital Calderon Guardia, Costa Rica. The bronchiectasis is the abnormal permanent expansion of the bronchial tuber and is important to diagnose it because the patient can suffer of pulmonary infections, these can be accompanied by bronchial blood flow and hemoptysis. When they are disseminated can be associated with significant obstruction of the aerial tract. When they are gotten into focus can be confused with neoplasia and other diseases. From the bronchiectasis diagnosis it is used methods of image such as x-ray of thorax, bronchography and computerized axial tomography (CAT) of thorax, usually the diagnosis is confirmed by means of a computerized axial tomography (CAT); which is the image of election to establish the presence and extension of the bronchiectasis. In addition, this study analyzes the radiological clinical relation in the patients which were performed the computerized axial tomography (CAT) of thorax with protocol of bronchiectasis and it identifies the most suitable radiological technique to obtain a satisfactory result in the computerized axial tomography with protocol of bronchiectasis [es

  8. Techniques for the thermal/hydraulic analysis of LMFBR check valves

    International Nuclear Information System (INIS)

    Cho, S.M.; Kane, R.S.

    1979-01-01

    A thermal/hydraulic analysis of the check valves in liquid sodium service for LMFBR plants is required to provide temperature data for thermal stress analysis of the valves for specified transient conditions. Because of the complex three-dimensional flow pattern within the valve, the heat transfer analysis techniques for less complicated shapes could not be used. This paper discusses the thermal analysis techniques used to assure that the valve stress analysis is conservative. These techniques include a method for evaluating the recirculating flow patterns and for selecting appropriately conservative heat transfer correlations in various regions of the valve

  9. Database communication protocol analyses and security detection

    International Nuclear Information System (INIS)

    Luo Qun; Liu Qiushi

    2003-01-01

    In this paper we introduced the analysis of TDS protocol in the communication application between Client and Server about SYBASE and MICROSOFT SQL SERVER and do some test for some bugs existed in the protocol. (authors)

  10. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  11. From human monocytes to genome-wide binding sites--a protocol for small amounts of blood: monocyte isolation/ChIP-protocol/library amplification/genome wide computational data analysis.

    Directory of Open Access Journals (Sweden)

    Sebastian Weiterer

    Full Text Available Chromatin immunoprecipitation in combination with a genome-wide analysis via high-throughput sequencing is the state of the art method to gain genome-wide representation of histone modification or transcription factor binding profiles. However, chromatin immunoprecipitation analysis in the context of human experimental samples is limited, especially in the case of blood cells. The typically extremely low yields of precipitated DNA are usually not compatible with library amplification for next generation sequencing. We developed a highly reproducible protocol to present a guideline from the first step of isolating monocytes from a blood sample to analyse the distribution of histone modifications in a genome-wide manner.The protocol describes the whole work flow from isolating monocytes from human blood samples followed by a high-sensitivity and small-scale chromatin immunoprecipitation assay with guidance for generating libraries compatible with next generation sequencing from small amounts of immunoprecipitated DNA.

  12. A Weak Value Based QKD Protocol Robust Against Detector Attacks

    Science.gov (United States)

    Troupe, James

    2015-03-01

    We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.

  13. Application of industrial hygiene techniques for work-place exposure assessment protocols related to petro-chemical exploration and production field activities

    International Nuclear Information System (INIS)

    Koehn, J.

    1995-01-01

    Standard industrial hygiene techniques for recognition, evaluation, and control can be directly applied to development of technical protocols for workplace exposure assessment activities for a variety of field site locations. Categories of occupational hazards include chemical and physical agents. Examples of these types of hazards directly related to oil and gas exploration and production workplaces include hydrocarbons, benzene, oil mist, hydrogen sulfide, Naturally Occurring Radioactive Materials (NORM), asbestos-containing materials, and noise. Specific components of well process chemicals include potential hazardous chemical substances such as methanol, acrolein, chlorine dioxide, and hydrochloric acid. Other types of exposure hazards may result from non-routine conduct of sandblasting and painting operations

  14. Method-centered digital communities on protocols.io for fast-paced scientific innovation [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Lori Kindler

    2017-06-01

    Full Text Available The Internet has enabled online social interaction for scientists beyond physical meetings and conferences. Yet despite these innovations in communication, dissemination of methods is often relegated to just academic publishing. Further, these methods remain static, with subsequent advances published elsewhere and unlinked. For communities undergoing fast-paced innovation, researchers need new capabilities to share, obtain feedback, and publish methods at the forefront of scientific development. For example, a renaissance in virology is now underway given the new metagenomic methods to sequence viral DNA directly from an environment. Metagenomics makes it possible to "see" natural viral communities that could not be previously studied through culturing methods. Yet, the knowledge of specialized techniques for the production and analysis of viral metagenomes remains in a subset of labs.  This problem is common to any community using and developing emerging technologies and techniques. We developed new capabilities to create virtual communities in protocols.io, an open access platform, for disseminating protocols and knowledge at the forefront of scientific development. To demonstrate these capabilities, we present a virology community forum called VERVENet. These new features allow virology researchers to share protocols and their annotations and optimizations, connect with the broader virtual community to share knowledge, job postings, conference announcements through a common online forum, and discover the current literature through personalized recommendations to promote discussion of cutting edge research. Virtual communities in protocols.io enhance a researcher's ability to: discuss and share protocols, connect with fellow community members, and learn about new and innovative research in the field.  The web-based software for developing virtual communities is free to use on protocols.io. Data are available through public APIs at protocols.io.

  15. Development of a protocol for the kinematic analysis of movement in patients with total hip arthroplasty

    OpenAIRE

    Mateu Pla, Joan

    2015-01-01

    The aim of this final degree project is to study and analyze the kinematics of the human body lower limbs. First of all, it is extremely important to establish a protocol in order to compare two patients operated with two different techniques of total hip arthroplasty. The three usual movements that are employed to make this comparison are gait, sit-to-stand and stairs climbing. A three-dimensional full body model is implemented and the kinematic parameters (angles) necessary for the st...

  16. The Virtual Insect Brain protocol: creating and comparing standardized neuroanatomy

    Directory of Open Access Journals (Sweden)

    Schindelin Johannes E

    2006-12-01

    Full Text Available Abstract Background In the fly Drosophila melanogaster, new genetic, physiological, molecular and behavioral techniques for the functional analysis of the brain are rapidly accumulating. These diverse investigations on the function of the insect brain use gene expression patterns that can be visualized and provide the means for manipulating groups of neurons as a common ground. To take advantage of these patterns one needs to know their typical anatomy. Results This paper describes the Virtual Insect Brain (VIB protocol, a script suite for the quantitative assessment, comparison, and presentation of neuroanatomical data. It is based on the 3D-reconstruction and visualization software Amira, version 3.x (Mercury Inc. 1. Besides its backbone, a standardization procedure which aligns individual 3D images (series of virtual sections obtained by confocal microscopy to a common coordinate system and computes average intensities for each voxel (volume pixel the VIB protocol provides an elaborate data management system for data administration. The VIB protocol facilitates direct comparison of gene expression patterns and describes their interindividual variability. It provides volumetry of brain regions and helps to characterize the phenotypes of brain structure mutants. Using the VIB protocol does not require any programming skills since all operations are carried out at an intuitively usable graphical user interface. Although the VIB protocol has been developed for the standardization of Drosophila neuroanatomy, the program structure can be used for the standardization of other 3D structures as well. Conclusion Standardizing brains and gene expression patterns is a new approach to biological shape and its variability. The VIB protocol provides a first set of tools supporting this endeavor in Drosophila. The script suite is freely available at http://www.neurofly.de2

  17. [Perinatal bioethics: euthanasia or end-of-life decisions? Analysis of the Groningen Protocol].

    Science.gov (United States)

    Halac, Jacobo; Halac, Eduardo; Moya, Martín P; Olmas, José M; Dopazo, Silvina L; Dolagaray, Nora

    2009-12-01

    The so called "Groningen Protocol" was conceived as a framework to discuss the euthanasia in neonates. Originally, it presents three groups of babies who might be candidates to this option. We analyzed the protocol in its original context and that of the Dutch society in which it was created. The analysis started with a careful reading of the protocol in both English and Dutch versions, translated later into Spanish. The medical and nursing staff participated in discussing it. A final consensus was reached. The Institutional Ethics Committee at our hospital discussed it freely and made recommendations for its application as a guideline to honestly discuss with parents the clinical condition of their babies, without permitting the option included literally in the word euthanasia. We selected four extremely ill infants. Their parents were interviewed at least twice daily: three stages were identified: the initial one of promoting all possible treatments; a second one of guarded and cautious request for the staff to evaluate "suffering", and a last one where requests were made to reduce therapeutic efforts to provide dignified death. A week after the death of their infants, they were presented with the facts of the protocol and the limits of our legal system. In all four cases the parents suggested that they would have chosen ending the life of their infants, in order to avoid them undue suffering. They clearly pointed out that this option emerged as a viable one to them once the ultimate outcome was evident. The protocol must not be viewed as a guideline for euthanasia in newborns, but rather as a mean to discuss the critical condition of an infant with the parents. Its direct implementation in our setting remains difficult. As a clear limitation for its overall application remains the definition of what is considered "unbearable suffering" in newborns, and how to certify when the infant has "no prospect". We emphasize the benefits of securing the help of the Ethics

  18. The use of PCR techniques to detect genetic variations in Cassava (Manihot esculenta L. Crantz): minisatellite and RAPD analysis

    International Nuclear Information System (INIS)

    Pawlicki, N.; Sangwan, R.S.; Sangwan-Norreel, B.; Koffi Konan, N.

    1998-01-01

    Cassava is an important tuber crop grown in the tropical and subtropical regions. Recently, we developed protocols for efficient somatic embryogenesis using zygotic embryos and nodal axillary meristems in order to reduce the genotype effect. Thereafter flow cytophotometry and randomly amplified polymorphic DNA (RAPD) markers were used to assess the ploidy level and the genetic fidelity of cassava plants regenerated by somatic embryogenesis. No change in the ploidy level of the regenerated plants was observed in comparison with the control plants. In the same way, monomorphic profiles of RAPD were obtained for the different cassava plants regenerated by somatic embryogenesis. The genetic analysis of calli showed only a few differences. Using two pairs of heterologous micro satellite primers developed in a wild African grass, a monomorphic pattern was also detected. Moreover, cultivars of different origins were also analysed using these PCR techniques. Our data from RAPD and materialistic analyses suggested that these techniques can be efficiently used to detect genetic variations in cassava. (author)

  19. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Performance Analysis of an Optical CDMA MAC Protocol With Variable-Size Sliding Window

    Science.gov (United States)

    Mohamed, Mohamed Aly A.; Shalaby, Hossam M. H.; Abdel-Moety El-Badawy, El-Sayed

    2006-10-01

    A media access control protocol for optical code-division multiple-access packet networks with variable length data traffic is proposed. This protocol exhibits a sliding window with variable size. A model for interference-level fluctuation and an accurate analysis for channel usage are presented. Both multiple-access interference (MAI) and photodetector's shot noise are considered. Both chip-level and correlation receivers are adopted. The system performance is evaluated using a traditional average system throughput and average delay. Finally, in order to enhance the overall performance, error control codes (ECCs) are applied. The results indicate that the performance can be enhanced to reach its peak using the ECC with an optimum number of correctable errors. Furthermore, chip-level receivers are shown to give much higher performance than that of correlation receivers. Also, it has been shown that MAI is the main source of signal degradation.

  1. A Standardized and Reproducible Urine Preparation Protocol for Cancer Biomarkers Discovery

    Directory of Open Access Journals (Sweden)

    Julia Beretov

    2014-01-01

    Full Text Available A suitable and standardized protein purification technique is essential to maintain consistency and to allow data comparison between proteomic studies for urine biomarker discovery. Ultimately, efforts should be made to standardize urine preparation protocols. The aim of this study was to develop an optimal analytical protocol to achieve maximal protein yield and to ensure that this method was applicable to examine urine protein patterns that distinguish disease and disease-free states. In this pilot study, we compared seven different urine sample preparation methods to remove salts, and to precipitate and isolate urinary proteins. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE profiles showed that the sequential preparation of urinary proteins by combining acetone and trichloroacetic acid (TCA alongside high speed centrifugation (HSC provided the best separation, and retained the most urinary proteins. Therefore, this approach is the preferred method for all further urine protein analysis.

  2. Development of a manualized protocol of massage therapy for clinical trials in osteoarthritis

    Directory of Open Access Journals (Sweden)

    Ali Ather

    2012-10-01

    Full Text Available Abstract Background Clinical trial design of manual therapies may be especially challenging as techniques are often individualized and practitioner-dependent. This paper describes our methods in creating a standardized Swedish massage protocol tailored to subjects with osteoarthritis of the knee while respectful of the individualized nature of massage therapy, as well as implementation of this protocol in two randomized clinical trials. Methods The manualization process involved a collaborative process between methodologic and clinical experts, with the explicit goals of creating a reproducible semi-structured protocol for massage therapy, while allowing some latitude for therapists’ clinical judgment and maintaining consistency with a prior pilot study. Results The manualized protocol addressed identical specified body regions with distinct 30- and 60-min protocols, using standard Swedish strokes. Each protocol specifies the time allocated to each body region. The manualized 30- and 60-min protocols were implemented in a dual-site 24-week randomized dose-finding trial in patients with osteoarthritis of the knee, and is currently being implemented in a three-site 52-week efficacy trial of manualized Swedish massage therapy. In the dose-finding study, therapists adhered to the protocols and significant treatment effects were demonstrated. Conclusions The massage protocol was manualized, using standard techniques, and made flexible for individual practitioner and subject needs. The protocol has been applied in two randomized clinical trials. This manualized Swedish massage protocol has real-world utility and can be readily utilized both in the research and clinical settings. Trial registration Clinicaltrials.gov NCT00970008 (18 August 2009

  3. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  4. Greening the networks: a comparative analysis of different energy efficient techniques

    International Nuclear Information System (INIS)

    Arshad, M.J.; Saeed, S.S.

    2014-01-01

    From a room electric bulb to the gigantic backbone networks energy savings have now become a matter of considerable concern. Issues such as resource depletion, global warming, high energy consumptions and environmental threats gave birth to the idea of green networking. Serious efforts have been done in this regard on large scale in the ICT sector. In this work first we give an idea of how and why this modern technology emerged. We then formulate a precise definition of the term green technology. We then discuss some leading techniques which are promising to produce green-results when implemented on real time network systems. These technologies are viewed from different perspectives e.g. hardware implementations, software mechanisms and protocol changing etc. We then compare these techniques based on some pivotal points. The main conclusion is that a detailed comparison is needed for selecting a technology to implement on a network system. (author)

  5. A Ten Step Protocol and Plan for CCS Site Characterization, Based on an Analysis of the Rocky Mountain Region, USA

    Energy Technology Data Exchange (ETDEWEB)

    McPherson, Brian; Matthews, Vince

    2013-09-15

    This report expresses a Ten-Step Protocol for CO2 Storage Site Characterization, the final outcome of an extensive Site Characterization analysis of the Rocky Mountain region, USA. These ten steps include: (1) regional assessment and data gathering; (2) identification and analysis of appropriate local sites for characterization; (3) public engagement; (4) geologic and geophysical analysis of local site(s); (5) stratigraphic well drilling and coring; (6) core analysis and interpretation with other data; (7) database assembly and static model development; (8) storage capacity assessment; (9) simulation and uncertainty assessment; (10) risk assessment. While the results detailed here are primarily germane to the Rocky Mountain region, the intent of this protocol is to be portable or generally applicable for CO2 storage site characterization.

  6. A Framework for Security Analysis of Mobile Wireless Networks

    DEFF Research Database (Denmark)

    Nanz, Sebastian; Hankin, Chris

    2006-01-01

    processes and the network's connectivity graph, which may change independently from protocol actions. We identify a property characterising an important aspect of security in this setting and express it using behavioural equivalences of the calculus. We complement this approach with a control flow analysis......We present a framework for specification and security analysis of communication protocols for mobile wireless networks. This setting introduces new challenges which are not being addressed by classical protocol analysis techniques. The main complication stems from the fact that the actions...... of intermediate nodes and their connectivity can no longer be abstracted into a single unstructured adversarial environment as they form an inherent part of the system's security. In order to model this scenario faithfully, we present a broadcast calculus which makes a clear distinction between the protocol...

  7. Accuracy of molecular biology techniques for the diagnosis of Strongyloides stercoralis infection-A systematic review and meta-analysis.

    Science.gov (United States)

    Buonfrate, Dora; Requena-Mendez, Ana; Angheben, Andrea; Cinquini, Michela; Cruciani, Mario; Fittipaldo, Andrea; Giorli, Giovanni; Gobbi, Federico; Piubelli, Chiara; Bisoffi, Zeno

    2018-02-01

    Strongyloides stercoralis infection is a neglected tropical disease which can lead to severe symptoms and even death in immunosuppressed people. Unfortunately, its diagnosis is hampered by the lack of a gold standard, as the sensitivity of traditional parasitological tests (including microscopic examination of stool samples and coproculture) is low. Hence, alternative diagnostic methods, such as molecular biology techniques (mostly polymerase chain reaction, PCR) have been implemented. However, there are discrepancies in the reported accuracy of PCR. A systematic review with meta-analysis was conducted in order to evaluate the accuracy of PCR for the diagnosis of S. stercoralis infection. The protocol was registered with PROSPERO International Prospective Register of Systematic Reviews (record: CRD42016054298). Fourteen studies, 12 of which evaluating real-time PCR, were included in the analysis. The specificity of the techniques resulted high (ranging from 93 to 95%, according to the reference test(s) used). When all molecular techniques were compared to parasitological methods, the sensitivity of PCR was assessed at 71.8% (95% CI 52.2-85.5), that decreased to 61.8% (95% CI 42.0-78.4) when serology was added among the reference tests. Similarly, sensitivity of real-time PCR resulted 64.4% (95% CI 46.2-77.7) when compared to parasitological methods only, 56.5% (95% CI 39.2-72.4) including serology. PCR might not be suitable for screening purpose, whereas it might have a role as a confirmatory test.

  8. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  9. Authentication Protocols for Internet of Things: A Comprehensive Survey

    Directory of Open Access Journals (Sweden)

    Mohamed Amine Ferrag

    2017-01-01

    Full Text Available In this paper, a comprehensive survey of authentication protocols for Internet of Things (IoT is presented. Specifically more than forty authentication protocols developed for or applied in the context of the IoT are selected and examined in detail. These protocols are categorized based on the target environment: (1 Machine to Machine Communications (M2M, (2 Internet of Vehicles (IoV, (3 Internet of Energy (IoE, and (4 Internet of Sensors (IoS. Threat models, countermeasures, and formal security verification techniques used in authentication protocols for the IoT are presented. In addition a taxonomy and comparison of authentication protocols that are developed for the IoT in terms of network model, specific security goals, main processes, computation complexity, and communication overhead are provided. Based on the current survey, open issues are identified and future research directions are proposed.

  10. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  11. Security and reliability analysis of diversity combining techniques in SIMO mixed RF/FSO with multiple users

    KAUST Repository

    Abd El-Malek, Ahmed H.; Salhab, Anas M.; Zummo, Salam A.; Alouini, Mohamed-Slim

    2016-01-01

    In this paper, we investigate the impact of different diversity combining techniques on the security and reliability analysis of a single-input-multiple-output (SIMO) mixed radio frequency (RF)/free space optical (FSO) relay network with opportunistic multiuser scheduling. In this model, the user of the best channel among multiple users communicates with a multiple antennas relay node over an RF link, and then, the relay node employs amplify-and-forward (AF) protocol in retransmitting the user data to the destination over an FSO link. Moreover, the authorized transmission is assumed to be attacked by a single passive RF eavesdropper equipped with multiple antennas. Therefore, the system security reliability trade-off analysis is investigated. Closed-form expressions for the system outage probability and the system intercept probability are derived. Then, the newly derived expressions are simplified to their asymptotic formulas at the high signal-to-noise- ratio (SNR) region. Numerical results are presented to validate the achieved exact and asymptotic results and to illustrate the impact of various system parameters on the system performance. © 2016 IEEE.

  12. Security and reliability analysis of diversity combining techniques in SIMO mixed RF/FSO with multiple users

    KAUST Repository

    Abd El-Malek, Ahmed H.

    2016-07-26

    In this paper, we investigate the impact of different diversity combining techniques on the security and reliability analysis of a single-input-multiple-output (SIMO) mixed radio frequency (RF)/free space optical (FSO) relay network with opportunistic multiuser scheduling. In this model, the user of the best channel among multiple users communicates with a multiple antennas relay node over an RF link, and then, the relay node employs amplify-and-forward (AF) protocol in retransmitting the user data to the destination over an FSO link. Moreover, the authorized transmission is assumed to be attacked by a single passive RF eavesdropper equipped with multiple antennas. Therefore, the system security reliability trade-off analysis is investigated. Closed-form expressions for the system outage probability and the system intercept probability are derived. Then, the newly derived expressions are simplified to their asymptotic formulas at the high signal-to-noise- ratio (SNR) region. Numerical results are presented to validate the achieved exact and asymptotic results and to illustrate the impact of various system parameters on the system performance. © 2016 IEEE.

  13. Demarcation of Security in Authentication Protocols

    DEFF Research Database (Denmark)

    Ahmed, Naveed; Jensen, Christian D.

    2011-01-01

    Security analysis of communication protocols is a slippery business; many “secure” protocols later turn out to be insecure. Among many, two complains are more frequent: inadequate definition of security and unstated assumptions in the security model. In our experience, one principal cause...... for such state of affairs is an apparent overlap of security and correctness, which may lead to many sloppy security definitions and security models. Although there is no inherent need to separate security and correctness requirements, practically, such separation is significant. It makes security analysis...... easier, and enables us to define security goals with a fine granularity. We present one such separation, by introducing the notion of binding sequence as a security primitive. A binding sequence, roughly speaking, is the only required security property of an authentication protocol. All other...

  14. Failure mode and effects analysis of witnessing protocols for ensuring traceability during IVF.

    Science.gov (United States)

    Rienzi, Laura; Bariani, Fiorenza; Dalla Zorza, Michela; Romano, Stefania; Scarica, Catello; Maggiulli, Roberta; Nanni Costa, Alessandro; Ubaldi, Filippo Maria

    2015-10-01

    Traceability of cells during IVF is a fundamental aspect of treatment, and involves witnessing protocols. Failure mode and effects analysis (FMEA) is a method of identifying real or potential breakdowns in processes, and allows strategies to mitigate risks to be developed. To examine the risks associated with witnessing protocols, an FMEA was carried out in a busy IVF centre, before and after implementation of an electronic witnessing system (EWS). A multidisciplinary team was formed and moderated by human factors specialists. Possible causes of failures, and their potential effects, were identified and risk priority number (RPN) for each failure calculated. A second FMEA analysis was carried out after implementation of an EWS. The IVF team identified seven main process phases, 19 associated process steps and 32 possible failure modes. The highest RPN was 30, confirming the relatively low risk that mismatches may occur in IVF when a manual witnessing system is used. The introduction of the EWS allowed a reduction in the moderate-risk failure mode by two-thirds (highest RPN = 10). In our experience, FMEA is effective in supporting multidisciplinary IVF groups to understand the witnessing process, identifying critical steps and planning changes in practice to enable safety to be enhanced. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  15. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  16. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  17. Game-theoretic perspective of Ping-Pong protocol

    Science.gov (United States)

    Kaur, Hargeet; Kumar, Atul

    2018-01-01

    We analyse Ping-Pong protocol from the point of view of a game. The analysis helps us in understanding the different strategies of a sender and an eavesdropper to gain the maximum payoff in the game. The study presented here characterizes strategies that lead to different Nash equilibriums. We further demonstrate the condition for Pareto optimality depending on the parameters used in the game. Moreover, we also analysed LM05 protocol and compared it with PP protocol from the point of view of a generic two-way QKD game with or without entanglement. Our results provide a deeper understanding of general two-way QKD protocols in terms of the security and payoffs of different stakeholders in the protocol.

  18. A novel preparation technique of red (sparkling wine for protein analysis

    Directory of Open Access Journals (Sweden)

    Elisabeth I. Vogt

    2016-06-01

    Full Text Available Despite their low concentration, proteins can influence several key enological parameters such as foam stability or haze formation in (sparkling wine. Most studies focus on white (sparkling wine since the higher content of phenolic compounds in red wines impairs proteomic research. The aim of the study was the development of a method for the preparation of red (sparkling wine proteins for proteomic analysis. Three methods of sample preparation were assessed on silver stained SDS-PAGE gels and with MALDI-TOF MS. Our new method was highly suitable for the preparation of proteins for the aforementioned applications. The results showed a substantial increase in signal intensity with a simultaneous decrease in background noise. The preparation protocol consists of (i dialysis and freeze drying of the sample, (ii removal of phenolic compounds by water-saturated phenol and (iii protein precipitation by addition of ammonium acetate. Employment of this method followed by SDS-PAGE analysis allowed for silver stained gels with diminished background or streaking and clearly resolved protein bands. Analysis of spectra obtained from samples prepared according to the proposed protocol showed increased intensity and signal-to-noise ratio in MALDI-TOF MS. Furthermore it was demonstrated that this method can be applied to various kinds of grape products.

  19. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  20. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  1. Replication protocol analysis: a method for the study of real-world design thinking

    DEFF Research Database (Denmark)

    Galle, Per; Kovacs, L. B.

    1996-01-01

    ’ is refined into a method called ‘replication protocol analysis’ (RPA), and discussed from a methodological perspective of design research. It is argued that for the study of real-world design thinking this method offers distinct advantages over traditional ‘design protocol analysis’, which seeks to capture......Given the brief of an architectural competition on site planning, and the design awarded the first prize, the first author (trained as an architect but not a participant in the competition) produced a line of reasoning that might have led from brief to design. In the paper, such ‘design replication...... the designer’s authentic line of reasoning. To illustrate how RPA can be used, the site planning case is briefly presented, and part of the replicated line of reasoning analysed. One result of the analysis is a glimpse of a ‘logic of design’; another is an insight which sheds new light on Darke’s classical...

  2. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  3. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  4. Contrast-Enhanced MR Angiography (CEMRA) in Peripheral Arterial Occlusive Disease (PAOD): conventional moving table technique versus hybrid technique

    International Nuclear Information System (INIS)

    Kalle, T. von; Gerlach, A.; Hatopp, A.; Klinger, S.; Prodehl, P.; Arlat, I.P.

    2004-01-01

    Patients and Methods: 80 patients (males n = 60, females n = 20, median age = 70 years, diabetics n = 27) with PAOD were examined with a 1,5T system (40 mT/m) using a dedicated phased array peripheral vascular coil. Protocol A consisted of a single injection of Gd-BOPTA with consecutive craniocaudal image acquisition and protocol B of two injections, with the first injection of Gd-BOPTA followed by image acquisition of the popliteocrural and pedal segments and the second injection followed by acquiring the aortoiliac and femoral segments (hybrid technique). The evaluation of the arterial system was directed to the iliac, femoral, popliteocrural and pedal arteries. Results: The visualization of the entire aortopedal vascular system was of diagnostically good or satisfactory quality in 16 of 40 patients using protocol A and in 29 of 40 patients using protocol B (iliac 40 vs. 37, femoral 40 vs. 40, popliteocrural 35 vs. 37, pedal 16 vs. 29); without the pedal station the number increased to 35 of 40 patients for both protocols. The reason of diagnostic limitations was an arteriovenous overlap in 24 of 80 cases, with 19 of 40 cases for protocol A and 5 of 40 for protocol B, located exclusively in the cruropedal region. Conclusion: Moving table hybrid CEMRA is superior to conventional technique in craniocaudal direction by producing less venous overlap of arteries and is especially more suitable for the diagnostic evaluation of the cruropedal region. (orig.) [de

  5. Medium Access Control Protocols for Cognitive Radio Ad Hoc Networks: A Survey

    Directory of Open Access Journals (Sweden)

    Mahdi Zareei

    2017-09-01

    Full Text Available New wireless network paradigms will demand higher spectrum use and availability to cope with emerging data-hungry devices. Traditional static spectrum allocation policies cause spectrum scarcity, and new paradigms such as Cognitive Radio (CR and new protocols and techniques need to be developed in order to have efficient spectrum usage. Medium Access Control (MAC protocols are accountable for recognizing free spectrum, scheduling available resources and coordinating the coexistence of heterogeneous systems and users. This paper provides an ample review of the state-of-the-art MAC protocols, which mainly focuses on Cognitive Radio Ad Hoc Networks (CRAHN. First, a description of the cognitive radio fundamental functions is presented. Next, MAC protocols are divided into three groups, which are based on their channel access mechanism, namely time-slotted protocol, random access protocol and hybrid protocol. In each group, a detailed and comprehensive explanation of the latest MAC protocols is presented, as well as the pros and cons of each protocol. A discussion on future challenges for CRAHN MAC protocols is included with a comparison of the protocols from a functional perspective.

  6. Improving the efficiency of single and multiple teleportation protocols based on the direct use of partially entangled states

    Energy Technology Data Exchange (ETDEWEB)

    Fortes, Raphael; Rigolin, Gustavo, E-mail: rigolin@ifi.unicamp.br

    2013-09-15

    We push the limits of the direct use of partially pure entangled states to perform quantum teleportation by presenting several protocols in many different scenarios that achieve the optimal efficiency possible. We review and put in a single formalism the three major strategies known to date that allow one to use partially entangled states for direct quantum teleportation (no distillation strategies permitted) and compare their efficiencies in real world implementations. We show how one can improve the efficiency of many direct teleportation protocols by combining these techniques. We then develop new teleportation protocols employing multipartite partially entangled states. The three techniques are also used here in order to achieve the highest efficiency possible. Finally, we prove the upper bound for the optimal success rate for protocols based on partially entangled Bell states and show that some of the protocols here developed achieve such a bound. -- Highlights: •Optimal direct teleportation protocols using directly partially entangled states. •We put in a single formalism all strategies of direct teleportation. •We extend these techniques for multipartite partially entangle states. •We give upper bounds for the optimal efficiency of these protocols.

  7. Performance Analysis of the IEEE 802.11p Multichannel MAC Protocol in Vehicular Ad Hoc Networks.

    Science.gov (United States)

    Song, Caixia

    2017-12-12

    Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. The safety applications require timely and reliable transmissions, while the non-safety applications require efficient and high throughput. In the IEEE 1609.4 protocol, operating interval is divided into alternating Control Channel (CCH) interval and Service Channel (SCH) interval with an identical length. During the CCH interval, nodes transmit safety-related messages and control messages, and Enhanced Distributed Channel Access (EDCA) mechanism is employed to allow four Access Categories (ACs) within a station with different priorities according to their criticality for the vehicle's safety. During the SCH interval, the non-safety massages are transmitted. An analytical model is proposed in this paper to evaluate performance, reliability and efficiency of the IEEE 802.11p and IEEE 1609.4 protocols. The proposed model improves the existing work by taking serval aspects and the character of multichannel switching into design consideration. Extensive performance evaluations based on analysis and simulation help to validate the accuracy of the proposed model and analyze the capabilities and limitations of the IEEE 802.11p and IEEE 1609.4 protocols, and enhancement suggestions are given.

  8. Recursion Versus Replication in Simple Cryptographic Protocols

    DEFF Research Database (Denmark)

    Hüttel, Hans; Srba, Jiri

    2005-01-01

    We use some very recent techniques from process algebra to draw interesting conclusions about the well studied class of ping-pong protocols introduced by Dolev and Yao. In particular we show that all nontrivial properties, including reachability and equivalence checking wrt. the whole van Glabbee...

  9. Decidability Issues for Extended Ping-Pong Protocols

    DEFF Research Database (Denmark)

    Huttel, Hans; Srba, Jiri

    2006-01-01

    We use some recent techniques from process algebra to draw several conclusions about the well studied class of ping-pong protocols introduced by Dolev and Yao. In particular we show that all nontrivial properties, including reachability and equivalence checking wrt. the whole van Glabbeek...

  10. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  11. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  12. Protocol optimization in chest CT scans of child

    Energy Technology Data Exchange (ETDEWEB)

    Abrao L, L. T.; Amaral de O, F.; Prata M, A. [Biomedical Engineering Center, Centro Federal de Educacao Tecnologica de Minas Gerais, 30421-169, Belo Horizonte, Minas Gerais (Brazil); Bustos F, M., E-mail: luanaabrao@gmail.com [Universidad Federal de Minas Gerais, Department of Nuclear Engineering, Av. Pres. Antonio Carlos 6627, Pampulha, 31270-901 Belo Horizonte, Minas Gerais (Brazil)

    2017-10-15

    The dissemination of Computed Tomography (CT), a radiodiagnostic technique, has significant increase in the patient dose. In the last years, this technique has shown a high growth due to clinical cases of medical emergencies, neoplasm and pediatric traumas. Dose measurement is important to correlate with the deleterious effects of radiation on the organism and radiation future effects is related with stochastic risks due to tissue radiosensitivity, allied to the life expectancy of the child. In this work, a cylindrical phantom, representing an adult chest made of polymethylmethacrylate (PMMA), was used and a new born chest phantom with a shape oblong was developed based on the dimensions of a typical newborn. In a Ge CT scanner, Discovery model, with 64 channels, the central slice of the phantoms were irradiated successively in order to obtain dose measurements using an ionizing pencil camera. Based in the measurements, dose index was calculated (CTDI{sub vol}). The radiological service chest protocol using a voltage of 120 kV was used for scanning 10 cm of the central area of the adult and newborn phantom, in helical mode. An acquisition of images was performed using this radiological service chest protocol to compare with the protocol optimized. In the newborn phantom was also used protocols optimized using a voltage of 120 and 80 kV. The voltage of 80 kV has the lowest dose index for the pediatric object phantom. This work allowed the comparison between absorbed dose variations by the pediatric phantom changing the X-ray tube supply voltage. This dose variation has shown how important is specific protocols for children. (Author)

  13. Protocol optimization in chest CT scans of child

    International Nuclear Information System (INIS)

    Abrao L, L. T.; Amaral de O, F.; Prata M, A.; Bustos F, M.

    2017-10-01

    The dissemination of Computed Tomography (CT), a radiodiagnostic technique, has significant increase in the patient dose. In the last years, this technique has shown a high growth due to clinical cases of medical emergencies, neoplasm and pediatric traumas. Dose measurement is important to correlate with the deleterious effects of radiation on the organism and radiation future effects is related with stochastic risks due to tissue radiosensitivity, allied to the life expectancy of the child. In this work, a cylindrical phantom, representing an adult chest made of polymethylmethacrylate (PMMA), was used and a new born chest phantom with a shape oblong was developed based on the dimensions of a typical newborn. In a Ge CT scanner, Discovery model, with 64 channels, the central slice of the phantoms were irradiated successively in order to obtain dose measurements using an ionizing pencil camera. Based in the measurements, dose index was calculated (CTDI vol ). The radiological service chest protocol using a voltage of 120 kV was used for scanning 10 cm of the central area of the adult and newborn phantom, in helical mode. An acquisition of images was performed using this radiological service chest protocol to compare with the protocol optimized. In the newborn phantom was also used protocols optimized using a voltage of 120 and 80 kV. The voltage of 80 kV has the lowest dose index for the pediatric object phantom. This work allowed the comparison between absorbed dose variations by the pediatric phantom changing the X-ray tube supply voltage. This dose variation has shown how important is specific protocols for children. (Author)

  14. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  15. The Security Analysis of Two-Step Quantum Direct Communication Protocol in Collective-Rotation Noise Channel

    International Nuclear Information System (INIS)

    Li Jian; Sun Feng-Qi; Pan Ze-Shi; Nie Jin-Rui; Chen Yan-Hua; Yuan Kai-Guo

    2015-01-01

    To analyze the security of two-step quantum direct communication protocol (QDCP) by using Einstein–Podolsky–Rosen pair proposed by Deng et al. [Phys. Rev. A 68 (2003) 042317] in collective-rotation noise channel, an excellent model of noise analysis is proposed. In the security analysis, the method of the entropy theory is introduced, and is compared with QDCP, an error rate point Q 0 (M : (Q 0 , 1.0)) is given. In different noise levels, if Eve wants to obtain the same amount of information, the error rate Q is distinguishable. The larger the noise level ϵ is, the larger the error rate Q is. When the noise level ϵ is lower than 11%, the high error rate is 0.153 without eavesdropping. Lastly, the security of the proposed protocol is discussed. It turns out that the quantum channel will be safe when Q < 0.153. Similarly, if error rate Q > 0.153 = Q 0 , eavesdropping information I > 1, which means that there exist eavesdroppers in the quantum channel, and the quantum channel will not be safe anymore. (paper)

  16. Analysis of protection spanning-tree protocol

    Directory of Open Access Journals (Sweden)

    Б.Я. Корнієнко

    2007-01-01

    Full Text Available  Extraordinary sweeping  of  IT – development  causes vulnerabilities and, thereafter, attacks that use these vulnerabilities. That is why one must post factum or even in advance speed up invention of new information  security systems as well as develop the old ones. The matter of article concerns Spanning-Tree Protocol  – the vivid example of the case, when the cure of the vulnerability creates dozen of new "weak spots".

  17. Multiparametric multidetector computed tomography scanning on suspicion of hyperacute ischemic stroke: validating a standardized protocol

    Directory of Open Access Journals (Sweden)

    Felipe Torres Pacheco

    2013-06-01

    Full Text Available Multidetector computed tomography (MDCT scanning has enabled the early diagnosis of hyperacute brain ischemia. We aimed at validating a standardized protocol to read and report MDCT techniques in a series of adult patients. The inter-observer agreement among the trained examiners was tested, and their results were compared with a standard reading. No false positives were observed, and an almost perfect agreement (Kappa>0.81 was documented when the CT angiography (CTA and cerebral perfusion CT (CPCT map data were added to the noncontrast CT (NCCT analysis. The inter-observer agreement was higher for highly trained readers, corroborating the need for specific training to interpret these modern techniques. The authors recommend adding CTA and CPCT to the NCCT analysis in order to clarify the global analysis of structural and hemodynamic brain abnormalities. Our structured report is suitable as a script for the reproducible analysis of the MDCT of patients on suspicion of ischemic stroke.

  18. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  19. Optimized exosome isolation protocol for cell culture supernatant and human plasma

    Directory of Open Access Journals (Sweden)

    Richard J. Lobb

    2015-07-01

    Full Text Available Extracellular vesicles represent a rich source of novel biomarkers in the diagnosis and prognosis of disease. However, there is currently limited information elucidating the most efficient methods for obtaining high yields of pure exosomes, a subset of extracellular vesicles, from cell culture supernatant and complex biological fluids such as plasma. To this end, we comprehensively characterize a variety of exosome isolation protocols for their efficiency, yield and purity of isolated exosomes. Repeated ultracentrifugation steps can reduce the quality of exosome preparations leading to lower exosome yield. We show that concentration of cell culture conditioned media using ultrafiltration devices results in increased vesicle isolation when compared to traditional ultracentrifugation protocols. However, our data on using conditioned media isolated from the Non-Small-Cell Lung Cancer (NSCLC SK-MES-1 cell line demonstrates that the choice of concentrating device can greatly impact the yield of isolated exosomes. We find that centrifuge-based concentrating methods are more appropriate than pressure-driven concentrating devices and allow the rapid isolation of exosomes from both NSCLC cell culture conditioned media and complex biological fluids. In fact to date, no protocol detailing exosome isolation utilizing current commercial methods from both cells and patient samples has been described. Utilizing tunable resistive pulse sensing and protein analysis, we provide a comparative analysis of 4 exosome isolation techniques, indicating their efficacy and preparation purity. Our results demonstrate that current precipitation protocols for the isolation of exosomes from cell culture conditioned media and plasma provide the least pure preparations of exosomes, whereas size exclusion isolation is comparable to density gradient purification of exosomes. We have identified current shortcomings in common extracellular vesicle isolation methods and provide a

  20. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  1. A fresh look at the freeze-all protocol: a SWOT analysis.

    Science.gov (United States)

    Blockeel, Christophe; Drakopoulos, Panagiotis; Santos-Ribeiro, Samuel; Polyzos, Nikolaos P; Tournaye, Herman

    2016-03-01

    The 'freeze-all' strategy with the segmentation of IVF treatment, namely with the use of a GnRH antagonist protocol, GnRH agonist triggering, the elective cryopreservation of all embryos by vitrification and a frozen-thawed embryo transfer in a subsequent cycle, has become more popular. However, the approach still encounters drawbacks. In this opinion paper, a SWOT (strengths, weaknesses, opportunities and threats) analysis sheds light on the different aspects of this strategy. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. A secure key agreement protocol based on chaotic maps

    International Nuclear Information System (INIS)

    Wang Xing-Yuan; Luan Da-Peng

    2013-01-01

    To guarantee the security of communication in the public channel, many key agreement protocols have been proposed. Recently, Gong et al. proposed a key agreement protocol based on chaotic maps with password sharing. In this paper, Gong et al.'s protocol is analyzed, and we find that this protocol exhibits key management issues and potential security problems. Furthermore, the paper presents a new key agreement protocol based on enhanced Chebyshev polynomials to overcome these problems. Through our analysis, our key agreement protocol not only provides mutual authentication and the ability to resist a variety of common attacks, but also solve the problems of key management and security issues existing in Gong et al.'s protocol

  3. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  4. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    Science.gov (United States)

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  5. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  6. [Analysis of palliative sedation in hospitalised elderly patients: Effectiveness of a protocol].

    Science.gov (United States)

    Mateos-Nozal, Jesús; García-Cabrera, Lorena; Montero Errasquín, Beatriz; Cruz-Jentoft, Alfonso José; Rexach Cano, Lourdes

    2016-01-01

    To measure changes in the practice of palliative sedation during agony in hospitalised elderly patients before and after the implementation of a palliative sedation protocol. A retrospective before-after study was performed in hospitalised patients over 65 years old who received midazolam during hospital admission and died in the hospital in two 3-month periods, before and after the implementation of the protocol. Non-sedative uses of midazolam and patients in intensive care were excluded. Patient and admission characteristics, the consent process, withdrawal of life-sustaining treatments, and the sedation process (refractory symptom treated, drug doses, assessment and use of other drugs) were recorded. Association was analysed using the Chi(2) and Student t tests. A total of 143 patients were included, with no significant differences between groups in demographic characteristics or symptoms. Do not resuscitate (DNR) orders were recorded in approximately 70% of the subjects of each group, and informed consent for sedation was recorded in 91% before vs. 84% after the protocol. Induction and maintenance doses of midazolam followed protocol recommendations in 1.3% before vs 10.4% after the protocol was implemented (P=.02) and adequate rescue doses were used in 1.3% vs 11.9% respectively (P=.01). Midazolam doses were significantly lower (9.86mg vs 18.67mg, Psedation score was used in 8% vs. 12% and the Palliative Care Team was involved in 35.5% and 16.4% of the cases (P=.008) before and after the protocol, respectively. Use of midazolam slightly improved after the implementation of a hospital protocol on palliative sedation. The percentage of adequate sedations and the general process of sedation were mostly unchanged by the protocol. More education and further assessment is needed to gauge the effect of these measures in the future. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.

  7. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  8. Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis.

    Science.gov (United States)

    Alanazi, Adwan; Elleithy, Khaled

    2015-09-02

    Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol.

  9. 75 FR 53273 - Federal Aquatic Nuisance Species Research Risk Analysis Protocol

    Science.gov (United States)

    2010-08-31

    ... Aquatic Nuisance Species Task Force (ANSTF). The Protocol is available for public review and comment... the draft revised Protocol are available on the ANSTF website, http://anstaskforce.gov/documents.php... nonindigenous species (ANS) and is designed to reduce the risk that research activities may cause introduction...

  10. Analysis of the radius and diameter protocols in terms of pricing telecommunication services

    Directory of Open Access Journals (Sweden)

    Vesna M. Radonjić

    2013-06-01

    Full Text Available Accounting of telecommunication services is closely related to the functions of authentication and authorization. These functions are usually considered together and implemented within the same server using a common protocol. The most renowned protocols for authentication, authorization and accounting are the RADIUS and Diameter protocols.   AAA functions and related protocols   In this chapter, the accounting management architecture developed by IETF is presented. It includes the interaction between network elements, accounting servers and billing and charging servers. Accounting data can be used for management, planning and charging users as well as other (specific purposes. Authentication is the process of confirming a user's digital identity, usually through some type of identifiers and related data. Authorization determines whether a particular entity is authorized to perform an activity.   Basic Functions of the RADIUS Protocol   The RADIUS architecture is based on a client-server model. It uses UDP on the transport layer. Transactions between the client and the server are authenticated, which is achieved by using a common secret key that is never sent through the network. Given the limited resources available to network devices, RADIUS facilitates and centralizes charging end users, provides some protection against active attacks by unauthorized users and it has great support from different network equipment vendors. Although RADIUS is a widely accepted protocol for the mechanisms of authentication, authorization and accounting, it has certain shortcomings that may be caused by the protocol itself or by its poor implementation.   Architecture and Operation of the Diameter Protocol   Diameter is a scalable protocol designed by the IETF working group in order to eliminate shortcomings and functional limitations of the RADIUS protocol and eventually to replace it in the near future. Most of the basic Diameter mechanisms and its

  11. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  12. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  13. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  14. An Overview and Analysis of Mobile Internet Protocols in Cellular Environments.

    Science.gov (United States)

    Chao, Han-Chieh

    2001-01-01

    Notes that cellular is the inevitable future architecture for the personal communication service system. Discusses the current cellular support based on Mobile Internet Protocol version 6 (Ipv6) and points out the shortfalls of using Mobile IP. Highlights protocols especially for mobile management schemes which can optimize a high-speed mobile…

  15. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  16. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  17. The Development of Korea Additional Protocol System

    International Nuclear Information System (INIS)

    Shim, Hye Won; Yeo, Jin Kyun

    2008-01-01

    The Agreement between the Republic of Korea (ROK) and the IAEA for the Application of Safeguards in Connection with the Treaty on the Non-Proliferation of Nuclear Weapons (the Safeguards Agreement) entered into force on 14 November 1975. The Additional Protocol to the Safeguards Agreement (the Additional Protocol) was signed on 21 June 1999 and entered into force on 19 February 2004. ROK has been submitting annual updated reports of initial declaration on every May 15th since August 2004. Additional protocol reports were submitted through Protocol Reporter provided by IAEA. Annual declarations were simply uploaded and stored in the Accounting Information Treatment System of KINAC, which did not provide data analysis and management function. There have been demands for improvement to handle ever-increasing information. KAPS (Korea Additional Protocol System) has been developed to assist and administrate the additional protocol related works effectively. The new system enables integrated management including composition of additional protocol report and version control, periodical update of related information, results of IAEA complementary access to each facility

  18. [Clinical outcomes and economic analysis of two ovulation induction protocols in patients undergoing repeated IVF/ICSI cycles].

    Science.gov (United States)

    Chen, Xiao; Geng, Ling; Li, Hong

    2014-04-01

    To compare the clinical outcomes and cost-effectiveness of luteal phase down-regulation with gonadotrophin-releasing hormone (GnRH) agonist protocol and GnRH antagonist protocol in patients undergoing repeated in vitro fertilization and intracytoplasmic sperm injection (IVF-ICSI) cycles. A retrospective analysis of clinical outcomes and costs was conducted among 198 patients undergoing repeated IVF-ICSI cycles, including 109 receiving luteal phase down-regulation with GnRH agonist protocol (group A) and 89 receiving GnRH antagonist protocol (group B). The numbers of oocytes retrieved and good embryos, clinical pregnancy rate, abortion rate, the live birth rate, mean total cost, and the cost-effective ratio were compared between the two groups. In patients undergoing repeated IVF-ICSI cycles, the two protocols produced no significant differences in the number of good embryos, clinical pregnancy rate, abortion rate, or twin pregnancy rate. Compared with group B, group A had better clinical outcomes though this difference was not statistically significant. The number of retrieved oocytes was significantly greater and live birth rate significantly higher in group A than in group B (9.13=4.98 vs 7.11=4.74, and 20.2% vs 9.0%, respectively). Compared with group B, group A had higher mean total cost per cycle but lower costs for each oocyte retrieved (2729.11 vs 3038.60 RMB yuan), each good embryo (8867.19 vs 9644.85 RMB yuan), each clinical pregnancy (77598.06 vs 96139.85 RMB yuan). For patients undergoing repeated IVF/ICSI cycle, luteal phase down-regulation with GnRH agonist protocol produces good clinical outcomes with also good cost-effectiveness in spite an unsatisfactory ovarian reserve.

  19. New View of Ping-Pong Protocol Security

    International Nuclear Information System (INIS)

    Zawadzki Piotr

    2012-01-01

    The ping-pong protocol offers confidential transmission of classic information without a prior key agreement. It is believed that it is quasi secure in lossless quantum channels. Serious doubts related to the analysis paradigm which has been used so far are presented in the study. The security of the protocol is reconsidered. (general)

  20. ANTERIOR CRUCIATE LIGAMENT RECONSTRUCTION USING THE DOUBLE-BUNDLE TECHNIQUE – EVALUATION IN THE BIOMECHANICS LABORATORY

    OpenAIRE

    D'Elia, Caio Oliveira; Bitar, Alexandre Carneiro; Castropil, Wagner; Garofo, Antônio Guilherme Padovani; Cantuária, Anita Lopes; Orselli, Maria Isabel Veras; Luques, Isabela Ugo; Duarte, Marcos

    2011-01-01

    Objective: The objective of this study was to describe the methodology of knee rotation analysis using biomechanics laboratory instruments and to present the preliminary results from a comparative study on patients who underwent anterior cruciate ligament (ACL) reconstruction using the double-bundle technique. Methods: The protocol currently used in our laboratory was described. Three-dimensional kinematic analysis was performed and knee rotation amplitude was measured on eight normal patient...

  1. Identification of a research protocol to study orthodontic tooth movement

    Directory of Open Access Journals (Sweden)

    Annalisa Dichicco

    2014-06-01

    Full Text Available Aim: The orthodontic movement is associated with a process of tissue remodeling together with the release of several chemical mediators in periodontal tissues. Each mediator is a potential marker of tooth movement and expresses biological processes as: tissue inflammation and bone remodeling. Different amounts of every mediator are present in several tissues and fluids of the oral cavity. Therefore, there are different methods that allow sampling with several degrees of invasiveness. Chemical mediators are also substances of different molecular nature, and multiple kind of analysis methods allow detection. The purpose of this study was to draft the best research protocol for an optimal study on orthodontic movement efficiency. Methods: An analysis of the international literature have been made, to identify the gold standard of each aspect of the protocol: type of mediator, source and method of sampling and analysis method. Results: From the analysis of the international literature was created an original research protocol for the study and the assessment of the orthodontic movement, by using the biomarkers of the tooth movement. Conclusions: The protocol created is based on the choice of the gold standard of every aspect already analyzed in the literature and in existing protocols for the monitoring of orthodontic tooth movement through the markers of tooth movement. Clinical trials re required for the evaluation and validation of the protocol created.

  2. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    Directory of Open Access Journals (Sweden)

    Shameng Wen

    Full Text Available Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  3. A novel cognitive palatability assessment protocol for dogs.

    Science.gov (United States)

    Araujo, J A; Milgram, N W

    2004-07-01

    Assessment of canine palatability is important for both the pet food and pharmaceutical industries; however, the current palatability assessment protocols are limited in their utility. The most common technique, the two-pan test, does not control for the satiating effects of food and may not be useful for long-term palatability analysis because nutritional or caloric characteristics of the diets may interfere with the results. Furthermore, the large quantities of foods consumed may be detrimental to the health of animals that do not self-limit their food intake. The purpose of this study was to determine whether a cognitive protocol could be used to determine food palatability in dogs. Five beagle dogs were trained on a three-choice object-discrimination learning task. After establishing object preferences, the preferred object was associated with no reward, a second object was associated with the dog's normal laboratory diet (Purina Agribrands Canine Lab Chow No. 5006; Agribrands Purina Canada, Inc., Woodstock, ON, Canada), and the third object was associated with a commercial (Hill's P/D; Hill's Pet Nutrition Inc., Topeka, KS) diet. In the discrimination-training phase, dogs were trained until they learned to avoid the no-reward object. They were subsequently given an additional 20 test sessions, which were used to determine food preference. In the reversal phase, which involved reversal learning, the object-food associations were modified, such that the object that was previously associated with Hill's P/D diet was now associated with the normal laboratory diet and vice versa. Once the dogs learned to avoid the no-reward object, they were tested for an additional 20 sessions. All subjects learned to avoid the no-reward object during the initial learning, and the number of choices to the object associated with the Hill's P/D diet was greater than the number of choices to the objects associated with the dry laboratory diet (P food-choice associations were reversed

  4. Security of the arbitrated quantum signature protocols revisited

    International Nuclear Information System (INIS)

    Kejia, Zhang; Dan, Li; Qi, Su

    2014-01-01

    Recently, much attention has been paid to the study of arbitrated quantum signature (AQS). Among these studies, the cryptanalysis of some AQS protocols and a series of improved ideas have been proposed. Compared with the previous analysis, we present a security criterion, which can judge whether an AQS protocol is able to prevent the receiver (i.e. one participant in the signature protocol) from forging a legal signature. According to our results, it can be seen that most AQS protocols which are based on the Zeng and Keitel (ZK) model are susceptible to a forgery attack. Furthermore, we present an improved idea of the ZK protocol. Finally, some supplement discussions and several interesting topics are provided. (paper)

  5. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  6. The general protocol for the S10 technique

    Directory of Open Access Journals (Sweden)

    Mircea Constantin Șora

    2016-12-01

    Full Text Available Plastination is a process of preservation of anatomical specimens by a delicate method of forced impregnation with curable polymers like silicone, epoxy or polyester resins, with vast applications in medical fields of study. In this process, water and lipids in biological tissues are replaced by curable polymers (silicone, epoxy, polyester which are hardened, resulting in dry, odorless and durable specimens. Today, after more than 30 years of its development, plastination is being applied in more than 400 departments of anatomy, pathology, forensic sciences and biology all over the world. The standard S10 silicone technique produces flexible, resilient and opaque specimens. After fixation and dehydration, the specimens are impregnated with silicone S10 and in the end the specimens are cured. The key element in plastination is the impregnation step and therefore depending on the polymer used the optical quality of the specimens differ. The S10 silicone technique is the most common technique used in plastination. It is used worldwide for beginners, but also for experimented plastinators. The S10 plastinated specimens can be easily stored at room temperature, are non-toxic and odorless. The S10 specimens can be successfully used, especially in teaching, as they are easy to be handled and display a realistic topography. Plastinated specimens are also used for displaying whole bodies, or body parts in exhibition.

  7. Energy efficient routing protocols for wireless sensor networks: comparison and future directions

    Directory of Open Access Journals (Sweden)

    Loganathan Murukesan

    2017-01-01

    Full Text Available Wireless sensor network consists of nodes with limited resources. Hence, it is important to design protocols or algorithms which increases energy efficiency in order to improve the network lifetime. In this paper, techniques used in the network layer (routing of the internet protocol stack to achieve energy efficiency are reviewed. Usually, the routing protocols are classified into four main schemes: (1 Network Structure, (2 Communication Model, (3 Topology Based, and (4 Reliable Routing. In this work, only network structure based routing protocols are reviewed due to the page constraint. Besides, this type of protocols are much popular among the researchers since they are fairly simple to implement and produce good results as presented in this paper. Also, the pros and cons of each protocols are presented. Finally, the paper concludes with possible further research directions.

  8. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  9. SDL-Based Protocol Validation for the Integrated Safety Communication Network in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Jung-hun; Kim, Dong-hoon; Lee, Dong-young; Park, Sung-woo

    2006-01-01

    The communication protocol in nuclear power plants needs to be validated systematically to avoid the critical situation that may be caused by its own faults. We establish the methodology to validate the protocol designed for the Integrated Safety Communication Networks (ISCN) of Korea Nuclear Instrumentation and Control System (KNICS). The ISCN protocol is specified using the formal description technique called the SDL. The validation of ISCN protocol is done via the Simulator and Validator, both of which are main functions provided by the SDL

  10. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  11. Impersonation attack on a quantum secure direct communication and authentication protocol with improvement

    Science.gov (United States)

    Amerimehr, Ali; Hadain Dehkordi, Massoud

    2018-03-01

    We analyze the security of a quantum secure direct communication and authentication protocol based on single photons. We first give an impersonation attack on the protocol. The cryptanalysis shows that there is a gap in the authentication procedure of the protocol so that an opponent can reveal the secret information by an undetectable attempt. We then propose an improvement for the protocol and show it closes the gap by applying a mutual authentication procedure. In the improved protocol single photons are transmitted once in a session, so it is easy to implement as the primary protocol. Furthermore, we use a novel technique for secret order rearrangement of photons by which not only quantum storage is eliminated also a secret key can be reused securely. So the new protocol is applicable in practical approaches like embedded system devices.

  12. Breaking Megrelishvili protocol using matrix diagonalization

    Science.gov (United States)

    Arzaki, Muhammad; Triantoro Murdiansyah, Danang; Adi Prabowo, Satrio

    2018-03-01

    In this article we conduct a theoretical security analysis of Megrelishvili protocol—a linear algebra-based key agreement between two participants. We study the computational complexity of Megrelishvili vector-matrix problem (MVMP) as a mathematical problem that strongly relates to the security of Megrelishvili protocol. In particular, we investigate the asymptotic upper bounds for the running time and memory requirement of the MVMP that involves diagonalizable public matrix. Specifically, we devise a diagonalization method for solving the MVMP that is asymptotically faster than all of the previously existing algorithms. We also found an important counterintuitive result: the utilization of primitive matrix in Megrelishvili protocol makes the protocol more vulnerable to attacks.

  13. A Hybrid Analysis for Security Protocols with State

    Science.gov (United States)

    2014-07-16

    http://www.mitre.org/publications/ technical-papers/completeness-of-cpsa. [19] Simon Meier, Cas Cremers , and David Basin. Efficient construction of...7] Cas Cremers and Sjouke Mauw. Operational semantics and verification of security protocols. Springer, 2012. [8] Anupam Datta, Ante Derek, John C

  14. Using concurrent think-aloud and protocol analysis to explore student nurses' social learning information communication technology knowledge and skill development.

    Science.gov (United States)

    Todhunter, Fern

    2015-06-01

    Observations obtained through concurrent think-aloud and protocol analysis offer new understanding about the influence of social learning on student nurses' acquisition of Information and Communication Technology (ICT) knowledge and skills. The software used provides a permanent record of the underpinning study method, events and analyses. The emerging themes reflect the dimensions of social engagement, and the characteristics of positive and negative reactions to ICT. The evidence shows that given the right conditions, stronger learners will support and guide their peers. To explore the use of concurrent think-aloud and protocol analysis as a method to examine how student nurses approach ICT. To identify the benefits and challenges of using observational technology to capture learning behaviours. To show the influence of small group arrangement and student interactions on their ICT knowledge and skills development. Previous studies examining social interaction between students show how they work together and respond to interactive problem solving. Social interaction has been shown to enhance skills in both ICT and collaborative decision making. Structured observational analysis using concurrent think-aloud and protocol analysis. Students displayed varying degrees of pastoral support and emotional need, leadership, reflection, suggestion and experimentation skills. Encouraging student nurses to work in small mixed ability groups can be conducive for social and ICT skill and knowledge development. Observational software gives a permanent record of the proceedings. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  16. Development of a dynamic quality assurance testing protocol for multisite clinical trial DCE-CT accreditation

    Energy Technology Data Exchange (ETDEWEB)

    Driscoll, B. [Department of Radiation Physics, Princess Margaret Cancer Center, 610 University Avenue, Toronto, Ontario M5G 2M9 (Canada); Keller, H. [Department of Radiation Physics, Princess Margaret Cancer Center, 610 University Avenue, Toronto, Ontario M5G 2M9, Canada and Department of Radiation Oncology, University of Toronto, 150 College Street, Toronto, Ontario M5S 3E2 (Canada); Jaffray, D.; Coolens, C. [Department of Radiation Physics, Princess Margaret Cancer Center, 610 University Avenue, Toronto, Ontario M5G 2M9 (Canada); Department of Radiation Oncology, University of Toronto, 150 College Street, Toronto, Ontario M5S 3E2 (Canada); Techna Institute, University Health Network, 124-100 College Street, Toronto, Ontario M5G 1L5 (Canada)

    2013-08-15

    Purpose: Credentialing can have an impact on whether or not a clinical trial produces useful quality data that is comparable between various institutions and scanners. With the recent increase of dynamic contrast enhanced-computed tomography (DCE-CT) usage as a companion biomarker in clinical trials, effective quality assurance, and control methods are required to ensure there is minimal deviation in the results between different scanners and protocols at various institutions. This paper attempts to address this problem by utilizing a dynamic flow imaging phantom to develop and evaluate a DCE-CT quality assurance (QA) protocol.Methods: A previously designed flow phantom, capable of producing predictable and reproducible time concentration curves from contrast injection was fully validated and then utilized to design a DCE-CT QA protocol. The QA protocol involved a set of quantitative metrics including injected and total mass error, as well as goodness of fit comparison to the known truth concentration curves. An additional region of interest (ROI) sensitivity analysis was also developed to provide additional details on intrascanner variability and determine appropriate ROI sizes for quantitative analysis. Both the QA protocol and ROI sensitivity analysis were utilized to test variations in DCE-CT results using different imaging parameters (tube voltage and current) as well as alternate reconstruction methods and imaging techniques. The developed QA protocol and ROI sensitivity analysis was then applied at three institutions that were part of clinical trial involving DCE-CT and results were compared.Results: The inherent specificity of robustness of the phantom was determined through calculation of the total intraday variability and determined to be less than 2.2 ± 1.1% (total calculated output contrast mass error) with a goodness of fit (R{sup 2}) of greater than 0.99 ± 0.0035 (n= 10). The DCE-CT QA protocol was capable of detecting significant deviations from

  17. An Authentication Protocol for Future Sensor Networks.

    Science.gov (United States)

    Bilal, Muhammad; Kang, Shin-Gak

    2017-04-28

    Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN

  18. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  19. Study protocol

    DEFF Research Database (Denmark)

    Smith, Benjamin E; Hendrick, Paul; Bateman, Marcus

    2017-01-01

    avoidance behaviours, catastrophising, self-efficacy, sport and leisure activity participation, and general quality of life. Follow-up will be 3 and 6 months. The analysis will focus on descriptive statistics and confidence intervals. The qualitative components will follow a thematic analysis approach....... DISCUSSION: This study will evaluate the feasibility of running a definitive large-scale trial on patients with patellofemoral pain, within the NHS in the UK. We will identify strengths and weaknesses of the proposed protocol and the utility and characteristics of the outcome measures. The results from...... this study will inform the design of a multicentre trial. TRIAL REGISTRATION: ISRCTN35272486....

  20. Comprehensive protocol of traceability during IVF: the result of a multicentre failure mode and effect analysis.

    Science.gov (United States)

    Rienzi, L; Bariani, F; Dalla Zorza, M; Albani, E; Benini, F; Chamayou, S; Minasi, M G; Parmegiani, L; Restelli, L; Vizziello, G; Costa, A Nanni

    2017-08-01

    Can traceability of gametes and embryos be ensured during IVF? The use of a simple and comprehensive traceability system that includes the most susceptible phases during the IVF process minimizes the risk of mismatches. Mismatches in IVF are very rare but unfortunately possible with dramatic consequences for both patients and health care professionals. Traceability is thus a fundamental aspect of the treatment. A clear process of patient and cell identification involving witnessing protocols has to be in place in every unit. To identify potential failures in the traceability process and to develop strategies to mitigate the risk of mismatches, previously failure mode and effects analysis (FMEA) has been used effectively. The FMEA approach is however a subjective analysis, strictly related to specific protocols and thus the results are not always widely applicable. To reduce subjectivity and to obtain a widespread comprehensive protocol of traceability, a multicentre centrally coordinated FMEA was performed. Seven representative Italian centres (three public and four private) were selected. The study had a duration of 21 months (from April 2015 to December 2016) and was centrally coordinated by a team of experts: a risk analysis specialist, an expert embryologist and a specialist in human factor. Principal investigators of each centre were first instructed about proactive risk assessment and FMEA methodology. A multidisciplinary team to perform the FMEA analysis was then formed in each centre. After mapping the traceability process, each team identified the possible causes of mistakes in their protocol. A risk priority number (RPN) for each identified potential failure mode was calculated. The results of the FMEA analyses were centrally investigated and consistent corrective measures suggested. The teams performed new FMEA analyses after the recommended implementations. In each centre, this study involved: the laboratory director, the Quality Control & Quality

  1. Pregnancy outcome of “delayed start” GnRH antagonist protocol versus GnRH antagonist protocol in poor responders: A clinical trial study

    Directory of Open Access Journals (Sweden)

    Abbas Aflatoonian

    2017-08-01

    Full Text Available Background: Management of poor-responding patients is still major challenge in assisted reproductive techniques (ART. Delayed-start GnRH antagonist protocol is recommended to these patients, but little is known in this regards. Objective: The goal of this study was assessment of delayed-start GnRH antagonist protocol in poor responders, and in vitro fertilization (IVF outcomes. Materials and Methods: This randomized clinical trial included sixty infertile women with Bologna criteria for ovarian poor responders who were candidate for IVF. In case group (n=30, delayed-start GnRH antagonist protocol administered estrogen priming followed by early follicular-phase GnRH antagonist treatment for 7 days before ovarian stimulation with gonadotropin. Control group (n=30 treated with estrogen priming antagonist protocol. Finally, endometrial thickness, the rates of oocytes maturation, , embryo formation, and pregnancy were compared between two groups. Results: Rates of implantation, chemical, clinical, and ongoing pregnancy in delayed-start cycles were higher although was not statistically significant. Endometrial thickness was significantly higher in case group. There were no statistically significant differences in the rates of oocyte maturation, embryo formation, and IVF outcomes between two groups. Conclusion: There is no significant difference between delayed-start GnRH antagonist protocol versus GnRH antagonist protocol.

  2. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  3. Successful implementation of a perioperative glycemic control protocol in cardiac surgery: barrier analysis and intervention using lean six sigma.

    Science.gov (United States)

    Martinez, Elizabeth A; Chavez-Valdez, Raul; Holt, Natalie F; Grogan, Kelly L; Khalifeh, Katherine W; Slater, Tammy; Winner, Laura E; Moyer, Jennifer; Lehmann, Christoph U

    2011-01-01

    Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period.

  4. Successful Implementation of a Perioperative Glycemic Control Protocol in Cardiac Surgery: Barrier Analysis and Intervention Using Lean Six Sigma

    Science.gov (United States)

    Martinez, Elizabeth A.; Chavez-Valdez, Raul; Holt, Natalie F.; Grogan, Kelly L.; Khalifeh, Katherine W.; Slater, Tammy; Winner, Laura E.; Moyer, Jennifer; Lehmann, Christoph U.

    2011-01-01

    Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period. PMID:22091218

  5. Successful Implementation of a Perioperative Glycemic Control Protocol in Cardiac Surgery: Barrier Analysis and Intervention Using Lean Six Sigma

    Directory of Open Access Journals (Sweden)

    Elizabeth A. Martinez

    2011-01-01

    Full Text Available Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period.

  6. A routine PET/CT protocol with simple calculations for assessing cardiac amyloid using 18F-Florbetapir

    Directory of Open Access Journals (Sweden)

    Dustin Ryan Osborne

    2015-05-01

    Full Text Available Introduction: Cardiac amyloidosis is a rare condition characterized by the deposition of well-structured protein fibrils, proteoglycans, and serum proteins as amyloid. Recent work has shown that it may be possible to use 18F-Florbetapir to image cardiac amyloidosis. Current methods for assessment include invasive biopsy techniques. This work enhances foundational work by Dorbala et al. by developing a routine imaging and analysis protocol using 18F-Florbetapir for cardiac amyloid assessment.Methods: Ten patients, 3 healthy controls and 7 amyloid positive patients, were imaged using 18F-Florbetapir to assess cardiac amyloid burden. Four of the patients also were imaged using 82Rb-Chloride to evaluate possible 18F-Florbetapir retention because of reduced myocardial blood flow. Quantitative methods using modeling, SUVs and SUV ratios were used to define a new streamlined clinical imaging protocol that could be used routinely and provide patient stratification.Results: Quantitative analysis of 18F-Florbetapir cardiac amyloid data were compiled from a 20 minute listmode protocol with data histogrammed into two static images at 0-5 minutes and, 10-15 min or 15-20 min. Data analysis indicated the use of SUVs or ratios of SUVs calculated from regions draw in the septal wall were adequate in identification of all healthy controls from amyloid positive patients in this small cohort. Additionally, we found that it may be possible to use this method to differentiate patients suffering from AL vs. TTR amyloid.Conclusions: This work builds on the seminal work by Dorbala et Al. by describing a short 18F-Florbetapir imaging protocol that is suitable for routine clinical use and uses a simple method for quantitative analysis of cardiac amyloid disease.

  7. Techniques and Protocols for Dispersing Nanoparticle Powders in Aqueous Media—is there a Rationale for Harmonization?

    DEFF Research Database (Denmark)

    Hartmann, Nanna B.; Jensen, Keld Alstrup; Baun, Anders

    2015-01-01

    scientific studies and from consensus reached in larger scale research projects and international organizations. A step-wise approach is proposed to develop tailored dispersion protocols for ecotoxicological and mammalian toxicological testing of ENP. The recommendations of this analysis may serve as a guide......Selecting appropriate ways of bringing engineered nanoparticles (ENP) into aqueous dispersion is a main obstacle for testing, and thus for understanding and evaluating, their potential adverse effects to the environment and human health. Using different methods to prepare (stock) dispersions...... of the same ENP may be a source of variation in the toxicity measured. Harmonization and standardization of dispersion methods applied in mammalian and ecotoxicity testing are needed to ensure a comparable data quality and to minimize test artifacts produced by modifications of ENP during the dispersion...

  8. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Directory of Open Access Journals (Sweden)

    Yongming Han

    Full Text Available Quantifying elemental carbon (EC content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT. A high-temperature method with extended heating times (STN120 showed the highest ECT/ECR ratio (0.86 while a low-temperature protocol (IMPROVE-550, with heating time adjusted for sample loading, showed the lowest (0.53. STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average than in soils (5.2 on average, most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  9. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Science.gov (United States)

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  10. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  11. TU-H-207A-09: An Automated Technique for Estimating Patient-Specific Regional Imparted Energy and Dose From TCM CT Exams Across 13 Protocols

    International Nuclear Information System (INIS)

    Sanders, J; Tian, X; Segars, P; Boone, J; Samei, E

    2016-01-01

    Purpose: To develop an automated technique for estimating patient-specific regional imparted energy and dose from tube current modulated (TCM) computed tomography (CT) exams across a diverse set of head and body protocols. Methods: A library of 58 adult computational anthropomorphic extended cardiac-torso (XCAT) phantoms were used to model a patient population. A validated Monte Carlo program was used to simulate TCM CT exams on the entire library of phantoms for three head and 10 body protocols. The net imparted energy to the phantoms, normalized by dose length product (DLP), and the net tissue mass in each of the scan regions were computed. A knowledgebase containing relationships between normalized imparted energy and scanned mass was established. An automated computer algorithm was written to estimate the scanned mass from actual clinical CT exams. The scanned mass estimate, DLP of the exam, and knowledgebase were used to estimate the imparted energy to the patient. The algorithm was tested on 20 chest and 20 abdominopelvic TCM CT exams. Results: The normalized imparted energy increased with increasing kV for all protocols. However, the normalized imparted energy was relatively unaffected by the strength of the TCM. The average imparted energy was 681 ± 376 mJ for abdominopelvic exams and 274 ± 141 mJ for chest exams. Overall, the method was successful in providing patientspecific estimates of imparted energy for 98% of the cases tested. Conclusion: Imparted energy normalized by DLP increased with increasing tube potential. However, the strength of the TCM did not have a significant effect on the net amount of energy deposited to tissue. The automated program can be implemented into the clinical workflow to provide estimates of regional imparted energy and dose across a diverse set of clinical protocols.

  12. A Concise Protocol for the Validation of Language ENvironment Analysis (LENA) Conversational Turn Counts in Vietnamese

    Science.gov (United States)

    Ganek, Hillary V.; Eriks-Brophy, Alice

    2018-01-01

    The aim of this study was to present a protocol for the validation of the Language ENvironment Analysis (LENA) System's conversational turn count (CTC) for Vietnamese speakers. Ten families of children aged between 22 and 42 months, recruited near Ho Chi Minh City, participated in this project. Each child wore the LENA audio recorder for a full…

  13. Security Protocols: Specification, Verification, Implementation, and Composition

    DEFF Research Database (Denmark)

    Almousa, Omar

    An important aspect of Internet security is the security of cryptographic protocols that it deploys. We need to make sure that such protocols achieve their goals, whether in isolation or in composition, i.e., security protocols must not suffer from any aw that enables hostile intruders to break...... results. The most important generalization is the support for all security properties of the geometric fragment proposed by [Gut14]....... their security. Among others, tools like OFMC [MV09b] and Proverif [Bla01] are quite efficient for the automatic formal verification of a large class of protocols. These tools use different approaches such as symbolic model checking or static analysis. Either approach has its own pros and cons, and therefore, we...

  14. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P 2015 statement

    Directory of Open Access Journals (Sweden)

    Mireia Estarli

    2016-02-01

    Full Text Available Systematic reviews should build on a protocol that describes the rationale, hypothesis, and planned methods of the review; few reviews report whether a protocol exists. Detailed, well-described protocols can facilitate the understanding and appraisal of the review methods, as well as the detection of modifications to methods and selective reporting in completed reviews. We describe the development of a reporting guideline, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015. PRISMA-P consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review. Funders and those commissioning reviews might consider mandating the use of the checklist to facilitate the submission of relevant protocol information in funding applications. Similarly, peer reviewers and editors can use the guidance to gauge the completeness and transparency of a systematic review protocol submitted for publication in a journal or other medium. Translation with permission of the authors. The original authors have not revised and verified the Spanish translation, and they do not necessarily endorse it.

  15. Insect Venom Immunotherapy: Analysis of the Safety and Tolerance of 3 Buildup Protocols Frequently Used in Spain.

    Science.gov (United States)

    Gutiérrez Fernández, D; Moreno-Ancillo, A; Fernández Meléndez, S; Domínguez-Noche, C; Gálvez Ruiz, P; Alfaya Arias, T; Carballada González, F; Alonso Llamazares, A; Marques Amat, L; Vega Castro, A; Antolín Amérigo, D; Cruz Granados, S; Ruiz León, B; Sánchez Morillas, L; Fernández Sánchez, J; Soriano Gomis, V; Borja Segade, J; Dalmau Duch, G; Guspi Bori, R; Miranda Páez, A

    2016-01-01

    Hymenoptera venom immunotherapy (VIT) is an effective treatment but not one devoid of risk, as both local and systemic adverse reactions may occur, especially in the initial phases. We compared the tolerance to 3 VIT buildup protocols and analyzed risk factors associated with adverse reactions during this phase. We enrolled 165 patients divided into 3 groups based on the buildup protocol used (3, 4, and 9 weeks). The severity of systemic reactions was evaluated according to the World Allergy Organization model. Results were analyzed using exploratory descriptive statistics, and variables were compared using analysis of variance. Adverse reactions were recorded in 53 patients (32%) (43 local and 10 systemic). Local reactions were immediate in 27 patients (63%) and delayed in 16 (37%). The severity of the local reaction was slight/moderate in 15 patients and severe in 13. Systemic reactions were grade 1-2. No significant association was found between the treatment modality and the onset of local or systemic adverse reactions or the type of local reaction. We only found a statistically significant association between severity of the local reaction and female gender. As for the risk factors associated with systemic reactions during the buildup phase, we found no significant differences in values depending on the protocol used or the insect responsible. The buildup protocols compared proved to be safe and did not differ significantly from one another. In the population studied, patients undergoing the 9-week schedule presented no systemic reactions. Therefore, this protocol can be considered the safest approach.

  16. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  17. A transferability study of the EPR-tooth-dosimetry technique

    International Nuclear Information System (INIS)

    Sholom, S.; Chumak, V.; Desrosiers, M.; Bouville, A.

    2006-01-01

    The transferability of a measurement protocol from one laboratory to another is an important feature of any mature, standardised protocol. The electron paramagnetic resonance (EPR)-tooth dosimetry technique that was developed in Scientific Center for Radiation Medicine, AMS (Ukraine) (SCRM) for routine dosimetry of Chernobyl liquidators has demonstrated consistent results in several inter-laboratory measurement comparisons. Transferability to the EPR dosimetry laboratory at the National Inst. of Standards and Technology (NIST) was examined. Several approaches were used to test the technique, including dose reconstruction of SCRM-NIST inter-comparison samples. The study has demonstrated full transferability of the technique and the possibility to reproduce results in a different laboratory environment. (authors)

  18. Establishing a protocol for element determination in human nail clippings by neutron activation analysis

    International Nuclear Information System (INIS)

    Sanches, Thalita Pinheiro; Saiki, Mitiko

    2011-01-01

    Human nail samples have been analyzed to evaluate occupational exposure, nutritional status and to diagnose certain diseases. However, sampling and washing protocols for nail analyses vary from study to study not allowing comparisons between studies. One of the difficulties in analyzing nail samples is to eliminate only surface contamination without removing elements of interest in this tissue. In the present study, a protocol was defined in order to obtain reliable results of element concentrations in human nail clippings. Nail clippings collected from all 10 fingers or toes were previously pre cleaned using an ethyl alcohol solution to eliminate microbes. Then, the clippings were cut in small pieces and submitted to different reagents for washing by shaking. Neutron activation analysis (NAA) was applied for nail samples analysis which consisted of irradiating aliquots of samples together with synthetic elemental standards in the IEA-R1 nuclear research reactor followed by gamma ray spectrometry. Comparisons made between the results obtained for nails submitted to different reagents for cleaning indicated that the procedure using acetone and Triton X100 solution is more effective than that of nitric acid solution. Analyses in triplicates of a nail sample indicated results with relative standard deviations lower than 15% for most of elements, showing the homogeneity of the prepared sample. Qualitative analyses of different nail polishes showed that the presence of elements determined in the present study is negligible in these products. Quality control of the analytical results indicated that the applied NAA procedure is adequate for human nail analysis. (author)

  19. Protocol to Manage Heritage-Building Interventions Using Heritage Building Information Modelling (HBIM

    Directory of Open Access Journals (Sweden)

    Isabel Jordan-Palomar

    2018-03-01

    Full Text Available The workflow in historic architecture projects presents problems related to the lack of clarity of processes, dispersion of information and the use of outdated tools. Different heritage organisations have showed interest in innovative methods to resolve those problems and improve cultural tourism for sustainable economic development. Building Information Modelling (BIM has emerged as a suitable computerised system for improving heritage management. Its application to historic buildings is named Historic BIM (HBIM. HBIM literature highlights the need for further research in terms of the overall processes of heritage projects, its practical implementation and a need for better cultural documentation. This work uses Design Science Research to develop a protocol to improve the workflow in heritage interdisciplinary projects. Research techniques used include documentary analysis, semi-structured interviews and focus groups. HBIM is proposed as a virtual model that will hold heritage data and will articulate processes. As a result, a simple and visual HBIM protocol was developed and applied in a real case study. The protocol was named BIMlegacy and it is divided into eight phases: building registration, determine intervention options, develop design for intervention, planning the physical intervention, physical intervention, handover, maintenance and culture dissemination. It contemplates all the stakeholders involved.

  20. Nonlinear analysis techniques of block masonry walls in nuclear power plants

    International Nuclear Information System (INIS)

    Hamid, A.A.; Harris, H.G.

    1986-01-01

    Concrete masonry walls have been used extensively in nuclear power plants as non-load bearing partitions serving as pipe supports, fire walls, radiation shielding barriers, and similar heavy construction separations. When subjected to earthquake loads, these walls should maintain their structural integrity. However, some of the walls do not meet design requirements based on working stress allowables. Consequently, utilities have used non-linear analysis techniques, such as the arching theory and the energy balance technique, to qualify such walls. This paper presents a critical review of the applicability of non-linear analysis techniques for both unreinforced and reinforced block masonry walls under seismic loading. These techniques are critically assessed in light of the performance of walls from limited available test data. It is concluded that additional test data are needed to justify the use of nonlinear analysis techniques to qualify block walls in nuclear power plants. (orig.)

  1. A protocol for analysing thermal stress in insects using infrared thermography.

    Science.gov (United States)

    Gallego, Belén; Verdú, José R; Carrascal, Luis M; Lobo, Jorge M

    2016-02-01

    The study of insect responses to thermal stress has involved a variety of protocols and methodologies that hamper the ability to compare results between studies. For that reason, the development of a protocol to standardize thermal assays is necessary. In this sense, infrared thermography solves some of the problems allowing us to take continuous temperature measurements without handling the individuals, an important fact in cold-blooded organisms like insects. Here, we present a working protocol based on infrared thermography to estimate both cold and heat thermal stress in insects. We analyse both the change in the body temperature of individuals and their behavioural response. In addition, we used partial least squares regression for the statistical analysis of our data, a technique that solves the problem of having a large number of variables and few individuals, allowing us to work with rare or endemic species. To test our protocol, we chose two species of congeneric, narrowly distributed dung beetles that are endemic to the southeastern part of the Iberian Peninsula. With our protocol we have obtained five variables in the response to cold and twelve in the response to heat. With this methodology we discriminate between the two flightless species of Jekelius through their thermal response. In response to cold, Jekelius hernandezi showed a higher rate of cooling and reached higher temperatures of stupor and haemolymph freezing than Jekelius punctatolineatus. Both species displayed similar thermoregulation ranges before reaching lethal body temperature with heat stress. Overall, we have demonstrated that infrared thermography is a suitable method to assess insect thermal responses with a high degree of sensitivity, allowing for the discrimination between closely related species. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Protocol of measurement techniques - Project colored solar collectors

    Energy Technology Data Exchange (ETDEWEB)

    Schueler, A.; Chambrier, E. De; Roecker, Ch.; Scartezzini, J.-L.

    2004-08-15

    This illustrated annual report for the Swiss Federal Office of Energy (SFOE) takes a look at work done at the Swiss Federal Institute of Technology in Lausanne, Switzerland, on multi-layer, thin-film interference coatings for solar collector glazing. The correct combinations of refractive indices and film thickness are discussed. The authors state that corresponding multi-layered thin film stacks will have to be realised experimentally in a controlled and reproducible way. New thin film materials are to be tailored to exhibit optimised optical and ageing properties. The development of these coatings is to be based on various measurement techniques, such as spectro-photometry, measurements of total power throughput by means of a solar simulator, spectroscopic ellipsometry, scanning electron microscopy (SEM), X-ray diffraction (XRD) and X-ray photoelectron spectroscopy (XPS). The paper provides many examples of typical data and explains which film properties can be inferred from each method and thus describes both the function and purpose of the different measurement techniques.

  3. Solar Cell Calibration and Measurement Techniques

    Science.gov (United States)

    Bailey, Sheila; Brinker, Dave; Curtis, Henry; Jenkins, Phillip; Scheiman, Dave

    2004-01-01

    The increasing complexity of space solar cells and the increasing international markets for both cells and arrays has resulted in workshops jointly sponsored by NASDA, ESA and NASA. These workshops are designed to obtain international agreement on standardized values for the AMO spectrum and constant, recommend laboratory measurement practices and establish a set of protocols for international comparison of laboratory measurements. A working draft of an ISO standard, WD15387, "Requirements for Measurement and Calibration Procedures for Space Solar Cells" was discussed with a focus on the scope of the document, a definition of primary standard cell, and required error analysis for all measurement techniques. Working groups addressed the issues of Air Mass Zero (AMO) solar constant and spectrum, laboratory measurement techniques, and te international round robin methodology. A summary is presented of the current state of each area and the formulation of the ISO document.

  4. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  5. Blocking Optimality in Distributed Real-Time Locking Protocols

    Directory of Open Access Journals (Sweden)

    Björn Bernhard Brandenburg

    2014-09-01

    Full Text Available Lower and upper bounds on the maximum priority inversion blocking (pi-blocking that is generally unavoidable in distributed multiprocessor real-time locking protocols (where resources may be accessed only from specific synchronization processors are established. Prior work on suspension-based shared-memory multiprocessor locking protocols (which require resources to be accessible from all processors has established asymptotically tight bounds of Ω(m and Ω(n maximum pi-blocking under suspension-oblivious and suspension-aware analysis, respectively, where m denotes the total number of processors and n denotes the number of tasks. In this paper, it is shown that, in the case of distributed semaphore protocols, there exist two different task allocation scenarios that give rise to distinct lower bounds. In the case of co-hosted task allocation, where application tasks may also be assigned to synchronization processors (i.e., processors hosting critical sections, Ω(Φ · n maximum pi-blocking is unavoidable for some tasks under any locking protocol under both suspension-aware and suspension-oblivious schedulability analysis, where Φ denotes the ratio of the maximum response time to the shortest period. In contrast, in the case of disjoint task allocation (i.e., if application tasks may not be assigned to synchronization processors, only Ω(m and Ω(n maximum pi-blocking is fundamentally unavoidable under suspension-oblivious and suspension-aware analysis, respectively, as in the shared-memory case. These bounds are shown to be asymptotically tight with the construction of two new distributed real-time locking protocols that ensure O(m and O(n maximum pi-blocking under suspension-oblivious and suspension-aware analysis, respectively.

  6. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  7. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Revised

    Science.gov (United States)

    Fargion, Giulietta S.; Mueller, James L.

    2000-01-01

    The document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. This document supersedes the earlier version (Mueller and Austin 1995) published as Volume 25 in the SeaWiFS Technical Report Series. This document marks a significant departure from, and improvement on, theformat and content of Mueller and Austin (1995). The authorship of the protocols has been greatly broadened to include experts specializing in some key areas. New chapters have been added to provide detailed and comprehensive protocols for stability monitoring of radiometers using portable sources, abovewater measurements of remote-sensing reflectance, spectral absorption measurements for discrete water samples, HPLC pigment analysis and fluorometric pigment analysis. Protocols were included in Mueller and Austin (1995) for each of these areas, but the new treatment makes significant advances in each topic area. There are also new chapters prescribing protocols for calibration of sun photometers and sky radiance sensors, sun photometer and sky radiance measurements and analysis, and data archival. These topic areas were barely mentioned in Mueller and Austin (1995).

  8. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  9. Protocol Monitoring Energy Conservation; Protocol Monitoring Energiebesparing

    Energy Technology Data Exchange (ETDEWEB)

    Boonekamp, P.G.M. [ECN Beleidsstudies, Petten (Netherlands); Mannaerts, H. [Centraal Planburea CPB, Den Haag (Netherlands); Tinbergen, W. [Centraal Bureau voor de Statistiek CBS, Den Haag (Netherlands); Vreuls, H.H.J. [Nederlandse onderneming voor energie en milieu Novem, Utrecht (Netherlands); Wesselink, B. [Rijksinstituut voor Volksgezondheid en Milieuhygiene RIVM, Bilthoven (Netherlands)

    2001-12-01

    On request of the Dutch ministry of Economic Affairs five institutes have collaborated to create a 'Protocol Monitoring Energy Conservation', a common method and database to calculate the amount of energy savings realised in past years. The institutes concerned are the Central Bureau of Statistics (CBS), the Netherlands Bureau for Economic Policy Analysis (CPB), the Energy research Centre of the Netherlands (ECN), the National Agency for Energy and Environment (Novem) and the Netherlands Institute of Public Health and the Environment (RIVM). The institutes have agreed upon a clear definition of energy use and energy savings. The demarcation with renewable energy, the saving effects of substitution between energy carriers and the role of import and export of energy have been elaborated. A decomposition method is used to split up the observed change in energy use in a number of effects, on a national and sectoral level. This method includes an analysis of growth effects, effects of structural changes in production and consumption activities and savings on end use or with more efficient conversion processes. To calculate these effects the total energy use is desegregated as much as possible. For each segment a reference energy use is calculated according to the trend in a variable which is supposed to be representative for the use without savings. The difference with the actual energy use is taken as the savings realised. Results are given for the sectors households, industry, agriculture, services and government, transportation and the energy sector; as well as a national figure. A special feature of the protocol method is the application of primary energy use figures in the determination of savings for end users. This means that the use of each energy carrier is increased with a certain amount, according to the conversion losses caused elsewhere in the energy system. The losses concern the base year energy sector and losses abroad for imports of secondary

  10. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Power-Controlled MAC Protocols with Dynamic Neighbor Prediction for Ad hoc Networks

    Institute of Scientific and Technical Information of China (English)

    LI Meng; ZHANG Lin; XIAO Yong-kang; SHAN Xiu-ming

    2004-01-01

    Energy and bandwidth are the scarce resources in ad hoc networks because most of the mobile nodes are battery-supplied and share the exclusive wireless medium. Integrating the power control into MAC protocol is a promising technique to fully exploit these precious resources of ad hoc wireless networks. In this paper, a new intelligent power-controlled Medium Access Control (MAC) (iMAC) protocol with dynamic neighbor prediction is proposed. Through the elaborate design of the distributed transmit-receive strategy of mobile nodes, iMAC greatly outperforms the prevailing IEEE 802.11 MAC protocols in not only energy conservation but also network throughput. Using the Dynamic Neighbor Prediction (DNP), iMAC performs well in mobile scenes. To the best of our knowledge, iMAC is the first protocol that considers the performance deterioration of power-controlled MAC protocols in mobile scenes and then proposes a solution. Simulation results indicate that DNP is important and necessary for power-controlled MAC protocols in mobile ad hoc networks.

  12. Performance analysis of clustering techniques over microarray data: A case study

    Science.gov (United States)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  13. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  14. Inter-comparison of NIOSH and IMPROVE protocols for OC and EC determination: implications for inter-protocol data conversion

    Science.gov (United States)

    Wu, Cheng; Huang, X. H. Hilda; Ng, Wai Man; Griffith, Stephen M.; Zhen Yu, Jian

    2016-09-01

    Organic carbon (OC) and elemental carbon (EC) are operationally defined by analytical methods. As a result, OC and EC measurements are protocol dependent, leading to uncertainties in their quantification. In this study, more than 1300 Hong Kong samples were analyzed using both National Institute for Occupational Safety and Health (NIOSH) thermal optical transmittance (TOT) and Interagency Monitoring of Protected Visual Environment (IMPROVE) thermal optical reflectance (TOR) protocols to explore the cause of EC disagreement between the two protocols. EC discrepancy mainly (83 %) arises from a difference in peak inert mode temperature, which determines the allocation of OC4NSH, while the rest (17 %) is attributed to a difference in the optical method (transmittance vs. reflectance) applied for the charring correction. Evidence shows that the magnitude of the EC discrepancy is positively correlated with the intensity of the biomass burning signal, whereby biomass burning increases the fraction of OC4NSH and widens the disagreement in the inter-protocol EC determination. It is also found that the EC discrepancy is positively correlated with the abundance of metal oxide in the samples. Two approaches (M1 and M2) that translate NIOSH TOT OC and EC data into IMPROVE TOR OC and EC data are proposed. M1 uses direct relationship between ECNSH_TOT and ECIMP_TOR for reconstruction: M1 : ECIMP_TOR = a × ECNSH_TOT + b; while M2 deconstructs ECIMP_TOR into several terms based on analysis principles and applies regression only on the unknown terms: M2 : ECIMP_TOR = AECNSH + OC4NSH - (a × PCNSH_TOR + b), where AECNSH, apparent EC by the NIOSH protocol, is the carbon that evolves in the He-O2 analysis stage, OC4NSH is the carbon that evolves at the fourth temperature step of the pure helium analysis stage of NIOSH, and PCNSH_TOR is the pyrolyzed carbon as determined by the NIOSH protocol. The implementation of M1 to all urban site data (without considering seasonal specificity

  15. Broadcast Expenses Controlling Techniques in Mobile Ad-hoc Networks: A Survey

    Directory of Open Access Journals (Sweden)

    Naeem Ahmad

    2016-07-01

    Full Text Available The blind flooding of query packets in route discovery more often characterizes the broadcast storm problem, exponentially increases energy consumption of intermediate nodes and congests the entire network. In such a congested network, the task of establishing the path between resources may become very complex and unwieldy. An extensive research work has been done in this area to improve the route discovery phase of routing protocols by reducing broadcast expenses. The purpose of this study is to provide a comparative analysis of existing broadcasting techniques for the route discovery phase, in order to bring about an efficient broadcasting technique for determining the route with minimum conveying nodes in ad-hoc networks. The study is designed to highlight the collective merits and demerits of such broadcasting techniques along with certain conclusions that would contribute to the choice of broadcasting techniques.

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  17. Detecting in situ copepod diet diversity using molecular technique: development of a copepod/symbiotic ciliate-excluding eukaryote-inclusive PCR protocol.

    Science.gov (United States)

    Hu, Simin; Guo, Zhiling; Li, Tao; Carpenter, Edward J; Liu, Sheng; Lin, Senjie

    2014-01-01

    Knowledge of in situ copepod diet diversity is crucial for accurately describing pelagic food web structure but is challenging to achieve due to lack of an easily applicable methodology. To enable analysis with whole copepod-derived DNAs, we developed a copepod-excluding 18S rDNA-based PCR protocol. Although it is effective in depressing amplification of copepod 18S rDNA, its applicability to detect diverse eukaryotes in both mono- and mixed-species has not been demonstrated. Besides, the protocol suffers from the problem that sequences from symbiotic ciliates are overrepresented in the retrieved 18S rDNA libraries. In this study, we designed a blocking primer to make a combined primer set (copepod/symbiotic ciliate-excluding eukaryote-common: CEEC) to depress PCR amplification of symbiotic ciliate sequences while maximizing the range of eukaryotes amplified. We firstly examined the specificity and efficacy of CEEC by PCR-amplifying DNAs from 16 copepod species, 37 representative organisms that are potential prey of copepods and a natural microplankton sample, and then evaluated the efficiency in reconstructing diet composition by detecting the food of both lab-reared and field-collected copepods. Our results showed that the CEEC primer set can successfully amplify 18S rDNA from a wide range of isolated species and mixed-species samples while depressing amplification of that from copepod and targeted symbiotic ciliate, indicating the universality of CEEC in specifically detecting prey of copepods. All the predetermined food offered to copepods in the laboratory were successfully retrieved, suggesting that the CEEC-based protocol can accurately reconstruct the diets of copepods without interference of copepods and their associated ciliates present in the DNA samples. Our initial application to analyzing the food composition of field-collected copepods uncovered diverse prey species, including those currently known, and those that are unsuspected, as copepod prey

  18. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  19. A Guide to Writing a Qualitative Systematic Review Protocol to Enhance Evidence-Based Practice in Nursing and Health Care.

    Science.gov (United States)

    Butler, Ashleigh; Hall, Helen; Copnell, Beverley

    2016-06-01

    The qualitative systematic review is a rapidly developing area of nursing research. In order to present trustworthy, high-quality recommendations, such reviews should be based on a review protocol to minimize bias and enhance transparency and reproducibility. Although there are a number of resources available to guide researchers in developing a quantitative review protocol, very few resources exist for qualitative reviews. To guide researchers through the process of developing a qualitative systematic review protocol, using an example review question. The key elements required in a systematic review protocol are discussed, with a focus on application to qualitative reviews: Development of a research question; formulation of key search terms and strategies; designing a multistage review process; critical appraisal of qualitative literature; development of data extraction techniques; and data synthesis. The paper highlights important considerations during the protocol development process, and uses a previously developed review question as a working example. This paper will assist novice researchers in developing a qualitative systematic review protocol. By providing a worked example of a protocol, the paper encourages the development of review protocols, enhancing the trustworthiness and value of the completed qualitative systematic review findings. Qualitative systematic reviews should be based on well planned, peer reviewed protocols to enhance the trustworthiness of results and thus their usefulness in clinical practice. Protocols should outline, in detail, the processes which will be used to undertake the review, including key search terms, inclusion and exclusion criteria, and the methods used for critical appraisal, data extraction and data analysis to facilitate transparency of the review process. Additionally, journals should encourage and support the publication of review protocols, and should require reference to a protocol prior to publication of the

  20. Comparative Study on Various Authentication Protocols in Wireless Sensor Networks.

    Science.gov (United States)

    Rajeswari, S Raja; Seenivasagam, V

    2016-01-01

    Wireless sensor networks (WSNs) consist of lightweight devices with low cost, low power, and short-ranged wireless communication. The sensors can communicate with each other to form a network. In WSNs, broadcast transmission is widely used along with the maximum usage of wireless networks and their applications. Hence, it has become crucial to authenticate broadcast messages. Key management is also an active research topic in WSNs. Several key management schemes have been introduced, and their benefits are not recognized in a specific WSN application. Security services are vital for ensuring the integrity, authenticity, and confidentiality of the critical information. Therefore, the authentication mechanisms are required to support these security services and to be resilient to distinct attacks. Various authentication protocols such as key management protocols, lightweight authentication protocols, and broadcast authentication protocols are compared and analyzed for all secure transmission applications. The major goal of this survey is to compare and find out the appropriate protocol for further research. Moreover, the comparisons between various authentication techniques are also illustrated.

  1. Comparative Study on Various Authentication Protocols in Wireless Sensor Networks

    Science.gov (United States)

    Rajeswari, S. Raja; Seenivasagam, V.

    2016-01-01

    Wireless sensor networks (WSNs) consist of lightweight devices with low cost, low power, and short-ranged wireless communication. The sensors can communicate with each other to form a network. In WSNs, broadcast transmission is widely used along with the maximum usage of wireless networks and their applications. Hence, it has become crucial to authenticate broadcast messages. Key management is also an active research topic in WSNs. Several key management schemes have been introduced, and their benefits are not recognized in a specific WSN application. Security services are vital for ensuring the integrity, authenticity, and confidentiality of the critical information. Therefore, the authentication mechanisms are required to support these security services and to be resilient to distinct attacks. Various authentication protocols such as key management protocols, lightweight authentication protocols, and broadcast authentication protocols are compared and analyzed for all secure transmission applications. The major goal of this survey is to compare and find out the appropriate protocol for further research. Moreover, the comparisons between various authentication techniques are also illustrated. PMID:26881272

  2. Entanglement distillation protocols and number theory

    International Nuclear Information System (INIS)

    Bombin, H.; Martin-Delgado, M.A.

    2005-01-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set Z D n associated with Bell diagonal states is a module rather than a vector space. We find that a partition of Z D n into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D. When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively

  3. Analysis technique for controlling system wavefront error with active/adaptive optics

    Science.gov (United States)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  4. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  5. Development of a Ground Test and Analysis Protocol for NASA's NextSTEP Phase 2 Habitation Concepts

    Science.gov (United States)

    Gernhardt, Michael L.; Beaton, Kara H.; Chappell, Steven P.; Bekdash, Omar S.; Abercromby, Andrew F. J.

    2018-01-01

    The NASA Next Space Technologies for Exploration Partnerships (NextSTEP) program is a public-private partnership model that seeks commercial development of deep space exploration capabilities to support human spaceflight missions around and beyond cislunar space. NASA first issued the Phase 1 NextSTEP Broad Agency Announcement to U.S. industries in 2014, which called for innovative cislunar habitation concepts that leveraged commercialization plans for low-Earth orbit. These habitats will be part of the Deep Space Gateway (DSG), the cislunar space station planned by NASA for construction in the 2020s. In 2016, Phase 2 of the NextSTEP program selected five commercial partners to develop ground prototypes. A team of NASA research engineers and subject matter experts (SMEs) have been tasked with developing the ground-test protocol that will serve as the primary means by which these Phase 2 prototypes will be evaluated. Since 2008, this core test team has successfully conducted multiple spaceflight analog mission evaluations utilizing a consistent set of operational tools, methods, and metrics to enable the iterative development, testing, analysis, and validation of evolving exploration architectures, operations concepts, and vehicle designs. The purpose of implementing a similar evaluation process for the Phase 2 Habitation Concepts is to consistently evaluate different commercial partner ground prototypes to provide data-driven, actionable recommendations for Phase 3. This paper describes the process by which the ground test protocol was developed and the objectives, methods, and metrics by which the NextSTEP Phase 2 Habitation Concepts will be rigorously and systematically evaluated. The protocol has been developed using both a top-down and bottom-up approach. Top-down development began with the Human Exploration and Operations Mission Directorate (HEOMD) exploration objectives and ISS Exploration Capability Study Team (IECST) candidate flight objectives. Strategic

  6. Effectiveness of a healthy lifestyle intervention for low back pain and osteoarthritis of the knee: protocol and statistical analysis plan for two randomised controlled trials

    Directory of Open Access Journals (Sweden)

    Kate M. O’Brien

    Full Text Available ABSTRACT Background These trials are the first randomised controlled trials of telephone-based weight management and healthy lifestyle interventions for low back pain and knee osteoarthritis. This article describes the protocol and statistical analysis plan. Method These trials are parallel randomised controlled trials that investigate and compare the effect of a telephone-based weight management and healthy lifestyle intervention for improving pain intensity in overweight or obese patients with low back pain or knee osteoarthritis. The analysis plan was finalised prior to initiation of analyses. All data collected as part of the trial were reviewed, without stratification by group, and classified by baseline characteristics, process of care and trial outcomes. Trial outcomes were classified as primary and secondary outcomes. Appropriate descriptive statistics and statistical testing of between-group differences, where relevant, have been planned and described. Conclusions A protocol for standard analyses was developed for the results of two randomised controlled trials. This protocol describes the data, and the pre-determined statistical tests of relevant outcome measures. The plan demonstrates transparent and verifiable use of the data collected. This a priori protocol will be followed to ensure rigorous standards of data analysis are strictly adhered to.

  7. Effectiveness of a healthy lifestyle intervention for low back pain and osteoarthritis of the knee: protocol and statistical analysis plan for two randomised controlled trials

    Science.gov (United States)

    O’Brien, Kate M.; Williams, Amanda; Wiggers, John; Wolfenden, Luke; Yoong, Serene; Campbell, Elizabeth; Kamper, Steven J.; McAuley, James; Attia, John; Oldmeadow, Chris; Williams, Christopher M.

    2016-01-01

    ABSTRACT Background These trials are the first randomised controlled trials of telephone-based weight management and healthy lifestyle interventions for low back pain and knee osteoarthritis. This article describes the protocol and statistical analysis plan. Method These trials are parallel randomised controlled trials that investigate and compare the effect of a telephone-based weight management and healthy lifestyle intervention for improving pain intensity in overweight or obese patients with low back pain or knee osteoarthritis. The analysis plan was finalised prior to initiation of analyses. All data collected as part of the trial were reviewed, without stratification by group, and classified by baseline characteristics, process of care and trial outcomes. Trial outcomes were classified as primary and secondary outcomes. Appropriate descriptive statistics and statistical testing of between-group differences, where relevant, have been planned and described. Conclusions A protocol for standard analyses was developed for the results of two randomised controlled trials. This protocol describes the data, and the pre-determined statistical tests of relevant outcome measures. The plan demonstrates transparent and verifiable use of the data collected. This a priori protocol will be followed to ensure rigorous standards of data analysis are strictly adhered to. PMID:27683839

  8. Techniques for incorporating operator expertise into intelligent decision aids and training

    International Nuclear Information System (INIS)

    Blackman, H.S.; Nelson, W.R.

    1987-01-01

    The objective of this work is to evaluate the potential for developing a complete model for training novices based upon a combination of rules for operation, and heuristics for application of the rules. The method used to investigate this potential is based upon the experimental evaluation of the response tree expert system. The present study used the low pressure injection system (LPIS) simulation developed for the response tree expert system evaluation, so that the rules of operation were already developed, and only the expert heuristics needed to be identified. The heuristics were abstracted from concurrent and recall protocols, taken from expert operators while attempting to solve transients on the LPIS, using protocol analysis techniques previously developed. This paper describes the experiment, and identifies the mental processes used by expert operators

  9. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  10. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  11. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    Science.gov (United States)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  12. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  13. SIMULATION AND ANALYSIS OF GREEDY ROUTING PROTOCOL IN VIEW OF ENERGY CONSUMPTION AND NETWORK LIFETIME IN THREE DIMENSIONAL UNDERWATER WIRELESS SENSOR NETWORK

    Directory of Open Access Journals (Sweden)

    SHEENA KOHLI

    2017-11-01

    Full Text Available Underwater Wireless Sensor Network (UWSN comprises of a number of miniature sized sensing devices deployed in the sea or ocean, connected by dint of acoustic links to each other. The sensors trap the ambient conditions and transmit the data from one end to another. For transmission of data in any medium, routing protocols play a crucial role. Moreover, being battery limited, an unavoidable parameter to be considered in operation and analysis of protocols is the network energy and the network lifetime. The paper discusses the greedy routing protocol for underwater wireless sensor networks. The simulation of this routing protocol also takes into consideration the characteristics of acoustic communication like attenuation, transmission loss, signal to noise ratio, noise, propagation delay. The results from these observations may be used to construct an accurate underwater communication model.

  14. A shortened protocol for assessing cognitive bias in rats.

    Science.gov (United States)

    Brydges, Nichola M; Hall, Lynsey

    2017-07-15

    Reliable measurement of affective state in animals is a significant goal of animal welfare. Such measurements would also improve the validity of pre-clinical mental health research which relies on animal models. However, at present, affective states in animals are inaccessible to direct measurement. In humans, changes in cognitive processing can give reliable indications of emotional state. Therefore, similar techniques are increasingly being used to gain proxy measures of affective states in animals. In particular, the 'cognitive bias' assay has gained popularity in recent years. Major disadvantages of this technique include length of time taken for animals to acquire the task (typically several weeks), negative experiences associated with task training, and issues of motivation. Here we present a shortened cognitive bias protocol using only positive reinforcers which must actively be responded to. The protocol took an average of 4days to complete, and produced similar results to previous, longer methods (minimum 30days). Specifically, rats housed in standard laboratory conditions demonstrated negative cognitive biases when presented with ambiguous stimuli, and took longer to make a decision when faced with an ambiguous stimulus. Compared to previous methods, this protocol is significantly shorter (average 4days vs. minimum 30days), utilises only positive reinforcers to avoid inducing negative affective states, and requires active responses to all cues, avoiding potential confounds of motivational state. We have successfully developed a shortened cognitive bias protocol, suitable for use with laboratory rats. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  15. Real-Time Fault Tolerant Networking Protocols

    National Research Council Canada - National Science Library

    Henzinger, Thomas A

    2004-01-01

    We made significant progress in the areas of video streaming, wireless protocols, mobile ad-hoc and sensor networks, peer-to-peer systems, fault tolerant algorithms, dependability and timing analysis...

  16. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  17. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  18. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    Users expect communication systems to guarantee, amongst others, privacy and integrity of their data. These can be ensured by using well-established protocols; the best protocol, however, is useless if not all parties involved in a communication have a correct implementation of the protocol and a...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....... necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...

  19. Cine EPID evaluation of two non-commercial techniques for DIBH

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Christopher; Urribarri, Jaime; Cail, Daniel; Rottmann, Joerg; Mishra, Pankaj; Lingos, Tatiana; Niedermayr, Thomas; Berbeco, Ross, E-mail: rberbeco@lroc.harvard.edu [Brigham and Women' s Hospital, Dana-Farber Cancer Institute, Harvard Medical School, Boston, Massachusetts 02115 (United States)

    2014-02-15

    Purpose: To evaluate the efficacy of two noncommercial techniques for deep inspiration breathhold (DIBH) treatment of left-sided breast cancer (LSBC) usingcine electronic portal imaging device (EPID) images. Methods: 23 875 EPID images of 65 patients treated for LSBC at two different cancer treatment centers were retrieved. At the Milford Regional Cancer Center, DIBH stability was maintained by visual alignment of inroom lasers and patient skin tattoos (TAT). At the South Shore Hospital, a distance-measuring laser device (RTSSD) was implemented. For both centers,cine EPID images were acquired at least once per week during beam-on. Chest wall position relative to image boundary was measured and tracked over the course of treatment for every patient and treatment fraction for which data were acquired. Results: Median intrabeam chest motion was 0.31 mm for the TAT method and 0.37 mm for the RTSSD method. The maximum excursions exceeded our treatment protocol threshold of 3 mm in 0.3% of cases (TAT) and 1.2% of cases (RTSSD). The authors did not observe a clinically significant difference between the two datasets. Conclusions: Both noncommercial techniques for monitoring the DIBH location provided DIBH stability within the predetermined treatment protocol parameters (<3 mm). The intreatment imaging offered by the EPID operating incine mode facilitates retrospective analysis and validation of both techniques.

  20. Cine EPID evaluation of two non-commercial techniques for DIBH

    International Nuclear Information System (INIS)

    Jensen, Christopher; Urribarri, Jaime; Cail, Daniel; Rottmann, Joerg; Mishra, Pankaj; Lingos, Tatiana; Niedermayr, Thomas; Berbeco, Ross

    2014-01-01

    Purpose: To evaluate the efficacy of two noncommercial techniques for deep inspiration breathhold (DIBH) treatment of left-sided breast cancer (LSBC) usingcine electronic portal imaging device (EPID) images. Methods: 23 875 EPID images of 65 patients treated for LSBC at two different cancer treatment centers were retrieved. At the Milford Regional Cancer Center, DIBH stability was maintained by visual alignment of inroom lasers and patient skin tattoos (TAT). At the South Shore Hospital, a distance-measuring laser device (RTSSD) was implemented. For both centers,cine EPID images were acquired at least once per week during beam-on. Chest wall position relative to image boundary was measured and tracked over the course of treatment for every patient and treatment fraction for which data were acquired. Results: Median intrabeam chest motion was 0.31 mm for the TAT method and 0.37 mm for the RTSSD method. The maximum excursions exceeded our treatment protocol threshold of 3 mm in 0.3% of cases (TAT) and 1.2% of cases (RTSSD). The authors did not observe a clinically significant difference between the two datasets. Conclusions: Both noncommercial techniques for monitoring the DIBH location provided DIBH stability within the predetermined treatment protocol parameters (<3 mm). The intreatment imaging offered by the EPID operating incine mode facilitates retrospective analysis and validation of both techniques

  1. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  2. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  3. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  4. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  5. Introduction to basic immunological methods : Generalities, Principles, Protocols and Variants of basic protocols

    International Nuclear Information System (INIS)

    Mejri, Naceur

    2013-01-01

    This manuscript is dedicated to student of biological sciences. It provides the information necessary to perform practical works, the most commonly used in immunology. During my doctoral and post-doctoral periods, panoply of methods was employed in diverse subjects in my research. Technical means used in my investigations were diverse enough that i could extract a set of techniques that cover most the basic immunological methods. Each chapter of this manuscript contains a fairly complete description of immunological methods. In each topic the basic protocol and its variants were preceded by background information provided in paragraphs concerning the principle and generalities. The emphasis is placed on describing situations in which each method and its variants were used. These basic immunological methods are useful for students and even researchers studying the immune system of human, nice and other species. Different subjects showed not only detailed protocols but also photos or/and shemas used as support to illustrate some knowledge or practical knowledge. I hope that students will find this manual interesting, easy to use contains necessary information to acquire skills in immunological practice. (Author)

  6. Atlas-based analysis of cardiac shape and function: correction of regional shape bias due to imaging protocol for population studies.

    Science.gov (United States)

    Medrano-Gracia, Pau; Cowan, Brett R; Bluemke, David A; Finn, J Paul; Kadish, Alan H; Lee, Daniel C; Lima, Joao A C; Suinesiaputra, Avan; Young, Alistair A

    2013-09-13

    Cardiovascular imaging studies generate a wealth of data which is typically used only for individual study endpoints. By pooling data from multiple sources, quantitative comparisons can be made of regional wall motion abnormalities between different cohorts, enabling reuse of valuable data. Atlas-based analysis provides precise quantification of shape and motion differences between disease groups and normal subjects. However, subtle shape differences may arise due to differences in imaging protocol between studies. A mathematical model describing regional wall motion and shape was used to establish a coordinate system registered to the cardiac anatomy. The atlas was applied to data contributed to the Cardiac Atlas Project from two independent studies which used different imaging protocols: steady state free precession (SSFP) and gradient recalled echo (GRE) cardiovascular magnetic resonance (CMR). Shape bias due to imaging protocol was corrected using an atlas-based transformation which was generated from a set of 46 volunteers who were imaged with both protocols. Shape bias between GRE and SSFP was regionally variable, and was effectively removed using the atlas-based transformation. Global mass and volume bias was also corrected by this method. Regional shape differences between cohorts were more statistically significant after removing regional artifacts due to imaging protocol bias. Bias arising from imaging protocol can be both global and regional in nature, and is effectively corrected using an atlas-based transformation, enabling direct comparison of regional wall motion abnormalities between cohorts acquired in separate studies.

  7. Automated Verification of Quantum Protocols using MCMAS

    Directory of Open Access Journals (Sweden)

    F. Belardinelli

    2012-07-01

    Full Text Available We present a methodology for the automated verification of quantum protocols using MCMAS, a symbolic model checker for multi-agent systems The method is based on the logical framework developed by D'Hondt and Panangaden for investigating epistemic and temporal properties, built on the model for Distributed Measurement-based Quantum Computation (DMC, an extension of the Measurement Calculus to distributed quantum systems. We describe the translation map from DMC to interpreted systems, the typical formalism for reasoning about time and knowledge in multi-agent systems. Then, we introduce dmc2ispl, a compiler into the input language of the MCMAS model checker. We demonstrate the technique by verifying the Quantum Teleportation Protocol, and discuss the performance of the tool.

  8. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  9. DNA-based techniques for authentication of processed food and food supplements.

    Science.gov (United States)

    Lo, Yat-Tung; Shaw, Pang-Chui

    2018-02-01

    Authentication of food or food supplements with medicinal values is important to avoid adverse toxic effects, provide consumer rights, as well as for certification purpose. Compared to morphological and spectrometric techniques, molecular authentication is found to be accurate, sensitive and reliable. However, DNA degradation and inclusion of inhibitors may lead to failure in PCR amplification. This paper reviews on the existing DNA extraction and PCR protocols, and the use of small size DNA markers with sufficient discriminative power for molecular authentication. Various emerging new molecular techniques such as isothermal amplification for on-site diagnosis, next-generation sequencing for high-throughput species identification, high resolution melting analysis for quick species differentiation, DNA array techniques for rapid detection and quantitative determination in food products are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Technique of sample preparation for analysis of gasoline and lubricating oils by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Avila P, P.

    1990-03-01

    The X-ray fluorescence laboratory of the National Institute of Nuclear Research when not having a technique for the analysis of oils it has intended, with this work, to develop a preparation technique for the analysis of the metals of Pb, Cr, Ni, V and Mo in gasolines and oils, by means of the spectrometry by X-ray fluorescence analysis. The obtained results, its will be of great utility for the one mentioned laboratory. (Author)

  11. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pannek, Kerstin [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, School of Medicine, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); Guzzetta, Andrea [IRCCS Stella Maris, Department of Developmental Neuroscience, Calambrone Pisa (Italy); Colditz, Paul B. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Perinatal Research Centre, Brisbane (Australia); Rose, Stephen E. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); University of Queensland Centre for Clinical Research, Royal Brisbane and Women' s Hospital, Brisbane (Australia)

    2012-10-15

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. (orig.)

  12. Vertical Protocol Composition

    DEFF Research Database (Denmark)

    Groß, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    The security of key exchange and secure channel protocols, such as TLS, has been studied intensively. However, only few works have considered what happens when the established keys are actually used—to run some protocol securely over the established “channel”. We call this a vertical protocol.......e., that the combination cannot introduce attacks that the individual protocols in isolation do not have. In this work, we prove a composability result in the symbolic model that allows for arbitrary vertical composition (including self-composition). It holds for protocols from any suite of channel and application...

  13. Dysphonia risk screening protocol

    Science.gov (United States)

    Nemr, Katia; Simões-Zenari, Marcia; da Trindade Duarte, João Marcos; Lobrigate, Karen Elena; Bagatini, Flavia Alves

    2016-01-01

    OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors) divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children), 29.25 (adult women), 22.75 (adult men), and 27.10 (seniors). CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics. PMID:27074171

  14. Dysphonia risk screening protocol

    Directory of Open Access Journals (Sweden)

    Katia Nemr

    2016-03-01

    Full Text Available OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children, 29.25 (adult women, 22.75 (adult men, and 27.10 (seniors. CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics.

  15. Three-stage treatment protocol for recalcitrant distal femoral nonunion.

    Science.gov (United States)

    Ma, Ching-Hou; Chiu, Yen-Chun; Tu, Yuan-Kun; Yen, Cheng-Yo; Wu, Chin-Hsien

    2017-04-01

    In this study, we proposed a three-stage treatment protocol for recalcitrant distal femoral nonunion and aimed to analyze the clinical results. We retrospective reviewed 12 consecutive patients with recalcitrant distal femoral nonunion undergoing our three-stage treatment protocol from January 2010 to December 2014 in our institute. The three-stage treatment protocol comprised debridement of the nonunion site, lengthening to eliminate leg length discrepancy, deformity correction, stabilization with a locked plate, filling of the defect with cement spacer for inducing membrane formation, and bone reconstruction using a cancellous bone autograft (Masquelet technique) or free vascularized fibular bone graft. The bone union time, wound complication, lower limbs alignment, amount of lengthening, knee range of motion, and functional outcomes were evaluated. Osseous union with angular deformity lengthening was 5.88 cm (range 3.5-12 cm). Excellent or good outcomes were obtained in 9 patients. Although the current study involved only a small number of patients and the intervention comprised three stages, we believe that such a protocol may be a valuable alternative for the treatment of recalcitrant distal femoral nonunion.

  16. Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs

    Directory of Open Access Journals (Sweden)

    Woo-Yong Choi

    2009-01-01

    Full Text Available We propose the efficient reliable multicast MAC protocol based on the connectivity information among the recipients. Enhancing the BMMM (Batch Mode Multicast MAC protocol, the reliable multicast MAC protocol significantly reduces the RAK (Request for ACK frame transmissions in a reasonable computational time and enhances the MAC performance. By the analytical performance analysis, the throughputs of the BMMM protocol and our proposed MAC protocol are derived. Numerical examples show that our proposed MAC protocol increases the reliable multicast MAC performance for IEEE 802.11 wireless LANs.

  17. EFFICACY OF DIFFERENT ENDODONTIC IRRIGATION PROTOCOLS IN CALCIUM HYDROXIDE REMOVAL

    Directory of Open Access Journals (Sweden)

    Elka N. Radeva

    2016-10-01

    Full Text Available Introduction: Calcium hydroxide is widely used in the field of endodontics as a temporary root canal filling. This medicament significantly increases pH and optimizes the treatment outcome. Its total removal before final obturation is very important. Otherwise it could affect the hermetic filling and respectively the endodontic success. Aim: To evaluate the most effective irrigation protocol of calcium hydroxide removal from root canals. Materials and methods: In this study 36 single root canal teeth were observed. They were randomly divided into three groups (n=10 each group according to the technique applied for calcium hydroxide removal - manual irrigation, irrigation and Revo-S rotary instrumentation; and passive ultrasonic irrigation, and a control group (n=6 – irrigation with distilled water only. After calcium hydroxide removals following the procedures above, teeth were separated longitudinally in a buccal-lingual direction and remnants of medicaments were observed in the apical, middle and coronal part of each tooth. Then all of the specimens were observed using scanning electron microscopy and evaluated by a specified scale. The results have undergone statistical analysis. Results: In the case of calcium hydroxide in the apex and in the middle with highest average is Revo-S, followed by Ultrasonic and irrigation. In the coronal part the highest average belongs to Revo-S, irrigation and Ultrasonic. In all groups the highest average is represented by control group. Conclusion: There is not a universal technique for removal of intracanal medicaments and applying more than one protocol is required.

  18. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  19. Study protocol for examining job strain as a risk factor for severe unipolar depression in an individual participant meta-analysis of 14 European cohorts

    DEFF Research Database (Denmark)

    Madsen, Ida E H; Hannerz, Harald; Nyberg, Solja T

    2013-01-01

    scales that do not necessarily correspond to clinically diagnosed depression. In addition, a meta-analysis from 2008 indicated publication bias in the field. METHODS: This study protocol describes the planned design and analyses of an individual participant data meta-analysis, to examine whether job...... using random effects meta-analysis. DISCUSSION: The planned analyses will help clarify whether job strain is associated with an increased risk of clinically diagnosed unipolar depression. As the analysis is based on pre-planned study protocols and an individual participant data meta-analysis, the pooled......BACKGROUND: Previous studies have shown that gainfully employed individuals with high work demands and low control at work (denoted "job strain") are at increased risk of common mental disorders, including depression. Most existing studies have, however, measured depression using self-rated symptom...

  20. A Passive Testing Approach for Protocols in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiaoping Che

    2015-11-01

    Full Text Available Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN. However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results.

  1. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  2. Validation of IT-based Data Communication Protocol for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jeong, K. I.; Kim, D. H.; Lee, J. C.

    2009-12-01

    The communication network designed to transmit control and processing signals in digital Instrument and Control (I and C) systems in Nuclear Power Plant (NPP), should provide a high level of safety and reliability. There are different features between the communication networks of NPPs and other commercial communication networks. Safety and reliability are the most important factors in the communication networks of an NPP rather than efficiency which are important factors of a commercial communication network design. To develop Data Communication Protocol for Nuclear Power Plant, We analyze the design criteria and performance requirements of existing commercial communication protocols based on Information Technology(IT). And also, we examine the adaptability to the communication protocol of an NPP. Based on these results, we developed our own protocol(Nuclear power plant Safety Communication Protocol : NSCP) for NPP I and C, which meet the required specifications through design overall protocol architecture and data frame format, definition of functional requirements and specifications. NSCP is the communication protocol designed for a safety-grade control network in the nuclear power plant. In this report, we had specified NSCP protocol by FDT(Formal Description Technique) and established validation procedures based on the validation methodology. It was confirmed specification error, major function's validity and reachability of NSCP by performing simulation and the validation process using Telelogic Tau tool

  3. Interactive verification of Markov chains: Two distributed protocol case studies

    Directory of Open Access Journals (Sweden)

    Johannes Hölzl

    2012-12-01

    Full Text Available Probabilistic model checkers like PRISM only check probabilistic systems of a fixed size. To guarantee the desired properties for an arbitrary size, mathematical analysis is necessary. We show for two case studies how this can be done in the interactive proof assistant Isabelle/HOL. The first case study is a detailed description of how we verified properties of the ZeroConf protocol, a decentral address allocation protocol. The second case study shows the more involved verification of anonymity properties of the Crowds protocol, an anonymizing protocol.

  4. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  5. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  6. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  7. Use of Atomic and Nuclear Techniques in Elemental and Isotopic Analysis

    International Nuclear Information System (INIS)

    2008-01-01

    This book is divided into four chapters which were presented by six authors of the best Arab specialists who have used the atomic and nuclear techniques for a long time and recognized their importance and capabilities in scientific researches. Atomic and Nuclear techniques are very successful in the field of analysis because they are the only way to proceed the analysis process with the requested accuracy and they are the cheapest at the same time. A number of these techniques were collected in this book on the basis of their accuracy and the abundance of using them in the analysis of material components, specially when these elements exist with insignificant percentage as in the case of poisons science, archaeology, nutrition, medicine and other applications.

  8. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    Science.gov (United States)

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves

  9. Prospective evaluation of a protocol for using transabdominal ultrasound to screen for short cervix.

    Science.gov (United States)

    Pandipati, Santosh; Combs, C Andrew; Fishman, Alan; Lee, Sarah Y; Mallory, Kimberly; Ianovich, Francesca

    2015-07-01

    We sought to evaluate a recently proposed protocol whereby transabdominal ultrasound of the cervix might be used as a prescreen to select women to undergo or to forgo measurement of cervical length via transvaginal ultrasound (CLvag). This was a prospective cohort study. Measurements of cervical length via transabdominal ultrasound (CLabd) and CLvag were made in women with singleton pregnancy during routine obstetrical ultrasound examination at 18(0/7) to 23(6/7) weeks of gestation. The transabdominal screen was considered positive if CLabd was ≤36 mm with the maternal bladder full or ≤35 mm with the bladder empty, or adequate imaging of the cervix could not be obtained. Sensitivity, specificity, predictive values, and likelihood ratios of a positive screen to detect a short cervix (CLvag ≤25 mm) were calculated. An interim analysis identified several technical problems with CLabd measurements, so the protocol was extensively revised. Under the revised protocol, 1580 women were included. Adequate views of the cervix were obtained via transabdominal imaging in 46% of subjects with the bladder empty and 56% with the bladder full. The correlation between CLabd and CLvag was poor (r = 0.38). Of the 17 patients with a short cervix, 15 had suboptimal transabdominal exams (screen positive) and 2 had CLabd ≤35 mm with bladder empty (screen positive). Sensitivity of the screen was 100% (95% confidence interval, 80.5-100%) but specificity was only 32.2% (95% confidence interval, 29.9-34.6%) and screen positive rate was 66.3%. Several technical problems and limitations of transabdominal imaging of the cervix are shown. Using modern, high-resolution ultrasound equipment, we were unable to adequately image the cervix via transabdominal ultrasound in half the cases. Although we confirmed that a CLabd cutoff value of 35-36 mm is appropriate for detection of short cervix, the technique for measuring CLabd is fraught with technical problems. Practitioners must validate the

  10. Patient perspectives: Kundalini yoga meditation techniques for psycho-oncology and as potential therapies for cancer.

    Science.gov (United States)

    Shannahoff-Khalsa, David S

    2005-03-01

    The ancient system of Kundalini Yoga (KY) includes a vast array of meditation techniques. Some were discovered to be specific for treating psychiatric disorders and others are supposedly beneficial for treating cancers. To date, 2 clinical trials have been conducted for treating obsessive-compulsive disorder (OCD). The first was an open uncontrolled trial and the second a single-blinded randomized controlled trial (RCT) comparing a KY protocol against the Relaxation Response and Mindfulness Meditation (RRMM) techniques combined. Both trials showed efficacy on all psychological scales using the KY protocol; however, the RCT showed no efficacy on any scale with the RRMM control group. The KY protocol employed an OCD-specific meditation technique combined with other techniques that are individually specific for anxiety, low energy, fear, anger, meeting mental challenges, and turning negative thoughts into positive thoughts. In addition to OCD symptoms, other symptoms, including anxiety and depression, were also significantly reduced. Elements of the KY protocol other than the OCD-specific technique also may have applications for psycho-oncology patients and are described here. Two depression-specific KY techniques are described that also help combat mental fatigue and low energy. A 7-part protocol is described that would be used in KY practice to affect the full spectrum of emotions and distress that complicate a cancer diagnosis. In addition, there are KY techniques that practitioners have used in treating cancer. These techniques have not yet been subjected to formal clinical trials but are described here as potential adjunctive therapies. A case history demonstrating rapid onset of acute relief of intense fear in a terminal breast cancer patient using a KY technique specific for fear is presented. A second case history is reported for a surviving male diagnosed in 1988 with terminal prostate cancer who has used KY therapy long term as part of a self

  11. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  12. A Study of Shared-Memory Mutual Exclusion Protocols Using CADP

    Science.gov (United States)

    Mateescu, Radu; Serwe, Wendelin

    Mutual exclusion protocols are an essential building block of concurrent systems: indeed, such a protocol is required whenever a shared resource has to be protected against concurrent non-atomic accesses. Hence, many variants of mutual exclusion protocols exist in the shared-memory setting, such as Peterson's or Dekker's well-known protocols. Although the functional correctness of these protocols has been studied extensively, relatively little attention has been paid to their non-functional aspects, such as their performance in the long run. In this paper, we report on experiments with the performance evaluation of mutual exclusion protocols using Interactive Markov Chains. Steady-state analysis provides an additional criterion for comparing protocols, which complements the verification of their functional properties. We also carefully re-examined the functional properties, whose accurate formulation as temporal logic formulas in the action-based setting turns out to be quite involved.

  13. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  14. Energy-Aware RFID Anti-Collision Protocol.

    Science.gov (United States)

    Arjona, Laura; Simon, Hugo Landaluce; Ruiz, Asier Perallos

    2018-06-11

    The growing interest in mobile devices is transforming wireless identification technologies. Mobile and battery-powered Radio Frequency Identification (RFID) readers, such as hand readers and smart phones, are are becoming increasingly attractive. These RFID readers require energy-efficient anti-collision protocols to minimize the tag collisions and to expand the reader's battery life. Furthermore, there is an increasing interest in RFID sensor networks with a growing number of RFID sensor tags. Thus, RFID application developers must be mindful of tag anti-collision protocols. Energy-efficient protocols involve a low reader energy consumption per tag. This work presents a thorough study of the reader energy consumption per tag and analyzes the main factor that affects this metric: the frame size update strategy. Using the conclusion of this analysis, the anti-collision protocol Energy-Aware Slotted Aloha (EASA) is presented to decrease the energy consumption per tag. The frame size update strategy of EASA is configured to minimize the energy consumption per tag. As a result, EASA presents an energy-aware frame. The performance of the proposed protocol is evaluated and compared with several state of the art Aloha-based anti-collision protocols based on the current RFID standard. Simulation results show that EASA, with an average of 15 mJ consumed per tag identified, achieves a 6% average improvement in the energy consumption per tag in relation to the strategies of the comparison.

  15. The surface elevation table and marker horizon technique: A protocol for monitoring wetland elevation dynamics

    Science.gov (United States)

    James C. Lynch,; Phillippe Hensel,; Cahoon, Donald R.

    2015-01-01

    The National Park Service, in response to the growing evidence and awareness of the effects of climate change on federal lands, determined that monitoring wetland elevation change is a top priority in North Atlantic Coastal parks (Stevens et al, 2010). As a result, the NPS Northeast Coastal and Barrier Network (NCBN) in collaboration with colleagues from the U.S. Geological Survey (USGS) and The National Oceanic and Atmospheric Administration (NOAA) have developed a protocol for monitoring wetland elevation change and other processes important for determining the viability of wetland communities. Although focused on North Atlantic Coastal parks, this document is applicable to all coastal and inland wetland regions. Wetlands exist within a narrow range of elevation which is influenced by local hydrologic conditions. For coastal wetlands in particular, local hydrologic conditions may be changing as sea levels continue to rise. As sea level rises, coastal wetland systems may respond by building elevation to maintain favorable hydrologic conditions for their survival. This protocol provides the reader with instructions and guidelines on designing a monitoring plan or study to: A) Quantify elevation change in wetlands with the Surface Elevation Table (SET). B) Understand the processes that influence elevation change, including vertical accretion (SET and Marker Horizon methods). C) Survey the wetland surface and SET mark to a common reference datum to allow for comparing sample stations to each other and to local tidal datums. D) Survey the SET mark to monitor its relative stability. This document is divided into two parts; the main body that presents an overview of all aspects of monitoring wetland elevation dynamics, and a collection of Standard Operating Procedures (SOP) that describes in detail how to perform or execute each step of the methodology. Detailed instruction on the installation, data collection, data management and analysis are provided in this report

  16. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  17. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  18. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  19. Perancangan dan Analisis Redistribution Routing Protocol OSPF dan EIGRP

    Directory of Open Access Journals (Sweden)

    DWI ARYANTA

    2016-02-01

    Full Text Available Abstrak OSPF (Open Shortest Path First dan EIGRP (Enhanced Interior Gateway Routing Protocol adalah dua routing protokol yang banyak digunakan dalam jaringan komputer. Perbedaan karakteristik antar routing protokol menimbulkan masalah dalam pengiriman paket data. Teknik redistribution adalah solusi untuk melakukan komunikasi antar routing protokol. Dengan menggunakan software Cisco Packet Tracer 5.3 pada penelitian ini dibuat simulasi OSPF dan EIGRP yang dihubungkan oleh teknik redistribution, kemudian dibandingkan kualitasnya dengan single routing protokol EIGRP dan OSPF. Parameter pengujian dalam penelitian ini adalah nilai time delay dan trace route. Nilai trace route berdasarkan perhitungan langsung cost dan metric dibandingkan dengan hasil simulasi. Hasilnya dapat dilakukan proses redistribution OSPF dan EIGRP. Nilai delay redistribution lebih baik 1% dibanding OSPF dan 2-3% di bawah EIGRP tergantung kepadatan traffic. Dalam perhitungan trace route redistribution dilakukan 2 perhitungan, yaitu cost untuk area OSPF dan metric pada area EIGRP. Pengambilan jalur utama dan alternatif pengiriman paket berdasarkan nilai cost dan metric yang terkecil, hal ini terbukti berdasarkan perhitungan dan simulasi. Kata kunci: OSPF, EIGRP, Redistribution, Delay, Cost, Metric. Abstract OSPF (Open Shortest Path First and EIGRP (Enhanced Interior Gateway Routing Protocol are two routing protocols are widely used in computer networks. Differences between the characteristics of routing protocols pose a problem in the delivery of data packets. Redistribution technique is the solution for communication between routing protocols. By using the software Cisco Packet Tracer 5.3 in this study were made simulating OSPF and EIGRP redistribution linked by technique, then compared its quality with a single EIGRP and OSPF routing protocols. Testing parameters in this study is the value of the time delay and trace route. Value trace route based on direct calculation of cost

  20. Comparison of lung tumor motion measured using a model-based 4DCT technique and a commercial protocol.

    Science.gov (United States)

    O'Connell, Dylan; Shaverdian, Narek; Kishan, Amar U; Thomas, David H; Dou, Tai H; Lewis, John H; Lamb, James M; Cao, Minsong; Tenn, Stephen; Percy, Lee P; Low, Daniel A

    2017-11-11

    To compare lung tumor motion measured with a model-based technique to commercial 4-dimensional computed tomography (4DCT) scans and describe a workflow for using model-based 4DCT as a clinical simulation protocol. Twenty patients were imaged using a model-based technique and commercial 4DCT. Tumor motion was measured on each commercial 4DCT dataset and was calculated on model-based datasets for 3 breathing amplitude percentile intervals: 5th to 85th, 5th to 95th, and 0th to 100th. Internal target volumes (ITVs) were defined on the 4DCT and 5th to 85th interval datasets and compared using Dice similarity. Images were evaluated for noise and rated by 2 radiation oncologists for artifacts. Mean differences in tumor motion magnitude between commercial and model-based images were 0.47 ± 3.0, 1.63 ± 3.17, and 5.16 ± 4.90 mm for the 5th to 85th, 5th to 95th, and 0th to 100th amplitude intervals, respectively. Dice coefficients between ITVs defined on commercial and 5th to 85th model-based images had a mean value of 0.77 ± 0.09. Single standard deviation image noise was 11.6 ± 9.6 HU in the liver and 6.8 ± 4.7 HU in the aorta for the model-based images compared with 57.7 ± 30 and 33.7 ± 15.4 for commercial 4DCT. Mean model error within the ITV regions was 1.71 ± 0.81 mm. Model-based images exhibited reduced presence of artifacts at the tumor compared with commercial images. Tumor motion measured with the model-based technique using the 5th to 85th percentile breathing amplitude interval corresponded more closely to commercial 4DCT than the 5th to 95th or 0th to 100th intervals, which showed greater motion on average. The model-based technique tended to display increased tumor motion when breathing amplitude intervals wider than 5th to 85th were used because of the influence of unusually deep inhalations. These results suggest that care must be taken in selecting the appropriate interval during image generation when using model-based 4DCT methods. Copyright © 2017

  1. ANTERIOR CRUCIATE LIGAMENT RECONSTRUCTION USING THE DOUBLE-BUNDLE TECHNIQUE - EVALUATION IN THE BIOMECHANICS LABORATORY.

    Science.gov (United States)

    D'Elia, Caio Oliveira; Bitar, Alexandre Carneiro; Castropil, Wagner; Garofo, Antônio Guilherme Padovani; Cantuária, Anita Lopes; Orselli, Maria Isabel Veras; Luques, Isabela Ugo; Duarte, Marcos

    2011-01-01

    The objective of this study was to describe the methodology of knee rotation analysis using biomechanics laboratory instruments and to present the preliminary results from a comparative study on patients who underwent anterior cruciate ligament (ACL) reconstruction using the double-bundle technique. The protocol currently used in our laboratory was described. Three-dimensional kinematic analysis was performed and knee rotation amplitude was measured on eight normal patients (control group) and 12 patients who were operated using the double-bundle technique, by means of three tasks in the biomechanics laboratory. No significant differences between operated and non-operated sides were shown in relation to the mean amplitudes of gait, gait with change in direction or gait with change in direction when going down stairs (p > 0.13). The preliminary results did not show any difference in the double-bundle ACL reconstruction technique in relation to the contralateral side and the control group.

  2. Postselection technique for quantum channels with applications to quantum cryptography.

    Science.gov (United States)

    Christandl, Matthias; König, Robert; Renner, Renato

    2009-01-16

    We propose a general method for studying properties of quantum channels acting on an n-partite system, whose action is invariant under permutations of the subsystems. Our main result is that, in order to prove that a certain property holds for an arbitrary input, it is sufficient to consider the case where the input is a particular de Finetti-type state, i.e., a state which consists of n identical and independent copies of an (unknown) state on a single subsystem. Our technique can be applied to the analysis of information-theoretic problems. For example, in quantum cryptography, we get a simple proof for the fact that security of a discrete-variable quantum key distribution protocol against collective attacks implies security of the protocol against the most general attacks. The resulting security bounds are tighter than previously known bounds obtained with help of the exponential de Finetti theorem.

  3. Energy-Efficient Boarder Node Medium Access Control Protocol for Wireless Sensor Networks

    OpenAIRE

    Razaque, Abdul; Elleithy, Khaled M.

    2014-01-01

    This paper introduces the design, implementation, and performance analysis of the scalable and mobility-aware hybrid protocol named boarder node medium access control (BN-MAC) for wireless sensor networks (WSNs), which leverages the characteristics of scheduled and contention-based MAC protocols. Like contention-based MAC protocols, BN-MAC achieves high channel utilization, network adaptability under heavy traffic and mobility, and low latency and overhead. Like schedule-based MAC protocols,...

  4. Field Monitoring Protocol. Heat Pump Water Heaters

    Energy Technology Data Exchange (ETDEWEB)

    Sparn, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Earle, L. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, D. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wilson, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hancock, C. E. [Mountain Energy Partnership, Longmont, CO (United States)

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  5. Field Monitoring Protocol: Heat Pump Water Heaters

    Energy Technology Data Exchange (ETDEWEB)

    Sparn, B.; Earle, L.; Christensen, D.; Maguire, J.; Wilson, E.; Hancock, E.

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  6. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  7. Performance Analysis of the Enhanced DSR Routing Protocol for the Short Time Disconnected MANET to the OPNET Modeler

    Directory of Open Access Journals (Sweden)

    PAPAJ Ján

    2013-05-01

    Full Text Available Disconnected mobile ad-hoc networks (MANET are very important areas of the research. In this article, the performance analysis of the enhanced dynamic source routing protocol (OPP_DSR is introduced. This modification enables the routing process in the case when there are no connections to other mobile nodes. It also will enable the routing mechanisms when the routes, selected by routing mechanisms, are disconnected for some time. Disconnection can be for a short time and standard routing protocol DSR cannot reflect on this situation.The main idea is based on opportunistic forwarding where the nodes not only forward data but it's stored in the cache during long time. The network parameters throughput, routing load and are analysed.

  8. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  9. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  10. Application of energy dispersive x-ray techniques for water analysis

    International Nuclear Information System (INIS)

    Funtua, I. I.

    2000-07-01

    Energy dispersive x-ray fluorescence (EDXRF) is a class of emission spectroscopic techniques that depends upon the emission of characteristic x-rays following excitation of the atomic electron energy levels by tube or isotopic source x-rays. The technique has found wide range of applications that include determination of chemical elements of water and water pollutants. Three EDXRF systems, the isotopic source, secondary target and total reflection (TXRF) are available at the Centre for Energy research and Training. These systems have been applied for the analysis of sediments, suspensions, ground water, river and rainwater. The isotopic source is based on 55 Fe, 109 Cd and 241 Am excitations while the secondary target and the total reflection are utilizing a Mo x-ray tube. Sample preparation requirements for water analysis range from physical and chemical pre-concentration steps to direct analysis and elements from Al to U can be determined with these systems. The EDXRF techniques, TXRF in particular with its multielement capability, low detection limit and possibility of direct analysis for water have competitive edge over the traditional methods of atomic absorption and flame photometry

  11. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  12. Multiple access protocol for supporting multimedia services in wireless ATM networks

    DEFF Research Database (Denmark)

    Liu, Hong; Dittmann, Lars; Gliese, Ulrik Bo

    1999-01-01

    The furture broadband wireless asynchronous transfer mode (ATM) networks must provide seamless extension of multimedia services from the wireline ATM networks. This requires an effecient wireless access protocol to fulfill varying Quality-og-Service (QoS) requirements for multimedia applications....... In this paper, we propose a multiple access protocol using centralized and distributed channel access control techniques to provide QoS guarantees for multimedia services by taking advantage of the characteristics of different kinds of ATM traffics. Multimedia traffic, including constant bit rate (CBR...

  13. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  14. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  15. Generating Models of Infinite-State Communication Protocols Using Regular Inference with Abstraction

    Science.gov (United States)

    Aarts, Fides; Jonsson, Bengt; Uijen, Johan

    In order to facilitate model-based verification and validation, effort is underway to develop techniques for generating models of communication system components from observations of their external behavior. Most previous such work has employed regular inference techniques which generate modest-size finite-state models. They typically suppress parameters of messages, although these have a significant impact on control flow in many communication protocols. We present a framework, which adapts regular inference to include data parameters in messages and states for generating components with large or infinite message alphabets. A main idea is to adapt the framework of predicate abstraction, successfully used in formal verification. Since we are in a black-box setting, the abstraction must be supplied externally, using information about how the component manages data parameters. We have implemented our techniques by connecting the LearnLib tool for regular inference with the protocol simulator ns-2, and generated a model of the SIP component as implemented in ns-2.

  16. Final report for the protocol extensions for ATM Security Laboratory Directed Research and Development Project

    Energy Technology Data Exchange (ETDEWEB)

    Tarman, T.D.; Pierson, L.G.; Brenkosh, J.P. [and others

    1996-03-01

    This is the summary report for the Protocol Extensions for Asynchronous Transfer Mode project, funded under Sandia`s Laboratory Directed Research and Development program. During this one-year effort, techniques were examined for integrating security enhancements within standard ATM protocols, and mechanisms were developed to validate these techniques and to provide a basic set of ATM security assurances. Based on our experience during this project, recommendations were presented to the ATM Forum (a world-wide consortium of ATM product developers, service providers, and users) to assist with the development of security-related enhancements to their ATM specifications. As a result of this project, Sandia has taken a leading role in the formation of the ATM Forum`s Security Working Group, and has gained valuable alliances and leading-edge experience with emerging ATM security technologies and protocols.

  17. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  18. Motor current and leakage flux signature analysis technique for condition monitoring

    International Nuclear Information System (INIS)

    Pillai, M.V.; Moorthy, R.I.K.; Mahajan, S.C.

    1994-01-01

    Till recently analysis of vibration signals was the only means available to predict the state of health of plant equipment. Motor current and leakage magnetic flux signature analysis is acquiring importance as a technique for detection of incipient damages in the electrical machines and as a supplementary technique for diagnostics of driven equipment such as centrifugal and reciprocating pumps. The state of health of the driven equipment is assessed by analysing time signal, frequency spectrum and trend analysis. For example, the pump vane frequency, piston stroke frequency, gear frequency and bearing frequencies are indicated in the current and flux spectra. By maintaining a periodic record of the amplitudes of various frequency lines in the frequency spectra, it is possible to understand the trend of deterioration of parts and components of the pump. All problems arising out of inappropriate mechanical alignment of vertical pumps are easily identified by a combined analysis of current, flux and vibration signals. It is found that current signature analysis technique is a sufficient method in itself for the analysis of state of health of reciprocating pumps and compressors. (author). 10 refs., 4 figs

  19. Protein blotting protocol for beginners.

    Science.gov (United States)

    Petrasovits, Lars A

    2014-01-01

    The transfer and immobilization of biological macromolecules onto solid nitrocellulose or nylon (polyvinylidene difluoride (PVDF)) membranes subsequently followed by specific detection is referred to as blotting. DNA blots are called Southerns after the inventor of the technique, Edwin Southern. By analogy, RNA blots are referred to as northerns and protein blots as westerns (Burnette, Anal Biochem 112:195-203, 1981). With few exceptions, western blotting involves five steps, namely, sample collection, preparation, separation, immobilization, and detection. In this chapter, protocols for the entire process from sample collection to detection are described.

  20. Artery-only fingertip replantations using a controlled nailbed bleeding protocol.

    Science.gov (United States)

    Erken, H Yener; Takka, Semih; Akmaz, Ibrahim

    2013-11-01

    We report our experience, treatment protocol, and 2-year follow-up results of 24 fingertip replantations treated using the artery-only technique without vein or nerve repair. We performed a retrospective review of 24 patients who had undergone fingertip replantation at the same center between 2005 and 2011. All patients in this study had complete fingertip amputation at or distal to the distal interphalangeal joint of the fingers or interphalangeal joint of the thumb. Patients with incomplete and complete amputations who had undergone vein and/or nerve repair along with artery repair were excluded. All patients received the same protocol including removal of the nail at the surgery and intravenous heparin 70 U/kg administered at the time of arterial anastomosis. After surgery, the nailbed was mechanically made to bleed with a sterile needle and mechanically scrubbed with a heparin-saline gauze. All patients received the same postoperative medical treatment protocol until physiological outflow was restored. Successful replantation was confirmed with clinical observation. Twenty-one of the 24 fingertip replantations (88%) were successful. The mean length of hospital stay was 7 days (range, 4-9 d). Fifteen of 22 patients required blood transfusion. The average amount of blood transfusion was 1.2 U (range, 0-3 U). This study shows that the described technique and protocol reconstructed circulation without vein anastomosis and with a high success rate. Furthermore, adequate sensory recovery without any nerve repair had occurred by the 2-year follow-up. Therapeutic IV. Copyright © 2013 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  1. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    Science.gov (United States)

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  2. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  3. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  4. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  5. Protocols and guidelines for mobile chest radiography in Irish public hospitals

    International Nuclear Information System (INIS)

    Kelly, Amanda; Toomey, Rachel

    2015-01-01

    Background: The mobile chest radiograph is a highly variable examination, in both technique and setting. Protocols and guidelines are one method by which examinations can be standardised, and provide information when one is unsure how to proceed. This study was undertaken to investigate the existence of protocols and guidelines available for the mobile chest radiograph, to establish their nature and compare them under a variety of headings. Methodology: A postal survey was administered to the Radiography Service Managers in the public hospitals under the governance of the Health Service Executive (HSE) in Ireland. The survey contained questions regarding hospital demographics, contents of existing protocols or guidelines, and why a protocol or guideline was not in place, if this was the case. Results: The response rate to the survey was 62% (n = 24). Those that had a specific protocol in place amounted to 63% (n = 15), 71% (n = 17) had a specific guideline, and 63% (n = 15) had both. Twenty nine percent (n = 7) had no specific protocol/guideline in place. Scientific research (88%, n = 15) and radiographer experience (82%, n = 14) were the most common sources used to inform protocols and guidelines. Conclusions: There are protocols and guidelines available to radiographers for mobile chest radiography in the majority of public hospitals in Ireland. The nature of the protocols and guidelines generally coincides with the HSE guidance regarding what sources of information should be used and how often they should be updated

  6. Redactions in protocols for drug trials: what industry sponsors concealed.

    Science.gov (United States)

    Marquardsen, Mikkel; Ogden, Michelle; Gøtzsche, Peter C

    2018-04-01

    Objective To describe the redactions in contemporary protocols for industry-sponsored randomised drug trials with patient relevant outcomes and to evaluate whether there was a legitimate rationale for the redactions. Design Cohort study. Under the Freedom of Information Act, we requested access to trial protocols approved by a research ethics committee in Denmark from October 2012 to March 2013. We received 17 consecutive protocols, which had been redacted before we got them, and nine protocols without redactions. In five additional cases, the companies refused to let the committees give us access, and in three other cases, documents were missing. Participants Not applicable. Setting Not applicable. Main outcome measure Amount and nature of redactions in 22 predefined key protocol variables. Results The redactions were most widespread in those sections of the protocol where there is empirical evidence of substantial problems with the trustworthiness of published drug trials: data analysis, handling of missing data, detection and analysis of adverse events, definition of the outcomes, interim analyses and premature termination of the study, sponsor's access to incoming data while the study is running, ownership to the data and investigators' publication rights. The parts of the text that were redacted differed widely, both between companies and within the same company. Conclusions We could not identify any legitimate rationale for the redactions. The current mistrust in industry-sponsored drug trials can only change if the industry offers unconditional access to its trial protocols and other relevant documents and data.

  7. A General Approach to Access Morphologies of Polyoxometalates in Solution by Using SAXS: An Ab Initio Modeling Protocol.

    Science.gov (United States)

    Li, Mu; Wang, Weiyu; Yin, Panchao

    2018-05-02

    Herein, we reported a general protocol for an ab initio modeling approach to deduce structure information of polyoxometalates (POMs) in solutions from scattering data collected by the small-angle X-ray scattering (SAXS) technique. To validate the protocol, the morphologies of a serious of known POMs in either aqueous or organic solvents were analyzed. The obtained particle morphologies were compared and confirmed with previous reported crystal structures. To extend the feasibility of the protocol to an unknown system of aqueous solutions of Na 2 MoO 4 with the pH ranging from -1 to 8.35, the formation of {Mo 36 } clusters was probed, identified, and confirmed by SAXS. The approach was further optimized with a multi-processing capability to achieve fast analysis of experimental data, thereby, facilitating in situ studies of formations of POMs in solutions. The advantage of this approach is to generate intuitive 3D models of POMs in solutions without confining information such as symmetries and possible sizes. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Ni Bangfa; Tian Weizhi; Nie Nuiling; Wang Pingsheng

    1999-01-01

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  9. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  10. A standardised protocol for texture feature analysis of endoscopic images in gynaecological cancer

    Directory of Open Access Journals (Sweden)

    Pattichis Marios S

    2007-11-01

    Full Text Available Abstract Background In the development of tissue classification methods, classifiers rely on significant differences between texture features extracted from normal and abnormal regions. Yet, significant differences can arise due to variations in the image acquisition method. For endoscopic imaging of the endometrium, we propose a standardized image acquisition protocol to eliminate significant statistical differences due to variations in: (i the distance from the tissue (panoramic vs close up, (ii difference in viewing angles and (iii color correction. Methods We investigate texture feature variability for a variety of targets encountered in clinical endoscopy. All images were captured at clinically optimum illumination and focus using 720 × 576 pixels and 24 bits color for: (i a variety of testing targets from a color palette with a known color distribution, (ii different viewing angles, (iv two different distances from a calf endometrial and from a chicken cavity. Also, human images from the endometrium were captured and analysed. For texture feature analysis, three different sets were considered: (i Statistical Features (SF, (ii Spatial Gray Level Dependence Matrices (SGLDM, and (iii Gray Level Difference Statistics (GLDS. All images were gamma corrected and the extracted texture feature values were compared against the texture feature values extracted from the uncorrected images. Statistical tests were applied to compare images from different viewing conditions so as to determine any significant differences. Results For the proposed acquisition procedure, results indicate that there is no significant difference in texture features between the panoramic and close up views and between angles. For a calibrated target image, gamma correction provided an acquired image that was a significantly better approximation to the original target image. In turn, this implies that the texture features extracted from the corrected images provided for better

  11. [The intervention mapping protocol: A structured process to develop, implement and evaluate health promotion programs].

    Science.gov (United States)

    Fassier, J-B; Lamort-Bouché, M; Sarnin, P; Durif-Bruckert, C; Péron, J; Letrilliart, L; Durand, M-J

    2016-02-01

    Health promotion programs are expected to improve population health and reduce social inequalities in health. However, their theoretical foundations are frequently ill-defined, and their implementation faces many obstacles. The aim of this article is to describe the intervention mapping protocol in health promotion programs planning, used recently in several countries. The challenges of planning health promotion programs are presented, and the six steps of the intervention mapping protocol are described with an example. Based on a literature review, the use of this protocol, its requirements and potential limitations are discussed. The intervention mapping protocol has four essential characteristics: an ecological perspective (person-environment), a participative approach, the use of theoretical models in human and social sciences and the use of scientific evidence. It comprises six steps: conduct a health needs assessment, define change objectives, select theory-based change techniques and practical applications, organize techniques and applications into an intervention program (logic model), plan for program adoption, implementation, and sustainability, and generate an evaluation plan. This protocol was used in different countries and domains such as obesity, tobacco, physical activity, cancer and occupational health. Although its utilization requires resources and a critical stance, this protocol was used to develop interventions which efficacy was demonstrated. The intervention mapping protocol is an integrated process that fits the scientific and practical challenges of health promotion. It could be tested in France as it was used in other countries, in particular to reduce social inequalities in health. Copyright © 2016. Published by Elsevier Masson SAS.

  12. Colour and shape analysis techniques for weed detection in cereal fields

    DEFF Research Database (Denmark)

    Pérez, A.J; López, F; Benlloch, J.V.

    2000-01-01

    . The proposed methods use colour information to discriminate between vegetation and background, whilst shape analysis techniques are applied to distinguish between crop and weeds. The determination of crop row position helps to reduce the number of objects to which shape analysis techniques are applied....... The performance of algorithms was assessed by comparing the results with a human classification, providing an acceptable success rate. The study has shown that despite the difficulties in accurately determining the number of seedlings (as in visual surveys), it is feasible to use image processing techniques......Information on weed distribution within the field is necessary to implement spatially variable herbicide application. This paper deals with the development of near-ground image capture and processing techniques in order to detect broad-leaved weeds in cereal crops under actual field conditions...

  13. Controlled Delegation Protocol in Mobile RFID Networks

    Directory of Open Access Journals (Sweden)

    Yang MingHour

    2010-01-01

    Full Text Available To achieve off-line delegation for mobile readers, we propose a delegation protocol for mobile RFID allowing its readers access to specific tags through back-end server. That is to say, reader-tag mutual authentication can be performed without readers being connected to back-end server. Readers are also allowed off-line access to tags' data. Compared with other delegation protocols, our scheme uniquely enables back-end server to limit each reader's reading times during delegation. Even in a multireader situation, our protocol can limit reading times and reading time periods for each of them and therefore makes back-end server's delegation more flexible. Besides, our protocol can prevent authorized readers from transferring their authority to the unauthorized, declining invalid access to tags. Our scheme is proved viable and secure with GNY logic; it is against certain security threats, such as replay attacks, denial of service (DoS attacks, Man-in-the-Middle attacks, counterfeit tags, and breaches of location and data privacy. Also, the performance analysis of our protocol proves that current tags can afford the computation load required in this scheme.

  14. A Secured Authentication Protocol for SIP Using Elliptic Curves Cryptography

    Science.gov (United States)

    Chen, Tien-Ho; Yeh, Hsiu-Lien; Liu, Pin-Chuan; Hsiang, Han-Chen; Shih, Wei-Kuan

    Session initiation protocol (SIP) is a technology regularly performed in Internet Telephony, and Hyper Text Transport Protocol (HTTP) as digest authentication is one of the major methods for SIP authentication mechanism. In 2005, Yang et al. pointed out that HTTP could not resist server spoofing attack and off-line guessing attack and proposed a secret authentication with Diffie-Hellman concept. In 2009, Tsai proposed a nonce based authentication protocol for SIP. In this paper, we demonstrate that their protocol could not resist the password guessing attack and insider attack. Furthermore, we propose an ECC-based authentication mechanism to solve their issues and present security analysis of our protocol to show that ours is suitable for applications with higher security requirement.

  15. Resistance to compression of weakened roots subjected to different root reconstruction protocols

    Directory of Open Access Journals (Sweden)

    Lucas Villaça Zogheib

    2011-12-01

    Full Text Available OBJECTIVE: This study evaluated, in vitro, the fracture resistance of human non-vital teeth restored with different reconstruction protocols. MATERIAL AND METHODS: Forty human anterior roots of similar shape and dimensions were assigned to four groups (n=10, according to the root reconstruction protocol: Group I (control: non-weakened roots with glass fiber post; Group II: roots with composite resin by incremental technique and glass fiber post; Group III: roots with accessory glass fiber posts and glass fiber post; and Group IV: roots with anatomic glass fiber post technique. Following post cementation and core reconstruction, the roots were embedded in chemically activated acrylic resin and submitted to fracture resistance testing, with a compressive load at an angle of 45º in relation to the long axis of the root at a speed of 0.5 mm/min until fracture. All data were statistically analyzed with bilateral Dunnett's test (α=0.05. RESULTS: Group I presented higher mean values of fracture resistance when compared with the three experimental groups, which, in turn, presented similar resistance to fracture among each other. None of the techniques of root reconstruction with intraradicular posts improved root strength, and the incremental technique was suggested as being the most recommendable, since the type of fracture that occurred allowed the remaining dental structure to be repaired. CONCLUSION: The results of this in vitro study suggest that the healthy remaining radicular dentin is more important to increase fracture resistance than the root reconstruction protocol.

  16. Analysis of quality control protocol implementation of equipment in radiotherapy services

    International Nuclear Information System (INIS)

    Calcina, Carmen S. Guzman; Lima, Luciana P. de; Rubo, Rodrigo A.; Ferraz, Eduardo; Almeida, Adelaide de

    2000-01-01

    Considering the importance of the Quality Assurance in the radiotherapy services, there was an interest to make tests' evaluation for a Quality Control for the cobalt equipment, linear accelerator and simulator as a classification and comparison. The work proposed is a suggestion that can serve as tool for medical physicists that are starting to work in the radiotherapy area and for the most experts. The discussions were made by the gathering of local tests and official protocols, resulting in a minimum protocol as a suggestion for a routine work, emphasizing the periodicity and level of tolerance of each one of the tests. (author)

  17. Protocol for sampling and analysis of bone specimens

    International Nuclear Information System (INIS)

    Aras, N.K.

    2000-01-01

    The iliac crest of hip bone was chosen as the most suitable sampling site for several reasons: Local variation in the elemental concentration along the iliac crest is minimal; Iliac crest biopsies are commonly taken clinically on patients; The cortical part of the sample is small (∼2 mm) and can be separated easily from the trabecular bone; The use of the trabecular part of the iliac crest for trace element analysis has the advantage of reflecting rapidly changes in the composition of bone due to external parameters, including medication. Biopsy studies, although in some ways more difficult than autopsy studies, because of the need to obtain the informed consents of the subjects, are potentially more useful than autopsy studies. Thereby many problems of postmortem migration of elements can be avoided and reliable dietary and other data can be collected simultaneously. Select the subjects among the patients undergoing orthopedic surgery due to any reason other than osteoporosis. Follow an established protocol to obtain bone biopsies. Patients undergoing synergy should fill in the 'Osteoporosis Project Questionnaire Form' including information on lifestyle variables, dietary intakes, the reason for surgery etc. If possible, measure the bone mineral density (BMD) prior to removal of the biopsy sample. However it may not possible to have BMD results on all the subjects because of difficulty of DEXA measurement after an accident

  18. Nuclear techniques for on-line analysis in the mineral and energy industries

    International Nuclear Information System (INIS)

    Sowerby, B.D.; Watt, J.S.

    1994-01-01

    Nuclear techniques are the basis of many on-line analysis systems which are now widely used in the mineral and energy industries. Some of the systems developed by the CSIRO depend entirely on nuclear techniques; others use a combination of nuclear techniques and microwave, capacitance, or ultrasonic techniques. The continuous analysis and rapid response of these CSIRO systems has led to improved control of mining, processing and blending operations, with increased productivity valued at A$50 million per year to Australia, and $90 million per year world wide. This paper reviews developments in nuclear on-line analysis systems by the On-Line Analysis Group in CSIRO at Lucas Heights. Commercialised systems based on this work analyse mineral and coal slurries and determine the ash and moisture contents of coal and coke on conveyors. This paper also reviews two on-line nuclear analysis systems recently developed and licensed to industry, firstly for the determination of the mass flow rates of oil/water/gas mixtures in pipelines, and secondly for determination of the moisture, specific energy, ash and fouling index in low rank coals. 8 refs., 3 tabs., 4 figs

  19. Quality-assurance techniques used with automated analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Killian, E.W.; Koeppen, L.D.; Femec, D.A.

    1994-01-01

    In the course of developing gamma-ray spectrum analysis algorithms for use by the Radiation Measurements Laboratory at the Idaho National Engineering Laboratory (INEL), several techniques have been developed that enhance and verify the quality of the analytical results. The use of these quality-assurance techniques is critical when gamma-ray analysis results from low-level environmental samples are used in risk assessment or site restoration and cleanup decisions. This paper describes four of the quality-assurance techniques that are in routine use at the laboratory. They are used for all types of samples, from reactor effluents to environmental samples. The techniques include: (1) the use of precision pulsers (with subsequent removal) to validate the correct operation of the spectrometer electronics for each and every spectrum acquired, (2) the use of naturally occurring and cosmically induced radionuclides in samples to help verify that the data acquisition and analysis were performed properly, (3) the use of an ambient background correction technique that involves superimposing (open-quotes mappingclose quotes) sample photopeak fitting parameters onto multiple background spectra for accurate and more consistent quantification of the background activities, (4) the use of interactive, computer-driven graphics to review the automated locating and fitting of photopeaks and to allow for manual fitting of photopeaks

  20. A new technique for quantitative analysis of hair loss in mice using grayscale analysis.

    Science.gov (United States)

    Ponnapakkam, Tulasi; Katikaneni, Ranjitha; Gulati, Rohan; Gensure, Robert

    2015-03-09

    Alopecia is a common form of hair loss which can occur in many different conditions, including male-pattern hair loss, polycystic ovarian syndrome, and alopecia areata. Alopecia can also occur as a side effect of chemotherapy in cancer patients. In this study, our goal was to develop a consistent and reliable method to quantify hair loss in mice, which will allow investigators to accurately assess and compare new therapeutic approaches for these various forms of alopecia. The method utilizes a standard gel imager to obtain and process images of mice, measuring the light absorption, which occurs in rough proportion to the amount of black (or gray) hair on the mouse. Data that has been quantified in this fashion can then be analyzed using standard statistical techniques (i.e., ANOVA, T-test). This methodology was tested in mouse models of chemotherapy-induced alopecia, alopecia areata and alopecia from waxing. In this report, the detailed protocol is presented for performing these measurements, including validation data from C57BL/6 and C3H/HeJ strains of mice. This new technique offers a number of advantages, including relative simplicity of application, reliance on equipment which is readily available in most research laboratories, and applying an objective, quantitative assessment which is more robust than subjective evaluations. Improvements in quantification of hair growth in mice will improve study of alopecia models and facilitate evaluation of promising new therapies in preclinical studies.

  1. Consensus problem in directed networks of multi-agents via nonlinear protocols

    International Nuclear Information System (INIS)

    Liu Xiwei; Chen Tianping; Lu Wenlian

    2009-01-01

    In this Letter, the consensus problem via distributed nonlinear protocols for directed networks is investigated. Its dynamical behaviors are described by ordinary differential equations (ODEs). Based on graph theory, matrix theory and the Lyapunov direct method, some sufficient conditions of nonlinear protocols guaranteeing asymptotical or exponential consensus are presented and rigorously proved. The main contribution of this work is that for nonlinearly coupled networks, we generalize the results for undirected networks to directed networks. Consensus under pinning control technique is also developed here. Simulations are also given to show the validity of the theories.

  2. A Proof-checked Verification of a Real-Time Communication Protocol

    NARCIS (Netherlands)

    Polak, I.

    We present an analysis of a protocol developed by Philips to connect several components of an audio-system. The verification of the protocol is carried out using the timed I/O-automata model of Lynch and Vaandrager. The verification has been partially proof-checked with the interactive proof

  3. An adaptive transmission protocol for managing dynamic shared states in collaborative surgical simulation.

    Science.gov (United States)

    Qin, J; Choi, K S; Ho, Simon S M; Heng, P A

    2008-01-01

    A force prediction algorithm is proposed to facilitate virtual-reality (VR) based collaborative surgical simulation by reducing the effect of network latencies. State regeneration is used to correct the estimated prediction. This algorithm is incorporated into an adaptive transmission protocol in which auxiliary features such as view synchronization and coupling control are equipped to ensure the system consistency. We implemented this protocol using multi-threaded technique on a cluster-based network architecture.

  4. Analysis of rocks involving the x-ray diffraction, infrared and thermal gravimetric techniques

    International Nuclear Information System (INIS)

    Ikram, M.; Rauf, M.A.; Munir, N.

    1998-01-01

    Chemical analysis of rocks and minerals are usually obtained by a number of analytical techniques. The purpose of present work is to investigate the chemical composition of the rock samples and also to find that how far the results obtained by different instrumental methods are closely related. Chemical tests wee performed before using the instrumental techniques in order to determined the nature of these rocks. The chemical analysis indicated mainly the presence of carbonate and hence the carbonate nature of these rocks. The x-ray diffraction, infrared spectroscopy and thermal gravimetric analysis techniques were used for the determination of chemical composition of these samples. The results obtained by using these techniques have shown a great deal of similarities. (author)

  5. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  6. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  7. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  8. Integrating usability testing and think-aloud protocol analysis with "near-live" clinical simulations in evaluating clinical decision support.

    Science.gov (United States)

    Li, Alice C; Kannry, Joseph L; Kushniruk, Andre; Chrimes, Dillon; McGinn, Thomas G; Edonyabo, Daniel; Mann, Devin M

    2012-11-01

    Usability evaluations can improve the usability and workflow integration of clinical decision support (CDS). Traditional usability testing using scripted scenarios with think-aloud protocol analysis provide a useful but incomplete assessment of how new CDS tools interact with users and clinical workflow. "Near-live" clinical simulations are a newer usability evaluation tool that more closely mimics clinical workflow and that allows for a complementary evaluation of CDS usability as well as impact on workflow. This study employed two phases of testing a new CDS tool that embedded clinical prediction rules (an evidence-based medicine tool) into primary care workflow within a commercial electronic health record. Phase I applied usability testing involving "think-aloud" protocol analysis of 8 primary care providers encountering several scripted clinical scenarios. Phase II used "near-live" clinical simulations of 8 providers interacting with video clips of standardized trained patient actors enacting the clinical scenario. In both phases, all sessions were audiotaped and had screen-capture software activated for onscreen recordings. Transcripts were coded using qualitative analysis methods. In Phase I, the impact of the CDS on navigation and workflow were associated with the largest volume of negative comments (accounting for over 90% of user raised issues) while the overall usability and the content of the CDS were associated with the most positive comments. However, usability had a positive-to-negative comment ratio of only 0.93 reflecting mixed perceptions about the usability of the CDS. In Phase II, the duration of encounters with simulated patients was approximately 12 min with 71% of the clinical prediction rules being activated after half of the visit had already elapsed. Upon activation, providers accepted the CDS tool pathway 82% of times offered and completed all of its elements in 53% of all simulation cases. Only 12.2% of encounter time was spent using the

  9. Mac protocols for wireless sensor network (wsn): a comparative study

    International Nuclear Information System (INIS)

    Arshad, J.; Akram, Q.; Saleem, Y.

    2014-01-01

    Data communication between nodes is carried out under Medium Access Control (MAC) protocol which is defined at data link layer. The MAC protocols are responsible to communicate and coordinate between nodes according to the defined standards in WSN (Wireless Sensor Networks). The design of a MAC protocol should also address the issues of energy efficiency and transmission efficiency. There are number of MAC protocols that exist in the literature proposed for WSN. In this paper, nine MAC protocols which includes S-MAC, T-MAC, Wise-MAC, Mu-MAC, Z-MAC, A-MAC, D-MAC, B-MAC and B-MAC+ for WSN have been explored, studied and analyzed. These nine protocols are classified in contention based and hybrid (combination of contention and schedule based) MAC protocols. The goal of this comparative study is to provide a basis for MAC protocols and to highlight different mechanisms used with respect to parameters for the evaluation of energy and transmission efficiency in WSN. This study also aims to give reader a better understanding of the concepts, processes and flow of information used in these MAC protocols for WSN. A comparison with respect to energy reservation scheme, idle listening avoidance, latency, fairness, data synchronization, and throughput maximization has been presented. It was analyzed that contention based MAC protocols are less energy efficient as compared to hybrid MAC protocols. From the analysis of contention based MAC protocols in term of energy consumption, it was being observed that protocols based on preamble sampling consume lesser energy than protocols based on static or dynamic sleep schedule. (author)

  10. The tree identify protocol of IEEE 1394 in uCRL

    NARCIS (Netherlands)

    C. Shankland; M.B. van der Zwaag

    1998-01-01

    textabstractWe specify the tree identify protocol of the IEEE 1394 high performance serial multimedia bus at three different levels of detail using $mu$CRL. We use the cones and foci verification technique of Groote and Springintveld to show that the descriptions are equivalent under branching

  11. A Novel Process Audit for Standardized Perioperative Handoff Protocols.

    Science.gov (United States)

    Pallekonda, Vinay; Scholl, Adam T; McKelvey, George M; Amhaz, Hassan; Essa, Deanna; Narreddy, Spurthy; Tan, Jens; Templonuevo, Mark; Ramirez, Sasha; Petrovic, Michelle A

    2017-11-01

    A perioperative handoff protocol provides a standardized delivery of communication during a handoff that occurs from the operating room to the postanestheisa care unit or ICU. The protocol's success is dependent, in part, on its continued proper use over time. A novel process audit was developed to help ensure that a perioperative handoff protocol is used accurately and appropriately over time. The Audit Observation Form is used for the Audit Phase of the process audit, while the Audit Averages Form is used for the Data Analysis Phase. Employing minimal resources and using quantitative methods, the process audit provides the necessary means to evaluate the proper execution of any perioperative handoff protocol. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  12. Imaging techniques and investigation protocols in pediatric emergency imaging

    International Nuclear Information System (INIS)

    Scharitzer, M.; Hoermann, M.; Puig, S.; Prokop, M.

    2002-01-01

    Paediatric emergencies demand a quick and efficient radiological investigation with special attention to specific adjustments related to patient age and radiation protection. Imaging modalities are improving rapidly and enable to diagnose childhood diseases and injuries more quickly, accurately and safely. This article provides an overview of imaging techniques adjusted to the age of the child and an overview of imaging strategies of common paediatric emergencies. Optimising the imaging parameters (digital radiography, different screen-film systems, exposure specifications) allows for substantial reduction of radiation dose. Spiral- and multislice-CT reduce scan time and enable a considerable reduction of radiation exposure if scanning parameters (pitch setting, tube current) are properly adjusted. MRI is still mainly used for neurological or spinal emergencies despite the advent of fast imaging sequences. The radiologist's task is to select an appropriate imaging strategy according to expected differential diagnosis and to adjust the imaging techniques to the individual patient. (orig.) [de

  13. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  14. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  15. The cardiovascular responses of male subjects to kung fu techniques. Expert/novice paradigm.

    Science.gov (United States)

    Jones, M A; Unnithan, V B

    1998-12-01

    The primary aim was to assess cardiovascular responses of expert and novice subjects to kung fu techniques. It was hypothesised that experienced subjects would demonstrate improved economy of movement during the techniques, evidenced by reduced exercise intensity. a comparative design was established utilising two groups; experienced (group E), and novice (group N). the experimentation took place under laboratory conditions, but was designed to maximise external validity. the only preselection variables were regular attendance at training and experience. Nine experienced males (group E, exp 9.5 +/- 5.2 yrs) and nine novice males (group N, exp 1.2 +/- 0.1 yrs) participated. The only exclusion guidelines were contraindications to participate within a maximal test, no subjects were excluded upon this basis. N/A. each subject participated in three kung fu protocols (forms, kicking and punching). Each protocol, randomly allocated, consisted of ten work (30 sec) and ten rest periods (30 sec). MEASURES taken during the protocols were heart rate (HR) and oxygen consumption (VO2). These were expressed as a percentage of maximal values to reflect exercise intensity. During both the form protocol and punching protocol group E were found to be working at a significantly (p kung fu techniques differ depending upon experience level. It is difficult to directly relate this to improved economy since work output could not be accurately quantified. It was also found that kung fu protocols elicited exercise intensity into the cardiovascular training zone.

  16. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  17. A Simple XML Producer-Consumer Protocol

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of

  18. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  19. Improvement of burn pain management through routine pain monitoring and pain management protocol.

    Science.gov (United States)

    Yang, Hyeong Tae; Hur, Giyeun; Kwak, In-Suk; Yim, Haejun; Cho, Yong Suk; Kim, Dohern; Hur, Jun; Kim, Jong Hyun; Lee, Boung Chul; Seo, Cheong Hoon; Chun, Wook

    2013-06-01

    Pain management is an important aspect of burn management. We developed a routine pain monitoring system and pain management protocol for burn patients. The purpose of this study is to evaluate the effectiveness of our new pain management system. From May 2011 to November 2011, the prospective study was performed with 107 burn patients. We performed control group (n=58) data analysis and then developed the pain management protocol and monitoring system. Next, we applied our protocol to patients and performed protocol group (n=49) data analysis, and compared this to control group data. Data analysis was performed using the Numeric Rating Scale (NRS) of background pain and procedural pain, Clinician-Administered PTSD Scale (CAPS), Hamilton Depression Rating Scale (HDRS), State-Trait Anxiety Inventory Scale (STAIS), and Holmes and Rahe Stress Scale (HRSS). The NRS of background pain for the protocol group was significantly decreased compared to the control group (2.8±2.0 versus 3.9±1.9), and the NRS of procedural pain of the protocol group was significantly decreased compared to the control group (4.8±2.8 versus 3.7±2.5). CAPS and HDRS were decreased in the protocol group, but did not have statistical significance. STAIS and HRSS were decreased in the protocol group, but only the STAIS had statistical significance. Our new pain management system was effective in burn pain management. However, adequate pain management can only be accomplished by a continuous and thorough effort. Therefore, pain control protocol and pain monitoring systems need to be under constant revision and improvement using creative ideas and approaches. Copyright © 2012 Elsevier Ltd and ISBI. All rights reserved.

  20. Static Validation of a Voting Protocol

    DEFF Research Database (Denmark)

    Nielsen, Christoffer Rosenkilde; Andersen, Esben Heltoft; Nielson, Hanne Riis

    2005-01-01

    is formalised in an extension of the LySa process calculus with blinding signatures. The analysis, which is fully automatic, pinpoints previously undiscovered flaws related to verifiability and accuracy and we suggest modifications of the protocol needed for validating these properties....