WorldWideScience

Sample records for protocol analysis techniques

  1. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  2. Affinity biosensors: techniques and protocols

    National Research Council Canada - National Science Library

    Rogers, Kim R; Mulchandani, Ashok

    1998-01-01

    ..., and government to begin or expand their biosensors research. This volume, Methods in Biotechnology vol. 7: Affinity Biosensors: Techniques and Protocols, describes a variety of classical and emerging transduction technologies that have been interfaced to bioaffinity elements (e.g., antibodies and receptors). Some of the reas...

  3. Split bolus technique in polytrauma: a prospective study on scan protocols for trauma analysis

    NARCIS (Netherlands)

    Beenen, Ludo F. M.; Sierink, Joanne C.; Kolkman, Saskia; Nio, C. Yung; Saltzherr, Teun Peter; Dijkgraaf, Marcel G. W.; Goslings, J. Carel

    2015-01-01

    For the evaluation of severely injured trauma patients a variety of total body computed tomography (CT) scanning protocols exist. Frequently multiple pass protocols are used. A split bolus contrast protocol can reduce the number of passes through the body, and thereby radiation exposure, in this

  4. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    ... justification for the implementation of a given enhancement technique. The research questions were answered through model and simulation of a satellite transmission system via a Linux-based network topology...

  5. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    The purpose of this study was to assess the throughput improvement afforded by the various TCP optimization techniques, with respect to a simulated geosynchronous satellite system, to provide a cost...

  6. Fetal MRI: techniques and protocols

    International Nuclear Information System (INIS)

    Prayer, Daniela; Brugger, Peter Christian; Prayer, Lucas

    2004-01-01

    The development of ultrafast sequences has led to a significant improvement in fetal MRI. Imaging protocols have to be adjusted to the rapidly developing fetal central nervous system (CNS) and to the clinical question. Sequence parameters must be changed to cope with the respective developmental stage, to produce images free from motion artefacts and to provide optimum visualization of the region and focus of interest. In contrast to postnatal studies, every suspect fetal CNS abnormality requires examination of the whole fetus and the extrafetal intrauterine structures including the uterus. This approach covers both aspects of fetal CNS disorders: isolated and complex malformations and cerebral lesions arising from the impaired integrity of the feto-placental unit. (orig.)

  7. Fetal MRI: techniques and protocols

    Energy Technology Data Exchange (ETDEWEB)

    Prayer, Daniela [Department of Neuroradiology, University Clinics of Radiodiagnostics, Medical University Vienna, Waehringerguertel 18-10, 1090, Vienna (Austria); Brugger, Peter Christian [Department of Anatomy, Integrative Morphology Group, Medical University Vienna (Austria); Prayer, Lucas [Diagnosezentrum Urania, Vienna (Austria)

    2004-09-01

    The development of ultrafast sequences has led to a significant improvement in fetal MRI. Imaging protocols have to be adjusted to the rapidly developing fetal central nervous system (CNS) and to the clinical question. Sequence parameters must be changed to cope with the respective developmental stage, to produce images free from motion artefacts and to provide optimum visualization of the region and focus of interest. In contrast to postnatal studies, every suspect fetal CNS abnormality requires examination of the whole fetus and the extrafetal intrauterine structures including the uterus. This approach covers both aspects of fetal CNS disorders: isolated and complex malformations and cerebral lesions arising from the impaired integrity of the feto-placental unit. (orig.)

  8. Mobile Internet Protocol Analysis

    National Research Council Canada - National Science Library

    Brachfeld, Lawrence

    1999-01-01

    ...) and User Datagram Protocol (UDP). Mobile IP allows mobile computers to send and receive packets addressed with their home network IP address, regardless of the IP address of their current point of attachment on the Internet...

  9. Automata Techniques for Epistemic Protocol Synthesis

    Directory of Open Access Journals (Sweden)

    Guillaume Aucher

    2014-04-01

    Full Text Available In this work we aim at applying automata techniques to problems studied in Dynamic Epistemic Logic, such as epistemic planning. To do so, we first remark that repeatedly executing ad infinitum a propositional event model from an initial epistemic model yields a relational structure that can be finitely represented with automata. This correspondence, together with recent results on uniform strategies, allows us to give an alternative decidability proof of the epistemic planning problem for propositional events, with as by-products accurate upper-bounds on its time complexity, and the possibility to synthesize a finite word automaton that describes the set of all solution plans. In fact, using automata techniques enables us to solve a much more general problem, that we introduce and call epistemic protocol synthesis.

  10. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  11. Energy Reduction Multipath Routing Protocol for MANET Using Recoil Technique

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar Sahu

    2018-04-01

    Full Text Available In Mobile Ad-hoc networks (MANET, power conservation and utilization is an acute problem and has received significant attention from academics and industry in recent years. Nodes in MANET function on battery power, which is a rare and limited energy resource. Hence, its conservation and utilization should be done judiciously for the effective functioning of the network. In this paper, a novel protocol namely Energy Reduction Multipath Routing Protocol for MANET using Recoil Technique (AOMDV-ER is proposed, which conserves the energy along with optimal network lifetime, routing overhead, packet delivery ratio and throughput. It performs better than any other AODV based algorithms, as in AOMDV-ER the nodes transmit packets to their destination smartly by using a varying recoil off time technique based on their geographical location. This concept reduces the number of transmissions, which results in the improvement of network lifetime. In addition, the local level route maintenance reduces the additional routing overhead. Lastly, the prediction based link lifetime of each node is estimated which helps in reducing the packet loss in the network. This protocol has three subparts: an optimal route discovery algorithm amalgamation with the residual energy and distance mechanism; a coordinated recoiled nodes algorithm which eliminates the number of transmissions in order to reduces the data redundancy, traffic redundant, routing overhead, end to end delay and enhance the network lifetime; and a last link reckoning and route maintenance algorithm to improve the packet delivery ratio and link stability in the network. The experimental results show that the AOMDV-ER protocol save at least 16% energy consumption, 12% reduction in routing overhead, significant achievement in network lifetime and packet delivery ratio than Ad hoc on demand multipath distance vector routing protocol (AOMDV, Ad hoc on demand multipath distance vector routing protocol life

  12. Symbolic Analysis of Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten

    We present our work on using abstract models for formally analysing cryptographic protocols: First, we present an ecient method for verifying trace-based authenticity properties of protocols using nonces, symmetric encryption, and asymmetric encryption. The method is based on a type system...... of Gordon et al., which we modify to support fully-automated type inference. Tests conducted via an implementation of our algorithm found it to be very ecient. Second, we show how privacy may be captured in a symbolic model using an equivalencebased property and give a formal denition. We formalise...

  13. A Logical Analysis of Quantum Voting Protocols

    Science.gov (United States)

    Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja

    2017-12-01

    In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.

  14. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  15. Analysis of Security Protocols by Annotations

    DEFF Research Database (Denmark)

    Gao, Han

    . The development of formal techniques, e.g. control flow analyses, that can check various security properties, is an important tool to meet this challenge. This dissertation contributes to the development of such techniques. In this dissertation, security protocols are modelled in the process calculus LYSA......The trend in Information Technology is that distributed systems and networks are becoming increasingly important, as most of the services and opportunities that characterise the modern society are based on these technologies. Communication among agents over networks has therefore acquired a great...... deal of research interest. In order to provide effective and reliable means of communication, more and more communication protocols are invented, and for most of them, security is a significant goal. It has long been a challenge to determine conclusively whether a given protocol is secure or not...

  16. Protocol Analysis as a Method for Analyzing Conversational Data.

    Science.gov (United States)

    Aleman, Carlos G.; Vangelisti, Anita L.

    Protocol analysis, a technique that uses people's verbal reports about their cognitions as they engage in an assigned task, has been used in a number of applications to provide insight into how people mentally plan, assess, and carry out those assignments. Using a system of networked computers where actors communicate with each other over…

  17. Rethinking Protocol Analysis from a Cultural Perspective.

    Science.gov (United States)

    Smagorinsky, Peter

    2001-01-01

    Outlines a cultural-historical activity theory (CHAT) perspective that accounts for protocol analysis along three key dimensions: the relationship between thinking and speech from a representational standpoint; the social role of speech in research methodology; and the influence of speech on thinking and data collection. (Author/VWL)

  18. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  19. Analysis of security protocols based on challenge-response

    Institute of Scientific and Technical Information of China (English)

    LUO JunZhou; YANG Ming

    2007-01-01

    Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.

  20. Effectiveness of behavioral change techniques employed in eHealth interventions designed to improve glycemic control in persons with poorly controlled type 2 diabetes: a systematic review and meta-analysis protocol

    Directory of Open Access Journals (Sweden)

    Mihiretu Kebede

    2017-10-01

    Full Text Available Abstract Background The incorporation of Behavioral Change Techniques (BCTs in eHealth interventions for the management of non-communicable diseases (NCDs, such as type 2 diabetes mellitus (T2DM, might be a promising approach to improve clinical and behavioral outcomes of NCDs in the long run. This 3paper reports a protocol for a systematic review that aims to (a identify the effects of individual BCTs in eHealth interventions for lowering glycated hemoglobin levels (HbA1c and (b investigate which additional intervention features (duration of intervention, tailoring, theory-base, and mode of delivery affect levels of HbA1c in this population. The protocol follows the Preferred Reporting Items for Systematic review and Meta-Analysis Protocols (PRISMA-P 2015 guideline. Methods/design To identify eligible studies, an extensive systematic database search (PubMed, Web of Science, and PsycINFO using keywords will be conducted. This review will include randomized controlled trials examining the effects of eHealth interventions on HbA1c in persons with poorly controlled T2DM over a minimum follow-up period of 3 months. Relevant data will be extracted from the included studies using Microsoft Excel. The content of the interventions will be extracted from the description of interventions and will be classified according to the BCT taxonomy v1 tool. The quality of studies will be independently assessed by two reviewers using the Cochrane risk of bias tool. If the studies have adequate homogeneity, meta-analysis will be considered. The effect sizes of each BCT will be calculated using the random effect model. The quality of the synthesized evidence will be evaluated employing the Grading of the Recommendations Assessment, Development and Evaluation (GRADE criteria. Discussion This systematic review is one of the firsts to appraise the effectiveness of eHealth interventions employing BCTs which aimed at improving glycemic control in persons with poorly

  1. Analysis of Security Protocols in Embedded Systems

    DEFF Research Database (Denmark)

    Bruni, Alessandro

    Embedded real-time systems have been adopted in a wide range of safety-critical applications—including automotive, avionics, and train control systems—where the focus has long been on safety (i.e., protecting the external world from the potential damage caused by the system) rather than security (i.......e., protecting the system from the external world). With increased connectivity of these systems to external networks the attack surface has grown, and consequently there is a need for securing the system from external attacks. Introducing security protocols in safety critical systems requires careful...... in this direction is to extend saturation-based techniques so that enough state information can be modelled and analysed. Finally, we present a methodology for proving the same security properties in the computational model, by means of typing protocol implementations....

  2. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  3. Results of a protocol of transfusion threshold and surgical technique on transfusion requirements in burn patients.

    Science.gov (United States)

    O'Mara, Michael S; Hayetian, Fernando; Slater, Harvey; Goldfarb, I William; Tolchin, Eric; Caushaj, Philip F

    2005-08-01

    Blood loss and high rates of transfusion in burn centers remains an area of ongoing concern. Blood use brings the risk of infection, adverse reaction, and immunosuppression. A protocol to reduce blood loss and blood use was implemented. Analysis included 3-year periods before and after institution of the protocol. All patients were transfused for a hemoglobin below 8.0 gm/dL. Operations per admission did not change during the two time periods (0.78 in each). Overall units transfused per operation decreased from 1.56+/-0.06 to 1.25+/-0.14 units after instituting the protocol (pburns of less than 20% surface area, declining from 386 to 46 units after protocol institution, from 0.37 to 0.04 units per admission, and from 0.79 to 0.08 units per operation in this group of smallest burns. There was no change noted in the larger burns. This study suggests that a defined protocol of hemostasis, technique, and transfusion trigger should be implemented in the process of burn excision and grafting. This will help especially those patients with the smallest burns, essentially eliminating transfusion need in that group.

  4. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  5. RCRA groundwater data analysis protocol for the Hanford Site, Washington

    International Nuclear Information System (INIS)

    Chou, C.J.; Jackson, R.L.

    1992-04-01

    The Resource Conservation and Recovery Act of 1976 (RCRA) groundwater monitoring program currently involves site-specific monitoring of 20 facilities on the Hanford Site in southeastern Washington. The RCRA groundwater monitoring program has collected abundant data on groundwater quality. These data are used to assess the impact of a facility on groundwater quality or whether remediation efforts under RCRA corrective action programs are effective. Both evaluations rely on statistical analysis of groundwater monitoring data. The need for information on groundwater quality by regulators and environmental managers makes statistical analysis of monitoring data an important part of RCRA groundwater monitoring programs. The complexity of groundwater monitoring programs and variabilities (spatial, temporal, and analytical) exhibited in groundwater quality variables indicate the need for a data analysis protocol to guide statistical analysis. A data analysis protocol was developed from the perspective of addressing regulatory requirements, data quality, and management information needs. This data analysis protocol contains four elements: data handling methods; graphical evaluation techniques; statistical tests for trend, central tendency, and excursion analysis; and reporting procedures for presenting results to users

  6. Toward Synthesis, Analysis, and Certification of Security Protocols

    Science.gov (United States)

    Schumann, Johann

    2004-01-01

    Implemented security protocols are basically pieces of software which are used to (a) authenticate the other communication partners, (b) establish a secure communication channel between them (using insecure communication media), and (c) transfer data between the communication partners in such a way that these data only available to the desired receiver, but not to anyone else. Such an implementation usually consists of the following components: the protocol-engine, which controls in which sequence the messages of the protocol are sent over the network, and which controls the assembly/disassembly and processing (e.g., decryption) of the data. the cryptographic routines to actually encrypt or decrypt the data (using given keys), and t,he interface to the operating system and to the application. For a correct working of such a security protocol, all of these components must work flawlessly. Many formal-methods based techniques for the analysis of a security protocols have been developed. They range from using specific logics (e.g.: BAN-logic [4], or higher order logics [12] to model checking [2] approaches. In each approach, the analysis tries to prove that no (or at least not a modeled intruder) can get access to secret data. Otherwise, a scenario illustrating the &tack may be produced. Despite the seeming simplicity of security protocols ("only" a few messages are sent between the protocol partners in order to ensure a secure communication), many flaws have been detected. Unfortunately, even a perfect protocol engine does not guarantee flawless working of a security protocol, as incidents show. Many break-ins and security vulnerabilities are caused by exploiting errors in the implementation of the protocol engine or the underlying operating system. Attacks using buffer-overflows are a very common class of such attacks. Errors in the implementation of exception or error handling can open up additional vulnerabilities. For example, on a website with a log-in screen

  7. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  8. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  9. Bioinspired Security Analysis of Wireless Protocols

    DEFF Research Database (Denmark)

    Petrocchi, Marinella; Spognardi, Angelo; Santi, Paolo

    2016-01-01

    work, this paper investigates feasibility of adopting fraglets as model for specifying security protocols and analysing their properties. In particular, we give concrete sample analyses over a secure RFID protocol, showing evolution of the protocol run as chemical dynamics and simulating an adversary...

  10. Formal analysis of a fair payment protocol

    NARCIS (Netherlands)

    J.G. Cederquist; M.T. Dashti (Mohammad)

    2004-01-01

    textabstractWe formally specify a payment protocol. This protocol is intended for fair exchange of time-sensitive data. Here the ?-CRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free ?-calculus. These properties are then verified

  11. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, M.T.

    2004-01-01

    We formally specify a payment protocol. This protocol is intended for fair exchange of timesensitive data. Here the μCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free μ-calculus. These properties are then verified using the finite

  12. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, Muhammad Torabi; Dimitrakos, Theo; Martinelli, Fabio

    We formally specify a payment protocol described by Vogt et al. This protocol is intended for fair exchange of time-sensitive data. Here the mCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free mu-calculus. These properties are then

  13. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  14. Novel Techniques with the Aid of a Staged CBCT Guided Surgical Protocol

    Directory of Open Access Journals (Sweden)

    Evdokia Chasioti

    2015-01-01

    Full Text Available The case report will present some novel techniques for using a “staged” protocol utilizing strategic periodontally involved teeth as transitional abutments in combination with CBCT guided implant surgery. Staging the case prevented premature loading of the grafted sites during the healing phase. A CBCT following a tenting screw guided bone regeneration procedure ensured adequate bone to place an implant fixture. Proper assessment of the CBCT allowed the surgeon to do an osteotome internal sinus lift in an optimum location. The depth of the bone needed for the osteotome sinus floor elevation was planned. The staged appliance allowed these sinus-augmented sites to heal for an extended period of time compared to implants, which were uncovered and loaded at an earlier time frame. The staged protocol and CBCT analysis enabled the immediate implants to be placed in proper alignment to the adjacent fixture. After teeth were extracted, the osseointegrated implants were converted to abutments for the transitional appliance. Finally, the staged protocol allowed for soft tissue enhancement in the implant and pontic areas prior to final insertion of the prosthesis.

  15. PERFORMANCE ANALYSIS OF DISTINCT SECURED AUTHENTICATION PROTOCOLS USED IN THE RESOURCE CONSTRAINED PLATFORM

    Directory of Open Access Journals (Sweden)

    S. Prasanna

    2014-03-01

    Full Text Available Most of the e-commerce and m-commerce applications in the current e-business world, has adopted asymmetric key cryptography technique in their authentication protocol to provide an efficient authentication of the involved parties. This paper exhibits the performance analysis of distinct authentication protocol which implements the public key cryptography like RSA, ECC and HECC. The comparison is made based on key generation, sign generation and sign verification processes. The results prove that the performance achieved through HECC based authentication protocol is better than the ECC- and RSA based authentication protocols.

  16. Protocol and the post-human performativity of security techniques.

    Science.gov (United States)

    O'Grady, Nathaniel

    2016-07-01

    This article explores the deployment of exercises by the United Kingdom Fire and Rescue Service. Exercises stage, simulate and act out potential future emergencies and in so doing help the Fire and Rescue Service prepare for future emergencies. Specifically, exercises operate to assess and develop protocol; sets of guidelines which plan out the actions undertaken by the Fire and Rescue Service in responding to a fire. In the article I outline and assess the forms of knowledge and technologies, what I call the 'aesthetic forces', by which the exercise makes present and imagines future emergencies. By critically engaging with Karen Barad's notion of post-human performativity, I argue that exercises provide a site where such forces can entangle with one another; creating a bricolage through which future emergencies are evoked sensually and representatively, ultimately making it possible to experience emergencies in the present. This understanding of exercises allows also for critical appraisal of protocol both as phenomena that are produced through the enmeshing of different aesthetic forces and as devices which premise the operation of the security apparatus on contingency.

  17. Security analysis of session initiation protocol

    OpenAIRE

    Dobson, Lucas E.

    2010-01-01

    Approved for public release; distribution is unlimited The goal of this thesis is to investigate the security of the Session Initiation Protocol (SIP). This was accomplished by researching previously discovered protocol and implementation vulnerabilities, evaluating the current state of security tools and using those tools to discover new vulnerabilities in SIP software. The CVSS v2 system was used to score protocol and implementation vulnerabilities to give them a meaning that was us...

  18. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  19. Energy neutral protocol based on hierarchical routing techniques for energy harvesting wireless sensor network

    Science.gov (United States)

    Muhammad, Umar B.; Ezugwu, Absalom E.; Ofem, Paulinus O.; Rajamäki, Jyri; Aderemi, Adewumi O.

    2017-06-01

    Recently, researchers in the field of wireless sensor networks have resorted to energy harvesting techniques that allows energy to be harvested from the ambient environment to power sensor nodes. Using such Energy harvesting techniques together with proper routing protocols, an Energy Neutral state can be achieved so that sensor nodes can run perpetually. In this paper, we propose an Energy Neutral LEACH routing protocol which is an extension to the traditional LEACH protocol. The goal of the proposed protocol is to use Gateway node in each cluster so as to reduce the data transmission ranges of cluster head nodes. Simulation results show that the proposed routing protocol achieves a higher throughput and ensure the energy neutral status of the entire network.

  20. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  1. Analysis of a security protocol in ?CRL

    NARCIS (Netherlands)

    J. Pang

    2002-01-01

    textabstractNeedham-Schroeder public-key protocol; With the growth and commercialization of the Internet, the security of communication between computers becomes a crucial point. A variety of security protocols based on cryptographic primitives are used to establish secure communication over

  2. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  3. Imaging techniques and investigation protocols in pediatric emergency imaging

    International Nuclear Information System (INIS)

    Scharitzer, M.; Hoermann, M.; Puig, S.; Prokop, M.

    2002-01-01

    Paediatric emergencies demand a quick and efficient radiological investigation with special attention to specific adjustments related to patient age and radiation protection. Imaging modalities are improving rapidly and enable to diagnose childhood diseases and injuries more quickly, accurately and safely. This article provides an overview of imaging techniques adjusted to the age of the child and an overview of imaging strategies of common paediatric emergencies. Optimising the imaging parameters (digital radiography, different screen-film systems, exposure specifications) allows for substantial reduction of radiation dose. Spiral- and multislice-CT reduce scan time and enable a considerable reduction of radiation exposure if scanning parameters (pitch setting, tube current) are properly adjusted. MRI is still mainly used for neurological or spinal emergencies despite the advent of fast imaging sequences. The radiologist's task is to select an appropriate imaging strategy according to expected differential diagnosis and to adjust the imaging techniques to the individual patient. (orig.) [de

  4. The general protocol for the S10 technique

    Directory of Open Access Journals (Sweden)

    Mircea Constantin Șora

    2016-12-01

    Full Text Available Plastination is a process of preservation of anatomical specimens by a delicate method of forced impregnation with curable polymers like silicone, epoxy or polyester resins, with vast applications in medical fields of study. In this process, water and lipids in biological tissues are replaced by curable polymers (silicone, epoxy, polyester which are hardened, resulting in dry, odorless and durable specimens. Today, after more than 30 years of its development, plastination is being applied in more than 400 departments of anatomy, pathology, forensic sciences and biology all over the world. The standard S10 silicone technique produces flexible, resilient and opaque specimens. After fixation and dehydration, the specimens are impregnated with silicone S10 and in the end the specimens are cured. The key element in plastination is the impregnation step and therefore depending on the polymer used the optical quality of the specimens differ. The S10 silicone technique is the most common technique used in plastination. It is used worldwide for beginners, but also for experimented plastinators. The S10 plastinated specimens can be easily stored at room temperature, are non-toxic and odorless. The S10 specimens can be successfully used, especially in teaching, as they are easy to be handled and display a realistic topography. Plastinated specimens are also used for displaying whole bodies, or body parts in exhibition.

  5. Technical Analysis of SSP-21 Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Bromberger, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-09

    As part of the California Energy Systems for the Twenty-First Century (CES-21) program, in December 2016 San Diego Gas and Electric (SDG&E) contracted with Lawrence Livermore National Laboratory (LLNL) to perform an independent verification and validation (IV&V) of a white paper describing their Secure SCADA Protocol for the Twenty-First Century (SSP-21) in order to analyze the effectiveness and propriety of cryptographic protocol use within the SSP-21 specification. SSP-21 is designed to use cryptographic protocols to provide (optional) encryption, authentication, and nonrepudiation, among other capabilities. The cryptographic protocols to be used reflect current industry standards; future versions of SSP-21 will use other advanced technologies to provide a subset of security services.

  6. Protocol of measurement techniques - Project colored solar collectors

    Energy Technology Data Exchange (ETDEWEB)

    Schueler, A.; Chambrier, E. De; Roecker, Ch.; Scartezzini, J.-L.

    2004-08-15

    This illustrated annual report for the Swiss Federal Office of Energy (SFOE) takes a look at work done at the Swiss Federal Institute of Technology in Lausanne, Switzerland, on multi-layer, thin-film interference coatings for solar collector glazing. The correct combinations of refractive indices and film thickness are discussed. The authors state that corresponding multi-layered thin film stacks will have to be realised experimentally in a controlled and reproducible way. New thin film materials are to be tailored to exhibit optimised optical and ageing properties. The development of these coatings is to be based on various measurement techniques, such as spectro-photometry, measurements of total power throughput by means of a solar simulator, spectroscopic ellipsometry, scanning electron microscopy (SEM), X-ray diffraction (XRD) and X-ray photoelectron spectroscopy (XPS). The paper provides many examples of typical data and explains which film properties can be inferred from each method and thus describes both the function and purpose of the different measurement techniques.

  7. Mean-Field Analysis for the Evaluation of Gossip Protocols

    NARCIS (Netherlands)

    Bakshi, Rena; Cloth, L.; Fokkink, Wan; Haverkort, Boudewijn R.H.M.

    Gossip protocols are designed to operate in very large, decentralised networks. A node in such a network bases its decision to interact (gossip) with another node on its partial view of the global system. Because of the size of these networks, analysis of gossip protocols is mostly done using

  8. Mean-field analysis for the evaluation of gossip protocols

    NARCIS (Netherlands)

    Bakhshi, Rena; Cloth, L.; Fokkink, Wan; Haverkort, Boudewijn R.H.M.

    2008-01-01

    Gossip protocols are designed to operate in very large, decentralised networks. A node in such a network bases its decision to interact (gossip) with another node on its partial view of the global system. Because of the size of these networks, analysis of gossip protocols is mostly done using

  9. A Calculus for Control Flow Analysis of Security Protocols

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Nielson, Hanne Riis; Nielson, Flemming

    2004-01-01

    The design of a process calculus for anaysing security protocols is governed by three factors: how to express the security protocol in a precise and faithful manner, how to accommodate the variety of attack scenarios, and how to utilise the strengths (and limit the weaknesses) of the underlying...... analysis methodology. We pursue an analysis methodology based on control flow analysis in flow logic style and we have previously shown its ability to analyse a variety of security protocols. This paper develops a calculus, LysaNS that allows for much greater control and clarity in the description...

  10. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  11. Analysis of Security Protocols for Mobile Healthcare.

    Science.gov (United States)

    Wazid, Mohammad; Zeadally, Sherali; Das, Ashok Kumar; Odelu, Vanga

    2016-11-01

    Mobile Healthcare (mHealth) continues to improve because of significant improvements and the decreasing costs of Information Communication Technologies (ICTs). mHealth is a medical and public health practice, which is supported by mobile devices (for example, smartphones) and, patient monitoring devices (for example, various types of wearable sensors, etc.). An mHealth system enables healthcare experts and professionals to have ubiquitous access to a patient's health data along with providing any ongoing medical treatment at any time, any place, and from any device. It also helps the patient requiring continuous medical monitoring to stay in touch with the appropriate medical staff and healthcare experts remotely. Thus, mHealth has become a major driving force in improving the health of citizens today. First, we discuss the security requirements, issues and threats to the mHealth system. We then present a taxonomy of recently proposed security protocols for mHealth system based on features supported and possible attacks, computation cost and communication cost. Our detailed taxonomy demonstrates the strength and weaknesses of recently proposed security protocols for the mHealth system. Finally, we identify some of the challenges in the area of security protocols for mHealth systems that still need to be addressed in the future to enable cost-effective, secure and robust mHealth systems.

  12. Analysis of protection spanning-tree protocol

    Directory of Open Access Journals (Sweden)

    Б.Я. Корнієнко

    2007-01-01

    Full Text Available  Extraordinary sweeping  of  IT – development  causes vulnerabilities and, thereafter, attacks that use these vulnerabilities. That is why one must post factum or even in advance speed up invention of new information  security systems as well as develop the old ones. The matter of article concerns Spanning-Tree Protocol  – the vivid example of the case, when the cure of the vulnerability creates dozen of new "weak spots".

  13. Reliability and criterion validity of an observation protocol for working technique assessments in cash register work.

    Science.gov (United States)

    Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina

    2016-06-01

    We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.

  14. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  15. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  16. Formal Analysis of SET and NSL Protocols Using the Interpretation Functions-Based Method

    Directory of Open Access Journals (Sweden)

    Hanane Houmani

    2012-01-01

    Full Text Available Most applications in the Internet such as e-banking and e-commerce use the SET and the NSL protocols to protect the communication channel between the client and the server. Then, it is crucial to ensure that these protocols respect some security properties such as confidentiality, authentication, and integrity. In this paper, we analyze the SET and the NSL protocols with respect to the confidentiality (secrecy property. To perform this analysis, we use the interpretation functions-based method. The main idea behind the interpretation functions-based technique is to give sufficient conditions that allow to guarantee that a cryptographic protocol respects the secrecy property. The flexibility of the proposed conditions allows the verification of daily-life protocols such as SET and NSL. Also, this method could be used under different assumptions such as a variety of intruder abilities including algebraic properties of cryptographic primitives. The NSL protocol, for instance, is analyzed with and without the homomorphism property. We show also, using the SET protocol, the usefulness of this approach to correct weaknesses and problems discovered during the analysis.

  17. Agricultural Soil Spectral Response and Properties Assessment: Effects of Measurement Protocol and Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Asa Gholizadeh

    2017-10-01

    Full Text Available Soil spectroscopy has shown to be a fast, cost-effective, environmentally friendly, non-destructive, reproducible and repeatable analytical technique. Soil components, as well as types of instruments, protocols, sampling methods, sample preparation, spectral acquisition techniques and analytical algorithms have a combined influence on the final performance. Therefore, it is important to characterize these differences and to introduce an effective approach in order to minimize the technical factors that alter reflectance spectra and consequent prediction. To quantify this alteration, a joint project between Czech University of Life Sciences Prague (CULS and Tel-Aviv University (TAU was conducted to estimate Cox, pH-H2O, pH-KCl and selected forms of Fe and Mn. Two different soil spectral measurement protocols and two data mining techniques were used to examine seventy-eight soil samples from five agricultural areas in different parts of the Czech Republic. Spectral measurements at both laboratories were made using different ASD spectroradiometers. The CULS protocol was based on employing a contact probe (CP spectral measurement scheme, while the TAU protocol was carried out using a CP measurement method, accompanied with the internal soil standard (ISS procedure. Two spectral datasets, acquired from different protocols, were both analyzed using partial least square regression (PLSR technique as well as the PARACUDA II®, a new data mining engine for optimizing PLSR models. The results showed that spectra based on the CULS setup (non-ISS demonstrated significantly higher albedo intensity and reflectance values relative to the TAU setup with ISS. However, the majority of statistics using the TAU protocol was not noticeably better than the CULS spectra. The paper also highlighted that under both measurement protocols, the PARACUDA II® engine proved to be a powerful tool for providing better results than PLSR. Such initiative is not only a way to

  18. An optimized protocol for handling and processing fragile acini cultured with the hanging drop technique.

    Science.gov (United States)

    Snyman, Celia; Elliott, Edith

    2011-12-15

    The hanging drop three-dimensional culture technique allows cultivation of functional three-dimensional mammary constructs without exogenous extracellular matrix. The fragile acini are, however, difficult to preserve during processing steps for advanced microscopic investigation. We describe adaptations to the protocol for handling of hanging drop cultures to include investigation using confocal, scanning, and electron microscopy, with minimal loss of cell culture components. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Formal Security Analysis of the MaCAN Protocol

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Sojka, Michal; Nielson, Flemming

    2014-01-01

    analysis identifies two flaws in the original protocol: one creates unavailability concerns during key establishment, and the other allows re-using authenticated signals for different purposes. We propose and analyse a modification that improves its behaviour while fitting the constraints of CAN bus...

  20. Emulation Platform for Cyber Analysis of Wireless Communication Network Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Van Leeuwen, Brian P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldridge, John M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    Wireless networking and mobile communications is increasing around the world and in all sectors of our lives. With increasing use, the density and complexity of the systems increase with more base stations and advanced protocols to enable higher data throughputs. The security of data transported over wireless networks must also evolve with the advances in technologies enabling more capable wireless networks. However, means for analysis of the effectiveness of security approaches and implementations used on wireless networks are lacking. More specifically a capability to analyze the lower-layer protocols (i.e., Link and Physical layers) is a major challenge. An analysis approach that incorporates protocol implementations without the need for RF emissions is necessary. In this research paper several emulation tools and custom extensions that enable an analysis platform to perform cyber security analysis of lower layer wireless networks is presented. A use case of a published exploit in the 802.11 (i.e., WiFi) protocol family is provided to demonstrate the effectiveness of the described emulation platform.

  1. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  2. New protocol for construction of eyeglasses-supported provisional nasal prosthesis using CAD/CAM techniques.

    Science.gov (United States)

    Ciocca, Leonardo; Fantini, Massimiliano; De Crescenzio, Francesca; Persiani, Franco; Scotti, Roberto

    2010-01-01

    A new protocol for making an immediate provisional eyeglasses-supported nasal prosthesis is presented that uses laser scanning, computer-aided design/computer-aided manufacturing procedures, and rapid prototyping techniques, reducing time and costs while increasing the quality of the final product. With this protocol, the eyeglasses were digitized, and the relative position of the nasal prosthesis was planned and evaluated in a virtual environment without any try-in appointment. This innovative method saves time, reduces costs, and restores the patient's aesthetic appearance after a disfiguration caused by ablation of the nasal pyramid better than conventional restoration methods. Moreover, the digital model of the designed nasal epithesis can be used to develop a definitive prosthesis anchored to osseointegrated craniofacial implants.

  3. [Professional divers: analysis of critical issues and proposal of a health protocol for work fitness].

    Science.gov (United States)

    Pedata, Paola; Corvino, Anna Rita; Napolitano, Raffaele Carmine; Garzillo, Elpidio Maria; Furfaro, Ciro; Lamberti, Monica

    2016-01-20

    From many years now, thanks to the development of modern diving techniques, there has been a rapid spread of diving activities everywhere. In fact, divers are ever more numerous both among the Armed Forces and civilians who dive for work, like fishing, biological research and archeology. The aim of the study was to propose a health protocol for work fitness of professional divers keeping in mind the peculiar work activity, existing Italian legislation that is almost out of date and the technical and scientific evolution in this occupational field. We performed an analysis of the most frequently occurring diseases among professional divers and of the clinical investigation and imaging techniques used for work fitness assessment of professional divers. From analysis of the health protocol recommended by D.M. 13 January 1979 (Ministerial Decree), that is most used by occupational health physician, several critical issues emerged. Very often the clinical investigation and imaging techniques still used are almost obsolete, ignoring the execution of simple and inexpensive investigations that are more useful for work fitness assessment. Considering the out-dated legislation concerning diving disciplines, it is necessary to draw up a common health protocol that takes into account clinical and scientific knowledge and skills acquired in this area. This protocol's aim is to propose a useful tool for occupational health physicians who work in this sector.

  4. IPv4 and IPv6 protocol compatibility options analysis

    Directory of Open Access Journals (Sweden)

    Regina Misevičienė

    2013-09-01

    Full Text Available The popularity of the internet has led to a very rapid growth of IPv4 (Internet Protocol v4 users. This caused a shortage of IP addresses, so it was created a new version – IPv6 (Internet Protocol v6. Currently, there are two versions of IP for IPv4 and IPv6. Due to the large differences in addressing the protocols IPv4 and IPv6 are incompatible. It is therefore necessary to find ways to move from IPv4 to IPv6. To facilitate the transition from one version to another are developed various mechanisms and strategies. Comparative analysis is done for dual stack, 6to4 tunnel and NAT64 mechanisms in this work. It has helped to reveal the shortcomings of these mechanisms and their application in selection of realization decisions.

  5. Protocol design and analysis for cooperative wireless networks

    CERN Document Server

    Song, Wei; Jin, A-Long

    2017-01-01

    This book focuses on the design and analysis of protocols for cooperative wireless networks, especially at the medium access control (MAC) layer and for crosslayer design between the MAC layer and the physical layer. It highlights two main points that are often neglected in other books: energy-efficiency and spatial random distribution of wireless devices. Effective methods in stochastic geometry for the design and analysis of wireless networks are also explored. After providing a comprehensive review of existing studies in the literature, the authors point out the challenges that are worth further investigation. Then, they introduce several novel solutions for cooperative wireless network protocols that reduce energy consumption and address spatial random distribution of wireless nodes. For each solution, the book offers a clear system model and problem formulation, details of the proposed cooperative schemes, comprehensive performance analysis, and extensive numerical and simulation results that validate th...

  6. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  7. A security analysis of the 802.11s wireless mesh network routing protocol and its secure routing protocols.

    Science.gov (United States)

    Tan, Whye Kit; Lee, Sang-Gon; Lam, Jun Huy; Yoo, Seong-Moo

    2013-09-02

    Wireless mesh networks (WMNs) can act as a scalable backbone by connecting separate sensor networks and even by connecting WMNs to a wired network. The Hybrid Wireless Mesh Protocol (HWMP) is the default routing protocol for the 802.11s WMN. The routing protocol is one of the most important parts of the network, and it requires protection, especially in the wireless environment. The existing security protocols, such as the Broadcast Integrity Protocol (BIP), Counter with cipher block chaining message authentication code protocol (CCMP), Secure Hybrid Wireless Mesh Protocol (SHWMP), Identity Based Cryptography HWMP (IBC-HWMP), Elliptic Curve Digital Signature Algorithm HWMP (ECDSA-HWMP), and Watchdog-HWMP aim to protect the HWMP frames. In this paper, we have analyzed the vulnerabilities of the HWMP and developed security requirements to protect these identified vulnerabilities. We applied the security requirements to analyze the existing secure schemes for HWMP. The results of our analysis indicate that none of these protocols is able to satisfy all of the security requirements. We also present a quantitative complexity comparison among the protocols and an example of a security scheme for HWMP to demonstrate how the result of our research can be utilized. Our research results thus provide a tool for designing secure schemes for the HWMP.

  8. A markerless protocol for genetic analysis of Aggregatibacter actinomycetemcomitans

    Science.gov (United States)

    Cheng, Ya-An; Jee, Jason; Hsu, Genie; Huang, Yanyan; Chen, Casey; Lin, Chun-Pin

    2015-01-01

    Background/Purpose The genomes of different Aggregatibacter actinomycetemcomitans strains contain many strain-specific genes and genomic islands (defined as DNA found in some but not all strains) of unknown functions. Genetic analysis for the functions of these islands will be constrained by the limited availability of genetic markers and vectors for A. actinomycetemcomitans. In this study we tested a novel genetic approach of gene deletion and restoration in a naturally competent A. actinomycetemcomitans strain D7S-1. Methods Specific genes’ deletion mutants and mutants restored with the deleted genes were constructed by a markerless loxP/Cre system. In mutants with sequential deletion of multiple genes loxP with different spacer regions were used to avoid unwanted recombinations between loxP sites. Results Eight single-gene deletion mutants, four multiple-gene deletion mutants, and two mutants with restored genes were constructed. No unintended non-specific deletion mutants were generated by this protocol. The protocol did not negatively affect the growth and biofilm formation of A. actinomycetemcomitans. Conclusion The protocol described in this study is efficient and specific for genetic manipulation of A. actinomycetemcomitans, and will be amenable for functional analysis of multiple genes in A. actinomycetemcomitans. PMID:24530245

  9. Timing Analysis of the FlexRay Communication Protocol

    DEFF Research Database (Denmark)

    Pop, Traian; Pop, Paul; Eles, Petru

    2006-01-01

    FlexRay will very likely become the de-facto standard for in-vehicle communications. However, before it can be successfully used for safety-critical applications that require predictability, timing analysis techniques are necessary for providing bounds for the message communication times....... In this paper, we propose techniques for determining the timing properties of messages transmitted in both the static (ST) and the dynamic (DYN) segments of a FlexRay communication cycle. The analysis techniques for messages are integrated in the context of a holistic schedulability analysis that computes...

  10. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  11. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  12. Rehabilitation with 4 zygomatic implants with a new surgical protocol using ultrasonic technique.

    Science.gov (United States)

    Mozzati, Marco; Mortellaro, Carmen; Arata, Valentina; Gallesio, Giorgia; Previgliano, Valter

    2015-05-01

    When the residual bone crest cannot allow the placement of standard implants, the treatment for complete arch rehabilitation of severely atrophic maxillae can be performed with 4 zygomatic implants (ZIs) and immediate function with predictable results in terms of aesthetics, function, and comfort for the patient. However, even if ZIs' rehabilitations showed a good success rate, this surgery is difficult and need a skillful operator. Complications in this kind of rehabilitation are not uncommon; the main difficulties can be related to the reduced surgical visibility and instrument control in a critical anatomic area. All the surgical protocols described in the literature used drilling techniques. Furthermore, the use of ultrasonic instruments in implant surgery compared with drilling instruments have shown advantages in many aspects of surgical procedures, tissues management, enhancement of control, surgical visualization, and healing. The aim of this study was to report on the preliminary experience using ultrasound technique for ZIs surgery in terms of safety and technical improvement. Ten consecutive patients with severely atrophic maxilla have been treated with 4 ZIs and immediate complete arch acrylic resin provisional prostheses. The patients were followed up from 30 to 32 months evaluating implant success, prosthetic success, and patient satisfaction with a questionnaire. No implants were lost during the study period, with a 100% implant and prosthetic success rate. Within the limitations of this preliminary study, these data indicate that ultrasonic implant site preparation for ZIs can be a good alternative to the drilling technique and an improvement for the surgeon.

  13. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  14. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  15. Implementation and Analysis of Real-Time Streaming Protocols.

    Science.gov (United States)

    Santos-González, Iván; Rivero-García, Alexandra; Molina-Gil, Jezabel; Caballero-Gil, Pino

    2017-04-12

    Communication media have become the primary way of interaction thanks to the discovery and innovation of many new technologies. One of the most widely used communication systems today is video streaming, which is constantly evolving. Such communications are a good alternative to face-to-face meetings, and are therefore very useful for coping with many problems caused by distance. However, they suffer from different issues such as bandwidth limitation, network congestion, energy efficiency, cost, reliability and connectivity. Hence, the quality of service and the quality of experience are considered the two most important issues for this type of communication. This work presents a complete comparative study of two of the most used protocols of video streaming, Real Time Streaming Protocol (RTSP) and the Web Real-Time Communication (WebRTC). In addition, this paper proposes two new mobile applications that implement those protocols in Android whose objective is to know how they are influenced by the aspects that most affect the streaming quality of service, which are the connection establishment time and the stream reception time. The new video streaming applications are also compared with the most popular video streaming applications for Android, and the experimental results of the analysis show that the developed WebRTC implementation improves the performance of the most popular video streaming applications with respect to the stream packet delay.

  16. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    Directory of Open Access Journals (Sweden)

    McNally James

    2009-01-01

    Full Text Available Abstract Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development, a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community.

  17. Analysis of Intracellular Metabolites from Microorganisms: Quenching and Extraction Protocols.

    Science.gov (United States)

    Pinu, Farhana R; Villas-Boas, Silas G; Aggio, Raphael

    2017-10-23

    Sample preparation is one of the most important steps in metabolome analysis. The challenges of determining microbial metabolome have been well discussed within the research community and many improvements have already been achieved in last decade. The analysis of intracellular metabolites is particularly challenging. Environmental perturbations may considerably affect microbial metabolism, which results in intracellular metabolites being rapidly degraded or metabolized by enzymatic reactions. Therefore, quenching or the complete stop of cell metabolism is a pre-requisite for accurate intracellular metabolite analysis. After quenching, metabolites need to be extracted from the intracellular compartment. The choice of the most suitable metabolite extraction method/s is another crucial step. The literature indicates that specific classes of metabolites are better extracted by different extraction protocols. In this review, we discuss the technical aspects and advancements of quenching and extraction of intracellular metabolite analysis from microbial cells.

  18. Analysis of Intracellular Metabolites from Microorganisms: Quenching and Extraction Protocols

    Directory of Open Access Journals (Sweden)

    Farhana R. Pinu

    2017-10-01

    Full Text Available Sample preparation is one of the most important steps in metabolome analysis. The challenges of determining microbial metabolome have been well discussed within the research community and many improvements have already been achieved in last decade. The analysis of intracellular metabolites is particularly challenging. Environmental perturbations may considerably affect microbial metabolism, which results in intracellular metabolites being rapidly degraded or metabolized by enzymatic reactions. Therefore, quenching or the complete stop of cell metabolism is a pre-requisite for accurate intracellular metabolite analysis. After quenching, metabolites need to be extracted from the intracellular compartment. The choice of the most suitable metabolite extraction method/s is another crucial step. The literature indicates that specific classes of metabolites are better extracted by different extraction protocols. In this review, we discuss the technical aspects and advancements of quenching and extraction of intracellular metabolite analysis from microbial cells.

  19. Reliability of diagnostic imaging techniques in suspected acute appendicitis: proposed diagnostic protocol

    International Nuclear Information System (INIS)

    Cura del, J. L.; Oleaga, L.; Grande, D.; Vela, A. C.; Ibanez, A. M.

    2001-01-01

    To study the utility of ultrasound and computed tomography (CT) in case of suspected appendicitis. To determine the diagnostic yield in terms of different clinical contexts and patient characteristics. to assess the costs and benefits of introducing these techniques and propose a protocol for their use. Negative appendectomies, complications and length of hospital stay in a group of 152 patients with suspected appendicitis who underwent ultrasound and CT were compared with those of 180 patients who underwent appendectomy during the same time period, but had not been selected for the first group: these patients costs for each group were calculated. In the first group, the diagnostic value of the clinical signs was also evaluated. The reliability of the clinical signs was limited, while the results with ultrasound and CT were excellent. The incidence of negative appendectomy was 9.6% in the study group and 12.2% in the control group. Moreover, there were fewer complications and a shorter hospital stay in the first group. Among men, however, the rate of negative appendectomy was lower in the control group. The cost of using ultrasound and CT in the management of appendicitis was only slightly higher than that of the control group. Although ultrasound and CT are not necessary in cases in which the probability of appendicitis is low or in men presenting clear clinical evidence, the use of these techniques is indicated in the remaining cases in which appendicitis is suspected. In children, ultrasound is the technique of choice. In all other patients, if negative results are obtained with one of the two techniques, the other should be performed. (Author) 49 refs

  20. MRI technique for the preoperative evaluation of deep infiltrating endometriosis: current status and protocol recommendation

    International Nuclear Information System (INIS)

    Schneider, C.; Oehmke, F.; Tinneberg, H.-R.; Krombach, G.A.

    2016-01-01

    Endometriosis is a common cause of chronic pelvic pain and infertility. It is defined as the occurrence of endometrial tissue outside the uterine cavity and can manifest as a peritoneal, ovarian or infiltrating form, the latter being referred to as deep infiltrating endometriosis (DIE). Surgery is essential in the treatment of DIE and depending on the severity of the disease, surgery can be difficult and extensive. Beside clinical examination and ultrasound, magnetic resonance imaging (MRI) has proven its value to provide useful information for planning surgery in patients with suspected DIE. To optimise the quality of MRI examinations, radiologists have to be familiar with the capabilities and also the limitations of this technique with respect to the assessment of DIE. MRI yields morphological information by using mainly T1- and T2-weighted sequences, but can also provide functional information by means of intravenous gadolinium, diffusion-weighted imaging or cine-MRI. In this article, these techniques and also adequate measures of patient preparation, which are indispensable for successful MRI imaging for the preoperative evaluation of DIE, are reviewed and a comprehensive protocol recommendation is provided.

  1. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  2. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  3. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  4. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  5. Symbolic Model Checking and Analysis for E-Commerce Protocol

    Institute of Scientific and Technical Information of China (English)

    WEN Jing-Hua; ZHANG Mei; LI Xiang

    2005-01-01

    A new approach is proposed for analyzing non-repudiation and fairness of e-commerce protocols. The authentication e-mail protocol CMP1 is modeled as finite state machine and analyzed in two vital aspects - non-repudiation and fairness using SMV. As a result, the CMP1 protocol is not fair and we have improved it. This result shows that it is effective to analyze and check the new features of e-commerce protocols using SMV model checker

  6. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  7. The comparative cost analysis of EAP Re-authentication Protocol and EAP TLS Protocol

    OpenAIRE

    Seema Mehla; Bhawna Gupta

    2010-01-01

    the Extensible Authentication Protocol (EAP) is a generic framework supporting multiple types of authentication methods. In systems where EAP is used for authentication, it is desirable to not repeat the entire EAP exchange with another authenticator. The EAP reauthentication Protocol provides a consistent, methodindependentand low-latency re-authentication. It is extension to current EAP mechanism to support intradomain handoff authentication. This paper analyzed the performance of the EAP r...

  8. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  9. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  10. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  11. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  12. Authentication Test-Based the RFID Authentication Protocol with Security Analysis

    Directory of Open Access Journals (Sweden)

    Minghui Wang

    2014-08-01

    Full Text Available To the problem of many recently proposed RFID authentication protocol was soon find security holes, we analyzed the main reason, which is that protocol design is not rigorous, and the correctness of the protocol cannot be guaranteed. To this end, authentication test method was adopted in the process of the formal analysis and strict proof to the proposed RFID protocol in this paper. Authentication Test is a new type of analysis and design method of security protocols based on Strand space model, and it can be used for most types of the security protocols. After analysis the security, the proposed protocol can meet the RFID security demand: information confidentiality, data integrity and identity authentication.

  13. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  14. Performance analysis of signaling protocols on OBS switches

    Science.gov (United States)

    Kirci, Pinar; Zaim, A. Halim

    2005-10-01

    In this paper, Just-In-Time (JIT), Just-Enough-Time (JET) and Horizon signalling schemes for Optical Burst Switched Networks (OBS) are presented. These signaling schemes run over a core dWDM network and a network architecture based on Optical Burst Switches (OBS) is proposed to support IP, ATM and Burst traffic. In IP and ATM traffic several packets are assembled in a single packet called burst and the burst contention is handled by burst dropping. The burst length distribution in IP traffic is arbitrary between 0 and 1, and is fixed in ATM traffic at 0,5. Burst traffic on the other hand is arbitrary between 1 and 5. The Setup and Setup ack length distributions are arbitrary. We apply the Poisson model with rate λ and Self-Similar model with pareto distribution rate α to identify inter-arrival times in these protocols. We consider a communication between a source client node and a destination client node over an ingress and one or more multiple intermediate switches.We use buffering only in the ingress node. The communication is based on single burst connections in which, the connection is set up just before sending a burst and then closed as soon as the burst is sent. Our analysis accounts for several important parameters, including the burst setup, burst setup ack, keepalive messages and the optical switching protocol. We compare the performance of the three signalling schemes on the network under as burst dropping probability under a range of network scenarios.

  15. Analysis of limiting information characteristics of quantum-cryptography protocols

    International Nuclear Information System (INIS)

    Sych, D V; Grishanin, Boris A; Zadkov, Viktor N

    2005-01-01

    The problem of increasing the critical error rate of quantum-cryptography protocols by varying a set of letters in a quantum alphabet for space of a fixed dimensionality is studied. Quantum alphabets forming regular polyhedra on the Bloch sphere and the continual alphabet equally including all the quantum states are considered. It is shown that, in the absence of basis reconciliation, a protocol with the tetrahedral alphabet has the highest critical error rate among the protocols considered, while after the basis reconciliation, a protocol with the continual alphabet possesses the highest critical error rate. (quantum optics and quantum computation)

  16. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  17. The surface elevation table and marker horizon technique: A protocol for monitoring wetland elevation dynamics

    Science.gov (United States)

    James C. Lynch,; Phillippe Hensel,; Cahoon, Donald R.

    2015-01-01

    The National Park Service, in response to the growing evidence and awareness of the effects of climate change on federal lands, determined that monitoring wetland elevation change is a top priority in North Atlantic Coastal parks (Stevens et al, 2010). As a result, the NPS Northeast Coastal and Barrier Network (NCBN) in collaboration with colleagues from the U.S. Geological Survey (USGS) and The National Oceanic and Atmospheric Administration (NOAA) have developed a protocol for monitoring wetland elevation change and other processes important for determining the viability of wetland communities. Although focused on North Atlantic Coastal parks, this document is applicable to all coastal and inland wetland regions. Wetlands exist within a narrow range of elevation which is influenced by local hydrologic conditions. For coastal wetlands in particular, local hydrologic conditions may be changing as sea levels continue to rise. As sea level rises, coastal wetland systems may respond by building elevation to maintain favorable hydrologic conditions for their survival. This protocol provides the reader with instructions and guidelines on designing a monitoring plan or study to: A) Quantify elevation change in wetlands with the Surface Elevation Table (SET). B) Understand the processes that influence elevation change, including vertical accretion (SET and Marker Horizon methods). C) Survey the wetland surface and SET mark to a common reference datum to allow for comparing sample stations to each other and to local tidal datums. D) Survey the SET mark to monitor its relative stability. This document is divided into two parts; the main body that presents an overview of all aspects of monitoring wetland elevation dynamics, and a collection of Standard Operating Procedures (SOP) that describes in detail how to perform or execute each step of the methodology. Detailed instruction on the installation, data collection, data management and analysis are provided in this report

  18. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  19. Communicating systems with UML 2 modeling and analysis of network protocols

    CERN Document Server

    Barrera, David Garduno

    2013-01-01

    This book gives a practical approach to modeling and analyzing communication protocols using UML 2. Network protocols are always presented with a point of view focusing on partial mechanisms and starting models. This book aims at giving the basis needed for anybody to model and validate their own protocols. It follows a practical approach and gives many examples for the description and analysis of well known basic network mechanisms for protocols.The book firstly shows how to describe and validate the main protocol issues (such as synchronization problems, client-server interactions, layer

  20. Effect of joint mobilization techniques for primary total knee arthroplasty: Study protocol for a randomized controlled trial.

    Science.gov (United States)

    Xu, Jiao; Zhang, Juan; Wang, Xue-Qiang; Wang, Xuan-Lin; Wu, Ya; Chen, Chan-Cheng; Zhang, Han-Yu; Zhang, Zhi-Wan; Fan, Kai-Yi; Zhu, Qiang; Deng, Zhi-Wei

    2017-12-01

    Total knee arthroplasty (TKA) has become the most preferred procedure by patients for the relief of pain caused by knee osteoarthritis. TKA patients aim a speedy recovery after the surgery. Joint mobilization techniques for rehabilitation have been widely used to relieve pain and improve joint mobility. However, relevant randomized controlled trials showing the curative effect of these techniques remain lacking to date. Accordingly, this study aims to investigate whether joint mobilization techniques are valid for primary TKA. We will manage a single-blind, prospective, randomized, controlled trial of 120 patients with unilateral TKA. Patients will be randomized into an intervention group, a physical modality therapy group, and a usual care group. The intervention group will undergo joint mobilization manipulation treatment once a day and regular training twice a day for a month. The physical modality therapy group will undergo physical therapy once a day and regular training twice a day for a month. The usual care group will perform regular training twice a day for a month. Primary outcome measures will be based on the visual analog scale, the knee joint Hospital for Special Surgery score, range of motion, surrounded degree, and adverse effect. Secondary indicators will include manual muscle testing, 36-Item Short Form Health Survey, Berg Balance Scale function evaluation, Pittsburgh Sleep Quality Index, proprioception, and muscle morphology. We will direct intention-to-treat analysis if a subject withdraws from the trial. The important features of this trial for joint mobilization techniques in primary TKA are randomization procedures, single-blind, large sample size, and standardized protocol. This study aims to investigate whether joint mobilization techniques are effective for early TKA patients. The result of this study may serve as a guide for TKA patients, medical personnel, and healthcare decision makers. It has been registered at http

  1. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  3. Performance Analysis of TDMA Protocol in a Femtocell Network

    Directory of Open Access Journals (Sweden)

    Wanod Kumar

    2014-07-01

    Full Text Available In this paper, we evaluate the performance of TDMA (Time Division Multiple Access protocol using queuing theory in a femtocell network. The fair use of wireless channel among the users of network is carried out using TDMA protocol. The arrival of data packets from M communicating nodes becomes multiple Poisson process. The time slots of TDMA protocol represent c servers to communicate data packets coming from communicating nodes to the input of FAP (Femtocell Access Point. The service time of each server (time slot is exponentially distributed. This complete communication scenario using TDMA protocol is modeled using M/M/c queue. The performance of the protocol is evaluated in terms of mean number in system, average system delay and utilization for varying traffic intensity

  4. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  5. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Science.gov (United States)

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  6. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Directory of Open Access Journals (Sweden)

    Shameng Wen

    Full Text Available Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  7. Performance analysis of routing protocols for IoT

    Science.gov (United States)

    Manda, Sridhar; Nalini, N.

    2018-04-01

    Internet of Things (IoT) is an arrangement of advancements that are between disciplinary. It is utilized to have compelling combination of both physical and computerized things. With IoT physical things can have personal virtual identities and participate in distributed computing. Realization of IoT needs the usage of sensors based on the sector for which IoT is integrated. For instance, in healthcare domain, IoT needs to have integration with wearable sensors used by patients. As sensor devices produce huge amount of data, often called big data, there should be efficient routing protocols in place. To the extent remote systems is worried there are some current protocols, for example, OLSR, DSR and AODV. It additionally tosses light into Trust based routing protocol for low-power and lossy systems (TRPL) for IoT. These are broadly utilized remote directing protocols. As IoT is developing round the corner, it is basic to investigate routing protocols that and evaluate their execution regarding throughput, end to end delay, and directing overhead. The execution experiences can help in settling on very much educated choices while incorporating remote systems with IoT. In this paper, we analyzed different routing protocols and their performance is compared. It is found that AODV showed better performance than other routing protocols aforementioned.

  8. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  9. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  10. Protocol for sampling and analysis of bone specimens

    International Nuclear Information System (INIS)

    Aras, N.K.

    2000-01-01

    The iliac crest of hip bone was chosen as the most suitable sampling site for several reasons: Local variation in the elemental concentration along the iliac crest is minimal; Iliac crest biopsies are commonly taken clinically on patients; The cortical part of the sample is small (∼2 mm) and can be separated easily from the trabecular bone; The use of the trabecular part of the iliac crest for trace element analysis has the advantage of reflecting rapidly changes in the composition of bone due to external parameters, including medication. Biopsy studies, although in some ways more difficult than autopsy studies, because of the need to obtain the informed consents of the subjects, are potentially more useful than autopsy studies. Thereby many problems of postmortem migration of elements can be avoided and reliable dietary and other data can be collected simultaneously. Select the subjects among the patients undergoing orthopedic surgery due to any reason other than osteoporosis. Follow an established protocol to obtain bone biopsies. Patients undergoing synergy should fill in the 'Osteoporosis Project Questionnaire Form' including information on lifestyle variables, dietary intakes, the reason for surgery etc. If possible, measure the bone mineral density (BMD) prior to removal of the biopsy sample. However it may not possible to have BMD results on all the subjects because of difficulty of DEXA measurement after an accident

  11. Performance analysis of simultaneous dense coding protocol under decoherence

    Science.gov (United States)

    Huang, Zhiming; Zhang, Cai; Situ, Haozhen

    2017-09-01

    The simultaneous dense coding (SDC) protocol is useful in designing quantum protocols. We analyze the performance of the SDC protocol under the influence of noisy quantum channels. Six kinds of paradigmatic Markovian noise along with one kind of non-Markovian noise are considered. The joint success probability of both receivers and the success probabilities of one receiver are calculated for three different locking operators. Some interesting properties have been found, such as invariance and symmetry. Among the three locking operators we consider, the SWAP gate is most resistant to noise and results in the same success probabilities for both receivers.

  12. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  13. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  14. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  15. An approach to standardization of urine sediment analysis via suggestion of a common manual protocol.

    Science.gov (United States)

    Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki

    2016-01-01

    The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.

  16. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  17. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  18. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  19. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  20. Analysis of the LTE Access Reservation Protocol for Real-Time Traffic

    DEFF Research Database (Denmark)

    Thomsen, Henning; Kiilerich Pratas, Nuno; Stefanovic, Cedomir

    2013-01-01

    LTE is increasingly seen as a system for serving real-time Machine-to-Machine (M2M) communication needs. The asynchronous M2M user access in LTE is obtained through a two-phase access reservation protocol (contention and data phase). Existing analysis related to these protocols is based...... of the two-phase LTE reservation protocol and asses its performance, when assumptions (1) and (2) do not hold....

  1. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  2. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  3. Design and analysis of communication protocols for quantum repeater networks

    International Nuclear Information System (INIS)

    Jones, Cody; Kim, Danny; Rakher, Matthew T; Ladd, Thaddeus D; Kwiat, Paul G

    2016-01-01

    We analyze how the performance of a quantum-repeater network depends on the protocol employed to distribute entanglement, and we find that the choice of repeater-to-repeater link protocol has a profound impact on entanglement-distribution rate as a function of hardware parameters. We develop numerical simulations of quantum networks using different protocols, where the repeater hardware is modeled in terms of key performance parameters, such as photon generation rate and collection efficiency. These parameters are motivated by recent experimental demonstrations in quantum dots, trapped ions, and nitrogen-vacancy centers in diamond. We find that a quantum-dot repeater with the newest protocol (‘MidpointSource’) delivers the highest entanglement-distribution rate for typical cases where there is low probability of establishing entanglement per transmission, and in some cases the rate is orders of magnitude higher than other schemes. Our simulation tools can be used to evaluate communication protocols as part of designing a large-scale quantum network. (paper)

  4. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  5. Dynamic Channel Slot Allocation Scheme and Performance Analysis of Cyclic Quorum Multichannel MAC Protocol

    Directory of Open Access Journals (Sweden)

    Xing Hu

    2017-01-01

    Full Text Available In high diversity node situation, multichannel MAC protocol can improve the frequency efficiency, owing to fewer collisions compared with single-channel MAC protocol. And the performance of cyclic quorum-based multichannel (CQM MAC protocol is outstanding. Based on cyclic quorum system and channel slot allocation, it can avoid the bottleneck that others suffered from and can be easily realized with only one transceiver. To obtain the accurate performance of CQM MAC protocol, a Markov chain model, which combines the channel-hopping strategy of CQM protocol and IEEE 802.11 distributed coordination function (DCF, is proposed. The results of numerical analysis show that the optimal performance of CQM protocol can be obtained in saturation bound situation. And then we obtain the saturation bound of CQM system by bird swarm algorithm. In addition, to improve the performance of CQM protocol in unsaturation situation, a dynamic channel slot allocation of CQM (DCQM protocol is proposed, based on wavelet neural network. Finally, the performance of CQM protocol and DCQM protocol is simulated by Qualnet platform. And the simulation results show that the analytic and simulation results match very well; the DCQM performs better in unsaturation situation.

  6. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA)

    DEFF Research Database (Denmark)

    Muharemovic, O; Troelsen, A; Thomsen, M G

    2018-01-01

    INTRODUCTION: Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time...... imaging. Radiographers in the control group used a standard RSA protocol. RESULTS: At three months, radiographers in the case group significantly reduced (p .... No significant improvements were found in the control group at any time point. CONCLUSION: There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute...

  7. Analysis of the differential-phase-shift-keying protocol in the quantum-key-distribution system

    International Nuclear Information System (INIS)

    Rong-Zhen, Jiao; Chen-Xu, Feng; Hai-Qiang, Ma

    2009-01-01

    The analysis is based on the error rate and the secure communication rate as functions of distance for three quantum-key-distribution (QKD) protocols: the Bennett–Brassard 1984, the Bennett–Brassard–Mermin 1992, and the coherent differential-phase-shift keying (DPSK) protocols. We consider the secure communication rate of the DPSK protocol against an arbitrary individual attack, including the most commonly considered intercept-resend and photon-number splitting attacks, and concluded that the simple and efficient differential-phase-shift-keying protocol allows for more than 200 km of secure communication distance with high communication rates. (general)

  8. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  9. A Formal Analysis of the Web Services Atomic Transaction Protocol with UPPAAL

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2010-01-01

    We present a formal analysis of the Web Services Atomic Transaction (WS-AT) protocol. WS-AT is a part of the WS-Coordination framework and describes an algorithm for reaching agreement on the outcome of a distributed transaction. The protocol is modelled and verified using the model checker UPPAAL...

  10. Design and Analysis of Transport Protocols for Reliable High-Speed Communications

    NARCIS (Netherlands)

    Oláh, A.

    1997-01-01

    The design and analysis of transport protocols for reliable communications constitutes the topic of this dissertation. These transport protocols guarantee the sequenced and complete delivery of user data over networks which may lose, duplicate and reorder packets. Reliable transport services are

  11. An Authentication Protocol for Mobile IPTV Users Based on an RFID-USB Convergence Technique

    Science.gov (United States)

    Jeong, Yoon-Su; Kim, Yong-Tae

    With the growing trend towards convergence in broadcast and communications media, Internet Protocol television (IPTV) that delivers real-time multimedia content over diverse types of communications networks (e.g., broadband Internet, cable TV, and satellite TV) has become a mainstream technology. Authenticating mobile IPTV subscribers who are continuously on the move is a challenge. A complex authentication process often impairs conditional access security or service quality as increasing illegal users and delaying service. This paper proposes an RFID-USB authentication protocol, for mobile IPTV users, combined with USIM-based personalized authentication and lightweight authentication that utilizes the RFID-USB technology with an implanted agent module (called an "agent tag") which temporarily enhanced user status information. The proposed authentication protocol adopts a plug-and-play security agent module that is placed in both an RFID tag and an RFID-USB. The implanted security agents cooperate in such a way that multiple RFID tags are connected seamlessly to an RFID-USB.

  12. A Hybrid Analysis for Security Protocols with State

    Science.gov (United States)

    2014-07-16

    http://www.mitre.org/publications/ technical-papers/completeness-of-cpsa. [19] Simon Meier, Cas Cremers , and David Basin. Efficient construction of...7] Cas Cremers and Sjouke Mauw. Operational semantics and verification of security protocols. Springer, 2012. [8] Anupam Datta, Ante Derek, John C

  13. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  14. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  15. The use of crypto-analysis techniques for securing internet ...

    African Journals Online (AJOL)

    ... recommended to be combined with other techniques, such as client-side software, data transaction protocols, web server software, and the network server operating system involved in handling e-commerce, for securing internet transaction. This recommendation will invariable ensure that internet transaction is secured.

  16. Time Error Analysis of SOE System Using Network Time Protocol

    International Nuclear Information System (INIS)

    Keum, Jong Yong; Park, Geun Ok; Park, Heui Youn

    2005-01-01

    To find the accuracy of time in the fully digitalized SOE (Sequence of Events) system, we used a formal specification of the Network Time Protocol (NTP) Version 3, which is used to synchronize time keeping among a set of distributed computers. Through constructing a simple experimental environments and experimenting internet time synchronization, we analyzed the time errors of local clocks of SOE system synchronized with a time server via computer networks

  17. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  18. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  19. Analysis of Pervasive Mobile Ad Hoc Routing Protocols

    Science.gov (United States)

    Qadri, Nadia N.; Liotta, Antonio

    Mobile ad hoc networks (MANETs) are a fundamental element of pervasive networks and therefore, of pervasive systems that truly support pervasive computing, where user can communicate anywhere, anytime and on-the-fly. In fact, future advances in pervasive computing rely on advancements in mobile communication, which includes both infrastructure-based wireless networks and non-infrastructure-based MANETs. MANETs introduce a new communication paradigm, which does not require a fixed infrastructure - they rely on wireless terminals for routing and transport services. Due to highly dynamic topology, absence of established infrastructure for centralized administration, bandwidth constrained wireless links, and limited resources in MANETs, it is challenging to design an efficient and reliable routing protocol. This chapter reviews the key studies carried out so far on the performance of mobile ad hoc routing protocols. We discuss performance issues and metrics required for the evaluation of ad hoc routing protocols. This leads to a survey of existing work, which captures the performance of ad hoc routing algorithms and their behaviour from different perspectives and highlights avenues for future research.

  20. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  1. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  2. Performance Analysis of Untraceability Protocols for Mobile Agents Using an Adaptable Framework

    OpenAIRE

    LESZCZYNA RAFAL; GORSKI Janusz Kazimierz

    2006-01-01

    Recently we had proposed two untraceability protocols for mobile agents and began investigating their quality. We believe that quality evaluation of security protocols should extend a sole validation of their security and cover other quality aspects, primarily their efficiency. Thus after conducting a security analysis, we wanted to complement it with a performance analysis. For this purpose we developed a performance evaluation framework, which, as we realised, with certain adjustments, can ...

  3. Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)

    Science.gov (United States)

    2006-12-01

    field of 9 identifies the ICMP message as an adverstisement . Code Mobile IP home agents and foreign agents use the value of 16 to prevent any nodes...ANALYSIS OF THE MOBILE IP PROTOCOL (RFC 3344 AND RELATED RFCS) by Chin Chin Ng December 2006 Thesis Co-Advisors: George W. Dinolt J. D...December 2006 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Performance Analysis of the Mobile IP Protocol (RFC 3344 and

  4. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4, Volume IV: Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols

    Science.gov (United States)

    Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.

  5. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  6. Protocol for chromosome-specific probe construction using PRINS, micromanipulation and DOP-PCR techniques

    Directory of Open Access Journals (Sweden)

    PAULO Z. PASSAMANI

    2017-12-01

    Full Text Available ABSTRACT Chromosome-specific probes have been widely used in molecular cytogenetics, being obtained with different methods. In this study, a reproducible protocol for construction of chromosome-specific probes is proposed which associates in situ amplification (PRINS, micromanipulation and degenerate oligonucleotide-primed PCR (DOP-PCR. Human lymphocyte cultures were used to obtain metaphases from male and female individuals. The chromosomes were amplified via PRINS, and subcentromeric fragments of the X chromosome were microdissected using microneedles coupled to a phase contrast microscope. The fragments were amplified by DOP-PCR and labeled with tetramethyl-rhodamine-5-dUTP. The probes were used in fluorescent in situ hybridization (FISH procedure to highlight these specific regions in the metaphases. The results show one fluorescent red spot in male and two in female X chromosomes and interphase nuclei.

  7. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA).

    Science.gov (United States)

    Muharemovic, O; Troelsen, A; Thomsen, M G; Kallemose, T; Gosvig, K K

    2018-05-01

    Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time and the number of exposure repetitions. Forty patients undergoing primary total hip arthroplasty were equally randomized to either a case or a control group. Radiographers in the case group were assisted by personalized patient protocols containing information about each patient's post-operative RSA imaging. Radiographers in the control group used a standard RSA protocol. At three months, radiographers in the case group significantly reduced (p RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute to the reduction of examination time, thus ensuring a cost benefit for department and patient safety. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  8. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  9. Analysis of transmission speed of AX.25 Protocol implemented in satellital earth station UPTC

    Directory of Open Access Journals (Sweden)

    Oscar Fernando Vera Cely

    2015-11-01

    Full Text Available One of the important parameters for the proper functioning of satellital ground station projected on Pedagogical and Technological University of Colombia (UPTC is the efficiency in transmission speed on communications protocol. This paper shows the results of analysis of the transmission speed of the AX.25 protocol implemented in the communication system of the satellital ground station UPTC. It begins with a brief description of the implemented hardware; the behavior of the transmission rate is evaluated using a theoretical analysis based on equations to estimate this parameter in the operation of the protocol, then tests are performed using the hardware that the satellital ground station UPTC has and finally, the conclusions are presented. Based on comparison of the theoretical analysis results obtained experimentally, it became apparent that AX.25 protocol efficiency is higher when increasing the number of frames.

  10. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  11. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  12. Performance Analysis of On-Demand Routing Protocols in Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Arafatur RAHMAN

    2009-01-01

    Full Text Available Wireless Mesh Networks (WMNs have recently gained a lot of popularity due to their rapid deployment and instant communication capabilities. WMNs are dynamically self-organizing, self-configuring and self-healing with the nodes in the network automatically establishing an adiej hoc network and preserving the mesh connectivity. Designing a routing protocol for WMNs requires several aspects to consider, such as wireless networks, fixed applications, mobile applications, scalability, better performance metrics, efficient routing within infrastructure, load balancing, throughput enhancement, interference, robustness etc. To support communication, various routing protocols are designed for various networks (e.g. ad hoc, sensor, wired etc.. However, all these protocols are not suitable for WMNs, because of the architectural differences among the networks. In this paper, a detailed simulation based performance study and analysis is performed on the reactive routing protocols to verify the suitability of these protocols over such kind of networks. Ad Hoc On-Demand Distance Vector (AODV, Dynamic Source Routing (DSR and Dynamic MANET On-demand (DYMO routing protocol are considered as the representative of reactive routing protocols. The performance differentials are investigated using varying traffic load and number of source. Based on the simulation results, how the performance of each protocol can be improved is also recommended.

  13. Techniques and Protocols for Dispersing Nanoparticle Powders in Aqueous Media-Is there a Rationale for Harmonization?

    Science.gov (United States)

    Hartmann, Nanna B; Jensen, Keld Alstrup; Baun, Anders; Rasmussen, Kirsten; Rauscher, Hubert; Tantra, Ratna; Cupi, Denisa; Gilliland, Douglas; Pianella, Francesca; Riego Sintes, Juan M

    2015-01-01

    Selecting appropriate ways of bringing engineered nanoparticles (ENP) into aqueous dispersion is a main obstacle for testing, and thus for understanding and evaluating, their potential adverse effects to the environment and human health. Using different methods to prepare (stock) dispersions of the same ENP may be a source of variation in the toxicity measured. Harmonization and standardization of dispersion methods applied in mammalian and ecotoxicity testing are needed to ensure a comparable data quality and to minimize test artifacts produced by modifications of ENP during the dispersion preparation process. Such harmonization and standardization will also enhance comparability among tests, labs, and studies on different types of ENP. The scope of this review was to critically discuss the essential parameters in dispersion protocols for ENP. The parameters are identified from individual scientific studies and from consensus reached in larger scale research projects and international organizations. A step-wise approach is proposed to develop tailored dispersion protocols for ecotoxicological and mammalian toxicological testing of ENP. The recommendations of this analysis may serve as a guide to researchers, companies, and regulators when selecting, developing, and evaluating the appropriateness of dispersion methods applied in mammalian and ecotoxicity testing. However, additional experimentation is needed to further document the protocol parameters and investigate to what extent different stock dispersion methods affect ecotoxicological and mammalian toxicological responses of ENP.

  14. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  15. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  16. Analyzing the effect of routing protocols on media access control protocols in radio networks

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C. L. (Christopher L.); Drozda, M. (Martin); Marathe, A. (Achla); Marathe, M. V. (Madhav V.)

    2002-01-01

    We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i) number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.

  17. A Survey of Congestion Control Techniques and Data Link Protocols in Satellite Networks

    OpenAIRE

    Fahmy, Sonia; Jain, Raj; Lu, Fang; Kalyanaraman, Shivkumar

    1998-01-01

    Satellite communication systems are the means of realizing a global broadband integrated services digital network. Due to the statistical nature of the integrated services traffic, the resulting rate fluctuations and burstiness render congestion control a complicated, yet indispensable function. The long propagation delay of the earth-satellite link further imposes severe demands and constraints on the congestion control schemes, as well as the media access control techniques and retransmissi...

  18. Ultrasound assisted extraction of food and natural products. Mechanisms, techniques, combinations, protocols and applications. A review.

    Science.gov (United States)

    Chemat, Farid; Rombaut, Natacha; Sicaire, Anne-Gaëlle; Meullemiestre, Alice; Fabiano-Tixier, Anne-Sylvie; Abert-Vian, Maryline

    2017-01-01

    This review presents a complete picture of current knowledge on ultrasound-assisted extraction (UAE) in food ingredients and products, nutraceutics, cosmetic, pharmaceutical and bioenergy applications. It provides the necessary theoretical background and some details about extraction by ultrasound, the techniques and their combinations, the mechanisms (fragmentation, erosion, capillarity, detexturation, and sonoporation), applications from laboratory to industry, security, and environmental impacts. In addition, the ultrasound extraction procedures and the important parameters influencing its performance are also included, together with the advantages and the drawbacks of each UAE techniques. Ultrasound-assisted extraction is a research topic, which affects several fields of modern plant-based chemistry. All the reported applications have shown that ultrasound-assisted extraction is a green and economically viable alternative to conventional techniques for food and natural products. The main benefits are decrease of extraction and processing time, the amount of energy and solvents used, unit operations, and CO 2 emissions. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Analysis of NASA communications (Nascom) II network protocols and performance

    Science.gov (United States)

    Omidyar, Guy C.; Butler, Thomas E.

    1991-01-01

    The NASA Communications (Nascom) Division of the Mission Operations and Data Systems Directorate is to undertake a major initiative to develop the Nascom II (NII) network to achieve its long-range service objectives for operational data transport to support the Space Station Freedom Program, the Earth Observing System, and other projects. NII is the Nascom ground communications network being developed to accommodate the operational traffic of the mid-1990s and beyond. The authors describe various baseline protocol architectures based on current and evolving technologies. They address the internetworking issues suggested for reliable transfer of data over heterogeneous segments. They also describe the NII architecture, topology, system components, and services. A comparative evaluation of the current and evolving technologies was made, and suggestions for further study are described. It is shown that the direction of the NII configuration and the subsystem component design will clearly depend on the advances made in the area of broadband integrated services.

  20. Sylvia Plath: a protocol analysis of her last poems.

    Science.gov (United States)

    Leenaars, A A; Wenckstern, S

    1998-01-01

    Personal documents have a significant place in psychological research. Suicide notes, diaries, novels, poems, and so on allow us to better understand the suicidal mind. The works of Sylvia Plath--a poet who killed herself at age 30--are prime examples for such protocol study. This article examines the last 6 months of Plath's poetry, revealing a suicidal malaise. Associating the results to the lives of Cesare Pavese and the case study of Natalie, a Terman-Shneidman subject of the intellectually gifted, the study shows a unit thema that facilitates the process of death. The poems reveal such themes as unbearable pain, loss, and abandonment that likely contributed significantly to death becoming the only solution.

  1. Sentinel Lymph Node Biopsy (SLNB) for Breast Cancer (BC) - Validation Protocol of the Technique

    International Nuclear Information System (INIS)

    Blidaru, A.; Bordea, C.I.; Condrea, Ileana; Albert, Paul

    2006-01-01

    Full text: The sentinel ganglion concept originates in the assumption according to which the primary tumor drains into a specific ganglionar area and then runs through the lymphatic nodes in an orderly, sequential mode. When neoplastic dissemination along the lymphatic pathway occurs, there is an initial invasion of a specific lymph node (rarely more than one) located on the drainage route. That firstly lymph node has been identified as the sentinel node, which mirrors the regional ganglionar status. In order to establish the indication for lymphadenectomy and avoid the situations in which such a surgical procedure would be of no use (N-), the only correct method consists in the identification and biopsy of the sentinel node. Radioactive tracing and/or use of vital staining enable the identification of the regional ganglionar group towards which the primary lesion is draining. The technique of sentinel lymph node identification and biopsy by means of radioactive tracing includes: - pre-surgical lymphoscintigraphy, - identification of the sentinel lymph node and its excisional biopsy, - intra-operative histopathological examination and immunohistochemical stains of the sentinel lymph node. Regional lymphadenectomy serves two major purposes: - diagnosis (axillary lymph node invasion represents an important prognostic factor) and therapeutic (to ensure local control of the disease). Regional lymph node invasion in breast cancer is directly related to the primary tumour size. In the less advanced stages (T1), as there is rarely invasion of the axillary lymph nodes, lymphadenectomy can be avoided in most cases. The paper presents the refinement of the technique, the validation of the method for the identification and biopsy of the sentinel lymph node in breast cancer using Tc99 and the intra-operative use of NEOPROBE 2000 gamma camera at the 'Prof. Dr. Alexandru Trestioreanu' Oncological Institute in Bucharest. 93 patients with primary breast cancer (T1, T2, N0

  2. Protocol: An updated integrated methodology for analysis of metabolites and enzyme activities of ethylene biosynthesis

    Directory of Open Access Journals (Sweden)

    Geeraerd Annemie H

    2011-06-01

    Full Text Available Abstract Background The foundations for ethylene research were laid many years ago by researchers such as Lizada, Yang and Hoffman. Nowadays, most of the methods developed by them are still being used. Technological developments since then have led to small but significant improvements, contributing to a more efficient workflow. Despite this, many of these improvements have never been properly documented. Results This article provides an updated, integrated set of protocols suitable for the assembly of a complete picture of ethylene biosynthesis, including the measurement of ethylene itself. The original protocols for the metabolites 1-aminocyclopropane-1-carboxylic acid and 1-(malonylaminocyclopropane-1-carboxylic acid have been updated and downscaled, while protocols to determine in vitro activities of the key enzymes 1-aminocyclopropane-1-carboxylate synthase and 1-aminocyclopropane-1-carboxylate oxidase have been optimised for efficiency, repeatability and accuracy. All the protocols described were optimised for apple fruit, but have been proven to be suitable for the analysis of tomato fruit as well. Conclusions This work collates an integrated set of detailed protocols for the measurement of components of the ethylene biosynthetic pathway, starting from well-established methods. These protocols have been optimised for smaller sample volumes, increased efficiency, repeatability and accuracy. The detailed protocol allows other scientists to rapidly implement these methods in their own laboratories in a consistent and efficient way.

  3. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  4. Staged protocol for the treatment of chronic femoral shaft osteomyelitis with Ilizarov's technique followed by the use of intramedullary locked nail

    Directory of Open Access Journals (Sweden)

    Po-Hsin Chou

    2017-06-01

    Conclusion: In the treatment of chronic femur osteomyelitis, the staged protocol of Ilizarov distraction osteogenesis followed by intramedullary nailing was safe and successful, and allowed for union, realignment, reorientation, and leg-length restoration. With regard to the soft tissue, this technique provides a unique type of reconstructive closure for infected wounds. It is suggested that the staged protocol is reliable in providing successful simultaneous reconstruction for bone and soft tissue defects without flap coverage.

  5. Specimen preparation, imaging, and analysis protocols for knife-edge scanning microscopy.

    Science.gov (United States)

    Choe, Yoonsuck; Mayerich, David; Kwon, Jaerock; Miller, Daniel E; Sung, Chul; Chung, Ji Ryang; Huffman, Todd; Keyser, John; Abbott, Louise C

    2011-12-09

    Major advances in high-throughput, high-resolution, 3D microscopy techniques have enabled the acquisition of large volumes of neuroanatomical data at submicrometer resolution. One of the first such instruments producing whole-brain-scale data is the Knife-Edge Scanning Microscope (KESM), developed and hosted in the authors' lab. KESM has been used to section and image whole mouse brains at submicrometer resolution, revealing the intricate details of the neuronal networks (Golgi), vascular networks (India ink), and cell body distribution (Nissl). The use of KESM is not restricted to the mouse nor the brain. We have successfully imaged the octopus brain, mouse lung, and rat brain. We are currently working on whole zebra fish embryos. Data like these can greatly contribute to connectomics research; to microcirculation and hemodynamic research; and to stereology research by providing an exact ground-truth. In this article, we will describe the pipeline, including specimen preparation (fixing, staining, and embedding), KESM configuration and setup, sectioning and imaging with the KESM, image processing, data preparation, and data visualization and analysis. The emphasis will be on specimen preparation and visualization/analysis of obtained KESM data. We expect the detailed protocol presented in this article to help broaden the access to KESM and increase its utilization.

  6. Optimization of DNA isolation and PCR protocol for RAPD analysis ...

    African Journals Online (AJOL)

    hope&shola

    The method involves a modified CTAB extraction employing polyvinyl ... The technique is ideal for isolation of DNA from different plant species and .... The tubes were incubated at 65°C in hot air oven or water bath for 60-90 min with intermittent shaking and .... permission to collect germ plasm Financial assistance (to.

  7. Imaging techniques and investigation protocols in pediatric emergency imaging; Aufnahmetechnik und Untersuchungsprotokolle beim paediatrischen Notfall

    Energy Technology Data Exchange (ETDEWEB)

    Scharitzer, M.; Hoermann, M.; Puig, S.; Prokop, M. [Universitaetsklinik fuer Radiodiagnostik, Wien (Austria)

    2002-03-01

    Paediatric emergencies demand a quick and efficient radiological investigation with special attention to specific adjustments related to patient age and radiation protection. Imaging modalities are improving rapidly and enable to diagnose childhood diseases and injuries more quickly, accurately and safely. This article provides an overview of imaging techniques adjusted to the age of the child and an overview of imaging strategies of common paediatric emergencies. Optimising the imaging parameters (digital radiography, different screen-film systems, exposure specifications) allows for substantial reduction of radiation dose. Spiral- and multislice-CT reduce scan time and enable a considerable reduction of radiation exposure if scanning parameters (pitch setting, tube current) are properly adjusted. MRI is still mainly used for neurological or spinal emergencies despite the advent of fast imaging sequences. The radiologist's task is to select an appropriate imaging strategy according to expected differential diagnosis and to adjust the imaging techniques to the individual patient. (orig.) [German] Das akut erkrankte Kind erfordert eine rasche radiologische Abklaerung mit besonderer Beruecksichtung der geaenderten Untersuchungsparameter bei gleichzeitig hohem Anspruch an den Strahlenschutz. Hochaufloesende Schallkoepfe, Multislice-CT und schnelle MR-Sequenzen erlauben eine bessere Anpassung der Untersuchungsmethoden an die Beduerfnisse in der Kinderradiologie. Ziel dieses Artikels ist eine Uebersicht ueber die verschiedenen radiologischen Untersuchungstechniken sowie deren Anpassung an kindliche Anforderungen und die Angabe von Untersuchungsalgorithmen der haeufigsten paediatrischen Notfaelle. In der Projektionsradiographie erlaubt die Optimierung der Aufnahmetechnik (digitale Radiographie, unterschiedliche Klassen von Film-Folien-Systemen, Belichtungsparameter) eine deutliche Reduktion der Strahlendosis bei diagnostisch ausreichender Qualitaet. Spiral- oder

  8. Comparative analysis of protocols for DNA extraction from soybean caterpillars.

    Science.gov (United States)

    Palma, J; Valmorbida, I; da Costa, I F D; Guedes, J V C

    2016-04-07

    Genomic DNA extraction is crucial for molecular research, including diagnostic and genome characterization of different organisms. The aim of this study was to comparatively analyze protocols of DNA extraction based on cell lysis by sarcosyl, cetyltrimethylammonium bromide, and sodium dodecyl sulfate, and to determine the most efficient method applicable to soybean caterpillars. DNA was extracted from specimens of Chrysodeixis includens and Spodoptera eridania using the aforementioned three methods. DNA quantification was performed using spectrophotometry and high molecular weight DNA ladders. The purity of the extracted DNA was determined by calculating the A260/A280 ratio. Cost and time for each DNA extraction method were estimated and analyzed statistically. The amount of DNA extracted by these three methods was sufficient for PCR amplification. The sarcosyl method yielded DNA of higher purity, because it generated a clearer pellet without viscosity, and yielded high quality amplification products of the COI gene I. The sarcosyl method showed lower cost per extraction and did not differ from the other methods with respect to preparation times. Cell lysis by sarcosyl represents the best method for DNA extraction in terms of yield, quality, and cost effectiveness.

  9. A content analysis of posthumous sperm procurement protocols with considerations for developing an institutional policy.

    Science.gov (United States)

    Bahm, Sarah M; Karkazis, Katrina; Magnus, David

    2013-09-01

    To identify and analyze existing posthumous sperm procurement (PSP) protocols in order to outline central themes for institutions to consider when developing future policies. Qualitative content analysis. Large academic institutions across the United States. We performed a literature search and contacted 40 institutions to obtain nine full PSP protocols. We then performed a content analysis on these policies to identify major themes and factors to consider when developing a PSP protocol. Presence of a PSP policy. We identified six components of a thorough PSP protocol: Standard of Evidence, Terms of Eligibility, Sperm Designee, Restrictions on Use in Reproduction, Logistics, and Contraindications. We also identified two different approaches to policy structure. In the Limited Role approach, institutions have stricter consent requirements and limit their involvement to the time of procurement. In the Family-Centered approach, substituted judgment is permitted but a mandatory wait period is enforced before sperm use in reproduction. Institutions seeking to implement a PSP protocol will benefit from considering the six major building blocks of a thorough protocol and where they would like to fall on the spectrum from a Limited Role to a Family-Centered approach. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  10. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  11. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  12. Study of an optimization protocol for radiographic techniques in computerized radiology

    International Nuclear Information System (INIS)

    Abrantes, Marcos Eugenio Silva

    2015-01-01

    This work is designed to produce information for the improvement of image quality to deployment in a radiology department based on previous reviews of the images by questionnaires of acceptance and quality e recognition of the parameters used in chest radiographic techniques. The data collected were divided by male, female, PA and LAT thickness, body mass index, biotypes, anthropomorphic parameters and body evaluation associated with constant voltage and the additional filtration. The results show the predominance of 35 and 40 constants with additional filtration from 0.5 to 1.5 mmAl, voltage in male: (PA and LAT) 86-92 kV and 96-112 kV, female: 85-98 kV and 96-112 kV. The charge applied to the tube for males: (PA and LAT) 5-10 mA.s and 5-16 mA.s, female: (PA and LAT) 6.3-8 mA.s and 9-14 mA.s. Absorbed doses for males: (PA and PF) 0.04-0.17 mGy and 0.03-0.19 mGy and from female (PA and PF) from 0.03-0.22 mGy and 0, 04-0.17 mGy. This procedure can be used in radiology department to implement and acceptance in the quality of images. (author)

  13. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  14. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  15. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  16. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  17. Upper Midwest Gap Analysis Program, Image Processing Protocol

    National Research Council Canada - National Science Library

    Lillesand, Thomas

    1998-01-01

    This document presents a series of technical guidelines by which land cover information is being extracted from Landsat Thematic Mapper data as part of the Upper Midwest Gap Analysis Program (UMGAP...

  18. Cost analysis of hybrid adaptive routing protocol for heterogeneous ...

    Indian Academy of Sciences (India)

    NONITA SHARMA

    Event detection; wireless sensor networks; hybrid routing; cost benefit analysis; proactive routing; reactive routing. 1. ... additional energy, high processing power, etc. are deployed to extend the .... transmit to its parent node. (2) Reactive ...

  19. Simple Public Key Infrastructure Protocol Analysis and Design

    National Research Council Canada - National Science Library

    Vidergar, Alexander G

    2005-01-01

    ...). This thesis aims at proving the applicability of the Simple Public Key Infrastructure (SPKI) as a means of PKC. The strand space approach of Guttman and Thayer is used to provide an appropriate model for analysis...

  20. Data Analysis Techniques for Physical Scientists

    Science.gov (United States)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  1. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  2. Practical security analysis of a quantum stream cipher by the Yuen 2000 protocol

    International Nuclear Information System (INIS)

    Hirota, Osamu

    2007-01-01

    There exists a great gap between one-time pad with perfect secrecy and conventional mathematical encryption. The Yuen 2000 (Y00) protocol or αη scheme may provide a protocol which covers from the conventional security to the ultimate one, depending on implementations. This paper presents the complexity-theoretic security analysis on some models of the Y00 protocol with nonlinear pseudo-random-number-generator and quantum noise diffusion mapping (QDM). Algebraic attacks and fast correlation attacks are applied with a model of the Y00 protocol with nonlinear filtering like the Toyocrypt stream cipher as the running key generator, and it is shown that these attacks in principle do not work on such models even when the mapping between running key and quantum state signal is fixed. In addition, a security property of the Y00 protocol with QDM is clarified. Consequently, we show that the Y00 protocol has a potential which cannot be realized by conventional cryptography and that it goes beyond mathematical encryption with physical encryption

  3. IEEE 802.11 Wireless LANs: Performance Analysis and Protocol Refinement

    Directory of Open Access Journals (Sweden)

    Chatzimisios P.

    2005-01-01

    Full Text Available The IEEE 802.11 protocol is emerging as a widely used standard and has become the most mature technology for wireless local area networks (WLANs. In this paper, we focus on the tuning of the IEEE 802.11 protocol parameters taking into consideration, in addition to throughput efficiency, performance metrics such as the average packet delay, the probability of a packet being discarded when it reaches the maximum retransmission limit, the average time to drop a packet, and the packet interarrival time. We present an analysis, which has been validated by simulation that is based on a Markov chain model commonly used in the literature. We further study the improvement on these performance metrics by employing suitable protocol parameters according to the specific communication needs of the IEEE 802.11 protocol for both basic access and RTS/CTS access schemes. We show that the use of a higher initial contention window size does not considerably degrade performance in small networks and performs significantly better in any other scenario. Moreover, we conclude that the combination of a lower maximum contention window size and a higher retry limit considerably improves performance. Results indicate that the appropriate adjustment of the protocol parameters enhances performance and improves the services that the IEEE 802.11 protocol provides to various communication applications.

  4. Establishment of a protocol for the gene expression analysis of laser microdissected rat kidney samples with affymetrix genechips

    International Nuclear Information System (INIS)

    Stemmer, Kerstin; Ellinger-Ziegelbauer, Heidrun; Lotz, Kerstin; Ahr, Hans-J.; Dietrich, Daniel R.

    2006-01-01

    Laser microdissection in conjunction with microarray technology allows selective isolation and analysis of specific cell populations, e.g., preneoplastic renal lesions. To date, only limited information is available on sample preparation and preservation techniques that result in both optimal histomorphological preservation of sections and high-quality RNA for microarray analysis. Furthermore, amplification of minute amounts of RNA from microdissected renal samples allowing analysis with genechips has only scantily been addressed to date. The objective of this study was therefore to establish a reliable and reproducible protocol for laser microdissection in conjunction with microarray technology using kidney tissue from Eker rats p.o. treated for 7 days and 6 months with 10 and 1 mg Aristolochic acid/kg bw, respectively. Kidney tissues were preserved in RNAlater or snap frozen. Cryosections were cut and stained with either H and E or cresyl violet for subsequent morphological and RNA quality assessment and laser microdissection. RNA quality was comparable in snap frozen and RNAlater-preserved samples, however, the histomorphological preservation of renal sections was much better following cryopreservation. Moreover, the different staining techniques in combination with sample processing time at room temperature can have an influence on RNA quality. Different RNA amplification protocols were shown to have an impact on gene expression profiles as demonstrated with Affymetrix Rat Genome 230 2 .0 arrays. Considering all the parameters analyzed in this study, a protocol for RNA isolation from laser microdissected samples with subsequent Affymetrix chip hybridization was established that was also successfully applied to preneoplastic lesions laser microdissected from Aristolochic acid-treated rats

  5. Performance Analysis of an Enhanced PRMA-HS Protocol for LEO Satellite Communication

    Institute of Scientific and Technical Information of China (English)

    ZHUO Yong-ning; YAN Shao-hu; WU Shi-qi

    2005-01-01

    The packet reservation multiple access with hindering state (PRMA-HS) is a protocol suitable for LEO satellite mobile communication. Although working well with light system payload (amount of user terminals), the protocol imposes high channel congestion on system with heavy payload, thus degrades the system's quality of service. To controlling the channel congestion, a scheme of enhanced PRMA-HS protocol is proposed, which aims to reduce the collision of voice packets by adopting a mechanism of access control. Through theoretic analysis, the system's mathematic model is presented and the packet drop probability of the scheme is deduced. To testify the performance of the scheme, a simulation is performed and the results support our analysis.

  6. Techniques and Protocols for Dispersing Nanoparticle Powders in Aqueous Media—is there a Rationale for Harmonization?

    DEFF Research Database (Denmark)

    Hartmann, Nanna B.; Jensen, Keld Alstrup; Baun, Anders

    2015-01-01

    scientific studies and from consensus reached in larger scale research projects and international organizations. A step-wise approach is proposed to develop tailored dispersion protocols for ecotoxicological and mammalian toxicological testing of ENP. The recommendations of this analysis may serve as a guide......Selecting appropriate ways of bringing engineered nanoparticles (ENP) into aqueous dispersion is a main obstacle for testing, and thus for understanding and evaluating, their potential adverse effects to the environment and human health. Using different methods to prepare (stock) dispersions...... of the same ENP may be a source of variation in the toxicity measured. Harmonization and standardization of dispersion methods applied in mammalian and ecotoxicity testing are needed to ensure a comparable data quality and to minimize test artifacts produced by modifications of ENP during the dispersion...

  7. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  8. Hybrid chemical and nondestructive-analysis technique

    International Nuclear Information System (INIS)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  9. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  10. Understanding context in knowledge translation: a concept analysis study protocol.

    Science.gov (United States)

    Squires, Janet E; Graham, Ian D; Hutchinson, Alison M; Linklater, Stefanie; Brehaut, Jamie C; Curran, Janet; Ivers, Noah; Lavis, John N; Michie, Susan; Sales, Anne E; Fiander, Michelle; Fenton, Shannon; Noseworthy, Thomas; Vine, Jocelyn; Grimshaw, Jeremy M

    2015-05-01

    To conduct a concept analysis of clinical practice contexts (work environments) that facilitate or militate against the uptake of research evidence by healthcare professionals in clinical practice. This will involve developing a clear definition of context by describing its features, domains and defining characteristics. The context where clinical care is delivered influences that care. While research shows that context is important to knowledge translation (implementation), we lack conceptual clarity on what is context, which contextual factors probably modify the effect of knowledge translation interventions (and hence should be considered when designing interventions) and which contextual factors themselves could be targeted as part of a knowledge translation intervention (context modification). Concept analysis. The Walker and Avant concept analysis method, comprised of eight systematic steps, will be used: (1) concept selection; (2) determination of aims; (3) identification of uses of context; (4) determination of defining attributes of context; (5) identification/construction of a model case of context; (6) identification/construction of additional cases of context; (7) identification/construction of antecedents and consequences of context; and (8) definition of empirical referents of context. This study is funded by the Canadian Institutes of Health Research (January 2014). This study will result in a much needed framework of context for knowledge translation, which identifies specific elements that, if assessed and used to tailor knowledge translation activities, will result in increased research use by nurses and other healthcare professionals in clinical practice, ultimately leading to better patient care. © 2014 John Wiley & Sons Ltd.

  11. A systematic review protocol: social network analysis of tobacco use.

    Science.gov (United States)

    Maddox, Raglan; Davey, Rachel; Lovett, Ray; van der Sterren, Anke; Corbett, Joan; Cochrane, Tom

    2014-08-08

    Tobacco use is the single most preventable cause of death in the world. Evidence indicates that behaviours such as tobacco use can influence social networks, and that social network structures can influence behaviours. Social network analysis provides a set of analytic tools to undertake methodical analysis of social networks. We will undertake a systematic review to provide a comprehensive synthesis of the literature regarding social network analysis and tobacco use. The review will answer the following research questions: among participants who use tobacco, does social network structure/position influence tobacco use? Does tobacco use influence peer selection? Does peer selection influence tobacco use? We will follow the Preferred Reporting Items for Systemic Reviews and Meta-Analyses (PRISMA) guidelines and search the following databases for relevant articles: CINAHL (Cumulative Index to Nursing and Allied Health Literature); Informit Health Collection; PsycINFO; PubMed/MEDLINE; Scopus/Embase; Web of Science; and the Wiley Online Library. Keywords include tobacco; smoking; smokeless; cigarettes; cigar and 'social network' and reference lists of included articles will be hand searched. Studies will be included that provide descriptions of social network analysis of tobacco use.Qualitative, quantitative and mixed method data that meets the inclusion criteria for the review, including methodological rigour, credibility and quality standards, will be synthesized using narrative synthesis. Results will be presented using outcome statistics that address each of the research questions. This systematic review will provide a timely evidence base on the role of social network analysis of tobacco use, forming a basis for future research, policy and practice in this area. This systematic review will synthesise the evidence, supporting the hypothesis that social network structures can influence tobacco use. This will also include exploring the relationship between social

  12. 75 FR 74007 - Federal Aquatic Nuisance Species Research Risk Analysis Protocol

    Science.gov (United States)

    2010-11-30

    ... site, http://anstaskforce.gov/documents.php . To obtain a hard copy of the Protocol, see Document... aquatic species that are the target of this risk analysis. Language used in the NANPCA differentiates...: http://anstaskforce.gov/documents.php Write: Susan Pasko, National Oceanic and Atmospheric...

  13. A simple protocol for NMR analysis of the enantiomeric purity of chiral hydroxylamines.

    Science.gov (United States)

    Tickell, David A; Mahon, Mary F; Bull, Steven D; James, Tony D

    2013-02-15

    A practically simple three-component chiral derivatization protocol for determining the enantiopurity of chiral hydroxylamines by (1)H NMR spectroscopic analysis is described, involving their treatment with 2-formylphenylboronic acid and enantiopure BINOL to afford a mixture of diastereomeric nitrono-boronate esters whose ratio is an accurate reflection of the enantiopurity of the parent hydroxylamine.

  14. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  15. A Concise Protocol for the Validation of Language ENvironment Analysis (LENA) Conversational Turn Counts in Vietnamese

    Science.gov (United States)

    Ganek, Hillary V.; Eriks-Brophy, Alice

    2018-01-01

    The aim of this study was to present a protocol for the validation of the Language ENvironment Analysis (LENA) System's conversational turn count (CTC) for Vietnamese speakers. Ten families of children aged between 22 and 42 months, recruited near Ho Chi Minh City, participated in this project. Each child wore the LENA audio recorder for a full…

  16. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  17. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    domain or in the frequency domain. However their .... computer to speech analysis led to important elaborations ... tool for the estimation of formant trajectory (10), ... prediction Linear prediction In effect determines the filter .... Radio Res. Lab.

  18. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  19. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  20. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  1. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  2. Analysis of MD5 authentication in various routing protocols using simulation tools

    Science.gov (United States)

    Dinakaran, M.; Darshan, K. N.; Patel, Harsh

    2017-11-01

    Authentication being an important paradigm of security and Computer Networks require secure paths to make the flow of the data even more secure through some security protocols. So MD-5(Message Digest 5) helps in providing data integrity to the data being sent through it and authentication to the network devices. This paper gives a brief introduction to the MD-5, simulation of the networks by including MD-5 authentication using various routing protocols like OSPF, EIGRP and RIPv2. GNS3 is being used to simulate the scenarios. Analysis of the MD-5 authentication is done in the later sections of the paper.

  3. Optimization of oligonucleotide arrays and RNA amplification protocols for analysis of transcript structure and alternative splicing.

    Science.gov (United States)

    Castle, John; Garrett-Engele, Phil; Armour, Christopher D; Duenwald, Sven J; Loerch, Patrick M; Meyer, Michael R; Schadt, Eric E; Stoughton, Roland; Parrish, Mark L; Shoemaker, Daniel D; Johnson, Jason M

    2003-01-01

    Microarrays offer a high-resolution means for monitoring pre-mRNA splicing on a genomic scale. We have developed a novel, unbiased amplification protocol that permits labeling of entire transcripts. Also, hybridization conditions, probe characteristics, and analysis algorithms were optimized for detection of exons, exon-intron edges, and exon junctions. These optimized protocols can be used to detect small variations and isoform mixtures, map the tissue specificity of known human alternative isoforms, and provide a robust, scalable platform for high-throughput discovery of alternative splicing.

  4. Allowing Students to Select Deliverables for Peer Review: Analysis of a Free-Selection Protocol

    DEFF Research Database (Denmark)

    Papadopoulos, Pantelis M.; Lagkas, Thomas; Demetriadis, Stavros

    2011-01-01

    This study analyzes the benefits and limitations of a “free-selection” peer assignment protocol by comparing them to the widely implemented “assigned-pair” protocol. The primary motivation was to circumvent the issues that often appear to the instructors implementing peer review activities with pre......-Selection, where students were able to explore and select peer work for review. Result analysis showed a very strong tendency in favor of the Free-Selection students regarding both domain specific (conceptual) and domain-general (reviewing) knowledge....

  5. Comparison of lung tumor motion measured using a model-based 4DCT technique and a commercial protocol.

    Science.gov (United States)

    O'Connell, Dylan; Shaverdian, Narek; Kishan, Amar U; Thomas, David H; Dou, Tai H; Lewis, John H; Lamb, James M; Cao, Minsong; Tenn, Stephen; Percy, Lee P; Low, Daniel A

    2017-11-11

    To compare lung tumor motion measured with a model-based technique to commercial 4-dimensional computed tomography (4DCT) scans and describe a workflow for using model-based 4DCT as a clinical simulation protocol. Twenty patients were imaged using a model-based technique and commercial 4DCT. Tumor motion was measured on each commercial 4DCT dataset and was calculated on model-based datasets for 3 breathing amplitude percentile intervals: 5th to 85th, 5th to 95th, and 0th to 100th. Internal target volumes (ITVs) were defined on the 4DCT and 5th to 85th interval datasets and compared using Dice similarity. Images were evaluated for noise and rated by 2 radiation oncologists for artifacts. Mean differences in tumor motion magnitude between commercial and model-based images were 0.47 ± 3.0, 1.63 ± 3.17, and 5.16 ± 4.90 mm for the 5th to 85th, 5th to 95th, and 0th to 100th amplitude intervals, respectively. Dice coefficients between ITVs defined on commercial and 5th to 85th model-based images had a mean value of 0.77 ± 0.09. Single standard deviation image noise was 11.6 ± 9.6 HU in the liver and 6.8 ± 4.7 HU in the aorta for the model-based images compared with 57.7 ± 30 and 33.7 ± 15.4 for commercial 4DCT. Mean model error within the ITV regions was 1.71 ± 0.81 mm. Model-based images exhibited reduced presence of artifacts at the tumor compared with commercial images. Tumor motion measured with the model-based technique using the 5th to 85th percentile breathing amplitude interval corresponded more closely to commercial 4DCT than the 5th to 95th or 0th to 100th intervals, which showed greater motion on average. The model-based technique tended to display increased tumor motion when breathing amplitude intervals wider than 5th to 85th were used because of the influence of unusually deep inhalations. These results suggest that care must be taken in selecting the appropriate interval during image generation when using model-based 4DCT methods. Copyright © 2017

  6. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  7. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    Science.gov (United States)

    Luglio, Gaetano; De Palma, Giovanni Domenico; Tarquini, Rachele; Giglio, Mariano Cesare; Sollazzo, Viviana; Esposito, Emanuela; Spadarella, Emanuela; Peltrini, Roberto; Liccardo, Filomena; Bucci, Luigi

    2015-01-01

    Background Despite the proven benefits, laparoscopic colorectal surgery is still under utilized among surgeons. A steep learning is one of the causes of its limited adoption. Aim of the study is to determine the feasibility and morbidity rate after laparoscopic colorectal surgery in a single institution, “learning curve” experience, implementing a well standardized operative technique and recovery protocol. Methods The first 50 patients treated laparoscopically were included. All the procedures were performed by a trainee surgeon, supervised by a consultant surgeon, according to the principle of complete mesocolic excision with central vascular ligation or TME. Patients underwent a fast track recovery programme. Recovery parameters, short-term outcomes, morbidity and mortality have been assessed. Results Type of resections: 20 left side resections, 8 right side resections, 14 low anterior resection/TME, 5 total colectomy and IRA, 3 total panproctocolectomy and pouch. Mean operative time: 227 min; mean number of lymph-nodes: 18.7. Conversion rate: 8%. Mean time to flatus: 1.3 days; Mean time to solid stool: 2.3 days. Mean length of hospital stay: 7.2 days. Overall morbidity: 24%; major morbidity (Dindo–Clavien III): 4%. No anastomotic leak, no mortality, no 30-days readmission. Conclusion Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon. PMID:25859386

  8. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    Science.gov (United States)

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  9. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  10. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  11. Security analysis of standards-driven communication protocols for healthcare scenarios.

    Science.gov (United States)

    Masi, Massimiliano; Pugliese, Rosario; Tiezzi, Francesco

    2012-12-01

    The importance of the Electronic Health Record (EHR), that stores all healthcare-related data belonging to a patient, has been recognised in recent years by governments, institutions and industry. Initiatives like the Integrating the Healthcare Enterprise (IHE) have been developed for the definition of standard methodologies for secure and interoperable EHR exchanges among clinics and hospitals. Using the requisites specified by these initiatives, many large scale projects have been set up for enabling healthcare professionals to handle patients' EHRs. The success of applications developed in these contexts crucially depends on ensuring such security properties as confidentiality, authentication, and authorization. In this paper, we first propose a communication protocol, based on the IHE specifications, for authenticating healthcare professionals and assuring patients' safety. By means of a formal analysis carried out by using the specification language COWS and the model checker CMC, we reveal a security flaw in the protocol thus demonstrating that to simply adopt the international standards does not guarantee the absence of such type of flaws. We then propose how to emend the IHE specifications and modify the protocol accordingly. Finally, we show how to tailor our protocol for application to more critical scenarios with no assumptions on the communication channels. To demonstrate feasibility and effectiveness of our protocols we have fully implemented them.

  12. Decentralized control using compositional analysis techniques

    NARCIS (Netherlands)

    Kerber, F.; van der Schaft, A. J.

    2011-01-01

    Decentralized control strategies aim at achieving a global control target by means of distributed local controllers acting on individual subsystems of the overall plant. In this sense, decentralized control is a dual problem to compositional analysis where a global verification task is decomposed

  13. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan

    2016-01-01

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social

  14. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  15. Magnetic Resonance and Ultrasound Image Fusion Supported Transperineal Prostate Biopsy Using the Ginsburg Protocol: Technique, Learning Points, and Biopsy Results.

    Science.gov (United States)

    Hansen, Nienke; Patruno, Giulio; Wadhwa, Karan; Gaziev, Gabriele; Miano, Roberto; Barrett, Tristan; Gnanapragasam, Vincent; Doble, Andrew; Warren, Anne; Bratt, Ola; Kastner, Christof

    2016-08-01

    Prostate biopsy supported by transperineal image fusion has recently been developed as a new method to the improve accuracy of prostate cancer detection. To describe the Ginsburg protocol for transperineal prostate biopsy supported by multiparametric magnetic resonance imaging (mpMRI) and transrectal ultrasound (TRUS) image fusion, provide learning points for its application, and report biopsy results. The article is supplemented by a Surgery in Motion video. This single-centre retrospective outcome study included 534 patients from March 2012 to October 2015. A total of 107 had no previous prostate biopsy, 295 had benign TRUS-guided biopsies, and 159 were on active surveillance for low-risk cancer. A Likert scale reported mpMRI for suspicion of cancer from 1 (no suspicion) to 5 (cancer highly likely). Transperineal biopsies were obtained under general anaesthesia using BiopSee fusion software (Medcom, Darmstadt, Germany). All patients had systematic biopsies, two cores from each of 12 anatomic sectors. Likert 3-5 lesions were targeted with a further two cores per lesion. Any cancer and Gleason score 7-10 cancer on biopsy were noted. Descriptive statistics and positive predictive values (PPVs) and negative predictive values (NPVs) were calculated. The detection rate of Gleason score 7-10 cancer was similar across clinical groups. Likert scale 3-5 MRI lesions were reported in 378 (71%) of the patients. Cancer was detected in 249 (66%) and Gleason score 7-10 cancer was noted in 157 (42%) of these patients. PPV for detecting 7-10 cancer was 0.15 for Likert score 3, 0.43 for score 4, and 0.63 for score 5. NPV of Likert 1-2 findings was 0.87 for Gleason score 7-10 and 0.97 for Gleason score ≥4+3=7 cancer. Limitations include lack of data on complications. Transperineal prostate biopsy supported by MRI/TRUS image fusion using the Ginsburg protocol yielded high detection rates of Gleason score 7-10 cancer. Because the NPV for excluding Gleason score 7-10 cancer was very

  16. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  17. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  18. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  19. Application of activation techniques to biological analysis

    International Nuclear Information System (INIS)

    Bowen, H.J.M.

    1981-01-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials

  20. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  1. A Simplified Whole-Organ CT Perfusion Technique with Biphasic Acquisition: Preliminary Investigation of Accuracy and Protocol Feasibility in Kidneys.

    Science.gov (United States)

    Yuan, XiaoDong; Zhang, Jing; Quan, ChangBin; Tian, Yuan; Li, Hong; Ao, GuoKun

    2016-04-01

    To determine the feasibility and accuracy of a protocol for calculating whole-organ renal perfusion (renal blood flow [RBF]) and regional perfusion on the basis of biphasic computed tomography (CT), with concurrent dynamic contrast material-enhanced (DCE) CT perfusion serving as the reference standard. This prospective study was approved by the institutional review board, and written informed consent was obtained from all patients. Biphasic CT of the kidneys, including precontrast and arterial phase imaging, was integrated with a first-pass dynamic volume CT protocol and performed and analyzed in 23 patients suspected of having renal artery stenosis. The perfusion value derived from biphasic CT was calculated as CT number enhancement divided by the area under the arterial input function and compared with the DCE CT perfusion data by using the paired t test, correlation analysis, and Bland-Altman plots. Correlation analysis was made between the RBF and the extent of renal artery stenosis. All postprocessing was independently performed by two observers and then averaged as the final result. Mean ± standard deviation biphasic and DCE CT perfusion data for RBF were 425.62 mL/min ± 124.74 and 419.81 mL/min ± 121.13, respectively (P = .53), and for regional perfusion they were 271.15 mL/min per 100 mL ± 82.21 and 266.33 mL/min per 100 mL ± 74.40, respectively (P = .31). Good correlation and agreement were shown between biphasic and DCE CT perfusion for RBF (r = 0.93; ±10% variation from mean perfusion data [P < .001]) and for regional perfusion (r = 0.90; ±13% variation from mean perfusion data [P < .001]). The extent of renal artery stenosis was negatively correlated with RBF with biphasic CT perfusion (r = -0.81, P = .012). Biphasic CT perfusion is clinically feasible and provides perfusion data comparable to DCE CT perfusion data at both global and regional levels in the kidney. Online supplemental material is available for this article.

  2. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  3. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  4. Cost-utility analysis of an advanced pressure ulcer management protocol followed by trained wound, ostomy, and continence nurses.

    Science.gov (United States)

    Kaitani, Toshiko; Nakagami, Gojiro; Iizaka, Shinji; Fukuda, Takashi; Oe, Makoto; Igarashi, Ataru; Mori, Taketoshi; Takemura, Yukie; Mizokami, Yuko; Sugama, Junko; Sanada, Hiromi

    2015-01-01

    The high prevalence of severe pressure ulcers (PUs) is an important issue that requires to be highlighted in Japan. In a previous study, we devised an advanced PU management protocol to enable early detection of and intervention for deep tissue injury and critical colonization. This protocol was effective for preventing more severe PUs. The present study aimed to compare the cost-effectiveness of the care provided using an advanced PU management protocol, from a medical provider's perspective, implemented by trained wound, ostomy, and continence nurses (WOCNs), with that of conventional care provided by a control group of WOCNs. A Markov model was constructed for a 1-year time horizon to determine the incremental cost-effectiveness ratio of advanced PU management compared with conventional care. The number of quality-adjusted life-years gained, and the cost in Japanese yen (¥) ($US1 = ¥120; 2015) was used as the outcome. Model inputs for clinical probabilities and related costs were based on our previous clinical trial results. Univariate sensitivity analyses were performed. Furthermore, a Bayesian multivariate probability sensitivity analysis was performed using Monte Carlo simulations with advanced PU management. Two different models were created for initial cohort distribution. For both models, the expected effectiveness for the intervention group using advanced PU management techniques was high, with a low expected cost value. The sensitivity analyses suggested that the results were robust. Intervention by WOCNs using advanced PU management techniques was more effective and cost-effective than conventional care. © 2015 by the Wound Healing Society.

  5. Correlation dimension based nonlinear analysis of network traffics with different application protocols

    International Nuclear Information System (INIS)

    Wang Jun-Song; Yuan Jing; Li Qiang; Yuan Rui-Xi

    2011-01-01

    This paper uses a correlation dimension based nonlinear analysis approach to analyse the dynamics of network traffics with three different application protocols—HTTP, FTP and SMTP. First, the phase space is reconstructed and the embedding parameters are obtained by the mutual information method. Secondly, the correlation dimensions of three different traffics are calculated and the results of analysis have demonstrated that the dynamics of the three different application protocol traffics is different from each other in nature, i.e. HTTP and FTP traffics are chaotic, furthermore, the former is more complex than the later; on the other hand, SMTP traffic is stochastic. It is shown that correlation dimension approach is an efficient method to understand and to characterize the nonlinear dynamics of HTTP, FTP and SMTP protocol network traffics. This analysis provided insight into and a more accurate understanding of nonlinear dynamics of internet traffics which have a complex mixture of chaotic and stochastic components. (general)

  6. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    Science.gov (United States)

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  7. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2012-09-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  8. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    Science.gov (United States)

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  10. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  11. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  12. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  13. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  14. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  15. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  16. An improved method of studying user-system interaction by combining transaction log analysis and protocol analysis

    Directory of Open Access Journals (Sweden)

    Jillian R. Griffiths

    2002-01-01

    Full Text Available The paper reports a novel approach to studying user-system interaction that captures a complete record of the searcher's actions, the system responses and synchronised talk-aloud comments from the searcher. The data is recorded unobtrusively and is available for later analysis. The approach is set in context by a discussion of transaction logging and protocol analysis and examples of the search logging in operation are presented

  17. Development of a protocol for the kinematic analysis of movement in patients with total hip arthroplasty

    OpenAIRE

    Mateu Pla, Joan

    2015-01-01

    The aim of this final degree project is to study and analyze the kinematics of the human body lower limbs. First of all, it is extremely important to establish a protocol in order to compare two patients operated with two different techniques of total hip arthroplasty. The three usual movements that are employed to make this comparison are gait, sit-to-stand and stairs climbing. A three-dimensional full body model is implemented and the kinematic parameters (angles) necessary for the st...

  18. Comparison of radiation doses using weight-based protocol and dose modulation techniques for patients undergoing biphasic abdominal computed tomography examinations

    Directory of Open Access Journals (Sweden)

    Livingstone Roshan

    2009-01-01

    Full Text Available Computed tomography (CT of the abdomen contributes a substantial amount of man-made radiation dose to patients and use of this modality is on the increase. This study intends to compare radiation dose and image quality using dose modulation techniques and weight- based protocol exposure parameters for biphasic abdominal CT. Using a six-slice CT scanner, a prospective study of 426 patients who underwent abdominal CT examinations was performed. Constant tube potentials of 90 kV and 120 kV were used for all arterial and portal venous phase respectively. The tube current-time product for weight-based protocol was optimized according to patient′s body weight; this was automatically selected in dose modulations. The effective dose using weight-based protocol, angular and z-axis dose modulation was 11.3 mSv, 9.5 mSv and 8.2 mSv respectively for the patient′s body weight ranging from 40 to 60 kg. For patients of body weights ranging 60 to 80 kg, the effective doses were 13.2 mSv, 11.2 mSv and 10.6 mSv respectively. The use of dose modulation technique resulted in a reduction of 16 to 28% in radiation dose with acceptable diagnostic accuracy in comparison to the use of weight-based protocol settings.

  19. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  20. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  1. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  2. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  3. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  4. [Perinatal bioethics: euthanasia or end-of-life decisions? Analysis of the Groningen Protocol].

    Science.gov (United States)

    Halac, Jacobo; Halac, Eduardo; Moya, Martín P; Olmas, José M; Dopazo, Silvina L; Dolagaray, Nora

    2009-12-01

    The so called "Groningen Protocol" was conceived as a framework to discuss the euthanasia in neonates. Originally, it presents three groups of babies who might be candidates to this option. We analyzed the protocol in its original context and that of the Dutch society in which it was created. The analysis started with a careful reading of the protocol in both English and Dutch versions, translated later into Spanish. The medical and nursing staff participated in discussing it. A final consensus was reached. The Institutional Ethics Committee at our hospital discussed it freely and made recommendations for its application as a guideline to honestly discuss with parents the clinical condition of their babies, without permitting the option included literally in the word euthanasia. We selected four extremely ill infants. Their parents were interviewed at least twice daily: three stages were identified: the initial one of promoting all possible treatments; a second one of guarded and cautious request for the staff to evaluate "suffering", and a last one where requests were made to reduce therapeutic efforts to provide dignified death. A week after the death of their infants, they were presented with the facts of the protocol and the limits of our legal system. In all four cases the parents suggested that they would have chosen ending the life of their infants, in order to avoid them undue suffering. They clearly pointed out that this option emerged as a viable one to them once the ultimate outcome was evident. The protocol must not be viewed as a guideline for euthanasia in newborns, but rather as a mean to discuss the critical condition of an infant with the parents. Its direct implementation in our setting remains difficult. As a clear limitation for its overall application remains the definition of what is considered "unbearable suffering" in newborns, and how to certify when the infant has "no prospect". We emphasize the benefits of securing the help of the Ethics

  5. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  6. Security Analysis of DTN Architecture and Bundle Protocol Specification for Space-Based Networks

    Science.gov (United States)

    Ivancic, William D.

    2009-01-01

    A Delay-Tolerant Network (DTN) Architecture (Request for Comment, RFC-4838) and Bundle Protocol Specification, RFC-5050, have been proposed for space and terrestrial networks. Additional security specifications have been provided via the Bundle Security Specification (currently a work in progress as an Internet Research Task Force internet-draft) and, for link-layer protocols applicable to Space networks, the Licklider Transport Protocol Security Extensions. This document provides a security analysis of the current DTN RFCs and proposed security related internet drafts with a focus on space-based communication networks, which is a rather restricted subset of DTN networks. Note, the original focus and motivation of DTN work was for the Interplanetary Internet . This document does not address general store-and-forward network overlays, just the current work being done by the Internet Research Task Force (IRTF) and the Consultative Committee for Space Data Systems (CCSDS) Space Internetworking Services Area (SIS) - DTN working group under the DTN and Bundle umbrellas. However, much of the analysis is relevant to general store-and-forward overlays.

  7. Performance Analysis of a Cluster-Based MAC Protocol for Wireless Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jesús Alonso-Zárate

    2010-01-01

    Full Text Available An analytical model to evaluate the non-saturated performance of the Distributed Queuing Medium Access Control Protocol for Ad Hoc Networks (DQMANs in single-hop networks is presented in this paper. DQMAN is comprised of a spontaneous, temporary, and dynamic clustering mechanism integrated with a near-optimum distributed queuing Medium Access Control (MAC protocol. Clustering is executed in a distributed manner using a mechanism inspired by the Distributed Coordination Function (DCF of the IEEE 802.11. Once a station seizes the channel, it becomes the temporary clusterhead of a spontaneous cluster and it coordinates the peer-to-peer communications between the clustermembers. Within each cluster, a near-optimum distributed queuing MAC protocol is executed. The theoretical performance analysis of DQMAN in single-hop networks under non-saturation conditions is presented in this paper. The approach integrates the analysis of the clustering mechanism into the MAC layer model. Up to the knowledge of the authors, this approach is novel in the literature. In addition, the performance of an ad hoc network using DQMAN is compared to that obtained when using the DCF of the IEEE 802.11, as a benchmark reference.

  8. Organ donation in the ICU: A document analysis of institutional policies, protocols, and order sets.

    Science.gov (United States)

    Oczkowski, Simon J W; Centofanti, John E; Durepos, Pamela; Arseneau, Erika; Kelecevic, Julija; Cook, Deborah J; Meade, Maureen O

    2018-04-01

    To better understand how local policies influence organ donation rates. We conducted a document analysis of our ICU organ donation policies, protocols and order sets. We used a systematic search of our institution's policy library to identify documents related to organ donation. We used Mindnode software to create a publication timeline, basic statistics to describe document characteristics, and qualitative content analysis to extract document themes. Documents were retrieved from Hamilton Health Sciences, an academic hospital system with a high volume of organ donation, from database inception to October 2015. We retrieved 12 active organ donation documents, including six protocols, two policies, two order sets, and two unclassified documents, a majority (75%) after the introduction of donation after circulatory death in 2006. Four major themes emerged: organ donation process, quality of care, patient and family-centred care, and the role of the institution. These themes indicate areas where documented institutional standards may be beneficial. Further research is necessary to determine the relationship of local policies, protocols, and order sets to actual organ donation practices, and to identify barriers and facilitators to improving donation rates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Micro-computed tomography and bond strength analysis of different root canal filling techniques

    Directory of Open Access Journals (Sweden)

    Juliane Nhata

    2014-01-01

    Full Text Available Introduction: The aim of this study was to evaluate the quality and bond strength of three root filling techniques (lateral compaction, continuous wave of condensation and Tagger′s Hybrid technique [THT] using micro-computed tomography (CT images and push-out tests, respectively. Materials and Methods: Thirty mandibular incisors were prepared using the same protocol and randomly divided into three groups (n = 10: Lateral condensation technique (LCT, continuous wave of condensation technique (CWCT, and THT. All specimens were filled with Gutta-percha (GP cones and AH Plus sealer. Five specimens of each group were randomly chosen for micro-CT analysis and all of them were sectioned into 1 mm slices and subjected to push-out tests. Results: Micro-CT analysis revealed less empty spaces when GP was heated within the root canals in CWCT and THT when compared to LCT. Push-out tests showed that LCT and THT had a significantly higher displacement resistance (P < 0.05 when compared to the CWCT. Bond strength was lower in apical and middle thirds than in the coronal thirds. Conclusions: It can be concluded that LCT and THT were associated with higher bond strengths to intraradicular dentine than CWCT. However, LCT was associated with more empty voids than the other techniques.

  10. Replication protocol analysis: a method for the study of real-world design thinking

    DEFF Research Database (Denmark)

    Galle, Per; Kovacs, L. B.

    1996-01-01

    ’ is refined into a method called ‘replication protocol analysis’ (RPA), and discussed from a methodological perspective of design research. It is argued that for the study of real-world design thinking this method offers distinct advantages over traditional ‘design protocol analysis’, which seeks to capture......Given the brief of an architectural competition on site planning, and the design awarded the first prize, the first author (trained as an architect but not a participant in the competition) produced a line of reasoning that might have led from brief to design. In the paper, such ‘design replication...... the designer’s authentic line of reasoning. To illustrate how RPA can be used, the site planning case is briefly presented, and part of the replicated line of reasoning analysed. One result of the analysis is a glimpse of a ‘logic of design’; another is an insight which sheds new light on Darke’s classical...

  11. Performance Analysis of an Optical CDMA MAC Protocol With Variable-Size Sliding Window

    Science.gov (United States)

    Mohamed, Mohamed Aly A.; Shalaby, Hossam M. H.; Abdel-Moety El-Badawy, El-Sayed

    2006-10-01

    A media access control protocol for optical code-division multiple-access packet networks with variable length data traffic is proposed. This protocol exhibits a sliding window with variable size. A model for interference-level fluctuation and an accurate analysis for channel usage are presented. Both multiple-access interference (MAI) and photodetector's shot noise are considered. Both chip-level and correlation receivers are adopted. The system performance is evaluated using a traditional average system throughput and average delay. Finally, in order to enhance the overall performance, error control codes (ECCs) are applied. The results indicate that the performance can be enhanced to reach its peak using the ECC with an optimum number of correctable errors. Furthermore, chip-level receivers are shown to give much higher performance than that of correlation receivers. Also, it has been shown that MAI is the main source of signal degradation.

  12. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  13. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  14. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  15. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  16. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  17. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  18. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  19. Staged protocol for the treatment of chronic femoral shaft osteomyelitis with Ilizarov's technique followed by the use of intramedullary locked nail.

    Science.gov (United States)

    Chou, Po-Hsin; Lin, Hsi-Hsien; Su, Yu-Pin; Chiang, Chao-Ching; Chang, Ming-Chau; Chen, Chuan-Mu

    2017-06-01

    Infected nonunion of the femoral shaft is uncommon, and usually presents with challenging therapeutic and reconstructive problems. There are still controversies over treating infected nonunion of the femoral shaft. The purposes of this retrospective study were to review the treatment outcomes and describe a staged protocol for spontaneous wound healing. Six patients with chronic femoral shaft infected-nonunion from October 2002 to September 2010 were included in this retrospective study. Serial plain films and triple films of lower legs were performed to evaluate the alignment of the treated femoral shaft and bony union following our staged protocol of Ilizarov distraction osteogenesis and intramedullary nailing. An average bone defect of 7 cm was noted after staged osteotomy. Mean follow-up was 87.5 (range, 38-133) months. Union was achieved in all six patients, with an average external fixation time of 6.8 (range, 5-11) months. There was no reinfection. One complication of a 4-cm leg discrepancy was noted, with an initial shortening of 15 cm. The mean knee ranges of motion (ROM) before staged protocols and at final follow-up were 64.2±8.6 (range, 60-75)° and 53.3±9.3 (range, 40-65)°, respectively. The ROM at the knee joint statistically decreased following staged protocols. In the treatment of chronic femur osteomyelitis, the staged protocol of Ilizarov distraction osteogenesis followed by intramedullary nailing was safe and successful, and allowed for union, realignment, reorientation, and leg-length restoration. With regard to the soft tissue, this technique provides a unique type of reconstructive closure for infected wounds. It is suggested that the staged protocol is reliable in providing successful simultaneous reconstruction for bone and soft tissue defects without flap coverage. Copyright © 2017. Published by Elsevier Taiwan LLC.

  20. Analysis of 213 currently used rehabilitation protocols in foot and ankle fractures.

    Science.gov (United States)

    Pfeifer, Christian G; Grechenig, Stephan; Frankewycz, Borys; Ernstberger, Antonio; Nerlich, Michael; Krutsch, Werner

    2015-10-01

    Fractures of the ankle, hind- and midfoot are amongst the five most common fractures. Besides initial operative or non-operative treatment, rehabilitation of the patients plays a crucial role for fracture union and long term functional outcome. Limited evidence is available with regard to what a rehabilitation regimen should include and what guidelines should be in place for the initial clinical course of these patients. This study therefore investigated the current rehabilitation concepts after fractures of the ankle, hind- and midfoot. Written rehabilitation protocols provided by orthopedic and trauma surgery institutions in terms of recommendations for weight bearing, range of motion (ROM), physiotherapy and choice of orthosis were screened and analysed. All protocols for lateral ankle fractures type AO 44A1, AO 44B1 and AO 44C1, for calcaneal fractures and fractures of the metatarsal as well as other not specific were included. Descriptive analysis was carried out and statistical analysis applied where appropriate. 209 rehabilitation protocols for ankle fractures type AO 44B1 and AO 44C1, 98 for AO 44A1, 193 for metatarsal fractures, 142 for calcaneal fractures, 107 for 5(th) metatarsal base fractures and 70 for 5(th) metatarsal Jones fractures were evaluated. The mean time recommended for orthosis treatment was 6.04 (SD 0.04) weeks. While the majority of protocols showed a trend towards increased weight bearing and increased ROM over time, the best consensus was noted for weight bearing recommendations. Our study shows that there exists a huge variability in rehabilitation of fractures of the ankle-, hind- and midfoot. This may be contributed to a lack of consensus (e.g. missing publication of guidelines), individualized patient care (e.g. in fragility fractures) or lack of specialization. This study might serve as basis for prospective randomized controlled trials in order to optimize rehabilitation for these common fractures. Copyright © 2015 Elsevier Ltd

  1. Immunochemical protocols

    National Research Council Canada - National Science Library

    Pound, John D

    1998-01-01

    ... easy and important refinements often are not published. This much anticipated 2nd edition of Immunochemzcal Protocols therefore aims to provide a user-friendly up-to-date handbook of reliable techniques selected to suit the needs of molecular biologists. It covers the full breadth of the relevant established immunochemical methods, from protein blotting and immunoa...

  2. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    International Nuclear Information System (INIS)

    Wang, J; Chan, F; Newman, B; Larson, D; Leung, A; Fleischmann, D; Molvin, L; Marsh, D; Zorich, C; Phillips, L

    2014-01-01

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze the scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds

  3. Fed-state gastric media and drug analysis techniques: Current status and points to consider.

    Science.gov (United States)

    Baxevanis, Fotios; Kuiper, Jesse; Fotaki, Nikoletta

    2016-10-01

    Gastric fed state conditions can have a significant effect on drug dissolution and absorption. In vitro dissolution tests with simple aqueous media cannot usually predict drugs' in vivo response, as several factors such as the meal content, the gastric emptying and possible interactions between food and drug formulations can affect drug's pharmacokinetics. Good understanding of the effect of the in vivo fed gastric conditions on the drug is essential for the development of biorelevant dissolution media simulating the gastric environment after the administration of the standard high fat meal proposed by the FDA and the EMA in bioavailability/bioequivalence (BA/BE) studies. The analysis of drugs in fed state media can be quite challenging as most analytical protocols currently employed are time consuming and labour intensive. In this review, an overview of the in vivo gastric conditions and the biorelevant media used for their in vitro simulation are described. Furthermore an analysis of the physicochemical properties of the drugs and the formulations related to food effect is given. In terms of drug analysis, the protocols currently used for the fed state media sample treatment and analysis and the analytical challenges and needs emerging for more efficient and time saving techniques for a broad spectrum of compounds are being discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Emotional Freedom Techniques for Anxiety: A Systematic Review With Meta-analysis.

    Science.gov (United States)

    Clond, Morgan

    2016-05-01

    Emotional Freedom Technique (EFT) combines elements of exposure and cognitive therapies with acupressure for the treatment of psychological distress. Randomized controlled trials retrieved by literature search were assessed for quality using the criteria developed by the American Psychological Association's Division 12 Task Force on Empirically Validated Treatments. As of December 2015, 14 studies (n = 658) met inclusion criteria. Results were analyzed using an inverse variance weighted meta-analysis. The pre-post effect size for the EFT treatment group was 1.23 (95% confidence interval, 0.82-1.64; p freedom technique treatment demonstrated a significant decrease in anxiety scores, even when accounting for the effect size of control treatment. However, there were too few data available comparing EFT to standard-of-care treatments such as cognitive behavioral therapy, and further research is needed to establish the relative efficacy of EFT to established protocols.

  5. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  6. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  7. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  8. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  9. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  10. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  11. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  12. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  13. Mac protocols for cyber-physical systems

    CERN Document Server

    Xia, Feng

    2015-01-01

    This book provides a literature review of various wireless MAC protocols and techniques for achieving real-time and reliable communications in the context of cyber-physical systems (CPS). The evaluation analysis of IEEE 802.15.4 for CPS therein will give insights into configuration and optimization of critical design parameters of MAC protocols. In addition, this book also presents the design and evaluation of an adaptive MAC protocol for medical CPS, which exemplifies how to facilitate real-time and reliable communications in CPS by exploiting IEEE 802.15.4 based MAC protocols. This book wil

  14. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  15. A fresh look at the freeze-all protocol: a SWOT analysis.

    Science.gov (United States)

    Blockeel, Christophe; Drakopoulos, Panagiotis; Santos-Ribeiro, Samuel; Polyzos, Nikolaos P; Tournaye, Herman

    2016-03-01

    The 'freeze-all' strategy with the segmentation of IVF treatment, namely with the use of a GnRH antagonist protocol, GnRH agonist triggering, the elective cryopreservation of all embryos by vitrification and a frozen-thawed embryo transfer in a subsequent cycle, has become more popular. However, the approach still encounters drawbacks. In this opinion paper, a SWOT (strengths, weaknesses, opportunities and threats) analysis sheds light on the different aspects of this strategy. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  17. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    Science.gov (United States)

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  18. Establishing a protocol for element determination in human nail clippings by neutron activation analysis

    International Nuclear Information System (INIS)

    Sanches, Thalita Pinheiro; Saiki, Mitiko

    2011-01-01

    Human nail samples have been analyzed to evaluate occupational exposure, nutritional status and to diagnose certain diseases. However, sampling and washing protocols for nail analyses vary from study to study not allowing comparisons between studies. One of the difficulties in analyzing nail samples is to eliminate only surface contamination without removing elements of interest in this tissue. In the present study, a protocol was defined in order to obtain reliable results of element concentrations in human nail clippings. Nail clippings collected from all 10 fingers or toes were previously pre cleaned using an ethyl alcohol solution to eliminate microbes. Then, the clippings were cut in small pieces and submitted to different reagents for washing by shaking. Neutron activation analysis (NAA) was applied for nail samples analysis which consisted of irradiating aliquots of samples together with synthetic elemental standards in the IEA-R1 nuclear research reactor followed by gamma ray spectrometry. Comparisons made between the results obtained for nails submitted to different reagents for cleaning indicated that the procedure using acetone and Triton X100 solution is more effective than that of nitric acid solution. Analyses in triplicates of a nail sample indicated results with relative standard deviations lower than 15% for most of elements, showing the homogeneity of the prepared sample. Qualitative analyses of different nail polishes showed that the presence of elements determined in the present study is negligible in these products. Quality control of the analytical results indicated that the applied NAA procedure is adequate for human nail analysis. (author)

  19. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    Science.gov (United States)

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  20. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  1. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Comprehensive protocol of traceability during IVF: the result of a multicentre failure mode and effect analysis.

    Science.gov (United States)

    Rienzi, L; Bariani, F; Dalla Zorza, M; Albani, E; Benini, F; Chamayou, S; Minasi, M G; Parmegiani, L; Restelli, L; Vizziello, G; Costa, A Nanni

    2017-08-01

    Can traceability of gametes and embryos be ensured during IVF? The use of a simple and comprehensive traceability system that includes the most susceptible phases during the IVF process minimizes the risk of mismatches. Mismatches in IVF are very rare but unfortunately possible with dramatic consequences for both patients and health care professionals. Traceability is thus a fundamental aspect of the treatment. A clear process of patient and cell identification involving witnessing protocols has to be in place in every unit. To identify potential failures in the traceability process and to develop strategies to mitigate the risk of mismatches, previously failure mode and effects analysis (FMEA) has been used effectively. The FMEA approach is however a subjective analysis, strictly related to specific protocols and thus the results are not always widely applicable. To reduce subjectivity and to obtain a widespread comprehensive protocol of traceability, a multicentre centrally coordinated FMEA was performed. Seven representative Italian centres (three public and four private) were selected. The study had a duration of 21 months (from April 2015 to December 2016) and was centrally coordinated by a team of experts: a risk analysis specialist, an expert embryologist and a specialist in human factor. Principal investigators of each centre were first instructed about proactive risk assessment and FMEA methodology. A multidisciplinary team to perform the FMEA analysis was then formed in each centre. After mapping the traceability process, each team identified the possible causes of mistakes in their protocol. A risk priority number (RPN) for each identified potential failure mode was calculated. The results of the FMEA analyses were centrally investigated and consistent corrective measures suggested. The teams performed new FMEA analyses after the recommended implementations. In each centre, this study involved: the laboratory director, the Quality Control & Quality

  3. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  4. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  5. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  6. Registered nurses' clinical reasoning in home healthcare clinical practice: A think-aloud study with protocol analysis.

    Science.gov (United States)

    Johnsen, Hege Mari; Slettebø, Åshild; Fossum, Mariann

    2016-05-01

    The home healthcare context can be unpredictable and complex, and requires registered nurses with a high level of clinical reasoning skills and professional autonomy. Thus, additional knowledge about registered nurses' clinical reasoning performance during patient home care is required. The aim of this study is to describe the cognitive processes and thinking strategies used by recently graduated registered nurses while caring for patients in home healthcare clinical practice. An exploratory qualitative think-aloud design with protocol analysis was used. Home healthcare visits to patients with stroke, diabetes, and chronic obstructive pulmonary disease in seven healthcare districts in southern Norway. A purposeful sample of eight registered nurses with one year of experience. Each nurse was interviewed using the concurrent think-aloud technique in three different patient home healthcare clinical practice visits. A total of 24 home healthcare visits occurred. Follow-up interviews were conducted with each participant. The think-aloud sessions were transcribed and analysed using three-step protocol analysis. Recently graduated registered nurses focused on both general nursing concepts and concepts specific to the domains required and tasks provided in home healthcare services as well as for different patient groups. Additionally, participants used several assertion types, cognitive processes, and thinking strategies. Our results showed that recently graduated registered nurses used both simple and complex cognitive processes involving both inductive and deductive reasoning. However, their reasoning was more reactive than proactive. The results may contribute to nursing practice in terms of developing effective nursing education programmes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  8. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  9. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  10. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  11. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  12. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  13. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  14. Failure mode and effects analysis of witnessing protocols for ensuring traceability during IVF.

    Science.gov (United States)

    Rienzi, Laura; Bariani, Fiorenza; Dalla Zorza, Michela; Romano, Stefania; Scarica, Catello; Maggiulli, Roberta; Nanni Costa, Alessandro; Ubaldi, Filippo Maria

    2015-10-01

    Traceability of cells during IVF is a fundamental aspect of treatment, and involves witnessing protocols. Failure mode and effects analysis (FMEA) is a method of identifying real or potential breakdowns in processes, and allows strategies to mitigate risks to be developed. To examine the risks associated with witnessing protocols, an FMEA was carried out in a busy IVF centre, before and after implementation of an electronic witnessing system (EWS). A multidisciplinary team was formed and moderated by human factors specialists. Possible causes of failures, and their potential effects, were identified and risk priority number (RPN) for each failure calculated. A second FMEA analysis was carried out after implementation of an EWS. The IVF team identified seven main process phases, 19 associated process steps and 32 possible failure modes. The highest RPN was 30, confirming the relatively low risk that mismatches may occur in IVF when a manual witnessing system is used. The introduction of the EWS allowed a reduction in the moderate-risk failure mode by two-thirds (highest RPN = 10). In our experience, FMEA is effective in supporting multidisciplinary IVF groups to understand the witnessing process, identifying critical steps and planning changes in practice to enable safety to be enhanced. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  15. Analysis of agreement between cardiac risk stratification protocols applied to participants of a center for cardiac rehabilitation

    Directory of Open Access Journals (Sweden)

    Ana A. S. Santos

    2016-01-01

    Full Text Available ABSTRACT Background Cardiac risk stratification is related to the risk of the occurrence of events induced by exercise. Despite the existence of several protocols to calculate risk stratification, studies indicating that there is similarity between these protocols are still unknown. Objective To evaluate the agreement between the existing protocols on cardiac risk rating in cardiac patients. Method The records of 50 patients from a cardiac rehabilitation program were analyzed, from which the following information was extracted: age, sex, weight, height, clinical diagnosis, medical history, risk factors, associated diseases, and the results from the most recent laboratory and complementary tests performed. This information was used for risk stratification of the patients in the protocols of the American College of Sports Medicine, the Brazilian Society of Cardiology, the American Heart Association, the protocol designed by Frederic J. Pashkow, the American Association of Cardiovascular and Pulmonary Rehabilitation, the Société Française de Cardiologie, and the Sociedad Española de Cardiología. Descriptive statistics were used to characterize the sample and the analysis of agreement between the protocols was calculated using the Kappa coefficient. Differences were considered with a significance level of 5%. Results Of the 21 analyses of agreement, 12 were considered significant between the protocols used for risk classification, with nine classified as moderate and three as low. No agreements were classified as excellent. Different proportions were observed in each risk category, with significant differences between the protocols for all risk categories. Conclusion The agreements between the protocols were considered low and moderate and the risk proportions differed between protocols.

  16. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  17. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  18. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  19. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  20. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  1. Application of industrial hygiene techniques for work-place exposure assessment protocols related to petro-chemical exploration and production field activities

    International Nuclear Information System (INIS)

    Koehn, J.

    1995-01-01

    Standard industrial hygiene techniques for recognition, evaluation, and control can be directly applied to development of technical protocols for workplace exposure assessment activities for a variety of field site locations. Categories of occupational hazards include chemical and physical agents. Examples of these types of hazards directly related to oil and gas exploration and production workplaces include hydrocarbons, benzene, oil mist, hydrogen sulfide, Naturally Occurring Radioactive Materials (NORM), asbestos-containing materials, and noise. Specific components of well process chemicals include potential hazardous chemical substances such as methanol, acrolein, chlorine dioxide, and hydrochloric acid. Other types of exposure hazards may result from non-routine conduct of sandblasting and painting operations

  2. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  3. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  4. Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.

    Science.gov (United States)

    Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q

    2018-04-01

    Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.

  5. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  6. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  7. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  8. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  9. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  10. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  11. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  12. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  13. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  14. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  15. TU-H-207A-09: An Automated Technique for Estimating Patient-Specific Regional Imparted Energy and Dose From TCM CT Exams Across 13 Protocols

    International Nuclear Information System (INIS)

    Sanders, J; Tian, X; Segars, P; Boone, J; Samei, E

    2016-01-01

    Purpose: To develop an automated technique for estimating patient-specific regional imparted energy and dose from tube current modulated (TCM) computed tomography (CT) exams across a diverse set of head and body protocols. Methods: A library of 58 adult computational anthropomorphic extended cardiac-torso (XCAT) phantoms were used to model a patient population. A validated Monte Carlo program was used to simulate TCM CT exams on the entire library of phantoms for three head and 10 body protocols. The net imparted energy to the phantoms, normalized by dose length product (DLP), and the net tissue mass in each of the scan regions were computed. A knowledgebase containing relationships between normalized imparted energy and scanned mass was established. An automated computer algorithm was written to estimate the scanned mass from actual clinical CT exams. The scanned mass estimate, DLP of the exam, and knowledgebase were used to estimate the imparted energy to the patient. The algorithm was tested on 20 chest and 20 abdominopelvic TCM CT exams. Results: The normalized imparted energy increased with increasing kV for all protocols. However, the normalized imparted energy was relatively unaffected by the strength of the TCM. The average imparted energy was 681 ± 376 mJ for abdominopelvic exams and 274 ± 141 mJ for chest exams. Overall, the method was successful in providing patientspecific estimates of imparted energy for 98% of the cases tested. Conclusion: Imparted energy normalized by DLP increased with increasing tube potential. However, the strength of the TCM did not have a significant effect on the net amount of energy deposited to tissue. The automated program can be implemented into the clinical workflow to provide estimates of regional imparted energy and dose across a diverse set of clinical protocols.

  16. A standardised protocol for texture feature analysis of endoscopic images in gynaecological cancer

    Directory of Open Access Journals (Sweden)

    Pattichis Marios S

    2007-11-01

    Full Text Available Abstract Background In the development of tissue classification methods, classifiers rely on significant differences between texture features extracted from normal and abnormal regions. Yet, significant differences can arise due to variations in the image acquisition method. For endoscopic imaging of the endometrium, we propose a standardized image acquisition protocol to eliminate significant statistical differences due to variations in: (i the distance from the tissue (panoramic vs close up, (ii difference in viewing angles and (iii color correction. Methods We investigate texture feature variability for a variety of targets encountered in clinical endoscopy. All images were captured at clinically optimum illumination and focus using 720 × 576 pixels and 24 bits color for: (i a variety of testing targets from a color palette with a known color distribution, (ii different viewing angles, (iv two different distances from a calf endometrial and from a chicken cavity. Also, human images from the endometrium were captured and analysed. For texture feature analysis, three different sets were considered: (i Statistical Features (SF, (ii Spatial Gray Level Dependence Matrices (SGLDM, and (iii Gray Level Difference Statistics (GLDS. All images were gamma corrected and the extracted texture feature values were compared against the texture feature values extracted from the uncorrected images. Statistical tests were applied to compare images from different viewing conditions so as to determine any significant differences. Results For the proposed acquisition procedure, results indicate that there is no significant difference in texture features between the panoramic and close up views and between angles. For a calibrated target image, gamma correction provided an acquired image that was a significantly better approximation to the original target image. In turn, this implies that the texture features extracted from the corrected images provided for better

  17. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  18. HISTOPATHOLOGICAL AND CYTOLOGICAL ANALYSIS OF TRANSMISSIBLE VENEREAL TUMOR IN DOGS AFTER TWO TREATMENT PROTOCOLS

    Directory of Open Access Journals (Sweden)

    Fabiana Aguena Sales Lapa

    2012-06-01

    Full Text Available The transmissible venereal tumor (TVT is a contagious neoplasm of round cells that frequently affect dogs. The treatment consists of chemotherapy being more effective the vincristine alone, however the resistance emergence to this agent due multidrug resistance of the P-glycoprotein (P-gp, a transporter protein encoded by the MDR1 gene, has been taking the association with other drugs. Recent studies demonstrated the antitumoral effect of the avermectins when associated to the vincristine in the treatment of some neoplasms. Therefore, the objective of the present study was to compare the effectiveness of standard treatment of TVT with vincristine only when compared to combined treatment with vincristine and ivermectin, evaluated through number of applications of the two protocols, histopathological and cytological analysis from 50 dogs diagnosed with TVT during the period of 2007 to 2010. The combined protocol significant reduced the number of applications and cytological and histopathological findings collaborate with the hypothesis that the combination of vincristine and ivermectin promotes faster healing than the use of vincristine alone. Combination treatment with vincristine and ivermectin could be in the future an excellent therapeutic alternative for the treatment of TVT for probably reducing the resistance to vincristine, simultaneously reducing the cost of TVT treatment and promoting a faster recovery of the dog.

  19. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  20. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  1. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  2. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  3. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  4. The interventional effect of new drugs combined with the Stupp protocol on glioblastoma: A network meta-analysis.

    Science.gov (United States)

    Li, Mei; Song, Xiangqi; Zhu, Jun; Fu, Aijun; Li, Jianmin; Chen, Tong

    2017-08-01

    New therapeutic agents in combination with the standard Stupp protocol (a protocol about the temozolomide combined with radiotherapy treatment with glioblastoma was research by Stupp R in 2005) were assessed to evaluate whether they were superior to the Stupp protocol alone, to determine the optimum treatment regimen for patients with newly diagnosed glioblastoma. We implemented a search strategy to identify studies in the following databases: PubMed, Cochrane Library, EMBASE, CNKI, CBM, Wanfang, and VIP, and assessed the quality of extracted data from the trials included. Statistical software was used to perform network meta-analysis. The use of novel therapeutic agents in combination with the Stupp protocol were all shown to be superior than the Stupp protocol alone for the treatment of newly diagnosed glioblastoma, ranked as follows: cilengitide 2000mg/5/week, bevacizumab in combination with irinotecan, nimotuzumab, bevacizumab, cilengitide 2000mg/2/week, cytokine-induced killer cell immunotherapy, and the Stupp protocol. In terms of serious adverse effects, the intervention group showed a 29% increase in the incidence of adverse events compared with the control group (patients treated only with Stupp protocol) with a statistically significant difference (RR=1.29; 95%CI 1.17-1.43; P<0.001). The most common adverse events were thrombocytopenia, lymphopenia, neutropenia, pneumonia, nausea, and vomiting, none of which were significantly different between the groups except for neutropenia, pneumonia, and embolism. All intervention drugs evaluated in our study were superior to the Stupp protocol alone when used in combination with it. However, we could not conclusively confirm whether cilengitide 2000mg/5/week was the optimum regime, as only one trial using this protocol was included in our study. Copyright © 2017. Published by Elsevier B.V.

  5. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  6. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  7. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  8. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  9. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  10. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  11. Analysis of the End-by-Hop Protocol for Secure Aggregation in Sensor Networks

    DEFF Research Database (Denmark)

    Zenner, Erik

    In order to save bandwidth and thus battery power, sensor network measurements are sometimes aggregated en-route while being reported back to the querying server. Authentication of the measurements then becomes a challenge if message integrity is important for the application. At ESAS 2007, the End......-by-Hop protocol for securing in-network aggregation for sensor nodes was presented. The solution was claimed to be secure and efficient and to provide the possibility of trading off bandwidth against computation time on the server. In this paper, we disprove these claims. We describe several attacks against...... the proposed solution and point out shortcomings in the original complexity analysis. In particular, we show that the proposed solution is inferior to a naive solution without in-network aggregation both in security and in efficiency....

  12. Is there a social gradient of sarcopenia? A meta-analysis and systematic review protocol.

    Science.gov (United States)

    Green, Darci; Duque, Gustavo; Fredman, Nick; Rizvi, Aoun; Brennan-Olsen, Sharon Lee

    2018-01-13

    Sarcopenia (or loss of muscle mass and function) is a relatively new area within the field of musculoskeletal research and medicine. Investigating whether there is a social gradient, including occupation type and income level, of sarcopenia, as observed for other diseases, will contribute significantly to the limited evidence base for this disease. This new information may inform the prevention and management of sarcopenia and widen the evidence base to support existing and future health campaigns. We will conduct a systematic search of the databases PubMed, Ovid, CINAHL, Scopus and EMBASE to identify articles that investigate associations between social determinants of health and sarcopenia in adults aged 50 years and older. Eligibility of the selected studies will be determined by two independent reviewers. The methodological quality of eligible studies will be assessed according to predetermined criteria. Established statistical methods to identify and control for heterogeneity will be used, and where appropriate, we will conduct a meta-analysis. In the event that heterogeneity prevents numerical synthesis, a best evidence analysis will be employed. This systematic review protocol adheres to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocols reporting guidelines and will be registered with the International Prospective Register of Systematic Reviews (PROSPERO). This systematic review will use published data, thus ethical permissions will not be required. In addition to peer-reviewed publication, our results will be presented at (inter)national conferences relevant to the field of sarcopenia, ageing and/or musculoskeletal health and disseminated both electronically and in print. CRD42017072253. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Different techniques of multispectral data analysis for vegetation fraction retrieval

    Science.gov (United States)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  14. Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis.

    Science.gov (United States)

    Alanazi, Adwan; Elleithy, Khaled

    2015-09-02

    Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol.

  15. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  16. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  17. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    Directory of Open Access Journals (Sweden)

    Gaetano Luglio

    2015-06-01

    Conclusion: Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon.

  18. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  19. Temperature analysis of laser ignited metalized material using spectroscopic technique

    Science.gov (United States)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  20. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  1. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  2. Some problems of calibration technique in charged particle activation analysis

    International Nuclear Information System (INIS)

    Krasnov, N.N.; Zatolokin, B.V.; Konstantinov, I.O.

    1977-01-01

    It is shown that three different approaches to calibration technique based on the use of average cross-section, equivalent target thickness and thick target yield are adequate. Using the concept of thick target yield, a convenient charged particle activation equation is obtained. The possibility of simultaneous determination of two impurities, from which the same isotope is formed, is pointed out. The use of the concept of thick target yield facilitates the derivation of a simple formula for an absolute and comparative methods of analysis. The methodical error does not exceed 10%. Calibration technique and determination of expected sensitivity based on the thick target yield concept is also very convenient because experimental determination of thick target yield values is a much simpler procedure than getting activation curve or excitation function. (T.G.)

  3. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  4. Biomechanical reposition techniques in anterior shoulder dislocation: a randomised multicentre clinical trial- the BRASD-trial protocol.

    Science.gov (United States)

    Baden, David N; Roetman, Martijn H; Boeije, Tom; Roodheuvel, Floris; Mullaart-Jansen, Nieke; Peeters, Suzanne; Burg, Mike D

    2017-07-20

    Glenohumeral (shoulder) dislocations are the most common large joint dislocations seen in the emergency department (ED). They cause pain, often severe, and require timely interventions to minimise discomfort and tissue damage. Commonly used reposition or relocation techniques often involve traction and/or leverage. These techniques have high success rates but may be painful and time consuming. They may also cause complications. Recently, other techniques-the biomechanical reposition techniques (BRTs)-have become more popular since they may cause less pain, require less time and cause fewer complications. To our knowledge, no research exists comparing the various BRTs. Our objective is to establish which BRT or BRT combination is fastest, least painful and associated with the lowest complication rate for adult ED patients with anterior glenohumeral dislocations (AGDs). Adults presenting to the participating EDs with isolated AGDs, as determined by radiographs, will be randomised to one of three BRTs: Cunningham, modified Milch or scapular manipulation. Main study parameters/endpoints are ED length of stay and patients' self-report of pain. Secondary study parameters/endpoints are procedure times, need for analgesic and/or sedative medications, iatrogenic complications and rates of successful reduction. Non-biomechanical AGD repositioning techniques based on traction and/or leverage are inherently painful and potentially harmful. We believe that the three BRTs used in this study are more physiological, more patient friendly, less likely to cause pain, more time efficient and less likely to produce complications. By comparing these three techniques, we hope to improve the care provided to adults with acute AGDs by reducing their ED length of stay and minimising pain and procedure-related complications. We also hope to define which of the three BRTs is quickest, most likely to be successful and least likely to require sedative or analgesic medications to achieve

  5. Comparative Analysis between Podography and Radiography in the Management of Idiopathic Clubfeet by Ponseti Technique.

    Science.gov (United States)

    Trivedi, Vikas; Badhwar, Sumit; Dube, Abhay S

    2017-02-01

    Idiopathic clubfoot is one of the most common and oldest congenital foot anomalies. There are controversies regarding its optimum management protocol and methodologies to be employed for its functional outcome evaluation. This paper attempts to propose a simple, reasonable and easily reproducible technique of podography for clinical and functional evaluation of clubfoot treated by the popular Ponseti technique. To compare the Foot Bimalleolar (FBM) angle method (podography) and radiography with respect to management of idiopathic clubfoot by Ponseti's Technique and its functional evaluation. Sixty feet of 48 patients with idiopathic clubfoot deformity were assessed in terms of FBM by podography (foot print on paper and FBM angle drawing) and radiologically; before starting treatment, after 6 weeks and at 6 monthly intervals with a maximum follow up period of 4.8 years (Range 1.2 to 4.8 years). Mean age at start of treatment was 1.5 years (2 months to 2.5 years). Functional evaluation was done by Magone's scoring system. After treatment, 92 percent patients had good correction (FBM greater than 70 degrees) which correlated well with post treatment Magone's score of greater than 80 (good to excellent) in nearly 85 percent of cases. Radiologically, talocalcaneal angles in both the views improved in only 60 percent of cases. Radiological criteria show inconsistent correlation with functional outcome for feet treated by Ponseti's Technique. Podography (FBM angle analysis) is a very simple, objective, cost effective, radiation free, easily reproducible and highly reliable clinical criterion for the assessment of deformity correction in club foot by Ponseti's Technique with an excellent correlation with functional outcome.

  6. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  7. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  8. An analysis of moderate sedation protocols used in dental specialty programs: a retrospective observational study.

    Science.gov (United States)

    Setty, Madhavi; Montagnese, Thomas A; Baur, Dale; Aminoshariae, Anita; Mickel, Andre

    2014-09-01

    Pain and anxiety control is critical in dental practice. Moderate sedation is a useful adjunct in managing a variety of conditions that make it difficult or impossible for some people to undergo certain dental procedures. The purpose of this study was to analyze the sedation protocols used in 3 dental specialty programs at the Case Western Reserve University School of Dental Medicine, Cleveland, OH. A retrospective analysis was performed using dental school records of patients receiving moderate sedation in the graduate endodontic, periodontic, and oral surgery programs from January 1, 2010, to December 31, 2012. Information was gathered and the data compiled regarding the reasons for sedation, age, sex, pertinent medical conditions, American Society of Anesthesiologists physical status classifications, routes of administration, drugs, dosages, failures, complications, and other information that was recorded. The reasons for the use of moderate sedation were anxiety (54%), local anesthesia failures (15%), fear of needles (15%), severe gag reflex (8%), and claustrophobia with the rubber dam (8%). The most common medical conditions were hypertension (17%), asthma (15%), and bipolar disorder (8%). Most patients were classified as American Society of Anesthesiologists class II. More women (63.1%) were treated than men (36.9%). The mean age was 45 years. Monitoring and drugs varied among the programs. The most common tooth treated in the endodontic program was the mandibular molar. There are differences in the moderate sedation protocols used in the endodontic, periodontic, and oral surgery programs regarding monitoring, drugs used, and record keeping. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Hierarchical evaluation of electrical stimulation protocols for chronic wound healing: An effect size meta-analysis.

    Science.gov (United States)

    Khouri, Charles; Kotzki, Sylvain; Roustit, Matthieu; Blaise, Sophie; Gueyffier, Francois; Cracowski, Jean-Luc

    2017-09-01

    Electrical stimulation (ES) has been tested for decades to improve chronic wound healing. However, uncertainty remains on the magnitude of the efficacy and on the best applicable protocol. We conducted an effect size meta-analysis to assess the overall efficacy of ES on wound healing, to compare the efficacy of the different modalities of electrical stimulation, and to determine whether efficacy differs depending on the wound etiology, size, and age of the chronic wound. Twenty-nine randomized clinical trials with 1,510 patients and 1,753 ulcers were selected. Overall efficacy of ES on would healing was a 0.72 SMD (95% CI: 0.48, 1) corresponding to a moderate to large effect size. We found that unidirectional high voltage pulsed current (HVPC) with the active electrode over the wound was the best evidence-based protocol to improve wound healing with a 0.8 SMD (95% CI: 0.38, 1.21), while evaluation of the efficacy of direct current was limited by the small number of studies. ES was more effective on pressure ulcers compared to venous and diabetic ulcers, and efficacy trended to be inversely associated with the wound size and duration. This study confirms the overall efficacy of ES to enhance healing of chronic wounds and highlights the superiority of HVPC over other type of currents, which is more effective on pressure ulcers, and inversely associated with the wound size and duration. This will enable to standardize future ES practices. © 2017 by the Wound Healing Society.

  10. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  11. Recursion vs. Replication in Simple Cryptographic Protocols

    DEFF Research Database (Denmark)

    Huttel, Hans; Srba, Jiri

    2005-01-01

    We use some recent techniques from process algebra to draw several conclusions about the well studied class of ping-pong protocols introduced by Dolev and Yao. In particular we show that all nontrivial properties, including reachability and equivalence checking wrt. the whole van Glabbeek's spect...... of messages in the sense of Amadio, Lugiez and Vanackere. We conclude by showing that reachability analysis for a replicative variant of the protocol becomes decidable....

  12. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    Science.gov (United States)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  13. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  14. The application of radiotracer technique for preconcentration neutron activation analysis

    International Nuclear Information System (INIS)

    Wang Xiaolin; Chen Yinliang; Sun Ying; Fu Yibei

    1995-01-01

    The application of radiotracer technique for preconcentration neutron activation analysis (Pre-NAA) are studied and the method for determination of chemical yield of Pre-NAA is developed. This method has been applied to determination of gold, iridium and rhenium in steel and rock samples and the contents of noble metal are in the range of 1-20 ng·g -1 (sample). In addition, the accuracy difference caused by determination of chemical yield between RNAA and Pre-NAA are also discussed

  15. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  17. Evaluation of Extraction Protocols for Simultaneous Polar and Non-Polar Yeast Metabolite Analysis Using Multivariate Projection Methods

    Directory of Open Access Journals (Sweden)

    Nicolas P. Tambellini

    2013-07-01

    Full Text Available Metabolomic and lipidomic approaches aim to measure metabolites or lipids in the cell. Metabolite extraction is a key step in obtaining useful and reliable data for successful metabolite studies. Significant efforts have been made to identify the optimal extraction protocol for various platforms and biological systems, for both polar and non-polar metabolites. Here we report an approach utilizing chemoinformatics for systematic comparison of protocols to extract both from a single sample of the model yeast organism Saccharomyces cerevisiae. Three chloroform/methanol/water partitioning based extraction protocols found in literature were evaluated for their effectiveness at reproducibly extracting both polar and non-polar metabolites. Fatty acid methyl esters and methoxyamine/trimethylsilyl derivatized aqueous compounds were analyzed by gas chromatography mass spectrometry to evaluate non-polar or polar metabolite analysis. The comparative breadth and amount of recovered metabolites was evaluated using multivariate projection methods. This approach identified an optimal protocol consisting of 64 identified polar metabolites from 105 ion hits and 12 fatty acids recovered, and will potentially attenuate the error and variation associated with combining metabolite profiles from different samples for untargeted analysis with both polar and non-polar analytes. It also confirmed the value of using multivariate projection methods to compare established extraction protocols.

  18. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    Science.gov (United States)

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves

  19. Mobile Health Technology Interventions for Suicide Prevention: Protocol for a Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Melia, Ruth; Francis, Kady; Duggan, Jim; Bogue, John; O'Sullivan, Mary; Chambers, Derek; Young, Karen

    2018-01-26

    Previous research has reported that two of the major barriers to help-seeking for individuals at risk of suicide are stigma and geographical isolation. Mobile technology offers a potential means of delivering evidence-based interventions with greater specificity to the individual, and at the time that it is needed. Despite documented motivation by at-risk individuals to use mobile technology to track mental health and to support psychological interventions, there is a shortfall of outcomes data on the efficacy of mobile health (mHealth) technology on suicide-specific outcomes. The objective of this study is to develop a protocol for a systematic review and meta-analysis that aims to evaluate the effectiveness of mobile technology-based interventions for suicide prevention. The search includes the Cochrane Central Register of Controlled Trials (CENTRAL: The Cochrane Library), MEDLINE, Embase, PsycINFO, CRESP and relevant sources of gray literature. Studies that have evaluated psychological or nonpsychological interventions delivered via mobile computing and communication technology, and have suicidality as an outcome measure will be included. Two authors will independently extract data and assess the study suitability in accordance with the Cochrane Collaboration Risk of Bias Tool. Studies will be included if they measure at least one suicide outcome variable (ie, suicidal ideation, suicidal intent, nonsuicidal self-injurious behavior, suicidal behavior). Secondary outcomes will be measures of symptoms of depression. Where studies are sufficiently homogenous and reported outcomes are amenable for pooled synthesis, meta-analysis will be performed. A narrative synthesis will be conducted if the data is unsuitable for a meta-analysis. The review is in progress, with findings expected by summer 2018. To date, evaluations of mobile technology-based interventions in suicide prevention have focused on evaluating content as opposed to efficacy. Indeed, previous research has

  20. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  1. A critical analysis of a locally agreed protocol for clinical practice

    International Nuclear Information System (INIS)

    Owen, A.; Hogg, P.; Nightingale, J.

    2004-01-01

    Within the traditional scope of radiographic practice (including advanced practice) there is a need to demonstrate effective patient care and management. Such practice should be set within a context of appropriate evidence and should also reflect peer practice. In order to achieve such practice the use of protocols is encouraged. Effective protocols can maximise care and management by minimising inter- and intra-professional variation; they can also allow for detailed procedural records to be kept in case of legal claims. However, whilst literature exists to encourage the use of protocols there is little published material available to indicate how to create, manage and archive them. This article uses an analytical approach to propose a suitable method for protocol creation and archival, it also offers suggestions on the scope and content of a protocol. To achieve this an existing clinical protocol for radiographer reporting barium enemas is analysed to draw out the general issues. Proposals for protocol creation, management, and archival were identified. The clinical practice described or inferred in the protocol should be drawn from evidence, such evidence could include peer-reviewed material, national standards and peer practice. The protocol should include an explanation of how to proceed when the radiographers reach the limit of their ability. It should refer to the initial training required to undertake the clinical duties as well as the on-going continual professional updating required to maintain competence. Audit of practice should be indicated, including the preferred audit methodology, and associated with this should be a clear statement about standards and what to do if standards are not adequately met. Protocols should be archived, in a paper-based form, for lengthy periods in case of legal claims. On the archived protocol the date it was in clinical use should be included

  2. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  3. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  4. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...... technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces...

  5. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  6. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  7. Modelling, Verification, and Comparative Performance Analysis of the B.A.T.M.A.N. Protocol

    NARCIS (Netherlands)

    Chaudhary, Kaylash; Fehnker, Ansgar; Mehta, Vinay; Hermanns, Holger; Höfner, Peter

    2017-01-01

    This paper considers on a network routing protocol known as Better Approach to Mobile Ad hoc Networks (B.A.T.M.A.N.). The protocol serves two aims: first, to discover all bidirectional links, and second, to identify the best-next-hop for every other node in the network. A key element is that each

  8. An Overview and Analysis of Mobile Internet Protocols in Cellular Environments.

    Science.gov (United States)

    Chao, Han-Chieh

    2001-01-01

    Notes that cellular is the inevitable future architecture for the personal communication service system. Discusses the current cellular support based on Mobile Internet Protocol version 6 (Ipv6) and points out the shortfalls of using Mobile IP. Highlights protocols especially for mobile management schemes which can optimize a high-speed mobile…

  9. 75 FR 53273 - Federal Aquatic Nuisance Species Research Risk Analysis Protocol

    Science.gov (United States)

    2010-08-31

    ... Aquatic Nuisance Species Task Force (ANSTF). The Protocol is available for public review and comment... the draft revised Protocol are available on the ANSTF website, http://anstaskforce.gov/documents.php... nonindigenous species (ANS) and is designed to reduce the risk that research activities may cause introduction...

  10. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    Science.gov (United States)

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  11. Modified Scoring, Traditional Item Analysis, and Sato's Caution Index Used To Investigate the Reading Recall Protocol.

    Science.gov (United States)

    Deville, Craig W.; Chalhoub-Deville, Micheline

    A study demonstrated the utility of item analyses to investigate which items function well or poorly in a second language reading recall protocol instrument. Data were drawn from a larger study of 56 learners of German as a second language at various proficiency levels. Pausal units of scored recall protocols were analyzed using both classical…

  12. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  13. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  14. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  15. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  16. The role of contrast-enhanced ultrasound (CEUS) in visualizing atherosclerotic carotid plaque vulnerability: Which injection protocol? Which scanning technique?

    Energy Technology Data Exchange (ETDEWEB)

    Iezzi, Roberto, E-mail: roberto.iezzi@rm.unicatt.it [Department of Bioimaging and Radiological Sciences, Institute of Radiology, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168 Rome (Italy); Petrone, Gianluigi [Institute of Pathology, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168, Rome (Italy); Ferrante, Angela [Department of Vascular Surgery, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168 Rome (Italy); Lauriola, Libero [Institute of Pathology, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168, Rome (Italy); Vincenzoni, Claudio [Department of Vascular Surgery, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168 Rome (Italy); Torre, Michele Fabio la [Department of Bioimaging and Radiological Sciences, Institute of Radiology, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168 Rome (Italy); Snider, Francesco [Department of Vascular Surgery, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168 Rome (Italy); Rindi, Guido [Institute of Pathology, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168, Rome (Italy); Bonomo, Lorenzo [Department of Bioimaging and Radiological Sciences, Institute of Radiology, “A. Gemelli” Hospital—Catholic University, L.go A Gemelli 8, 00168 Rome (Italy)

    2015-05-15

    Highlights: • CEUS is a safe and efficacious technique for the identification and characterization of carotid plaque. • CEUS represents a diagnostic tool for the management of patients with carotid plaque, particularly in asymptomatic patients. • Improved diagnostic performance is achieved with the injection of 4 mL bolus of contrast-medium. • Improved diagnostic performance is achieved with the use of Dynamic Imaging rather than late-phase imaging. - Abstract: Purpose: To correlate the degree of plaque vulnerability as determined by contrast-enhanced ultrasound (CEUS) with histological findings. Secondary objectives were to optimize the CEUS acquisition technique and image evaluation methods. Materials and methods: Fifty consecutive patients, either symptomatic and asymptomatic referring to our department in order to perform carotid endarterectomy (TEA), were enrolled. Each patient provided informed consent before undergoing CEUS. Ultrasound examination was performed using high-frequency (8–14 MHz) linear probe and a non-linear pulse inversion technique (mechanical index: 0.09–1.3). A double contrast media injection (Sonovue, 2 mL and 4 mL; Bracco, Italy) was performed. Two videotapes were recorded for every injection: early “dynamic” phase and late “flash” phase, performed with 6 high mechanical index impulses. Movies were quantitatively and qualitatively evaluated. Qualitative and quantitative evaluation were statistically compared to immunohistological diagnosis of vulnerable plaque, considered as gold standard. Results: Qualitative CEUS evaluation obtained high statistical results when compared to immunohistological results, with values of sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy of 94%, 68%, 87%, 85% and 86%, respectively, which became higher if considering only asymptomatic patient, with a NPV of 91%. Nevertheless, quantitative software evaluation proved less

  17. Analyzing security protocols in hierarchical networks

    DEFF Research Database (Denmark)

    Zhang, Ye; Nielson, Hanne Riis

    2006-01-01

    Validating security protocols is a well-known hard problem even in a simple setting of a single global network. But a real network often consists of, besides the public-accessed part, several sub-networks and thereby forms a hierarchical structure. In this paper we first present a process calculus...... capturing the characteristics of hierarchical networks and describe the behavior of protocols on such networks. We then develop a static analysis to automate the validation. Finally we demonstrate how the technique can benefit the protocol development and the design of network systems by presenting a series...

  18. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  19. Successful implementation of a perioperative glycemic control protocol in cardiac surgery: barrier analysis and intervention using lean six sigma.

    Science.gov (United States)

    Martinez, Elizabeth A; Chavez-Valdez, Raul; Holt, Natalie F; Grogan, Kelly L; Khalifeh, Katherine W; Slater, Tammy; Winner, Laura E; Moyer, Jennifer; Lehmann, Christoph U

    2011-01-01

    Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period.

  20. Successful Implementation of a Perioperative Glycemic Control Protocol in Cardiac Surgery: Barrier Analysis and Intervention Using Lean Six Sigma

    Science.gov (United States)

    Martinez, Elizabeth A.; Chavez-Valdez, Raul; Holt, Natalie F.; Grogan, Kelly L.; Khalifeh, Katherine W.; Slater, Tammy; Winner, Laura E.; Moyer, Jennifer; Lehmann, Christoph U.

    2011-01-01

    Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period. PMID:22091218

  1. Successful Implementation of a Perioperative Glycemic Control Protocol in Cardiac Surgery: Barrier Analysis and Intervention Using Lean Six Sigma

    Directory of Open Access Journals (Sweden)

    Elizabeth A. Martinez

    2011-01-01

    Full Text Available Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period.

  2. Insect Venom Immunotherapy: Analysis of the Safety and Tolerance of 3 Buildup Protocols Frequently Used in Spain.

    Science.gov (United States)

    Gutiérrez Fernández, D; Moreno-Ancillo, A; Fernández Meléndez, S; Domínguez-Noche, C; Gálvez Ruiz, P; Alfaya Arias, T; Carballada González, F; Alonso Llamazares, A; Marques Amat, L; Vega Castro, A; Antolín Amérigo, D; Cruz Granados, S; Ruiz León, B; Sánchez Morillas, L; Fernández Sánchez, J; Soriano Gomis, V; Borja Segade, J; Dalmau Duch, G; Guspi Bori, R; Miranda Páez, A

    2016-01-01

    Hymenoptera venom immunotherapy (VIT) is an effective treatment but not one devoid of risk, as both local and systemic adverse reactions may occur, especially in the initial phases. We compared the tolerance to 3 VIT buildup protocols and analyzed risk factors associated with adverse reactions during this phase. We enrolled 165 patients divided into 3 groups based on the buildup protocol used (3, 4, and 9 weeks). The severity of systemic reactions was evaluated according to the World Allergy Organization model. Results were analyzed using exploratory descriptive statistics, and variables were compared using analysis of variance. Adverse reactions were recorded in 53 patients (32%) (43 local and 10 systemic). Local reactions were immediate in 27 patients (63%) and delayed in 16 (37%). The severity of the local reaction was slight/moderate in 15 patients and severe in 13. Systemic reactions were grade 1-2. No significant association was found between the treatment modality and the onset of local or systemic adverse reactions or the type of local reaction. We only found a statistically significant association between severity of the local reaction and female gender. As for the risk factors associated with systemic reactions during the buildup phase, we found no significant differences in values depending on the protocol used or the insect responsible. The buildup protocols compared proved to be safe and did not differ significantly from one another. In the population studied, patients undergoing the 9-week schedule presented no systemic reactions. Therefore, this protocol can be considered the safest approach.

  3. A Ten Step Protocol and Plan for CCS Site Characterization, Based on an Analysis of the Rocky Mountain Region, USA

    Energy Technology Data Exchange (ETDEWEB)

    McPherson, Brian; Matthews, Vince

    2013-09-15

    This report expresses a Ten-Step Protocol for CO2 Storage Site Characterization, the final outcome of an extensive Site Characterization analysis of the Rocky Mountain region, USA. These ten steps include: (1) regional assessment and data gathering; (2) identification and analysis of appropriate local sites for characterization; (3) public engagement; (4) geologic and geophysical analysis of local site(s); (5) stratigraphic well drilling and coring; (6) core analysis and interpretation with other data; (7) database assembly and static model development; (8) storage capacity assessment; (9) simulation and uncertainty assessment; (10) risk assessment. While the results detailed here are primarily germane to the Rocky Mountain region, the intent of this protocol is to be portable or generally applicable for CO2 storage site characterization.

  4. Pharmacotherapies for fatigue in chronic liver disease (CLD): a systematic review and meta-analysis (protocol).

    Science.gov (United States)

    Effiong, Andem; Kumari, Prerna

    2018-02-14

    This is the protocol for a systematic review (and meta-analysis) of an intervention. The primary objective of this systematic review will be to assess the benefits and harms of pharmacological therapies (pharmacotherapies) for the management of fatigue in adults with CLD of any etiology. The effects of pharmacological therapies on fatigue in CLD will be compared against those of placebo, no intervention, or non-pharmacological interventions. Specifically, this review will examine whether pharmacological therapies improve CLD-associated fatigue, and if they do, what key elements are associated with their effectiveness. The results of this systematic review will assist clinicians, policy-makers, researchers, and people with CLD in decision-making on how best to manage fatigue and its associated symptoms. MEDLINE, SCOPUS, EMBASE, EU Clinical Trials Register, WHO International Clinical Trials Registry Platform, CENTRAL (The Cochrane Library), ClinicalTrials.gov, reference lists of articles and conference proceedings will be searched for relevant studies. No language or date restrictions will be applied. Eligible studies will include adults with CLD of any etiology. Included studies will be randomized controlled trials. From included studies, data on participant characteristics, study design, setting, research ethics compliance, and intervention outcomes will be extracted. Risk of bias in included studies will be assessed using the Cochrane Risk of Bias Tool. A random-effects meta-analysis will be conducted. If substantial or considerable levels of heterogeneity are detected, analysis will be limited to a narrative synthesis. This systematic review will examine the effectiveness of pharmacological therapies on fatigue reduction in people with CLD. Such therapies may be more effective than non-pharmacological interventions in treating fatigue symptoms in CLD. Evidence derived from the findings of this study will guide future practice, policy, and research. PROSPERO, CRD

  5. A Protocol for the Comprehensive Flow Cytometric Analysis of Immune Cells in Normal and Inflamed Murine Non-Lymphoid Tissues

    Science.gov (United States)

    Yu, Yen-Rei A.; O’Koren, Emily G.; Hotten, Danielle F.; Kan, Matthew J.; Kopin, David; Nelson, Erik R.; Que, Loretta; Gunn, Michael D.

    2016-01-01

    Flow cytometry is used extensively to examine immune cells in non-lymphoid tissues. However, a method of flow cytometric analysis that is both comprehensive and widely applicable has not been described. We developed a protocol for the flow cytometric analysis of non-lymphoid tissues, including methods of tissue preparation, a 10-fluorochrome panel for cell staining, and a standardized gating strategy, that allows the simultaneous identification and quantification of all major immune cell types in a variety of normal and inflamed non-lymphoid tissues. We demonstrate that our basic protocol minimizes cell loss, reliably distinguishes macrophages from dendritic cells (DC), and identifies all major granulocytic and mononuclear phagocytic cell types. This protocol is able to accurately quantify 11 distinct immune cell types, including T cells, B cells, NK cells, neutrophils, eosinophils, inflammatory monocytes, resident monocytes, alveolar macrophages, resident/interstitial macrophages, CD11b- DC, and CD11b+ DC, in normal lung, heart, liver, kidney, intestine, skin, eyes, and mammary gland. We also characterized the expression patterns of several commonly used myeloid and macrophage markers. This basic protocol can be expanded to identify additional cell types such as mast cells, basophils, and plasmacytoid DC, or perform detailed phenotyping of specific cell types. In examining models of primary and metastatic mammary tumors, this protocol allowed the identification of several distinct tumor associated macrophage phenotypes, the appearance of which was highly specific to individual tumor cell lines. This protocol provides a valuable tool to examine immune cell repertoires and follow immune responses in a wide variety of tissues and experimental conditions. PMID:26938654

  6. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P 2015 statement

    Directory of Open Access Journals (Sweden)

    Mireia Estarli

    2016-02-01

    Full Text Available Systematic reviews should build on a protocol that describes the rationale, hypothesis, and planned methods of the review; few reviews report whether a protocol exists. Detailed, well-described protocols can facilitate the understanding and appraisal of the review methods, as well as the detection of modifications to methods and selective reporting in completed reviews. We describe the development of a reporting guideline, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015. PRISMA-P consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review. Funders and those commissioning reviews might consider mandating the use of the checklist to facilitate the submission of relevant protocol information in funding applications. Similarly, peer reviewers and editors can use the guidance to gauge the completeness and transparency of a systematic review protocol submitted for publication in a journal or other medium. Translation with permission of the authors. The original authors have not revised and verified the Spanish translation, and they do not necessarily endorse it.

  7. Performance Analysis of Secure and Private Billing Protocols for Smart Metering

    Directory of Open Access Journals (Sweden)

    Tom Eccles

    2017-11-01

    Full Text Available Traditional utility metering is to be replaced by smart metering. Smart metering enables fine-grained utility consumption measurements. These fine-grained measurements raise privacy concerns due to the lifestyle information which can be inferred from the precise time at which utilities were consumed. This paper outlines and compares two privacy-respecting time of use billing protocols for smart metering and investigates their performance on a variety of hardware. These protocols protect the privacy of customers by never transmitting the fine-grained utility readings outside of the customer’s home network. One protocol favors complexity on the trusted smart meter hardware while the other uses homomorphic commitments to offload computation to a third device. Both protocols are designed to operate on top of existing cryptographic secure channel protocols in place on smart meters. Proof of concept software implementations of these protocols have been written and their suitability for real world application to low-performance smart meter hardware is discussed. These protocols may also have application to other privacy conscious aggregation systems, such as electronic voting.

  8. TU-EF-BRD-02: Indicators and Technique Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlone, M. [Princess Margaret Hospital (Canada)

    2015-06-15

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  9. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  10. Pithy Review on Routing Protocols in Wireless Sensor Networks and Least Routing Time Opportunistic Technique in WSN

    Science.gov (United States)

    Salman Arafath, Mohammed; Rahman Khan, Khaleel Ur; Sunitha, K. V. N.

    2018-01-01

    Nowadays due to most of the telecommunication standard development organizations focusing on using device-to-device communication so that they can provide proximity-based services and add-on services on top of the available cellular infrastructure. An Oppnets and wireless sensor network play a prominent role here. Routing in these networks plays a significant role in fields such as traffic management, packet delivery etc. Routing is a prodigious research area with diverse unresolved issues. This paper firstly focuses on the importance of Opportunistic routing and its concept then focus is shifted to prime aspect i.e. on packet reception ratio which is one of the highest QoS Awareness parameters. This paper discusses the two important functions of routing in wireless sensor networks (WSN) namely route selection using least routing time algorithm (LRTA) and data forwarding using clustering technique. Finally, the simulation result reveals that LRTA performs relatively better than the existing system in terms of average packet reception ratio and connectivity.

  11. Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy

    Directory of Open Access Journals (Sweden)

    Tomi Kauppi

    2013-01-01

    Full Text Available We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions.

  12. A critical comparison of systematic calibration protocols for activated sludge models: a SWOT analysis.

    Science.gov (United States)

    Sin, Gürkan; Van Hulle, Stijn W H; De Pauw, Dirk J W; van Griensven, Ann; Vanrolleghem, Peter A

    2005-07-01

    Modelling activated sludge systems has gained an increasing momentum after the introduction of activated sludge models (ASMs) in 1987. Application of dynamic models for full-scale systems requires essentially a calibration of the chosen ASM to the case under study. Numerous full-scale model applications have been performed so far which were mostly based on ad hoc approaches and expert knowledge. Further, each modelling study has followed a different calibration approach: e.g. different influent wastewater characterization methods, different kinetic parameter estimation methods, different selection of parameters to be calibrated, different priorities within the calibration steps, etc. In short, there was no standard approach in performing the calibration study, which makes it difficult, if not impossible, to (1) compare different calibrations of ASMs with each other and (2) perform internal quality checks for each calibration study. To address these concerns, systematic calibration protocols have recently been proposed to bring guidance to the modeling of activated sludge systems and in particular to the calibration of full-scale models. In this contribution four existing calibration approaches (BIOMATH, HSG, STOWA and WERF) will be critically discussed using a SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis. It will also be assessed in what way these approaches can be further developed in view of further improving the quality of ASM calibration. In this respect, the potential of automating some steps of the calibration procedure by use of mathematical algorithms is highlighted.

  13. Electroacupuncture for women with stress urinary incontinence: Protocol for a systematic review and meta-analysis.

    Science.gov (United States)

    Huang, Weixin; Li, Xiaohui; Wang, Yuanping; Yan, Xia; Wu, Siping

    2017-12-01

    Stress urinary incontinence (SUI) is a widespread complaint in the adult women. Electroacupuncture has been widely applied in the treatment of SUI. But its efficacy has not been evaluated scientifically and systematically. Therefore, we provide a protocol of systematic evaluation to assess the effectiveness and safety of electroacupuncture treatment on women with SUI. The retrieved databases include 3 English literature databases, namely PubMed, Embase, and Cochrane Library, and 3 Chinese literature databases, namely Chinese Biomedical Literature Database (CBM), China National Knowledge Infrastructure (CNKI), and Wanfang Database. The randomized controlled trials (RCTs) of the electroacupuncture treatment on women with SUI will be searched in the above-mentioned databases from the time when the respective databases were established to December 2017. The change from baseline in the amount of urine leakage measured by the 1-hour pad test will be accepted as the primary outcomes. We will use RevMan V.5.3 software as well to compute the data synthesis carefully when a meta-analysis is allowed. This study will provide a high-quality synthesis to assess the effectiveness and safety of electroacupuncture treatment on women with SUI. The conclusion of our systematic review will provide evidence to judge whether electroacupuncture is an effective intervention for women with SUI. PROSPERO CRD42017070947.

  14. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  15. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Directory of Open Access Journals (Sweden)

    Richard E.A. van Emmerik

    2016-03-01

    Full Text Available Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1 maintain pattern stability, (2 transition into new states, and (3 are governed by short- and long-term (fractal correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  16. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  17. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  18. Analysis of the Implementation of Standardized Clinical Protocol «Diabetes Mellitus Type 2» by Quality Indicators in Institutions of Kyiv Region

    Directory of Open Access Journals (Sweden)

    V.I. Tkachenko

    2014-10-01

    Full Text Available In Ukraine, a standardized clinical protocol (SCP to provide medical care in diabetes mellitus type 2 (order of the Ministry of Healthcare of Ukraine dated 21.12.2012 № 1118, which identifies 4 quality indicators, is being implemented. The objective of research — to analyze the implementation of SCP based on monitoring of quality indicators in the institutions of the Kyiv region. Materials and Methods. Technique for assessing the quality of diabetes care, one element of which is the monitoring of quality indicators specified in SCP, has been developed and applied. Collection and analysis of information was carried out by forms of primary records № 025/030 and 030/o, forms of statistical reporting № 12 and 20. Statistical analysis was performed using Excel 2007, SPSS. Results. Today, primary health care institutions in Kyiv region developed local protocols that confirms the implementation of the first quality indicator, in accordance with the desired level of the indicator value by SCP. The second indicator — the percentage of patients who were defined the level of glycated hemoglobin in the reporting period amounted to 12.2 %, which is higher than in 2012 (8.84 %, but remains low. The third quality indicator — the percentage of patients who were admitted to hospital for diabetes mellitus and its complications during the reporting period amounted to 15.01 %, while in 2012 it stood at 8.66 %. For comparison, this figure in 2007 was 9.37 %. Conclusions. The quality of care at an early stage of implementation is not enough, partly due to the lack of awareness by physicians of major provisions of the protocol, lack of equipment, the need of payment by a patient for medical services specified in the protocol, lack of doctors’ understanding of the characteristics of different types of medical and technological documents and difficulties in the development and implementation of local protocols, particularly. The obtained results are

  19. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  20. Techniques of production and analysis of polarized synchrotron radiation

    International Nuclear Information System (INIS)

    Mills, D.M.

    1992-01-01

    The use of the unique polarization properties of synchrotron radiation in the hard x-ray spectral region (E>3 KeV) is becoming increasingly important to many synchrotron radiation researchers. The radiation emitted from bending magnets and conventional (planar) insertion devices (IDs) is highly linearly polarized in the plane of the particle's orbit. Elliptically polarized x-rays can also be obtained by going off axis on a bending magnet source, albeit with considerable loss of flux. The polarization properties of synchrotron radiation can be further tailored to the researcher's specific needs through the use of specialized insertion devices such as helical and crossed undulators and asymmetrical wigglers. Even with the possibility of producing a specific polarization, there is still the need to develop x-ray optical components which can manipulate the polarization for both analysis and further modification of the polarization state. A survey of techniques for producing and analyzing both linear and circular polarized x-rays will be presented with emphasis on those techniques which rely on single crystal optical components

  1. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  2. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  3. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  4. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  5. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  6. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  7. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  8. Detecting in situ copepod diet diversity using molecular technique: development of a copepod/symbiotic ciliate-excluding eukaryote-inclusive PCR protocol.

    Science.gov (United States)

    Hu, Simin; Guo, Zhiling; Li, Tao; Carpenter, Edward J; Liu, Sheng; Lin, Senjie

    2014-01-01

    Knowledge of in situ copepod diet diversity is crucial for accurately describing pelagic food web structure but is challenging to achieve due to lack of an easily applicable methodology. To enable analysis with whole copepod-derived DNAs, we developed a copepod-excluding 18S rDNA-based PCR protocol. Although it is effective in depressing amplification of copepod 18S rDNA, its applicability to detect diverse eukaryotes in both mono- and mixed-species has not been demonstrated. Besides, the protocol suffers from the problem that sequences from symbiotic ciliates are overrepresented in the retrieved 18S rDNA libraries. In this study, we designed a blocking primer to make a combined primer set (copepod/symbiotic ciliate-excluding eukaryote-common: CEEC) to depress PCR amplification of symbiotic ciliate sequences while maximizing the range of eukaryotes amplified. We firstly examined the specificity and efficacy of CEEC by PCR-amplifying DNAs from 16 copepod species, 37 representative organisms that are potential prey of copepods and a natural microplankton sample, and then evaluated the efficiency in reconstructing diet composition by detecting the food of both lab-reared and field-collected copepods. Our results showed that the CEEC primer set can successfully amplify 18S rDNA from a wide range of isolated species and mixed-species samples while depressing amplification of that from copepod and targeted symbiotic ciliate, indicating the universality of CEEC in specifically detecting prey of copepods. All the predetermined food offered to copepods in the laboratory were successfully retrieved, suggesting that the CEEC-based protocol can accurately reconstruct the diets of copepods without interference of copepods and their associated ciliates present in the DNA samples. Our initial application to analyzing the food composition of field-collected copepods uncovered diverse prey species, including those currently known, and those that are unsuspected, as copepod prey

  9. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  10. Optical code-division multiple-access protocol with selective retransmission

    Science.gov (United States)

    Mohamed, Mohamed A. A.; Shalaby, Hossam M. H.; El-Badawy, El-Sayed A.

    2006-05-01

    An optical code-division multiple-access (OCDMA) protocol based on selective retransmission technique is proposed. The protocol is modeled using a detailed state diagram and is analyzed using equilibrium point analysis (EPA). Both traditional throughput and average delay are used to examine its performance for several network parameters. In addition, the performance of the proposed protocol is compared to that of the R3T protocol, which is based on a go-back-n technique. Our results show that a higher performance is achieved by the proposed protocol at the expense of system complexity.

  11. Group differences in physician responses to handheld presentation of clinical evidence: a verbal protocol analysis

    Directory of Open Access Journals (Sweden)

    Pavlovic Nada J

    2007-07-01

    Full Text Available Abstract Background To identify individual differences in physicians' needs for the presentation of evidence resources and preferences for mobile devices. Methods Within-groups analysis of responses to semi-structured interviews. Interviews consisted of using prototypes in response to task-based scenarios. The prototypes were implemented on two different form factors: a tablet style PC and a pocketPC. Participants were from three user groups: general internists, family physicians and medicine residents, and from two different settings: urban and semi-urban. Verbal protocol analysis, which consists of coding utterances, was conducted on the transcripts of the testing sessions. Statistical relationships were investigated between staff physicians' and residents' background variables, self-reported experiences with the interfaces, and verbal code frequencies. Results 47 physicians were recruited from general internal medicine, family practice clinics and a residency training program. The mean age of participants was 42.6 years. Physician specialty had a greater effect on device and information-presentation preferences than gender, age, setting or previous technical experience. Family physicians preferred the screen size of the tablet computer and were less concerned about its portability. Residents liked the screen size of the tablet, but preferred the portability of the pocketPC. Internists liked the portability of the pocketPC, but saw less advantage to the large screen of the tablet computer (F[2,44] = 4.94, p = .012. Conclusion Different types of physicians have different needs and preferences for evidence-based resources and handheld devices. This study shows how user testing can be incorporated into the process of design to inform group-based customization.

  12. An optimized protocol for generation and analysis of Ion Proton sequencing reads for RNA-Seq.

    Science.gov (United States)

    Yuan, Yongxian; Xu, Huaiqian; Leung, Ross Ka-Kit

    2016-05-26

    Previous studies compared running cost, time and other performance measures of popular sequencing platforms. However, comprehensive assessment of library construction and analysis protocols for Proton sequencing platform remains unexplored. Unlike Illumina sequencing platforms, Proton reads are heterogeneous in length and quality. When sequencing data from different platforms are combined, this can result in reads with various read length. Whether the performance of the commonly used software for handling such kind of data is satisfactory is unknown. By using universal human reference RNA as the initial material, RNaseIII and chemical fragmentation methods in library construction showed similar result in gene and junction discovery number and expression level estimated accuracy. In contrast, sequencing quality, read length and the choice of software affected mapping rate to a much larger extent. Unspliced aligner TMAP attained the highest mapping rate (97.27 % to genome, 86.46 % to transcriptome), though 47.83 % of mapped reads were clipped. Long reads could paradoxically reduce mapping in junctions. With reference annotation guide, the mapping rate of TopHat2 significantly increased from 75.79 to 92.09 %, especially for long (>150 bp) reads. Sailfish, a k-mer based gene expression quantifier attained highly consistent results with that of TaqMan array and highest sensitivity. We provided for the first time, the reference statistics of library preparation methods, gene detection and quantification and junction discovery for RNA-Seq by the Ion Proton platform. Chemical fragmentation performed equally well with the enzyme-based one. The optimal Ion Proton sequencing options and analysis software have been evaluated.

  13. Analysis of the radius and diameter protocols in terms of pricing telecommunication services

    Directory of Open Access Journals (Sweden)

    Vesna M. Radonjić

    2013-06-01

    Full Text Available Accounting of telecommunication services is closely related to the functions of authentication and authorization. These functions are usually considered together and implemented within the same server using a common protocol. The most renowned protocols for authentication, authorization and accounting are the RADIUS and Diameter protocols.   AAA functions and related protocols   In this chapter, the accounting management architecture developed by IETF is presented. It includes the interaction between network elements, accounting servers and billing and charging servers. Accounting data can be used for management, planning and charging users as well as other (specific purposes. Authentication is the process of confirming a user's digital identity, usually through some type of identifiers and related data. Authorization determines whether a particular entity is authorized to perform an activity.   Basic Functions of the RADIUS Protocol   The RADIUS architecture is based on a client-server model. It uses UDP on the transport layer. Transactions between the client and the server are authenticated, which is achieved by using a common secret key that is never sent through the network. Given the limited resources available to network devices, RADIUS facilitates and centralizes charging end users, provides some protection against active attacks by unauthorized users and it has great support from different network equipment vendors. Although RADIUS is a widely accepted protocol for the mechanisms of authentication, authorization and accounting, it has certain shortcomings that may be caused by the protocol itself or by its poor implementation.   Architecture and Operation of the Diameter Protocol   Diameter is a scalable protocol designed by the IETF working group in order to eliminate shortcomings and functional limitations of the RADIUS protocol and eventually to replace it in the near future. Most of the basic Diameter mechanisms and its

  14. [Analysis of palliative sedation in hospitalised elderly patients: Effectiveness of a protocol].

    Science.gov (United States)

    Mateos-Nozal, Jesús; García-Cabrera, Lorena; Montero Errasquín, Beatriz; Cruz-Jentoft, Alfonso José; Rexach Cano, Lourdes

    2016-01-01

    To measure changes in the practice of palliative sedation during agony in hospitalised elderly patients before and after the implementation of a palliative sedation protocol. A retrospective before-after study was performed in hospitalised patients over 65 years old who received midazolam during hospital admission and died in the hospital in two 3-month periods, before and after the implementation of the protocol. Non-sedative uses of midazolam and patients in intensive care were excluded. Patient and admission characteristics, the consent process, withdrawal of life-sustaining treatments, and the sedation process (refractory symptom treated, drug doses, assessment and use of other drugs) were recorded. Association was analysed using the Chi(2) and Student t tests. A total of 143 patients were included, with no significant differences between groups in demographic characteristics or symptoms. Do not resuscitate (DNR) orders were recorded in approximately 70% of the subjects of each group, and informed consent for sedation was recorded in 91% before vs. 84% after the protocol. Induction and maintenance doses of midazolam followed protocol recommendations in 1.3% before vs 10.4% after the protocol was implemented (P=.02) and adequate rescue doses were used in 1.3% vs 11.9% respectively (P=.01). Midazolam doses were significantly lower (9.86mg vs 18.67mg, Psedation score was used in 8% vs. 12% and the Palliative Care Team was involved in 35.5% and 16.4% of the cases (P=.008) before and after the protocol, respectively. Use of midazolam slightly improved after the implementation of a hospital protocol on palliative sedation. The percentage of adequate sedations and the general process of sedation were mostly unchanged by the protocol. More education and further assessment is needed to gauge the effect of these measures in the future. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.

  15. Analysis and Verification of a Key Agreement Protocol over Cloud Computing Using Scyther Tool

    OpenAIRE

    Hazem A Elbaz

    2015-01-01

    The mostly cloud computing authentication mechanisms use public key infrastructure (PKI). Hierarchical Identity Based Cryptography (HIBC) has several advantages that sound well align with the demands of cloud computing. The main objectives of cloud computing authentication protocols are security and efficiency. In this paper, we clarify Hierarchical Identity Based Authentication Key Agreement (HIB-AKA) protocol, providing lightweight key management approach for cloud computing users. Then, we...

  16. Study protocol

    DEFF Research Database (Denmark)

    Smith, Benjamin E; Hendrick, Paul; Bateman, Marcus

    2017-01-01

    avoidance behaviours, catastrophising, self-efficacy, sport and leisure activity participation, and general quality of life. Follow-up will be 3 and 6 months. The analysis will focus on descriptive statistics and confidence intervals. The qualitative components will follow a thematic analysis approach....... DISCUSSION: This study will evaluate the feasibility of running a definitive large-scale trial on patients with patellofemoral pain, within the NHS in the UK. We will identify strengths and weaknesses of the proposed protocol and the utility and characteristics of the outcome measures. The results from...... this study will inform the design of a multicentre trial. TRIAL REGISTRATION: ISRCTN35272486....

  17. [Clinical outcomes and economic analysis of two ovulation induction protocols in patients undergoing repeated IVF/ICSI cycles].

    Science.gov (United States)

    Chen, Xiao; Geng, Ling; Li, Hong

    2014-04-01

    To compare the clinical outcomes and cost-effectiveness of luteal phase down-regulation with gonadotrophin-releasing hormone (GnRH) agonist protocol and GnRH antagonist protocol in patients undergoing repeated in vitro fertilization and intracytoplasmic sperm injection (IVF-ICSI) cycles. A retrospective analysis of clinical outcomes and costs was conducted among 198 patients undergoing repeated IVF-ICSI cycles, including 109 receiving luteal phase down-regulation with GnRH agonist protocol (group A) and 89 receiving GnRH antagonist protocol (group B). The numbers of oocytes retrieved and good embryos, clinical pregnancy rate, abortion rate, the live birth rate, mean total cost, and the cost-effective ratio were compared between the two groups. In patients undergoing repeated IVF-ICSI cycles, the two protocols produced no significant differences in the number of good embryos, clinical pregnancy rate, abortion rate, or twin pregnancy rate. Compared with group B, group A had better clinical outcomes though this difference was not statistically significant. The number of retrieved oocytes was significantly greater and live birth rate significantly higher in group A than in group B (9.13=4.98 vs 7.11=4.74, and 20.2% vs 9.0%, respectively). Compared with group B, group A had higher mean total cost per cycle but lower costs for each oocyte retrieved (2729.11 vs 3038.60 RMB yuan), each good embryo (8867.19 vs 9644.85 RMB yuan), each clinical pregnancy (77598.06 vs 96139.85 RMB yuan). For patients undergoing repeated IVF/ICSI cycle, luteal phase down-regulation with GnRH agonist protocol produces good clinical outcomes with also good cost-effectiveness in spite an unsatisfactory ovarian reserve.

  18. A protocol for the development of a critical thinking assessment tool for nurses using a Delphi technique.

    Science.gov (United States)

    Jacob, Elisabeth; Duffield, Christine; Jacob, Darren

    2017-08-01

    The aim of this study was to develop an assessment tool to measure the critical thinking ability of nurses. As an increasing number of complex patients are admitted to hospitals, the importance of nurses recognizing changes in health status and picking up on deterioration is more important. To detect early signs of complication requires critical thinking skills. Registered Nurses are expected to commence their clinical careers with the necessary critical thinking skills to ensure safe nursing practice. Currently, there is no published tool to assess critical thinking skills which is context specific to Australian nurses. A modified Delphi study will be used for the project. This study will develop a series of unfolding case scenarios using national health data with multiple-choice questions to assess critical thinking. Face validity of the scenarios will be determined by an expert reference group of clinical and academic nurses. A Delphi study will determine the answers to scenario questions. Panel members will be expert clinicians and educators from two states in Australia. Rasch analysis of the questionnaire will assess validity and reliability of the tool. Funding for the study and Research Ethics Committee approval were obtained in March and November 2016, respectively. Patient outcomes and safety are directly linked to nurses' critical thinking skills. This study will develop an assessment tool to provide a standardized method of measuring nurses' critical thinking skills across Australia. This will provide healthcare providers with greater confidence in the critical thinking level of graduate Registered Nurses. © 2017 John Wiley & Sons Ltd.

  19. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  20. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  1. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  2. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  3. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  4. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  5. Measuring situation awareness of operation teams in NPPs using a verbal protocol analysis

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Park, Jinkyun; Kim, Ar ryum; Seong, Poong Hyun

    2012-01-01

    Highlights: ► A method for measuring team situation awareness is developed. ► Verbal protocol analysis is adopted in this method. ► This method resolves uncertainties from conventional methods. ► This method can be used in evaluating the human–system interfaces. - Abstract: Situation awareness (SA) continues to receive a considerable amount of attention from the ergonomics community given that need for operators to maintain SA is frequently cited as a key to effective and efficient performance. Although complex and dynamic environments such as that of a main control room (MCR) in a nuclear power plant (NPP) are operated by operation teams, and while team situation awareness (TSA) is also cited as an important factor, research is limited to individual SA. However, understanding TSA can provide a window onto the characteristics of team acquisition as well as the performance of a complex skill. Therefore, such knowledge can be valuable in diagnosing team performance successes and failures. Moreover, training and design interventions can target the cognitive underpinnings of team performance, with implications for the design of technological aids to improve team performance. Despite these advantages and the importance of understanding TSA, measures and methods targeting TSA are sparse and fail to address it properly. In this study, an objective TSA measurement method is developed in an effort to understand TSA. First, key considerations for developing a method are derived. Based on these considerations, the proposed method is developed while mainly focusing on the creation of logical connections between team communications and TSA. A speech act coding scheme is also implemented to analyze team communications. The TSA measurement method developed in this study provides a measure for each level of TSA. It was revealed from a preliminary study that this TSA measurement method is feasible for measuring TSA to a fair extent. Useful insight into TSA is also derived.

  6. Analysis of the Sheltered Instruction Observation Protocol Model on Academic Performance of English Language Learners

    Science.gov (United States)

    Ingram, Sandra W.

    This quantitative comparative descriptive study involved analyzing archival data from end-of-course (EOC) test scores in biology of English language learners (ELLs) taught or not taught using the sheltered instruction observation protocol (SIOP) model. The study includes descriptions and explanations of the benefits of the SIOP model to ELLs, especially in content area subjects such as biology. Researchers have shown that ELLs in high school lag behind their peers in academic achievement in content area subjects. Much of the research on the SIOP model took place in elementary and middle school, and more research was necessary at the high school level. This study involved analyzing student records from archival data to describe and explain if the SIOP model had an effect on the EOC test scores of ELLs taught or not taught using it. The sample consisted of 527 Hispanic students (283 females and 244 males) from Grades 9-12. An independent sample t-test determined if a significant difference existed in the mean EOC test scores of ELLs taught using the SIOP model as opposed to ELLs not taught using the SIOP model. The results indicated that a significant difference existed between EOC test scores of ELLs taught using the SIOP model and ELLs not taught using the SIOP model (p = .02). A regression analysis indicated a significant difference existed in the academic performance of ELLs taught using the SIOP model in high school science, controlling for free and reduced-price lunch (p = .001) in predicting passing scores on the EOC test in biology at the school level. The data analyzed for free and reduced-price lunch together with SIOP data indicated that both together were not significant (p = .175) for predicting passing scores on the EOC test in high school biology. Future researchers should repeat the study with student-level data as opposed to school-level data, and data should span at least three years.

  7. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  8. Protocol: optimising hydroponic growth systems for nutritional and physiological analysis of Arabidopsis thaliana and other plants

    Science.gov (United States)

    2013-01-01

    Background Hydroponic growth systems are a convenient platform for studying whole plant physiology. However, we found through trialling systems as they are described in the literature that our experiments were frequently confounded by factors that affected plant growth, including algal contamination and hypoxia. We also found the way in which the plants were grown made them poorly amenable to a number of common physiological assays. Results The drivers for the development of this hydroponic system were: 1) the exclusion of light from the growth solution; 2) to simplify the handling of individual plants, and 3) the growth of the plant to allow easy implementation of multiple assays. These aims were all met by the use of pierced lids of black microcentrifuge tubes. Seed was germinated on a lid filled with an agar-containing germination media immersed in the same solution. Following germination, the liquid growth media was exchanged with the experimental solution, and after 14-21 days seedlings were transferred to larger tanks with aerated solution where they remained until experimentation. We provide details of the protocol including composition of the basal growth solution, and separate solutions with altered calcium, magnesium, potassium or sodium supply whilst maintaining the activity of the majority of other ions. We demonstrate the adaptability of this system for: gas exchange measurement on single leaves and whole plants; qRT-PCR to probe the transcriptional response of roots or shoots to altered nutrient composition in the growth solution (we demonstrate this using high and low calcium supply); producing highly competent mesophyll protoplasts; and, accelerating the screening of Arabidopsis transformants. This system is also ideal for manipulating plants for micropipette techniques such as electrophysiology or SiCSA. Conclusions We present an optimised plant hydroponic culture system that can be quickly and cheaply constructed, and produces plants with similar

  9. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  10. A new technique for quantitative analysis of hair loss in mice using grayscale analysis.

    Science.gov (United States)

    Ponnapakkam, Tulasi; Katikaneni, Ranjitha; Gulati, Rohan; Gensure, Robert

    2015-03-09

    Alopecia is a common form of hair loss which can occur in many different conditions, including male-pattern hair loss, polycystic ovarian syndrome, and alopecia areata. Alopecia can also occur as a side effect of chemotherapy in cancer patients. In this study, our goal was to develop a consistent and reliable method to quantify hair loss in mice, which will allow investigators to accurately assess and compare new therapeutic approaches for these various forms of alopecia. The method utilizes a standard gel imager to obtain and process images of mice, measuring the light absorption, which occurs in rough proportion to the amount of black (or gray) hair on the mouse. Data that has been quantified in this fashion can then be analyzed using standard statistical techniques (i.e., ANOVA, T-test). This methodology was tested in mouse models of chemotherapy-induced alopecia, alopecia areata and alopecia from waxing. In this report, the detailed protocol is presented for performing these measurements, including validation data from C57BL/6 and C3H/HeJ strains of mice. This new technique offers a number of advantages, including relative simplicity of application, reliance on equipment which is readily available in most research laboratories, and applying an objective, quantitative assessment which is more robust than subjective evaluations. Improvements in quantification of hair growth in mice will improve study of alopecia models and facilitate evaluation of promising new therapies in preclinical studies.

  11. Analiza protokola kvaliteta usluga telekomunikacionih mreža / Analysis of quality of service protocols in telecommunication networks

    Directory of Open Access Journals (Sweden)

    Milojko Jevtović

    2003-05-01

    Full Text Available Protokoli kvaliteta usluga (Quality of Service - QoS sadašnjih i budućih telekomunikacionih mreža razvijeni su, pored ostalog, sa ciljem da podrže različite klase usluga (Class of Service - CoS komunikaciju u realnom vremenu, kao i prenos multimedijalnih poruka preko paketskih IP (Internet Protocol mreža. U raduje dat pregled karakteristika tih protokola i ocena njihovih konkretnih mogućnosti u obezbeđenju kvaliteta usluga unutar sistema ('s vrha do dna', tj. vertikalno u OSI arhitekturi kao i 'horizontalno' odnosno s kraja na kraj veze, tj. između izvora i odredišta. / Today's and future telecommunication networks must enable transmission throughout heterogeneous environment, using different Quality of Service protocols, Quality of Service protocols use a variety of complementary mechanisms to enable deterministic end-to-end different data delivery. The analysis of these protocols and their efficiency in providing QoS and CoS has been given in this paper.

  12. Performance Analysis of the IEEE 802.11p Multichannel MAC Protocol in Vehicular Ad Hoc Networks.

    Science.gov (United States)

    Song, Caixia

    2017-12-12

    Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. The safety applications require timely and reliable transmissions, while the non-safety applications require efficient and high throughput. In the IEEE 1609.4 protocol, operating interval is divided into alternating Control Channel (CCH) interval and Service Channel (SCH) interval with an identical length. During the CCH interval, nodes transmit safety-related messages and control messages, and Enhanced Distributed Channel Access (EDCA) mechanism is employed to allow four Access Categories (ACs) within a station with different priorities according to their criticality for the vehicle's safety. During the SCH interval, the non-safety massages are transmitted. An analytical model is proposed in this paper to evaluate performance, reliability and efficiency of the IEEE 802.11p and IEEE 1609.4 protocols. The proposed model improves the existing work by taking serval aspects and the character of multichannel switching into design consideration. Extensive performance evaluations based on analysis and simulation help to validate the accuracy of the proposed model and analyze the capabilities and limitations of the IEEE 802.11p and IEEE 1609.4 protocols, and enhancement suggestions are given.

  13. Application of Behavior Change Techniques in a Personalized Nutrition Electronic Health Intervention Study: Protocol for the Web-Based Food4Me Randomized Controlled Trial

    Science.gov (United States)

    Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout RH; Stewart-Knox, Barbara J; Mathers, John C

    2018-01-01

    Background To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. Objective The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. Methods The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype–based, and intake+phenotype+gene–based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Results Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of

  14. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  15. Trends in grazing emission x-ray analysis techniques

    International Nuclear Information System (INIS)

    Grieken, R. van; Tsuji, K.; Injuk, J.

    2000-01-01

    then, the detection limits imposed by the semiconductor industry roadmap can probably not be obtained by tube-excited GEXRF. Th perspectives for tube-excited GE-XRF are thus rather poor. Future developments imply the combination of GEXRF with synchrotron radiation excitation. Grazing-emission particle-induced X-ray emission (GE-PIXE) suffers of similar quantification Problems for material deposited on a carrier, but it makes PIXE a surface-sensitive technique, while normally the protons penetrate some tens of μm in the sample. Similarly, grazing-emission electron probe micro-analysis (GE-EPNIA) allows to selectively analyze particles on a flat carrier, allows surface sensitivities in the nm rather than μ range, and yields, in principle, a spatial resolution for chemical analysis similar to the size of the impinging electron beam, rather than of the electron-excited volume. Both GE-PIXE and GE-EPMA need to be explored more fully in the near future. (author)

  16. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Therese, Laurent; Guillot, Philippe; Muja, Cristina

    2011-01-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  17. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  18. [Professor Xu Fu-song's traditional Chinese medicine protocols for male diseases: A descriptive analysis].

    Science.gov (United States)

    Liu, Cheng-yong; Xu, Fu-song

    2015-04-01

    To analyze the efficacy and medication principles of Professor Xu Fu-songs traditional Chinese medicine (TCM) protocols for male diseases. We reviewed and descriptively analyzed the unpublished complete medical records of 100 male cases treated by Professor Xu Fu-song with his TCM protocols from 1978 to 1992. The 100 cases involved 32 male diseases, most of which were difficult and complicated cases. The drug compliance was 95%. Each prescription was made up of 14 traditional Chinese drugs on average. The cure rate was 32% , and the effective rate was 85%. Professor Xu Fu-song advanced and proved some new theories and therapeutic methods. Professor Xu Fu-song's TCM protocols can be applied to a wide range of male diseases, mostly complicated, and are characterized by accurate differentiation of symptoms and signs, high drug compliance, and excellent therapeutic efficacy.

  19. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  20. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Fernandez, R.F.; Zhang, W.; Robertson, J.D.; Majidi, V.

    1995-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 μg/g; and for Hg 2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag + , Ba 2+ , Cd 2+ , Cu 2+ , and Pb 2+ , share common binding sites with binding efficiencies varying in the sequence of Pb 2+ >Cu 2+ >Ag 2+ >Cd 2+ >Ba 2+ . The binding of Hg 2+ involved a different binding site with an increase in binding efficiency in the presence of Ag + . (orig.)

  1. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Robertson, J.D.; Majidi, V.

    1994-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. 5 mg of dried algae powder were mixed with 5 mL of single- and multi-metal solutions. The algae cells were then collected by filtration on 0.6 um polycarbonate membranes and analyzed by PIXE using a dual energy irradiation. When C. vulgatis was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 ug/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 ug/g; and for Hg 2+ from 10 ng/g to 10 ug/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 ug/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium is also replaced

  2. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  3. Analysis of Program Obfuscation Schemes with Variable Encoding Technique

    Science.gov (United States)

    Fukushima, Kazuhide; Kiyomoto, Shinsaku; Tanaka, Toshiaki; Sakurai, Kouichi

    Program analysis techniques have improved steadily over the past several decades, and software obfuscation schemes have come to be used in many commercial programs. A software obfuscation scheme transforms an original program or a binary file into an obfuscated program that is more complicated and difficult to analyze, while preserving its functionality. However, the security of obfuscation schemes has not been properly evaluated. In this paper, we analyze obfuscation schemes in order to clarify the advantages of our scheme, the XOR-encoding scheme. First, we more clearly define five types of attack models that we defined previously, and define quantitative resistance to these attacks. Then, we compare the security, functionality and efficiency of three obfuscation schemes with encoding variables: (1) Sato et al.'s scheme with linear transformation, (2) our previous scheme with affine transformation, and (3) the XOR-encoding scheme. We show that the XOR-encoding scheme is superior with regard to the following two points: (1) the XOR-encoding scheme is more secure against a data-dependency attack and a brute force attack than our previous scheme, and is as secure against an information-collecting attack and an inverse transformation attack as our previous scheme, (2) the XOR-encoding scheme does not restrict the calculable ranges of programs and the loss of efficiency is less than in our previous scheme.

  4. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    Directory of Open Access Journals (Sweden)

    Ricardo E. Izzo

    2011-06-01

    Full Text Available The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball, with 20.9 % – and by the two-handed over the head pass, with 18.2 %, and finally, one- or two-handed indirect passes (bounces, with 11.2 % and 9.8 %. Considering the most used pass in basketball, from the biomechanical point of view, the muscles involved in the correct movement consider all the muscles of the upper extremity, adding also the shoulder muscles as well as the body fixators (abdominals, hip flexors, knee extensors, and dorsal flexors of the foot. The technical and conditional analysis considers the throwing speed, the throw height and the air resistance. In conclusion, the aim of this study is to give some guidelines to improve the mechanical execution of the movements in training, without neglecting the importance of the harmony of the movements themselves.

  5. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  6. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  7. Modelling and Analysis of a Collision Avoidance Protocol using SPIN and UPPAAL

    DEFF Research Database (Denmark)

    Skou, Arne; Larsen, Kim Guldstrand; Jensen, Henrik Ejersbo

    1997-01-01

    , the modelling of the media becomes ackward due to the lack of broadcast communication in the PROMELA language. On the other hand we find it easy to model the timed aspects using the UPPAAL tool. Especially, the notion of committed locations supports the modelling of broadcast communication. However......This paper compares the tools SPIN and UPPAAL by modelling and verifying a Collision Avoidance Protocol for an Ethernet-like medium. We find that SPIN is well suited for modelling the untimed aspects of the protocol processes and for expressing the relevant (untimed) properties. However...

  8. Replication protocol analysis: a method for the study of real-world design thinking

    DEFF Research Database (Denmark)

    Galle, Per; Kovacs, L. B.

    1996-01-01

    Given the brief of an architectural competition on site planning, and the design awarded the first prize, the first author (trained as an architect but not a participant in the competition) produced a line of reasoning that might have led from brief to design. In the paper, such ‘design replication......’ is refined into a method called ‘replication protocol analysis’ (RPA), and discussed from a methodological perspective of design research. It is argued that for the study of real-world design thinking this method offers distinct advantages over traditional ‘design protocol analysis’, which seeks to capture...

  9. Analysis of quality control protocol implementation of equipment in radiotherapy services

    International Nuclear Information System (INIS)

    Calcina, Carmen S. Guzman; Lima, Luciana P. de; Rubo, Rodrigo A.; Ferraz, Eduardo; Almeida, Adelaide de

    2000-01-01

    Considering the importance of the Quality Assurance in the radiotherapy services, there was an interest to make tests' evaluation for a Quality Control for the cobalt equipment, linear accelerator and simulator as a classification and comparison. The work proposed is a suggestion that can serve as tool for medical physicists that are starting to work in the radiotherapy area and for the most experts. The discussions were made by the gathering of local tests and official protocols, resulting in a minimum protocol as a suggestion for a routine work, emphasizing the periodicity and level of tolerance of each one of the tests. (author)

  10. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  11. Analysis of the results of CAT of thorax with bronchiectasis protocol, period 2000-2001 Hospital Calderon Guardia

    International Nuclear Information System (INIS)

    Pacheco Segura, Maureen

    2003-01-01

    This investigation analyses the computerized axial tomography (CAT) of thorax with protocol of bronchiectasis. It was carried out in the Servicio de Radiologia e Imagenes Medicas of the Hospital Calderon Guardia, Costa Rica. The bronchiectasis is the abnormal permanent expansion of the bronchial tuber and is important to diagnose it because the patient can suffer of pulmonary infections, these can be accompanied by bronchial blood flow and hemoptysis. When they are disseminated can be associated with significant obstruction of the aerial tract. When they are gotten into focus can be confused with neoplasia and other diseases. From the bronchiectasis diagnosis it is used methods of image such as x-ray of thorax, bronchography and computerized axial tomography (CAT) of thorax, usually the diagnosis is confirmed by means of a computerized axial tomography (CAT); which is the image of election to establish the presence and extension of the bronchiectasis. In addition, this study analyzes the radiological clinical relation in the patients which were performed the computerized axial tomography (CAT) of thorax with protocol of bronchiectasis and it identifies the most suitable radiological technique to obtain a satisfactory result in the computerized axial tomography with protocol of bronchiectasis [es

  12. Taxonomy and Analysis of IP Micro-Mobility Protocols in Single and Simultaneous Movements Scenarios

    Directory of Open Access Journals (Sweden)

    G. De Marco

    2007-01-01

    Full Text Available The micro-mobility is an important aspect in mobile communications, where the applications are anywhere and used anytime. One of the problems of micro-mobility is the hand-off latency. In this paper, we analyse two solutions for IP micro-mobility by means of a general taxonomy. The first one is based on the Stream Control Transmission Protocol (SCTP, which allows the dynamic address configuration of an association. The second one is based on the Session Initiation Protocol (SIP, which is the most popular protocol for multimedia communications over IP networks. We show that for the SCTP solution, there is room for further optimisations of the hand-off latency by adding slight changes to the protocol. However, as full end-to-end solution, SCTP is not able to handle simultaneous movement of hosts, whose probability in general cannot be neglected. On the other hand, the SIP can handle both single and simultaneous movements cases, although the hand-off latency can increase with respect to the SCTP solution. We show that for a correct and fast hand-off, the SIP server should be statefull.

  13. Timing Analysis of Rate Constrained Traffic for the TTEthernet Communication Protocol

    DEFF Research Database (Denmark)

    Tamas-Selicean, Domitian; Pop, Paul; Steiner, Wilfried

    2015-01-01

    Ethernet is a low-cost communication solution offering high transmission speeds. Although its applications extend beyond computer networking, Ethernet is not suitable for real-time and safety-critical systems. To alleviate this, several real-time Ethernet-based communication protocols have been...

  14. Feature-Driven Domain Analysis of Session Layer Protocols of Internet of Things

    NARCIS (Netherlands)

    Köksal, Omer; Tekinerdogan, B.

    2017-01-01

    The Internet of Things (IoT) architecture is defined as a layered structure in which each layer represents a coherent set of services. For supporting the communication among the different IoT entities many different communication protocols are now available in practice. For practitioners, it is

  15. Comparative Analysis of the Dark Ground Buffy Coat Technique (DG ...

    African Journals Online (AJOL)

    The prevalence of typanosome infection in 65 cattle reared under expensive system of management was determined using the dark ground buffy coat (DG) technique and the enzyme-linkedimmunisorbent assay (ELISA). The DG technique showed that there were 18 positive cases (27.69%) of total number of animals, made ...

  16. A novel preconcentration technique for the PIXE analysis of water

    Energy Technology Data Exchange (ETDEWEB)

    Savage, J.M. [Element Analysis Corp., Lexington, KY (United States); Fernandez, R.F. [Element Analysis Corp., Lexington, KY (United States); Zhang, W. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Robertson, J.D. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Majidi, V. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States)

    1995-05-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag{sup +}, Ba{sup 2+}, and Cd{sup 2+} in the concentration range from 10 ng/g to 1 {mu}g/g; for Cu{sup 2+} and Pb{sup 2+} from 10 ng/g to 5 {mu}g/g; and for Hg{sup 2+} from 10 ng/g to 10 {mu}g/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 {mu}g/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag{sup +}, Ba{sup 2+}, Cd{sup 2+}, Cu{sup 2+}, and Pb{sup 2+}, share common binding sites with binding efficiencies varying in the sequence of Pb{sup 2+}>Cu{sup 2+}>Ag{sup 2+}>Cd{sup 2+}>Ba{sup 2+}. The binding of Hg{sup 2+} involved a different binding site with an increase in binding efficiency in the presence of Ag{sup +}. (orig.).

  17. Performance Analysis of the Enhanced DSR Routing Protocol for the Short Time Disconnected MANET to the OPNET Modeler

    Directory of Open Access Journals (Sweden)

    PAPAJ Ján

    2013-05-01

    Full Text Available Disconnected mobile ad-hoc networks (MANET are very important areas of the research. In this article, the performance analysis of the enhanced dynamic source routing protocol (OPP_DSR is introduced. This modification enables the routing process in the case when there are no connections to other mobile nodes. It also will enable the routing mechanisms when the routes, selected by routing mechanisms, are disconnected for some time. Disconnection can be for a short time and standard routing protocol DSR cannot reflect on this situation.The main idea is based on opportunistic forwarding where the nodes not only forward data but it's stored in the cache during long time. The network parameters throughput, routing load and are analysed.

  18. What the drivers do and do not tell you: using verbal protocol analysis to investigate driver behaviour in emergency situations.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A; Harvey, Catherine

    2014-01-01

    Although task analysis of pedestrian detection can provide us with useful insights into how a driver may behave in emergency situations, the cognitive elements of driver decision-making are less well understood. To assist in the design of future Advanced Driver Assistance Systems, such as Autonomous Emergency Brake systems, it is essential that the cognitive elements of the driving task are better understood. This paper uses verbal protocol analysis in an exploratory fashion to uncover the thought processes underlying behavioural outcomes represented by hard data collected using the Southampton University Driving Simulator.

  19. Security Protocols in a Nutshell

    OpenAIRE

    Toorani, Mohsen

    2016-01-01

    Security protocols are building blocks in secure communications. They deploy some security mechanisms to provide certain security services. Security protocols are considered abstract when analyzed, but they can have extra vulnerabilities when implemented. This manuscript provides a holistic study on security protocols. It reviews foundations of security protocols, taxonomy of attacks on security protocols and their implementations, and different methods and models for security analysis of pro...

  20. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Directory of Open Access Journals (Sweden)

    Yongming Han

    Full Text Available Quantifying elemental carbon (EC content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT. A high-temperature method with extended heating times (STN120 showed the highest ECT/ECR ratio (0.86 while a low-temperature protocol (IMPROVE-550, with heating time adjusted for sample loading, showed the lowest (0.53. STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average than in soils (5.2 on average, most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  1. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Science.gov (United States)

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  2. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  3. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  4. Implant loading protocols for edentulous patients with fixed prostheses: a systematic review and meta-analysis.

    Science.gov (United States)

    Papaspyridakos, Panos; Chen, Chun-Jung; Chuang, Sung-Kiang; Weber, Hans-Peter

    2014-01-01

    To report on the effect of immediate implant loading with fixed prostheses compared to early and conventional loading on implant and prosthesis survival, failure, and complications. An electronic and manual search was conducted to identify randomized controlled clinical trials (RCTs) as well as prospective and retrospective studies involving rough surface implants and implant fixed complete dental prostheses for edentulous patients. The 62 studies that fulfilled the inclusion criteria featured 4 RCTs, 2 prospective case-control studies, 34 prospective cohort studies, and 22 retrospective cohort studies. These studies yielded data from 2,695 patients (2,757 edentulous arches) with 13,653 implants. Studies were grouped according to the loading protocol applied; 45 studies reported on immediate loading, 8 on early loading, and 11 on conventional loading. For the immediate loading protocol with flap surgery, the implant and prosthesis survival rates ranged from 90.1% to 100% and 93.75% to 100%, respectively (range of follow-up, 1 to 10 years). When immediate loading was combined with guided flapless implant placement, the implant survival rates ranged from 90% to 99.4%. For the early loading protocol, the implant and prosthesis survival rates ranged from 94.74% to 100% and 93.75% to 100%, respectively (range of follow-up, 1 to 10 years). For the conventional loading protocol, the implant and prosthesis survival rates ranged from 94.95% to 100% and 87.5% to 100%, respectively (range of follow-up, 2 to 15 years). No difference was identified between maxilla and mandible. When selecting cases carefully and using dental implants with a rough surface, immediate loading with fixed prostheses in edentulous patients results in similar implant and prosthesis survival and failure rates as early and conventional loading. For immediate loading, most of the studies recommended a minimal insertion torque of 30 Ncm. The estimated 1-year implant survival was above 99% with all three

  5. Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques

    Science.gov (United States)

    Ramers, Douglas L.

    2005-01-01

    pressurizing the bottle on a test stand, and running sweeps of excitations frequencies for each of the piezo sensors and recording the resulting impedance. The sweeps were limited to 401 points by the available analyzer, and it was decided to perform individual sweeps at five different excitation frequency ranges. The frequency ranges used for the PZTs were different in two of the five ranges from the ranges used for the SCP. The bottles were pressurized to empty (no water), 0psig, 77 psig, 155 psig, 227 psig in nearly uniform increments of about 77psi. One of each of the two types of piezo sensors was fastened on to the bottle surface at two locations: about midway between the ends on cylindrical portion of the bottle and at the very edge of one of the end domes. The data was collected in files by sensor type (2 cases), by location (2 cases), by frequency range (5 cases), and pressure (5cases) to produce 100 data sets of 401 impedances. After familiarization with the piezo sensing technology and obtaining the data, the team developed a set of questions to try to answer regarding the data and made assignments of responsibilities. The next section lists the questions, and the remainder of the report describes the data analysis work performed by Dr. Ramers. This includes a discussion of the data, the approach to answering the question using statistical techniques, the use of an emergent system to investigate the data where statistical techniques were not usable, conclusions regarding the data, and recommendations.

  6. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  7. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  8. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...

  9. Impact of the Kyoto Protocol on the Iberian Electricity Market: A scenario analysis

    International Nuclear Information System (INIS)

    Reneses, Javier; Centeno, Efraim

    2008-01-01

    This paper presents an assessment of the impact of the Kyoto Protocol on the Iberian Electricity Market during two periods: the first phase (2005-2007) and the second phase (2008-2012). A market-equilibrium model is used in order to analyze different conditions faced by generation companies. Scenarios involving CO 2 -emission prices, hydro conditions, demand, fuel prices and renewable generation are considered. This valuation will show the significance of CO 2 -emission prices as regards Spanish and Portuguese electricity prices, generation mix, utilities profits and the total CO 2 emissions. Furthermore, the results will illustrate how energy policies implemented by regulators are critical for Spain and Portugal in order to mitigate the negative impact of the Kyoto Protocol. In conclusion, the Iberian electricity system will not be able to reach the Kyoto targets, except in very favorable conditions (CO 2 -emission prices over Euro 15/ton and the implementation of very efficient energy policies)

  10. An Analysis of Error Reconciliation Protocols for use in Quantum Key Distribution

    Science.gov (United States)

    2012-02-01

    INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN // CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR...of the messages passed, and that the time to prepare or separate the message information is negligible . Finally, for this experiment all errors...of interactions becomes negligible . In fact, of the three protocols, experiments performed here have shown that Winnow produces the highest average

  11. Design and Analysis of Secure Routing Protocol for Wireless Sensor Networks

    Science.gov (United States)

    Wang, Jiong; Zhang, Hua

    2017-09-01

    In recent years, with the development of science and technology and the progress of the times, China's wireless network technology has become increasingly prosperous and it plays an important role in social production and life. In this context, in order to further to enhance the stability of wireless network data transmission and security enhancements, the staff need to focus on routing security and carry out related work. Based on this, this paper analyzes the design of wireless sensor based on secure routing protocol.

  12. Design and Analysis of a secure multi-party communication protocol

    OpenAIRE

    Herberth, Klaus

    2016-01-01

    In the past years digital communication became an important aspect in every day life. Everything is shared and discussed in groups of friends, family or business part- ners without a proper way to protect that information. This master thesis introduces the first secure robust multi-party communication protocol which mimics a physical conversation with the help of a Diffie-Hellman key tree and social behaviours. Robust- ness against offline group members is reached by taking advantage of trans...

  13. Performance Analysis of Routing Protocols in Ad-hoc and Sensor Networking Environments

    Directory of Open Access Journals (Sweden)

    L. Gavrilovska

    2009-06-01

    Full Text Available Ad-hoc and sensor networks are becoming an increasingly popular wireless networking concepts lately. This paper analyzes and compares prominent routing schemes in these networking environments. The knowledge obtained can serve users to better understand short range wireless network solutions thus leading to options for implementation in various scenarios. In addition, it should aid researchers develop protocol improvements reliable for the technologies of interest.

  14. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  15. Investigation of the Study Characteristics Affecting Clinical Trial Quality Using the Protocol Deviations Leading to Exclusion of Subjects From the Per Protocol Set Data in Studies for New Drug Application: A Retrospective Analysis.

    Science.gov (United States)

    Kohara, Norihito; Kaneko, Masayuki; Narukawa, Mamoru

    2018-01-01

    The concept of the risk-based approach has been introduced as an effort to secure the quality of clinical trials. In the risk-based approach, identification and evaluation of risk in advance are considered important. For recently completed clinical trials, we investigated the relationship between study characteristics and protocol deviations leading to the exclusion of subjects from Per Protocol Set (PPS) efficacy analysis. New drugs approved in Japan in the fiscal year 2014-2015 were targeted in the research. The reasons for excluding subjects from the PPS efficacy analysis were described in 102 trials out of 492 in the summary of new drug application documents, which was publicly disclosed after the drug's regulatory approval. The author extracted these reasons along with the numbers of the cases and the study characteristics of each clinical trial. Then, the direct comparison, univariate regression analysis, and multivariate regression analysis was carried out based on the exclusion rate. The study characteristics for which exclusion of subjects from the PPS efficacy analysis were frequently observed was multiregional clinical trials in study region; inhalant and external use in administration route; Anti-infective for systemic use; Respiratory system, Dermatologicals, and Nervous system in therapeutic drug under the Anatomical Therapeutic Chemical Classification. In the multivariate regression analysis, the clinical trial variables of inhalant, Respiratory system, or Dermatologicals were selected as study characteristics leading to a higher exclusion rate. The characteristics of the clinical trial that is likely to cause protocol deviations that will affect efficacy analysis were suggested. These studies should be considered for specific attention and priority observation in the trial protocol or its monitoring plan and execution, such as a clear description of inclusion/exclusion criteria in the protocol, development of training materials to site staff, and

  16. Analysis of energy efficient routing protocols for implementation of a ubiquitous health system

    Science.gov (United States)

    Kwon, Jongwon; Park, Yongman; Koo, Sangjun; Ayurzana, Odgeral; Kim, Hiesik

    2007-12-01

    The innovative Ubiquitous-Health was born through convergence of medical service, with development of up to date information technologies and ubiquitous IT. The U-Health can be applied to a variety of special situations for managing functions of each medical center efficiently. This paper focuses on estimation of various routing protocols for implementation of U-health monitoring system. In order to facilitate wireless communication over the network, a routing protocol on the network layer is used to establish precise and efficient route between sensor nodes so that information acquired from sensors may be delivered in a timely manner. A route establishment should be considered to minimize overhead, data loss and power consumption because wireless networks for U-health are organized by a large number of sensor nodes which are small in size and have limited processing power, memory and battery life. In this paper a overview of wireless sensor network technologies commonly known is described as well as evaluation of three multi hop routing protocols which are flooding, gossiping and modified low energy adaptive clustering hierarchy(LEACH) for use with these networks using TOSSIM simulator. As a result of evaluation the integrated wireless sensor board was developed in particular. The board is embedded device based on AVR128 porting TinyOS. Also it employs bio sensor measures blood pressure, pulse frequency and ZigBee module for wireless communication. This paper accelerates the digital convergence age through continual research and development of technologies related the U-Health.

  17. Development of a protocol for sampling and analysis of ballast water in Jamaica

    Directory of Open Access Journals (Sweden)

    Achsah A Mitchell

    2014-09-01

    Full Text Available The transfer of ballast by the international shipping industry has negatively impacted the environment. To design such a protocol for the area, the ballast water tanks of seven bulk cargo vessels entering a Jamaican port were sampled between January 28, 2010 and August 17, 2010. Vessels originated from five ports and used three main routes, some of which conducted ballast water exchange. Twenty-six preserved and 22 live replicate zooplankton samples were obtained. Abundance and richness were higher than at temperate ports. Exchange did not alter the biotic composition but reduced the abundance. Two of the live sample replicates, containing 31.67 and 16.75 viable individuals m-3, were non-compliant with the International Convention for the Control and Management of Ships’ Ballast Water and Sediments. Approximately 12% of the species identified in the ballast water were present in the waters nearest the port in 1995 and 11% were present in the entire bay in 2005. The protocol designed from this study can be used to aid the establishment of a ballast water management system in the Caribbean or used as a foundation for the development of further protocols.

  18. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    Science.gov (United States)

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  19. Analysis of the new code stroke protocol in Asturias after one year. Experience at one hospital.

    Science.gov (United States)

    García-Cabo, C; Benavente, L; Martínez-Ramos, J; Pérez-Álvarez, Á; Trigo, A; Calleja, S

    2018-03-01

    Prehospital code stroke (CS) systems have been proved effective for improving access to specialised medical care in acute stroke cases. They also improve the prognosis of this disease, which is one of the leading causes of death and disability in our setting. The aim of this study is to analyse results one year after implementation of the new code stroke protocol at one hospital in Asturias. We prospectively included patients who were admitted to our tertiary care centre as per the code stroke protocol for the period of one year. We analysed 363 patients. Mean age was 69 years and 54% of the cases were men. During the same period in the previous year, there were 236 non-hospital CS activations. One hundred forty-seven recanalisation treatments were performed (66 fibrinolysis and 81 mechanical thrombectomies or combined treatments), representing a 25% increase with regard to the previous year. Recent advances in the management of acute stroke call for coordinated code stroke protocols that are adapted to the needs of each specific region. This may result in an increased number of patients receiving early care, as well as revascularisation treatments. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  1. Dose-modeling study to compare external beam techniques from protocol NSABP B-39/RTOG 0413 for patients with highly unfavorable cardiac anatomy

    International Nuclear Information System (INIS)

    Hiatt, Jessica R.; Evans, Suzanne B.; Price, Lori Lyn; Cardarelli, Gene A.; Di Petrillo, Thomas A.; Wazer, David E.

    2006-01-01

    Purpose: The aim of this study was to select patients with heart anatomy that is specifically unfavorable for tangential irradiation in whole-breast radiotherapy (WBRT), to be used as an experimental cohort to compare cardiac dosimetric and radiobiological parameters of three-dimensional conformal external beam accelerated partial breast irradiation (3D-CRT APBI) to WBRT with techniques as defined by the National Surgical Adjuvant Breast and Bowel Project (NSABP) B-39/Radiation Therapy Oncology Group (RTOG) 0413 clinical trial. Methods and Materials: A dosimetric modeling study that compared WBRT and 3D-CRT APBI was performed on CT planning data from 8 patients with left-sided breast cancer. Highly unfavorable cardiac anatomy was defined by the measured contact of the myocardium with the anterior chest wall in the axial and para-sagittal planes. Treatment plans of WBRT and 3D-CRT APBI were generated for each patient in accordance with NSABP B-39/RTOG 0413 protocol. Dose-volume relationships of the heart, including the V 5 min (minimum dose delivered to 5% of the cardiac volume), biological effective dose (BED) of the V 5 min, and normal tissue complication probability (NTCP) were analyzed and compared. Results: Despite expected anatomic variation, significantly large differences were found favoring 3D-CRT APBI in cumulative dose-volume histograms (p 5 min (mean difference, 24.53 Gy; p 5 min (85%, p < 0.01). Conclusions: Use of 3D-CRT APBI can demonstrate improved sparing of the heart in select patients with highly unfavorable cardiac anatomy for WBRT, and may result in reduced risk of cardiac morbidity and mortality

  2. Comparative analysis of five DNA isolation protocols and three drying methods for leaves samples of Nectandra megapotamica (Spreng. Mez

    Directory of Open Access Journals (Sweden)

    Leonardo Severo da Costa

    2016-06-01

    Full Text Available The aim of the study was to establish a DNA isolation protocol Nectandra megapotamica (Spreng. Mez., able to obtain samples of high yield and quality for use in genomic analysis. A commercial kit and four classical methods of DNA extraction were tested, including three cetyltrimethylammonium bromide (CTAB-based and one sodium dodecyl sulfate (SDS-based methods. Three drying methods for leaves samples were also evaluated including drying at room temperature (RT, in an oven at 40ºC (S40, and in a microwave oven (FMO. The DNA solutions obtained from different types of leaves samples using the five protocols were assessed in terms of cost, execution time, and quality and yield of extracted DNA. The commercial kit did not extract DNA with sufficient quantity or quality for successful PCR reactions. Among the classic methods, only the protocols of Dellaporta and of Khanuja yielded DNA extractions for all three types of foliar samples that resulted in successful PCR reactions and subsequent enzyme restriction assays. Based on the evaluated variables, the most appropriate DNA extraction method for Nectandra megapotamica (Spreng. Mez. was that of Dellaporta, regardless of the method used to dry the samples. The selected method has a relatively low cost and total execution time. Moreover, the quality and quantity of DNA extracted using this method was sufficient for DNA sequence amplification using PCR reactions and to get restriction fragments.

  3. Bioremediation protocols

    National Research Council Canada - National Science Library

    Sheehan, David

    1997-01-01

    ..., .. . . . . .. ,. . . .. . . . . . . . .. . . . . .. . . .. . .. 3 2 Granular Nina Sludge Christiansen, Consortia lndra for Bioremediation, M. Mathrani, and Birgitte K. Ahring . 23 PART II PROTOCOLS...

  4. Testing Behavior Change Techniques to Encourage Primary Care Physicians to Access Cancer Screening Audit and Feedback Reports: Protocol for a Factorial Randomized Experiment of Email Content.

    Science.gov (United States)

    Vaisson, Gratianne; Witteman, Holly O; Bouck, Zachary; Bravo, Caroline A; Desveaux, Laura; Llovet, Diego; Presseau, Justin; Saragosa, Marianne; Taljaard, Monica; Umar, Shama; Grimshaw, Jeremy M; Tinmouth, Jill; Ivers, Noah M

    2018-02-16

    how to communicate effectively with primary care providers by email and identify which behavior change techniques tested are most effective at encouraging engagement with an audit and feedback report. ClinicalTrials.gov NCT03124316; https://clinicaltrials.gov/ct2/show/NCT03124316 (Archived by WebCite at http://www.webcitation.org/6w2MqDWGu). ©Gratianne Vaisson, Holly O Witteman, Zachary Bouck, Caroline A Bravo, Laura Desveaux, Diego Llovet, Justin Presseau, Marianne Saragosa, Monica Taljaard, Shama Umar, Jeremy M Grimshaw, Jill Tinmouth, Noah M Ivers. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 16.02.2018.

  5. Radon remedial techniques in buildings - analysis of French actual cases

    International Nuclear Information System (INIS)

    Dupuis, M.

    2004-01-01

    The IRSN has compiled a collection of solutions from data provided by the various decentralised government services in 31 French departments. Contributors were asked to provide a description of the building, as well as details of measured radon levels, the type of reduction technique adopted and the cost. Illustrative layouts, technical drawings and photographs were also requested, when available. Of the cases recorded, 85% are establishments open to the public (schools (70%), city halls (4%) and combined city halls and school houses (26%)), 11% are houses and 4% industrial buildings. IRSN obtained 27 real cases of remedial techniques used. The data were presented in the form of fact sheets. The primary aim of this exercise was to illustrate each of the radon reduction techniques that can be used in the different building types (with basement, ground bearing slab, crawl space). This investigation not only enabled us to show that combining passive and active techniques reduces the operating cost of the installation, but above all that it considerably improves the efficiency. The passive technique reduces the amount of radon in the building and thus reduces the necessary ventilation rate, which directly affects the cost of operating the installation. For the 27 cases recorded, we noted:(a) the application of 7 passive techniques: sealing of floors and semi-buried walls, together with improved aeration by installing ventilation openings or ventilation strips in the windows. Radon concentrations were reduced on average by a factor of 4.7. No measurement in excess of 400 Bq.m -3 (the limit recommended by the French public authorities) was obtained following completion of the works; (b) the application of 15 active techniques: depressurization of the underlying ground, crawl space or basement and/or pressurization of the building. Radon concentrations were reduced on average by a factor of 13.8. Radon concentrations of over 400 Bq.m -3 were measured in only 4 cases

  6. Improved streaming analysis technique: spherical harmonics expansion of albedo data

    International Nuclear Information System (INIS)

    Albert, T.E.; Simmons, G.L.

    1979-01-01

    An improved albedo scattering technique was implemented with a three-dimensional Monte Carlo transport code for use in analyzing radiation streaming problems. The improvement was based on a shifted spherical Harmonics expansion of the doubly differential albedo data base. The result of the improvement was a factor of 3 to 10 reduction in data storage requirements and approximately a factor of 3 to 6 increase in computational speed. Comparisons of results obtained using the technique with measurements are shown for neutron streaming in one- and two-legged square concrete ducts

  7. Undesirable effects of covariance matrix techniques for error analysis

    International Nuclear Information System (INIS)

    Seibert, D.

    1994-01-01

    Regression with χ 2 constructed from covariance matrices should not be used for some combinations of covariance matrices and fitting functions. Using the technique for unsuitable combinations can amplify systematic errors. This amplification is uncontrolled, and can produce arbitrarily inaccurate results that might not be ruled out by a χ 2 test. In addition, this technique can give incorrect (artificially small) errors for fit parameters. I give a test for this instability and a more robust (but computationally more intensive) method for fitting correlated data

  8. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    1988-06-01

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  9. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  10. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  11. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  12. Accuracy of molecular biology techniques for the diagnosis of Strongyloides stercoralis infection-A systematic review and meta-analysis.

    Science.gov (United States)

    Buonfrate, Dora; Requena-Mendez, Ana; Angheben, Andrea; Cinquini, Michela; Cruciani, Mario; Fittipaldo, Andrea; Giorli, Giovanni; Gobbi, Federico; Piubelli, Chiara; Bisoffi, Zeno

    2018-02-01

    Strongyloides stercoralis infection is a neglected tropical disease which can lead to severe symptoms and even death in immunosuppressed people. Unfortunately, its diagnosis is hampered by the lack of a gold standard, as the sensitivity of traditional parasitological tests (including microscopic examination of stool samples and coproculture) is low. Hence, alternative diagnostic methods, such as molecular biology techniques (mostly polymerase chain reaction, PCR) have been implemented. However, there are discrepancies in the reported accuracy of PCR. A systematic review with meta-analysis was conducted in order to evaluate the accuracy of PCR for the diagnosis of S. stercoralis infection. The protocol was registered with PROSPERO International Prospective Register of Systematic Reviews (record: CRD42016054298). Fourteen studies, 12 of which evaluating real-time PCR, were included in the analysis. The specificity of the techniques resulted high (ranging from 93 to 95%, according to the reference test(s) used). When all molecular techniques were compared to parasitological methods, the sensitivity of PCR was assessed at 71.8% (95% CI 52.2-85.5), that decreased to 61.8% (95% CI 42.0-78.4) when serology was added among the reference tests. Similarly, sensitivity of real-time PCR resulted 64.4% (95% CI 46.2-77.7) when compared to parasitological methods only, 56.5% (95% CI 39.2-72.4) including serology. PCR might not be suitable for screening purpose, whereas it might have a role as a confirmatory test.

  13. Research review and development trends of human reliability analysis techniques

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Dai Licao

    2011-01-01

    Human reliability analysis (HRA) methods are reviewed. The theoretical basis of human reliability analysis, human error mechanism, the key elements of HRA methods as well as the existing HRA methods are respectively introduced and assessed. Their shortcomings,the current research hotspot and difficult problems are identified. Finally, it takes a close look at the trends of human reliability analysis methods. (authors)

  14. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  15. Experimental Analysis of Temperature Differences During Implant Site Preparation: Continuous Drilling Technique Versus Intermittent Drilling Technique.

    Science.gov (United States)

    Di Fiore, Adolfo; Sivolella, Stefano; Stocco, Elena; Favero, Vittorio; Stellini, Edoardo

    2018-02-01

    Implant site preparation through drilling procedures may cause bone thermonecrosis. The aim of this in vitro study was to evaluate, using a thermal probe, overheating at implant sites during osteotomies through 2 different drilling methods (continuous drilling technique versus intermittent drilling technique) using irrigation at different temperatures. Five implant sites 13 mm in length were performed on 16 blocks (fresh bovine ribs), for a total of 80 implant sites. The PT-100 thermal probe was positioned 5 mm from each site. Two physiological refrigerant solutions were used: one at 23.7°C and one at 6.0°C. Four experimental groups were considered: group A (continuous drilling with physiological solution at 23.7°C), group B (intermittent drilling with physiological solution at 23.7°C), group C (continuous drilling with physiological solution at 6.0°C), and group D (intermittent drilling with physiological solution at 6.0°C). The Wilcoxon rank-sum test (2-tailed) was used to compare groups. While there was no difference between group A and group B (W = 86; P = .45), statistically significant differences were observed between experimental groups A and C (W = 0; P =.0001), B and D (W = 45; P =.0005), and C and D (W = 41; P = .003). Implant site preparation did not affect the overheating of the bone. Statistically significant differences were found with the refrigerant solutions. Using both irrigating solutions, bone temperature did not exceed 47°C.

  16. Critical analysis of procurement techniques in construction management sectors

    Science.gov (United States)

    Tiwari, Suman Tiwari Suresh; Chan, Shiau Wei; Faraz Mubarak, Muhammad

    2018-04-01

    Over the last three decades, numerous procurement techniques have been one of the highlights of the Construction Management (CM) for ventures, administration contracting, venture management as well as design and construct. Due to the development and utilization of those techniques, various researchers have explored the criteria for their choice and their execution in terms of time, cost and quality. Nevertheless, there is a lack of giving an account on the relationship between the procurement techniques and the progressed related issues, for example, supply chain, sustainability, innovation and technology development, lean construction, constructability, value management, Building Information Modelling (BIM) as well as e-procurement. Through chosen papers from the reputable CM-related academic journals, the specified scopes of these issues are methodically assessed with the objective to explore the status and trend in procurement related research. The result of this paper contributes theoretically as well as practically to the researchers and industrialist in order to be aware and appreciate the development of procurement techniques.

  17. Protease analysis by zymography: a review on techniques and patents.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2009-01-01

    Zymography, the detection of enzymatic activity on gel electrophoresis, has been a technique described in the literature for at least in the past 50 years. Although a diverse amount of enzymes, especially proteases, have been detected, advances and improvements have been slower in comparison with other molecular biology, biotechnology and chromatography techniques. Most of the reviews and patents published focus on the technique as an element for enzymatic testing, but detailed analytical studies are scarce. Patents referring to zymography per se are few and the technique itself is hardly an important issue in titles or keywords in many scientific publications. This review covers a small condensation of the works published so far dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations like 2-D zymography, real-time zymography, and in-situ zymography. Moreover, a scope will be given to visualize the new tendencies of this method, regarding substrates used and activity visualization. What to expect from zymography in the near future is also approached.

  18. Dynamic Analysis Techniques for the Reconstruction of Architectural Views

    NARCIS (Netherlands)

    Cornelissen, B.

    2007-01-01

    Gaining an understanding of software systems is an important discipline in many software engineering contexts. It is essential that software engineers are assisted as much as possible during this task, e.g., by using tools and techniques that provide architectural views on the software at hand. This

  19. Analysis of ISO 26262 Compliant Techniques for the Automotive Domain

    NARCIS (Netherlands)

    M. S. Kannan; Y. Dajsuren (Yanjindulam); Y. Luo; I. Barosan

    2015-01-01

    htmlabstractThe ISO 26262 standard denes functional safety for automotive E/E systems. Since the publication of the rst edition of this standard in 2011, many dierent safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the

  20. Analysis of ISO 26262 compliant techniques for the automotive domain

    NARCIS (Netherlands)

    S., Manoj Kannan; Dajsuren, Y.; Luo, Y.; Barosan, I.; Antkiewicz, M.; Atlee, J.; Dingel, J.; S, R.

    2015-01-01

    The ISO 26262 standard defines functional safety for automotive E/E systems. Since the publication of the first edition of this standard in 2011, many different safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the standard