WorldWideScience

Sample records for protocol combining analytical

  1. Analytical protocols for characterisation of sulphur-free lignin

    NARCIS (Netherlands)

    Gosselink, R.J.A.; Abächerli, A.; Semke, H.; Malherbe, R.; Käuper, P.; Nadif, A.; Dam, van J.E.G.

    2004-01-01

    Interlaboratory tests for chemical characterisation of sulphur-free lignins were performed by five laboratories to develop useful analytical protocols, which are lacking, and identify quality-related properties. Protocols have been established for reproducible determination of the chemical

  2. Navigating the Benford Labyrinth: A big-data analytic protocol illustrated using the academic library context

    Directory of Open Access Journals (Sweden)

    Michael Halperin

    2016-03-01

    Full Text Available Objective: Big Data Analytics is a panoply of techniques the principal intention of which is to ferret out dimensions or factors from certain data streamed or available over the WWW. We offer a subset or “second” stage protocol of Big Data Analytics (BDA that uses these dimensional datasets as benchmarks for profiling related data. We call this Specific Context Benchmarking (SCB. Method: In effecting this benchmarking objective, we have elected to use a Digital Frequency Profiling (DFP technique based upon the work of Newcomb and Benford, who have developed a profiling benchmark based upon the Log10 function. We illustrate the various stages of the SCB protocol using the data produced by the Academic Research Libraries to enhance insights regarding the details of the operational benchmarking context and so offer generalizations needed to encourage adoption of SCB across other functional domains. Results: An illustration of the SCB protocol is offered using the recently developed Benford Practical Profile as the Conformity Benchmarking Measure. ShareWare: We have developed a Decision Support System called: SpecificContextAnalytics (SCA:DSS to create the various information sets presented in this paper. The SCA:DSS, programmed in Excel VBA, is available from the corresponding author as a free download without restriction to its use. Conclusions: We note that SCB effected using the DFPs is an enhancement not a replacement for the usual statistical and analytic techniques and fits very well in the BDA milieu.

  3. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  4. The combination of four analytical methods to explore skeletal muscle metabolomics: Better coverage of metabolic pathways or a marketing argument?

    Science.gov (United States)

    Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H

    2018-01-30

    Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed

  5. An Analytical Tire Model with Flexible Carcass for Combined Slips

    Directory of Open Access Journals (Sweden)

    Nan Xu

    2014-01-01

    Full Text Available The tire mechanical characteristics under combined cornering and braking/driving situations have significant effects on vehicle directional controls. The objective of this paper is to present an analytical tire model with flexible carcass for combined slip situations, which can describe tire behavior well and can also be used for studying vehicle dynamics. The tire forces and moments come mainly from the shear stress and sliding friction at the tread-road interface. In order to describe complicated tire characteristics and tire-road friction, some key factors are considered in this model: arbitrary pressure distribution; translational, bending, and twisting compliance of the carcass; dynamic friction coefficient; anisotropic stiffness properties. The analytical tire model can describe tire forces and moments accurately under combined slip conditions. Some important properties induced by flexible carcass can also be reflected. The structural parameters of a tire can be identified from tire measurements and the computational results using the analytical model show good agreement with test data.

  6. Analytical approach to cross-layer protocol optimization in wireless sensor networks

    Science.gov (United States)

    Hortos, William S.

    2008-04-01

    In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in

  7. The evaluation of an analytical protocol for the determination of substances in waste for hazard classification

    Energy Technology Data Exchange (ETDEWEB)

    Hennebert, Pierre, E-mail: pierre.hennebert@ineris.fr [INERIS – Institut National de l’Environnement Industriel et des Risques, Domaine du Petit Arbois BP33, F-13545 Aix-en-Provence (France); Papin, Arnaud [INERIS, Parc Technologique ALATA, BP No. 2, 60550 Verneuil en Halatte (France); Padox, Jean-Marie [INERIS – Institut National de l’Environnement Industriel et des Risques, Domaine du Petit Arbois BP33, F-13545 Aix-en-Provence (France); Hasebrouck, Benoît [INERIS, Parc Technologique ALATA, BP No. 2, 60550 Verneuil en Halatte (France)

    2013-07-15

    Highlights: • Knowledge of wastes in substances will be necessary to assess HP1–HP15 hazard properties. • A new analytical protocol is proposed for this and tested by two service laboratories on 32 samples. • Sixty-three percentage of the samples have a satisfactory analytical balance between 90% and 110%. • Eighty-four percentage of the samples were classified identically (Seveso Directive) for their hazardousness by the two laboratories. • The method, in progress, is being normalized in France and is be proposed to CEN. - Abstract: The classification of waste as hazardous could soon be assessed in Europe using largely the hazard properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC–MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of ‘pools’ of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved ‘mass’ during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved ‘pools’) should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter

  8. Abbreviated Combined MR Protocol: A New Faster Strategy for Characterizing Breast Lesions.

    Science.gov (United States)

    Moschetta, Marco; Telegrafo, Michele; Rella, Leonarda; Stabile Ianora, Amato Antonio; Angelelli, Giuseppe

    2016-06-01

    The use of an abbreviated magnetic resonance (MR) protocol has been recently proposed for cancer screening. The aim of our study is to evaluate the diagnostic accuracy of an abbreviated MR protocol combining short TI inversion recovery (STIR), turbo-spin-echo (TSE)-T2 sequences, a pre-contrast T1, and a single intermediate (3 minutes after contrast injection) post-contrast T1 sequence for characterizing breast lesions. A total of 470 patients underwent breast MR examination for screening, problem solving, or preoperative staging. Two experienced radiologists evaluated both standard and abbreviated protocols in consensus. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy for both protocols were calculated (with the histological findings and 6-month ultrasound follow-up as the reference standard) and compared with the McNemar test. The post-processing and interpretation times for the MR images were compared with the paired t test. In 177 of 470 (38%) patients, the MR sequences detected 185 breast lesions. Standard and abbreviated protocols obtained sensitivity, specificity, diagnostic accuracy, PPV, and NPV values respectively of 92%, 92%, 92%, 68%, and 98% and of 89%, 91%, 91%, 64%, and 98% with no statistically significant difference (P < .0001). The mean post-processing and interpretation time were, respectively, 7 ± 1 minutes and 6 ± 3.2 minutes for the standard protocol and 1 ± 1.2 minutes and 2 ± 1.2 minutes for the abbreviated protocol, with a statistically significant difference (P < .01). An abbreviated combined MR protocol represents a time-saving tool for radiologists and patients with the same diagnostic potential as the standard protocol in patients undergoing breast MRI for screening, problem solving, or preoperative staging. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Experimental and analytical combined thermal approach for local tribological understanding in metal cutting

    International Nuclear Information System (INIS)

    Artozoul, Julien; Lescalier, Christophe; Dudzinski, Daniel

    2015-01-01

    Metal cutting is a highly complex thermo-mechanical process. The knowledge of temperature in the chip forming zone is essential to understand it. Conventional experimental methods such as thermocouples only provide global information which is incompatible with the high stress and temperature gradients met in the chip forming zone. Field measurements are essential to understand the localized thermo-mechanical problem. An experimental protocol has been developed using advanced infrared imaging in order to measure temperature distribution in both the tool and the chip during an orthogonal or oblique cutting operation. It also provides several information on the chip formation process such as some geometrical characteristics (tool-chip contact length, chip thickness, primary shear angle) and thermo-mechanical information (heat flux dissipated in deformation zone, local interface heat partition ratio). A study is carried out on the effects of cutting conditions i.e. cutting speed, feed and depth of cut on the temperature distribution along the contact zone for an elementary operation. An analytical thermal model has been developed to process experimental data and access more information i.e. local stress or heat flux distribution. - Highlights: • A thermal analytical model is proposed for orthogonal cutting process. • IR thermography is used during cutting tests. • Combined experimental and modeling approaches are applied. • Heat flux and stress distribution at the tool-chip interface are determined. • The decomposition into sticking and sliding zones is defined.

  10. The interventional effect of new drugs combined with the Stupp protocol on glioblastoma: A network meta-analysis.

    Science.gov (United States)

    Li, Mei; Song, Xiangqi; Zhu, Jun; Fu, Aijun; Li, Jianmin; Chen, Tong

    2017-08-01

    New therapeutic agents in combination with the standard Stupp protocol (a protocol about the temozolomide combined with radiotherapy treatment with glioblastoma was research by Stupp R in 2005) were assessed to evaluate whether they were superior to the Stupp protocol alone, to determine the optimum treatment regimen for patients with newly diagnosed glioblastoma. We implemented a search strategy to identify studies in the following databases: PubMed, Cochrane Library, EMBASE, CNKI, CBM, Wanfang, and VIP, and assessed the quality of extracted data from the trials included. Statistical software was used to perform network meta-analysis. The use of novel therapeutic agents in combination with the Stupp protocol were all shown to be superior than the Stupp protocol alone for the treatment of newly diagnosed glioblastoma, ranked as follows: cilengitide 2000mg/5/week, bevacizumab in combination with irinotecan, nimotuzumab, bevacizumab, cilengitide 2000mg/2/week, cytokine-induced killer cell immunotherapy, and the Stupp protocol. In terms of serious adverse effects, the intervention group showed a 29% increase in the incidence of adverse events compared with the control group (patients treated only with Stupp protocol) with a statistically significant difference (RR=1.29; 95%CI 1.17-1.43; P<0.001). The most common adverse events were thrombocytopenia, lymphopenia, neutropenia, pneumonia, nausea, and vomiting, none of which were significantly different between the groups except for neutropenia, pneumonia, and embolism. All intervention drugs evaluated in our study were superior to the Stupp protocol alone when used in combination with it. However, we could not conclusively confirm whether cilengitide 2000mg/5/week was the optimum regime, as only one trial using this protocol was included in our study. Copyright © 2017. Published by Elsevier B.V.

  11. Combining Vertex-centric Graph Processing with SPARQL for Large-scale RDF Data Analytics

    KAUST Repository

    Abdelaziz, Ibrahim

    2017-06-27

    Modern applications, such as drug repositioning, require sophisticated analytics on RDF graphs that combine structural queries with generic graph computations. Existing systems support either declarative SPARQL queries, or generic graph processing, but not both. We bridge the gap by introducing Spartex, a versatile framework for complex RDF analytics. Spartex extends SPARQL to support programs that combine seamlessly generic graph algorithms (e.g., PageRank, Shortest Paths, etc.) with SPARQL queries. Spartex builds on existing vertex-centric graph processing frameworks, such as Graphlab or Pregel. It implements a generic SPARQL operator as a vertex-centric program that interprets SPARQL queries and executes them efficiently using a built-in optimizer. In addition, any graph algorithm implemented in the underlying vertex-centric framework, can be executed in Spartex. We present various scenarios where our framework simplifies significantly the implementation of complex RDF data analytics programs. We demonstrate that Spartex scales to datasets with billions of edges, and show that our core SPARQL engine is at least as fast as the state-of-the-art specialized RDF engines. For complex analytical tasks that combine generic graph processing with SPARQL, Spartex is at least an order of magnitude faster than existing alternatives.

  12. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2018-01-01

    for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision...... are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation...

  13. Combined Protocol for Acute Malnutrition Study (ComPAS) in rural South Sudan and urban Kenya

    DEFF Research Database (Denmark)

    Bailey, Jeanette; Lelijveld, Natasha; Marron, Bethany

    2018-01-01

    Background: Acute malnutrition is a continuum condition, but severe and moderate forms are treated separately, with different protocols and therapeutic products, managed by separate United Nations agencies. The Combined Protocol for Acute Malnutrition Study (ComPAS) aims to simplify and unify...... the treatment of uncomplicated severe and moderate acute malnutrition (SAM and MAM) for children 6-59 months into one protocol in order to improve the global coverage, quality, continuity of care and cost-effectiveness of acute malnutrition treatment in resource-constrained settings.  Methods/design: This study...... is a multi-site, cluster randomized non-inferiority trial with 12 clusters in Kenya and 12 clusters in South Sudan. Participants are 3600 children aged 6-59 months with uncomplicated acute malnutrition. This study will evaluate the impact of a simplified and combined protocol for the treatment of SAM and MAM...

  14. A Protocol Layer Trust-Based Intrusion Detection Scheme for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Jian Wang

    2017-05-01

    Full Text Available This article proposes a protocol layer trust-based intrusion detection scheme for wireless sensor networks. Unlike existing work, the trust value of a sensor node is evaluated according to the deviations of key parameters at each protocol layer considering the attacks initiated at different protocol layers will inevitably have impacts on the parameters of the corresponding protocol layers. For simplicity, the paper mainly considers three aspects of trustworthiness, namely physical layer trust, media access control layer trust and network layer trust. The per-layer trust metrics are then combined to determine the overall trust metric of a sensor node. The performance of the proposed intrusion detection mechanism is then analyzed using the t-distribution to derive analytical results of false positive and false negative probabilities. Numerical analytical results, validated by simulation results, are presented in different attack scenarios. It is shown that the proposed protocol layer trust-based intrusion detection scheme outperforms a state-of-the-art scheme in terms of detection probability and false probability, demonstrating its usefulness for detecting cross-layer attacks.

  15. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  16. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    Science.gov (United States)

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  17. Investigation on bonding defects in ITER first wall beryllium armour components by combining analytical and experimental methods

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Germán, E-mail: german.perez.pichel@gmail.com; Mitteau, Raphaël; Eaton, Russell; Raffray, René

    2015-12-15

    Highlights: • Bonding defects at the ITER first wall beryllium armour are studied. • Experimental and analytical methods are combined. • Models supporting test results interpretation are proposed. • Guidelines for new experimental protocols are suggested. • Contribution to the definition of defects acceptance criteria. - Abstract: The reliability of the plasma facing components (PFCs) is essential for the efficient plasma operation in a fusion machine. This concerns especially the bond between the armour tiles facing the plasma and the heat sink material (copper alloy). The different thermal expansions of the bonded materials cause a stress distribution in the bond, which peaks at the bond edge. Under cyclic heat flux and accounting for the possible presence of bonding defects, this stress could reach a level where the component might be jeopardised. Because of the complexity of describing realistically by analyses and models the stress evolution in the bond, “design by experiments” is the main procedure for defining and qualifying the armour joint. Most of the existing plasma operation know-how on actively cooled PFCs has been obtained with carbon composite armour tiles. In ITER, the tiles of the first wall are made out of beryllium, which means that the know-how is progressively adapted to this specific bimetallic pair. Nonetheless, analyses are still performed for supporting the R&D experimental programme. This paper: explores methods for combining experimental results with finite element and statistical analyses; benchmarks test results; proposes hypothesis and rationales consistent with test results interpretations; suggests guidelines for defining possible further experimental protocols; and contributes to the definition of defects acceptance criteria.

  18. Investigation on bonding defects in ITER first wall beryllium armour components by combining analytical and experimental methods

    International Nuclear Information System (INIS)

    Pérez, Germán; Mitteau, Raphaël; Eaton, Russell; Raffray, René

    2015-01-01

    Highlights: • Bonding defects at the ITER first wall beryllium armour are studied. • Experimental and analytical methods are combined. • Models supporting test results interpretation are proposed. • Guidelines for new experimental protocols are suggested. • Contribution to the definition of defects acceptance criteria. - Abstract: The reliability of the plasma facing components (PFCs) is essential for the efficient plasma operation in a fusion machine. This concerns especially the bond between the armour tiles facing the plasma and the heat sink material (copper alloy). The different thermal expansions of the bonded materials cause a stress distribution in the bond, which peaks at the bond edge. Under cyclic heat flux and accounting for the possible presence of bonding defects, this stress could reach a level where the component might be jeopardised. Because of the complexity of describing realistically by analyses and models the stress evolution in the bond, “design by experiments” is the main procedure for defining and qualifying the armour joint. Most of the existing plasma operation know-how on actively cooled PFCs has been obtained with carbon composite armour tiles. In ITER, the tiles of the first wall are made out of beryllium, which means that the know-how is progressively adapted to this specific bimetallic pair. Nonetheless, analyses are still performed for supporting the R&D experimental programme. This paper: explores methods for combining experimental results with finite element and statistical analyses; benchmarks test results; proposes hypothesis and rationales consistent with test results interpretations; suggests guidelines for defining possible further experimental protocols; and contributes to the definition of defects acceptance criteria.

  19. Microsystems for liquid-liquid extraction of radionuclides in the analytical protocols

    International Nuclear Information System (INIS)

    Helle, Gwendolyne

    2014-01-01

    Radiochemical analyses are necessary to numerous steps for nuclear wastes management and for the control of the environment. An analytical protocol generally includes different steps of chemical separations which are lengthy, manual and complicated to implement because of their confinement in glove boxes and because of the hostile chemical and radiochemical media. Thus there is a huge importance to propose innovative and robust solutions to automate these steps but also to reduce the volumes of the radioactive and chemical wastes at the end of the analytical cycle. One solution consists in the miniaturization of the analyses through the use of lab-on-chip. The objective of this thesis work was to propose a rational approach to the conception of separative microsystems for the liquid-liquid extraction of radionuclides. To achieve this, the hydrodynamic behavior as well as the extraction performances have been investigated in one chip for three different chemical systems: Eu(III)-HNO 3 /DMDBTDMA, Eu(III)-AcO(H,Na)-HNO 3 /HDEHP and U(VI)-HCl/Aliquat336. A methodology has been developed for the implementation of the liquid-liquid extraction in micro-system for each chemical system. The influence of various geometric parameters such as channel length or specific interfacial area has been studied and the comparison of the liquid-liquid extraction performances has led to highlight the influence of the phases viscosities ratio on the flows. Thanks to the modeling of both hydrodynamics and mass transfer in micro-system, the criteria related to physical and kinetic properties of the chemical systems have been distinguished to propose a rational conception of tailor-made chips. Finally, several examples of the liquid-liquid extraction implementation in micro-system have been described for analytical applications in the nuclear field: U/Co separation by Aliquat336, Eu/Sm separation by DMDBTDMA or even the coupling between a liquid-liquid extraction chip and the system of

  20. In-depth characterization of prebiotic galactooligosaccharides by a combination of analytical techniques

    NARCIS (Netherlands)

    Coulier, L.; Timmermans, J.; Richard, B.; Dool, R. van den; Haaksman, I.; Klarenbeek, B.; Slaghek, T.; Dongen, W. van

    2009-01-01

    A commercial prebiotic galacto-oligosaccharide mixture (Vivinal GOS) was extensively characterized using a combination of analytical techniques. The different techniques were integrated to give complementary information on specific characteristics of the oligosaccharide mixture, ranging from global

  1. The evaluation of an analytical protocol for the determination of substances in waste for hazard classification.

    Science.gov (United States)

    Hennebert, Pierre; Papin, Arnaud; Padox, Jean-Marie; Hasebrouck, Benoît

    2013-07-01

    The classification of waste as hazardous could soon be assessed in Europe using largely the hazard properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC-MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of 'pools' of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved 'mass' during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved 'pools') should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter results. Despite discrepancies in some parameters, a satisfactory sum of estimated or measured concentrations (analytical balance) of 90% was reached for 20 samples (63% of the overall total) during this first test exercise, with identified reasons for most of the unsatisfactory results. Regular use of this protocol (which is now included in the French legislation) has enabled service laboratories to reach a 90% mass balance for nearly all the solid samples tested, and most of liquid samples (difficulties were caused in some samples from polymers in solution and

  2. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    Science.gov (United States)

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Combining Vertex-centric Graph Processing with SPARQL for Large-scale RDF Data Analytics

    KAUST Repository

    Abdelaziz, Ibrahim; Al-Harbi, Mohammad Razen; Salihoglu, Semih; Kalnis, Panos

    2017-01-01

    , but not both. We bridge the gap by introducing Spartex, a versatile framework for complex RDF analytics. Spartex extends SPARQL to support programs that combine seamlessly generic graph algorithms (e.g., PageRank, Shortest Paths, etc.) with SPARQL queries

  4. The Protocol of Fixed Reconstruction for Severely Worn Teeth Combined with Anterior Deep Bite

    Directory of Open Access Journals (Sweden)

    Ya-Wen Zhao

    2017-01-01

    Full Text Available Full mouth reconstruction is one of the most effective methods to restore severe worn teeth that have suffered reduced vertical dimension. Although the use of the overlay splint restoration for a trial period allowing the patient to adapt to an increased vertical dimension is the recognized method, the specific protocol from the transitional splint to the fixed reconstruction is yet to be established. This case report describes a 50-year-old female patient who has severely worn teeth combined with an anterior deep bite and chewing pain. The protocol of the treatment process is described.

  5. Combination of Cyclodextrin and Ionic Liquid in Analytical Chemistry: Current and Future Perspectives.

    Science.gov (United States)

    Hui, Boon Yih; Raoov, Muggundha; Zain, Nur Nadhirah Mohamad; Mohamad, Sharifah; Osman, Hasnah

    2017-09-03

    The growth in driving force and popularity of cyclodextrin (CDs) and ionic liquids (ILs) as promising materials in the field of analytical chemistry has resulted in an exponentially increase of their exploitation and production in analytical chemistry field. CDs belong to the family of cyclic oligosaccharides composing of α-(1,4) linked glucopyranose subunits and possess a cage-like supramolecular structure. This structure enables chemical reactions to proceed between interacting ions, radical or molecules in the absence of covalent bonds. Conversely, ILs are an ionic fluids comprising of only cation and anion often with immeasurable vapor pressure making them as green or designer solvent. The cooperative effect between CD and IL due to their fascinating properties, have nowadays contributed their footprints for a better development in analytical chemistry nowadays. This comprehensive review serves to give an overview on some of the recent studies and provides an analytical trend for the application of CDs with the combination of ILs that possess beneficial and remarkable effects in analytical chemistry including their use in various sample preparation techniques such as solid phase extraction, magnetic solid phase extraction, cloud point extraction, microextraction, and separation techniques which includes gas chromatography, high-performance liquid chromatography, capillary electrophoresis as well as applications of electrochemical sensors as electrode modifiers with references to recent applications. This review will highlight the nature of interactions and synergic effects between CDs, ILs, and analytes. It is hoped that this review will stimulate further research in analytical chemistry.

  6. Quantification of theobromine and caffeine in saliva, plasma and urine via liquid chromatography-tandem mass spectrometry: a single analytical protocol applicable to cocoa intervention studies.

    Science.gov (United States)

    Ptolemy, Adam S; Tzioumis, Emma; Thomke, Arjun; Rifai, Sami; Kellogg, Mark

    2010-02-01

    Targeted analyses of clinically relevant metabolites in human biofluids often require extensive sample preparation (e.g., desalting, protein removal and/or preconcentration) prior to quantitation. In this report, a single ultra-centrifugation based sample pretreatment combined with a designed liquid chromatography-tandem mass spectrometry (LC-MS/MS) protocol provides selective quantification of 3,7-dimethylxanthine (theobromine) and 1,3,7-trimethylxanthine (caffeine) in human saliva, plasma and urine samples. The optimized chromatography permitted elution of both analytes within 1.3 min of the applied gradient. Positive-mode electrospray ionization and a triple quadruple MS/MS instrument operated in multiple reaction mode were used for detection. (13)C(3) isotopically labeled caffeine was included as an internal standard to improve accuracy and precision. Implementing a 20-fold dilution of the isolated low MW biofluid fraction prior to injection effectively minimized the deleterious contributions of all three matrices to quantitation. The assay was linear over a 160-fold concentration range from 2.5 to 400 micromol L(-1) for both theobromine (average R(2) 0.9968) and caffeine (average R(2) 0.9997) respectively. Analyte peak area variations for 2.5 micromol L(-1) caffeine and theobromine in saliva, plasma and urine ranged from 5 and 10% (intra-day, N=10) to 9 and 13% (inter-day, N=25) respectively. The intra- and inter-day precision of theobromine and caffeine elution times were 3 and theobromine ranged from 114 to 118% and 99 to 105% at concentration levels of 10 and 300 micromol L(-1). This validated protocol also permitted the relative saliva, plasma and urine distribution of both theobromine and caffeine to be quantified following a cocoa intervention. 2009 Elsevier B.V. All rights reserved.

  7. Gender-Specific Combination HIV Prevention for Youth in High-Burden Settings: The MP3 Youth Observational Pilot Study Protocol.

    Science.gov (United States)

    Buttolph, Jasmine; Inwani, Irene; Agot, Kawango; Cleland, Charles M; Cherutich, Peter; Kiarie, James N; Osoti, Alfred; Celum, Connie L; Baeten, Jared M; Nduati, Ruth; Kinuthia, John; Hallett, Timothy B; Alsallaq, Ramzi; Kurth, Ann E

    2017-03-08

    Nearly three decades into the epidemic, sub-Saharan Africa (SSA) remains the region most heavily affected by human immunodeficiency virus (HIV), with nearly 70% of the 34 million people living with HIV globally residing in the region. In SSA, female and male youth (15 to 24 years) are at a disproportionately high risk of HIV infection compared to adults. As such, there is a need to target HIV prevention strategies to youth and to tailor them to a gender-specific context. This protocol describes the process for the multi-staged approach in the design of the MP3 Youth pilot study, a gender-specific, combination, HIV prevention intervention for youth in Kenya. The objective of this multi-method protocol is to outline a rigorous and replicable methodology for a gender-specific combination HIV prevention pilot study for youth in high-burden settings, illustrating the triangulated methods undertaken to ensure that age, sex, and context are integral in the design of the intervention. The mixed-methods, cross-sectional, longitudinal cohort pilot study protocol was developed by first conducting a systematic review of the literature, which shaped focus group discussions around prevention package and delivery options, and that also informed age- and sex- stratified mathematical modeling. The review, qualitative data, and mathematical modeling created a triangulated evidence base of interventions to be included in the pilot study protocol. To design the pilot study protocol, we convened an expert panel to select HIV prevention interventions effective for youth in SSA, which will be offered in a mobile health setting. The goal of the pilot study implementation and evaluation is to apply lessons learned to more effective HIV prevention evidence and programming. The combination HIV prevention package in this protocol includes (1) offering HIV testing and counseling for all youth; (2) voluntary medical circumcision and condoms for males; (3) pre-exposure prophylaxis (Pr

  8. Dynamic Channel Slot Allocation Scheme and Performance Analysis of Cyclic Quorum Multichannel MAC Protocol

    Directory of Open Access Journals (Sweden)

    Xing Hu

    2017-01-01

    Full Text Available In high diversity node situation, multichannel MAC protocol can improve the frequency efficiency, owing to fewer collisions compared with single-channel MAC protocol. And the performance of cyclic quorum-based multichannel (CQM MAC protocol is outstanding. Based on cyclic quorum system and channel slot allocation, it can avoid the bottleneck that others suffered from and can be easily realized with only one transceiver. To obtain the accurate performance of CQM MAC protocol, a Markov chain model, which combines the channel-hopping strategy of CQM protocol and IEEE 802.11 distributed coordination function (DCF, is proposed. The results of numerical analysis show that the optimal performance of CQM protocol can be obtained in saturation bound situation. And then we obtain the saturation bound of CQM system by bird swarm algorithm. In addition, to improve the performance of CQM protocol in unsaturation situation, a dynamic channel slot allocation of CQM (DCQM protocol is proposed, based on wavelet neural network. Finally, the performance of CQM protocol and DCQM protocol is simulated by Qualnet platform. And the simulation results show that the analytic and simulation results match very well; the DCQM performs better in unsaturation situation.

  9. Business Analytics and Performance Management: A Small Data Example Combining TD-ABC and BSC for Simulation and Optimization

    DEFF Research Database (Denmark)

    Nielsen, Steen

    The purpose of this paper is twofold: first, it discuss the potentials of combining performance management with the concept and methodology of business analytics. The inspiration for this stems from the intensified discussions and use of business analytics and performance in organizations by both...

  10. An analytical solution for two-dimensional vacuum preloading combined with electro-osmosis consolidation using EKG electrodes

    Science.gov (United States)

    Qiu, Chenchen; Li, Yande

    2017-01-01

    China is a country with vast territory, but economic development and population growth have reduced the usable land resources in recent years. Therefore, reclamation by pumping and filling is carried out in eastern coastal regions of China in order to meet the needs of urbanization. However, large areas of reclaimed land need rapid drainage consolidation treatment. Based on past researches on how to improve the treatment efficiency of soft clay using vacuum preloading combined with electro-osmosis, a two-dimensional drainage plane model was proposed according to the Terzaghi and Esrig consolidation theory. However, the analytical solution using two-dimensional plane model was never involved. Current analytical solutions can’t have a thorough theoretical analysis of practical engineering and give relevant guidance. Considering the smearing effect and the rectangle arrangement pattern, an analytical solution is derived to describe the behavior of pore-water and the consolidation process by using EKG (electro-kinetic geo synthetics) materials. The functions of EKG materials include drainage, electric conduction and corrosion resistance. Comparison with test results is carried out to verify the analytical solution. It is found that the measured value is larger than the applied vacuum degree because of the stacking effect of the vacuum preloading and electro-osmosis. The trends of the mean measured value and the mean analytical value processes are comparable. Therefore, the consolidation model can accurately assess the change in pore-water pressure and the consolidation process during vacuum preloading combined with electro-osmosis. PMID:28771496

  11. An analytical solution for two-dimensional vacuum preloading combined with electro-osmosis consolidation using EKG electrodes.

    Directory of Open Access Journals (Sweden)

    Yang Shen

    Full Text Available China is a country with vast territory, but economic development and population growth have reduced the usable land resources in recent years. Therefore, reclamation by pumping and filling is carried out in eastern coastal regions of China in order to meet the needs of urbanization. However, large areas of reclaimed land need rapid drainage consolidation treatment. Based on past researches on how to improve the treatment efficiency of soft clay using vacuum preloading combined with electro-osmosis, a two-dimensional drainage plane model was proposed according to the Terzaghi and Esrig consolidation theory. However, the analytical solution using two-dimensional plane model was never involved. Current analytical solutions can't have a thorough theoretical analysis of practical engineering and give relevant guidance. Considering the smearing effect and the rectangle arrangement pattern, an analytical solution is derived to describe the behavior of pore-water and the consolidation process by using EKG (electro-kinetic geo synthetics materials. The functions of EKG materials include drainage, electric conduction and corrosion resistance. Comparison with test results is carried out to verify the analytical solution. It is found that the measured value is larger than the applied vacuum degree because of the stacking effect of the vacuum preloading and electro-osmosis. The trends of the mean measured value and the mean analytical value processes are comparable. Therefore, the consolidation model can accurately assess the change in pore-water pressure and the consolidation process during vacuum preloading combined with electro-osmosis.

  12. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    Science.gov (United States)

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of

  13. A combined analytic-numeric approach for some boundary-value problems

    Directory of Open Access Journals (Sweden)

    Mustafa Turkyilmazoglu

    2016-02-01

    Full Text Available A combined analytic-numeric approach is undertaken in the present work for the solution of boundary-value problems in the finite or semi-infinite domains. Equations to be treated arise specifically from the boundary layer analysis of some two and three-dimensional flows in fluid mechanics. The purpose is to find quick but accurate enough solutions. Taylor expansions at either boundary conditions are computed which are next matched to the other asymptotic or exact boundary conditions. The technique is applied to the well-known Blasius as well as Karman flows. Solutions obtained in terms of series compare favorably with the existing ones in the literature.

  14. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    OpenAIRE

    Saurabh B. Ganorkar; Dinesh M. Dhumal; Atul A. Shirkhedkar

    2017-01-01

    A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral), oxidative, photolytic (acidic, basic, neutral, solid state) and thermal (dry heat) degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm) by isocratic mode at ambie...

  15. Vertical Protocol Composition

    DEFF Research Database (Denmark)

    Groß, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    The security of key exchange and secure channel protocols, such as TLS, has been studied intensively. However, only few works have considered what happens when the established keys are actually used—to run some protocol securely over the established “channel”. We call this a vertical protocol.......e., that the combination cannot introduce attacks that the individual protocols in isolation do not have. In this work, we prove a composability result in the symbolic model that allows for arbitrary vertical composition (including self-composition). It holds for protocols from any suite of channel and application...

  16. Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engebrecht, Cheryn [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2010-09-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  17. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Modelling the protocol stack in NCS with deterministic and stochastic petri net

    Science.gov (United States)

    Hui, Chen; Chunjie, Zhou; Weifeng, Zhu

    2011-06-01

    Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.

  19. The effect on reliability and sensitivity to level of training of combining analytic and holistic rating scales for assessing communication skills in an internal medicine resident OSCE.

    Science.gov (United States)

    Daniels, Vijay John; Harley, Dwight

    2017-07-01

    Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Chapter 23: Combined Heat and Power Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Simons, George [Itron, Davis, CA (United States); Barsun, Stephan [Itron, Davis, CA (United States)

    2017-11-06

    The main focus of most evaluations is to determine the energy-savings impacts of the installed measure. This protocol defines a combined heat and power (CHP) measure as a system that sequentially generates both electrical energy and useful thermal energy from one fuel source at a host customer's facility or residence. This protocol is aimed primarily at regulators and administrators of ratepayer-funded CHP programs; however, project developers may find the protocol useful to understand how CHP projects are evaluated.

  1. Analytical study of the conjecture rule for the combination of multipole effects in LHC

    CERN Document Server

    Guignard, Gilbert

    1997-01-01

    This paper summarizes the analytical investigation done on the conjecture law found by tracking for the effect on the dynamic aperture of the combination of two multipoles of various order. A one-dimensional model leading to an integrable system has been used to find closed formulae for the dynamic aperture associated with a fully distributed multipole. The combination has then been studied and the resulting expression compared with the assumed conjecture law. For integrated multipoles small with respect to the focusing strength, the conjecture appears to hold, though with an exponent different from the one expected by crude reasoning.

  2. A New Anaesthetic Protocol for Adult Zebrafish (Danio rerio: Propofol Combined with Lidocaine.

    Directory of Open Access Journals (Sweden)

    Ana M Valentim

    Full Text Available The increasing use of zebrafish model has not been accompanied by the evolution of proper anaesthesia for this species in research. The most used anaesthetic in fishes, MS222, may induce aversion, reduction of heart rate, and consequently high mortality, especially during long exposures. Therefore, we aim to explore new anaesthetic protocols to be used in zebrafish by studying the quality of anaesthesia and recovery induced by different concentrations of propofol alone and in combination with different concentrations of lidocaine.In experiment A, eighty-three AB zebrafish were randomly assigned to 7 different groups: control, 2.5 (2.5P, 5 (5P or 7.5 μg/ml (7.5P of propofol; and 2.5 μg/ml of propofol combined with 50, (P/50L, 100 (P/100L or 150 μg/ml (P/150L of lidocaine. Zebrafish were placed in an anaesthetic water bath and time to lose the equilibrium, reflex to touch, reflex to a tail pinch, and respiratory rate were measured. Time to gain equilibrium was also assessed in a clean tank. Five and 24 hours after anaesthesia recovery, zebrafish were evaluated concerning activity and reactivity. Afterwards, in a second phase of experiments (experiment B, the best protocol of the experiment A was compared with a new group of 8 fishes treated with 100 mg/L of MS222 (100M.In experiment A, only different concentrations of propofol/lidocaine combination induced full anaesthesia in all animals. Thus only these groups were compared with a standard dose of MS222 in experiment B. Propofol/lidocaine induced a quicker loss of equilibrium, and loss of response to light and painful stimuli compared with MS222. However zebrafish treated with MS222 recovered quickly than the ones treated with propofol/lidocaine.In conclusion, propofol/lidocaine combination and MS222 have advantages in different situations. MS222 is ideal for minor procedures when a quick recovery is important, while propofol/lidocaine is best to induce a quick and complete anaesthesia.

  3. Analyzing the effect of routing protocols on media access control protocols in radio networks

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C. L. (Christopher L.); Drozda, M. (Martin); Marathe, A. (Achla); Marathe, M. V. (Madhav V.)

    2002-01-01

    We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i) number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.

  4. Analytical protocol to study the food safety of (multiple-)recycled high-density polyethylene (HDPE) and polypropylene (PP) crates: Influence of recycling on the migration and formation of degradation products

    NARCIS (Netherlands)

    Coulier, L.; Orbons, H.G.M.; Rijk, R.

    2007-01-01

    An analytical protocol was set up and successfully applied to study the food safety of recycled HDPE and PP crates. A worst-case scenario was applied that focused not only on overall migration and specific migration of accepted starting materials but also on migratable degradation products of

  5. Building America House Simulation Protocols (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Engebrecht, C.

    2010-10-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  6. A tiered analytical protocol for the characterization of heavy oil residues at petroleum-contaminated hazardous waste sites

    International Nuclear Information System (INIS)

    Pollard, S.J.T.; Kenefick, S.L.; Hrudey, S.E.; Fuhr, B.J.; Holloway, L.R.; Rawluk, M.

    1994-01-01

    The analysis of hydrocarbon-contaminated soils from abandoned refinery sites in Alberta, Canada is used to illustrate a tiered analytical approach to the characterization of complex hydrocarbon wastes. Soil extracts isolated from heavy oil- and creosote-contaminated sites were characterized by thin layer chromatography with flame ionization detection (TLC-FID), ultraviolet fluorescence, simulated distillation (GC-SIMDIS) and chemical ionization GC-MS analysis. The combined screening and detailed analytical methods provided information essential to remedial technology selection including the extent of contamination, the class composition of soil extracts, the distillation profile of component classes and the distribution of individual class components within various waste fractions. Residual contamination was characteristic of heavy, degraded oils, consistent with documented site operations and length of hydrocarbon exposure at the soil surface

  7. A combined experimental and analytical approach for interface fracture parameters of dissimilar materials in electronic packages

    International Nuclear Information System (INIS)

    Kay, N.R.; Ghosh, S.; Guven, I.; Madenci, E.

    2006-01-01

    This study concerns the development of a combined experimental and analytical technique to determine the critical values of fracture parameters for interfaces between dissimilar materials in electronic packages. This technique utilizes specimens from post-production electronic packages. The mechanical testing is performed inside a scanning electron microscope while the measurements are achieved by means of digital image correlation. The measured displacements around the crack tip are used as the boundary conditions for the analytical model to compute the energy release rate. The critical energy release rate values obtained from post-production package specimens are obtained to be lower than those laboratory specimens

  8. Recolonization of the oral cavity by Streptococcus mutans after a combined mechanical/chemical antisepsis protocol.

    Science.gov (United States)

    Farina, R; Squarzoni, M A; Calura, G; Trombelli, L

    2009-06-01

    The bacterial colonization of teeth by Streptococcus mutans (StrepM) represents a major risk factor for the development of dental caries. At present, no clinical studies have explored the effect of a combined mechanical-chemical antisepsis protocol in a periodontally-healthy population and the pattern of recolonization of StrepM in subjects whose StrepM infection was successfully eradicated. The present study was designed in order to 1) determine the salivary and plaque changes in StrepM content after a combined mechanical/chemical antisepsis protocol; and 2) evaluate the pattern of recolonization when StrepM was successfully eradicated from saliva and plaque. Thirty-five periodontally-healthy and caries-susceptible subjects successfully entered and concluded the study. At baseline, non-surgical periodontal therapy was performed according to the principles of full mouth disinfection. Adjunctive home-based rinsing with a 0.2% chlorhexidine mouthrinse was requested for the following week. StrepM concentration was assessed in saliva and plaque at the initial contact appointment, at baseline, and 1-week, 1-month, 3-month and 6-month follow-up. A significant effect of ''time'' on StrepM concentration in saliva and plaque was observed (P<0.000). In subjects with successful eradication of StrepM at 1 week (N=17 plaque samples), StrepM infection recurrence occurred within 3-6 months. The results of the present study demonstrated that 1) the application of the investigated mechanical/chemical antisepsis protocol can effectively reduce StrepM colonies in saliva and plaque of periodontally healthy subjects; and 2) in plaque samples, StrepM infection recurrence tends to occur within 3-6 months.

  9. On privacy-preserving protocols for smart metering systems security and privacy in smart grids

    CERN Document Server

    Borges de Oliveira, Fábio

    2017-01-01

    This book presents current research in privacy-preserving protocols for smart grids. It contains several approaches and compares them analytically and by means of simulation. In particular, the book introduces asymmetric DC-Nets, which offer an ideal combination of performance and features in comparison with homomorphic encryption; data anonymization via cryptographic protocols; and data obfuscation by means of noise injection or by means of the installation of storage banks. The author shows that this theory can be leveraged into several application scenarios, and how asymmetric DC-Nets are generalizations of additive homomorphic encryption schemes and abstractions of symmetric DC-Nets. The book provides the reader with an understanding about smart grid scenarios, the privacy problem, and the mathematics and algorithms used to solve it.

  10. Multi-site study of additive genetic effects on fractional anisotropy of cerebral white matter: comparing meta and mega analytical approaches for data pooling

    Science.gov (United States)

    Kochunov, Peter; Jahanshad, Neda; Sprooten, Emma; Nichols, Thomas E.; Mandl, René C.; Almasy, Laura; Booth, Tom; Brouwer, Rachel M.; Curran, Joanne E.; de Zubicaray, Greig I.; Dimitrova, Rali; Duggirala, Ravi; Fox, Peter T.; Hong, L. Elliot; Landman, Bennett A.; Lemaitre, Hervé; Lopez, Lorna; Martin, Nicholas G.; McMahon, Katie L.; Mitchell, Braxton D.; Olvera, Rene L.; Peterson, Charles P.; Starr, John M.; Sussmann, Jessika E.; Toga, Arthur W.; Wardlaw, Joanna M.; Wright, Margaret J.; Wright, Susan N.; Bastin, Mark E.; McIntosh, Andrew M.; Boomsma, Dorret I.; Kahn, René S.; den Braber, Anouk; de Geus, Eco JC; Deary, Ian J.; Hulshoff Pol, Hilleke E.; Williamson, Douglas E.; Blangero, John; van ’t Ent, Dennis; Thompson, Paul M.; Glahn, David C.

    2014-01-01

    Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9–85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large “mega-family”. We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability. PMID:24657781

  11. Combined microfluidization and ultrasonication: a synergistic protocol for high-efficient processing of SWCNT dispersions with high quality

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Sida, E-mail: s.luo@buaa.edu.cn [Beihang University, School of Mechanical Engineering and Automation (China); Liu, Tao, E-mail: tliu@fsu.edu [Florida State University, High-Performance Materials Institute (United States); Wang, Yong; Li, Liuhe [Beihang University, School of Mechanical Engineering and Automation (China); Wang, Guantao; Luo, Yun [China University of Geosciences, Center of Safety Research, School of Engineering and Technology (China)

    2016-08-15

    High-efficient and large-scale production of high-quality CNT dispersions is necessary for meeting the future needs to develop various CNT-based electronic devices. Herein, we have designed novel processing protocols by combining conventional ultrasonication process with a new microfluidization technique to produce high-quality SWCNT dispersions with improved processing efficiency. To judge the quality of SWCNT dispersions, one critical factor is the degree of exfoliation, which could be quantified by both geometrical dimension of the exfoliated nanotubes and percentage of individual tubes in a given dispersion. In this paper, the synergistic effect of the combined protocols was systematically investigated through evaluating SWCNT dispersions with newly developed characterization techniques, namely preparative ultracentrifuge method (PUM) and simultaneous Raman scattering and photoluminescence spectroscopy (SRSPL). The results of both techniques draw similar conclusions that as compared with either of the processes operated separately, a low-pass microfluidization followed by a reasonable duration of ultrasonication could substantially improve the processing efficiency to produce high-quality SWCNT dispersions with averaged particle length and diameter as small as ~600 and ~2 nm, respectively.Graphical abstract.

  12. A Standardized and Reproducible Urine Preparation Protocol for Cancer Biomarkers Discovery

    Directory of Open Access Journals (Sweden)

    Julia Beretov

    2014-01-01

    Full Text Available A suitable and standardized protein purification technique is essential to maintain consistency and to allow data comparison between proteomic studies for urine biomarker discovery. Ultimately, efforts should be made to standardize urine preparation protocols. The aim of this study was to develop an optimal analytical protocol to achieve maximal protein yield and to ensure that this method was applicable to examine urine protein patterns that distinguish disease and disease-free states. In this pilot study, we compared seven different urine sample preparation methods to remove salts, and to precipitate and isolate urinary proteins. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE profiles showed that the sequential preparation of urinary proteins by combining acetone and trichloroacetic acid (TCA alongside high speed centrifugation (HSC provided the best separation, and retained the most urinary proteins. Therefore, this approach is the preferred method for all further urine protein analysis.

  13. NEREC, an effective brain mapping protocol for combined language and long-term memory functions.

    Science.gov (United States)

    Perrone-Bertolotti, Marcela; Girard, Cléa; Cousin, Emilie; Vidal, Juan Ricardo; Pichat, Cédric; Kahane, Philippe; Baciu, Monica

    2015-12-01

    Temporal lobe epilepsy can induce functional plasticity in temporoparietal networks involved in language and long-term memory processing. Previous studies in healthy subjects have revealed the relative difficulty for this network to respond effectively across different experimental designs, as compared to more reactive regions such as frontal lobes. For a protocol to be optimal for clinical use, it has to first show robust effects in a healthy cohort. In this study, we developed a novel experimental paradigm entitled NEREC, which is able to reveal the robust participation of temporoparietal networks in a uniquely combined language and memory task, validated in an fMRI study with healthy subjects. Concretely, NEREC is composed of two runs: (a) an intermixed language-memory task (confrontation naming associated with encoding in nonverbal items, NE) to map language (i.e., word retrieval and lexico-semantic processes) combined with simultaneous long-term verbal memory encoding (NE items named but also explicitly memorized) and (b) a memory retrieval task of items encoded during NE (word recognition, REC) intermixed with new items. Word recognition is based on both perceptual-semantic familiarity (feeling of 'know') and accessing stored memory representations (remembering). In order to maximize the remembering and recruitment of medial temporal lobe structures, we increased REC difficulty by changing the modality of stimulus presentation (from nonverbal during NE to verbal during REC). We report that (a) temporoparietal activation during NE was attributable to both lexico-semantic (language) and memory (episodic encoding and semantic retrieval) processes; that (b) encoding activated the left hippocampus, bilateral fusiform, and bilateral inferior temporal gyri; and that (c) task recognition (recollection) activated the right hippocampus and bilateral but predominant left fusiform gyrus. The novelty of this protocol consists of (a) combining two tasks in one (language

  14. Clinical results after different protocols of combined local heat and radiation

    International Nuclear Information System (INIS)

    Arcangeli, G.; Cividalli, A.; Nervi, C.; Lovisolo, G.

    1983-01-01

    Since 1977, 69 patients with 138 multiple lesions have been treated with combined radiotherapy and hyperthermia, according to 3 protocols. Firstly, radiotherapy was given following a thrice-a-day fractionation scheme of 1.5 to 2 Gy/fraction, up to 60 Gy. Hyperthermia (42.5 0 C/45 min) was applied each other day, immediately after the 2nd radiation fraction. Immediate response resulted significantly higher in the combined group (76% clearances in comparison with 46% after radiotherapy alone). Secondly, tumors received 40 Gy/8 fractions, twice a week, and hyperthermia (42.5 0 C/45 min) was applied with each radiotherapy fraction, either immediately after irradiation (simultaneously) of 4 h later (sequentially). A remarkable improvement of radiation response was obtained, especially with the simultaneous treatment. Thirdly, tumors received 30 Gy/6 fractions, twice a week. Hyperthermia (45 0 C/30 min) was applied simultaneously with each radiotherapy fraction and the surrounding skin was cooled. Complete tumor clearance was achieved in 88% lesions in comparison with 31% after radiotherapy alone. As expected, the incidence of thermal damage on uncooled skin was also increased. In conclusion, the best therapeutic ratio was obtained with low fractional radiotherapy doses and low temperature hyperthermia. (orig.) [de

  15. [New protocol combining orthodontics and implant therapy for partially edentulous adult patients. Part I: Description of the Decker protocol].

    Science.gov (United States)

    Davarpanah, K; Decker, A; Sache, M P; Deffrennes, D; Demurashvili, G; Szmukler-Moncler, S

    2014-12-01

    The treatment of adult malocclusion is usually complex and pluridisciplinary. Its prognosis is not reliable. We present a new clinical protocol to improve the management and final result. We use a specific software (Simplant®, OMS®, Materialise Dental) and its accessory modules. It allows visualization of the expected final clinical result of the orthodontic treatment. Combined with guided-surgery, it allows placing implants at the beginning of treatment in a position that is compatible with the final position. The implants serve as absolute anchorage for dental motions during the orthodontic step; it is also used to support the final prosthesis. The treatment is thus optimized and its prognosis is improved. Finally, the reversed surgical sequences shorten the treatment thus promoting the compliance of patients. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  16. Risk of Deep vein thrombosis in neurosurgery: State of the art on prophylaxis protocols and best clinical practices.

    Science.gov (United States)

    Ganau, Mario; Prisco, Lara; Cebula, Helene; Todeschi, Julien; Abid, Houssem; Ligarotti, Gianfranco; Pop, Raoul; Proust, Francois; Chibbaro, Salvatore

    2017-11-01

    To analytically discuss some protocols in Deep vein thrombosis (DVT)/pulmonary Embolism (PE) prophylaxis currently use in Neurosurgical Departments around the world. Analysis of the prophylaxis protocols in the English literature: An analytical and narrative review of literature concerning DVT prophylaxis protocols in Neurosurgery have been conducted by a PubMed search (back to 1978). 80 abstracts were reviewed, and 74 articles were extracted. The majority of DVT seems to develop within the first week after a neurosurgical procedure, and a linear correlation between the duration of surgery and DVT occurrence has been highlighted. The incidence of DVT seems greater for cranial (7.7%) than spinal procedures (1.5%). Although intermittent pneumatic compression (IPC) devices provided adequate reduction of DVT/PE in some cranial and combined cranial/spinal series, low-dose subcutaneous unfractionated heparin (UFH) or low molecular-weight heparin (LMWH) further reduced the incidence, not always of DVT, but of PE. Nevertheless, low-dose heparin-based prophylaxis in cranial and spinal series risks minor and major postoperative haemorrhages: 2-4% in cranial series, 3.4% minor and 3.4% major haemorrhages in combined cranial/spinal series, and a 0.7% incidence of major/minor haemorrhages in spinal series. This analysis showed that currently most of the articles are represented by case series and case reports. As long as clear guidelines will not be defined and universally applied to this diverse group of patients, any prophylaxis for DVT and PE should be tailored to the individual patient with cautious assessment of benefits versus risks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    Science.gov (United States)

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  18. Agricultural Soil Spectral Response and Properties Assessment: Effects of Measurement Protocol and Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Asa Gholizadeh

    2017-10-01

    Full Text Available Soil spectroscopy has shown to be a fast, cost-effective, environmentally friendly, non-destructive, reproducible and repeatable analytical technique. Soil components, as well as types of instruments, protocols, sampling methods, sample preparation, spectral acquisition techniques and analytical algorithms have a combined influence on the final performance. Therefore, it is important to characterize these differences and to introduce an effective approach in order to minimize the technical factors that alter reflectance spectra and consequent prediction. To quantify this alteration, a joint project between Czech University of Life Sciences Prague (CULS and Tel-Aviv University (TAU was conducted to estimate Cox, pH-H2O, pH-KCl and selected forms of Fe and Mn. Two different soil spectral measurement protocols and two data mining techniques were used to examine seventy-eight soil samples from five agricultural areas in different parts of the Czech Republic. Spectral measurements at both laboratories were made using different ASD spectroradiometers. The CULS protocol was based on employing a contact probe (CP spectral measurement scheme, while the TAU protocol was carried out using a CP measurement method, accompanied with the internal soil standard (ISS procedure. Two spectral datasets, acquired from different protocols, were both analyzed using partial least square regression (PLSR technique as well as the PARACUDA II®, a new data mining engine for optimizing PLSR models. The results showed that spectra based on the CULS setup (non-ISS demonstrated significantly higher albedo intensity and reflectance values relative to the TAU setup with ISS. However, the majority of statistics using the TAU protocol was not noticeably better than the CULS spectra. The paper also highlighted that under both measurement protocols, the PARACUDA II® engine proved to be a powerful tool for providing better results than PLSR. Such initiative is not only a way to

  19. Pre-analytical and post-analytical evaluation in the era of molecular diagnosis of sexually transmitted diseases: cellularity control and internal control

    Directory of Open Access Journals (Sweden)

    Loria Bianchi

    2014-06-01

    Full Text Available Background. Increase of molecular tests performed on DNA extracted from various biological materials should not be carried out without an adequate standardization of the pre-analytical and post-analytical phase. Materials and Methods. Aim of this study was to evaluate the role of internal control (IC to standardize pre-analytical phase and the role of cellularity control (CC in the suitability evaluation of biological matrices, and their influence on false negative results. 120 cervical swabs (CS were pre-treated and extracted following 3 different protocols. Extraction performance was evaluated by amplification of: IC, added in each mix extraction; human gene HPRT1 (CC with RT-PCR to quantify sample cellularity; L1 region of HPV with SPF10 primers. 135 urine, 135 urethral swabs, 553 CS and 332 ThinPrep swabs (TP were tested for C. trachomatis (CT and U. parvum (UP with RT-PCR and for HPV by endpoint-PCR. Samples were also tested for cellularity. Results. Extraction protocol with highest average cellularity (Ac/sample showed lowest number of samples with inhibitors; highest HPV positivity was achieved by protocol with greatest Ac/PCR. CS and TP under 300.000 cells/sample showed a significant decrease of UP (P<0.01 and HPV (P<0.005 positivity. Female urine under 40.000 cells/mL were inadequate to detect UP (P<0.05. Conclusions. Our data show that IC and CC allow optimization of pre-analytical phase, with an increase of analytical quality. Cellularity/sample allows better sample adequacy evaluation, crucial to avoid false negative results, while cellularity/PCR allows better optimization of PCR amplification. Further data are required to define the optimal cut-off for result normalization.

  20. Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs

    Directory of Open Access Journals (Sweden)

    Woo-Yong Choi

    2009-01-01

    Full Text Available We propose the efficient reliable multicast MAC protocol based on the connectivity information among the recipients. Enhancing the BMMM (Batch Mode Multicast MAC protocol, the reliable multicast MAC protocol significantly reduces the RAK (Request for ACK frame transmissions in a reasonable computational time and enhances the MAC performance. By the analytical performance analysis, the throughputs of the BMMM protocol and our proposed MAC protocol are derived. Numerical examples show that our proposed MAC protocol increases the reliable multicast MAC performance for IEEE 802.11 wireless LANs.

  1. Development of a fast and simple gas chromatographic protocol based on the combined use of alkyl chloroformate and solid phase microextraction for the assay of polyamines in human urine.

    Science.gov (United States)

    Naccarato, Attilio; Elliani, Rosangela; Cavaliere, Brunella; Sindona, Giovanni; Tagarelli, Antonio

    2018-05-11

    Polyamines are aliphatic amines with low molecular weight that are widely recognized as one of the most important cancer biomarkers for early diagnosis and treatment. The goal of the work herein presented is the development of a rapid and simple method for the quantification of free polyamines (i.e., putrescine, cadaverine, spermidine, spermine) and N-monoacetylated polyamines (i.e., N 1 -Acetylspermidine, N 8 -Acetylspermidine, and N 1 -Acetylspermine) in human urine. A preliminary derivatization with propyl chloroformate combined with the use of solid phase microextraction (SPME) allowed for an easy and automatable protocol involving minimal sample handling and no consumption of organic solvents. The affinity of the analytes toward five commercial SPME coatings was evaluated in univariate mode, and the best result in terms of analyte extraction was achieved using the divinylbenzene/carboxen/polydimethylsiloxane fiber. The variables affecting the performance of SPME analysis were optimized by the multivariate approach of experimental design and, in particular, using a central composite design (CCD). The optimal working conditions in terms of response values are the following: extraction temperature 40 °C, extraction time of 15 min and no addition of NaCl. Analyses were carried out by gas chromatography-triple quadrupole mass spectrometry (GC-QqQ-MS) in selected reaction monitoring (SRM) acquisition mode. The developed method was validated according to the guidelines issued by the Food and Drug Administration (FDA). The satisfactory performances reached in terms of linearity, sensitivity (LOQs between 0.01 and 0.1 μg/mL), matrix effect (68-121%), accuracy, and precision (inter-day values between -24% and +16% and in the range 3.3-28.4%, respectively) make the proposed protocol suitable to be adopted for quantification of these important biomarkers in urine samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    Science.gov (United States)

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  3. Biofluid infrared spectro-diagnostics: pre-analytical considerations for clinical applications.

    Science.gov (United States)

    Lovergne, L; Bouzy, P; Untereiner, V; Garnotel, R; Baker, M J; Thiéfin, G; Sockalingum, G D

    2016-06-23

    Several proof-of-concept studies on the vibrational spectroscopy of biofluids have demonstrated that the methodology has promising potential as a clinical diagnostic tool. However, these studies also show that there is a lack of a standardised protocol in sample handling and preparation prior to spectroscopic analysis. One of the most important sources of analytical errors is the pre-analytical phase. For the technique to be translated into clinics, it is clear that a very strict protocol needs to be established for such biological samples. This study focuses on some of the aspects of the pre-analytical phase in the development of the high-throughput Fourier Transform Infrared (FTIR) spectroscopy of some of the most common biofluids such as serum, plasma and bile. Pre-analytical considerations that can impact either the samples (solvents, anti-coagulants, freeze-thaw cycles…) and/or spectroscopic analysis (sample preparation such as drying, deposit methods, volumes, substrates, operators dependence…) and consequently the quality and the reproducibility of spectral data will be discussed in this report.

  4. A fast semi-analytical model for the slotted structure of induction motors with 36/28 stator/rotor slot combination

    NARCIS (Netherlands)

    Sprangers, R.L.J.; Paulides, J.J.H.; Gysen, B.L.J.; Lomonova, E.A.

    2014-01-01

    A fast, semi-analyticalmodel for inductionmotors (IMs) with 36/28 stator/rotor slot combination is presented. In comparison to traditional analytical models for IMs, such as lumped parameter, magnetic equivalent circuit and anisotropic layer models, the presented model calculates a continuous

  5. Security of Semi-Device-Independent Random Number Expansion Protocols.

    Science.gov (United States)

    Li, Dan-Dan; Wen, Qiao-Yan; Wang, Yu-Kun; Zhou, Yu-Qian; Gao, Fei

    2015-10-27

    Semi-device-independent random number expansion (SDI-RNE) protocols require some truly random numbers to generate fresh ones, with making no assumptions on the internal working of quantum devices except for the dimension of the Hilbert space. The generated randomness is certified by non-classical correlation in the prepare-and-measure test. Until now, the analytical relations between the amount of the generated randomness and the degree of non-classical correlation, which are crucial for evaluating the security of SDI-RNE protocols, are not clear under both the ideal condition and the practical one. In the paper, first, we give the analytical relation between the above two factors under the ideal condition. As well, we derive the analytical relation under the practical conditions, where devices' behavior is not independent and identical in each round and there exists deviation in estimating the non-classical behavior of devices. Furthermore, we choose a different randomness extractor (i.e., two-universal random function) and give the security proof.

  6. Packet reversed packet combining scheme

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2006-07-01

    The packet combining scheme is a well defined simple error correction scheme with erroneous copies at the receiver. It offers higher throughput combined with ARQ protocols in networks than that of basic ARQ protocols. But packet combining scheme fails to correct errors when the errors occur in the same bit locations of two erroneous copies. In the present work, we propose a scheme that will correct error if the errors occur at the same bit location of the erroneous copies. The proposed scheme when combined with ARQ protocol will offer higher throughput. (author)

  7. Analytical treatment of the nonlinear electron cloud effect and the combined effects with beam-beam and space charge nonlinear forces in storage rings

    International Nuclear Information System (INIS)

    Gao Jie

    2009-01-01

    In this paper we treat first some nonlinear beam dynamics problems in storage rings, such as beam dynamic apertures due to magnetic multipoles, wiggles, beam-beam effects, nonlinear space charge effect, and then nonlinear electron cloud effect combined with beam-beam and space charge effects, analytically. This analytical treatment is applied to BEPC II. The corresponding analytical expressions developed in this paper are useful both in understanding the physics behind these problems and also in making practical quick hand estimations. (author)

  8. Asymptotic performance modelling of DCF protocol with prioritized channel access

    Science.gov (United States)

    Choi, Woo-Yong

    2017-11-01

    Recently, the modification of the DCF (Distributed Coordination Function) protocol by the prioritized channel access was proposed to resolve the problem that the DCF performance worsens exponentially as more nodes exist in IEEE 802.11 wireless LANs. In this paper, an asymptotic analytical performance model is presented to analyze the MAC performance of the DCF protocol with the prioritized channel access.

  9. Entanglement distillation protocols and number theory

    International Nuclear Information System (INIS)

    Bombin, H.; Martin-Delgado, M.A.

    2005-01-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set Z D n associated with Bell diagonal states is a module rather than a vector space. We find that a partition of Z D n into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D. When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively

  10. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  11. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  12. Development of Characterization Protocol for Mixed Liquid Radioactive Waste Classification

    International Nuclear Information System (INIS)

    Norasalwa Zakaria; Syed Asraf Wafa; Wo, Y.M.; Sarimah Mahat; Mohamad Annuar Assadat Husain

    2017-01-01

    Mixed organic liquid waste generated from health-care and research activities containing tritium, carbon-14, and other radionuclide posed specific challenges in its management. Often, this waste becomes legacy waste in many nuclear facilities and being considered as 'problematic' waste. One of the most important recommendations made by IAEA is to perform multistage processes aiming at declassification of the waste. At this moment, approximately 3000 bottles of mixed liquid waste, with estimated volume of 6000 litres are currently stored at the National Radioactive Waste Management Centre, Malaysia and some have been stored for more than 25 years. The aim of this study is to develop a characterization protocol towards reclassification of these wastes. The characterization protocol entails waste identification, waste screening and segregation, and analytical radionuclides profiling using analytical procedures involving gross alpha beta, and gamma spectrometry. The results obtained from the characterization protocol are used to establish criteria for speedy classification of the waste. (author)

  13. Development of characterization protocol for mixed liquid radioactive waste classification

    Energy Technology Data Exchange (ETDEWEB)

    Zakaria, Norasalwa, E-mail: norasalwa@nuclearmalaysia.gov.my [Waste Technology Development Centre, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Wafa, Syed Asraf [Radioisotop Technology and Innovation, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Wo, Yii Mei [Radiochemistry and Environment, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Mahat, Sarimah [Material Technology Group, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia)

    2015-04-29

    Mixed liquid organic waste generated from health-care and research activities containing tritium, carbon-14, and other radionuclides posed specific challenges in its management. Often, these wastes become legacy waste in many nuclear facilities and being considered as ‘problematic’ waste. One of the most important recommendations made by IAEA is to perform multistage processes aiming at declassification of the waste. At this moment, approximately 3000 bottles of mixed liquid waste, with estimated volume of 6000 litres are currently stored at the National Radioactive Waste Management Centre, Malaysia and some have been stored for more than 25 years. The aim of this study is to develop a characterization protocol towards reclassification of these wastes. The characterization protocol entails waste identification, waste screening and segregation, and analytical radionuclides profiling using various analytical procedures including gross alpha/ gross beta, gamma spectrometry, and LSC method. The results obtained from the characterization protocol are used to establish criteria for speedy classification of the waste.

  14. Protocol for the process evaluation of interventions combining performance-based financing with health equity in Burkina Faso.

    Science.gov (United States)

    Ridde, Valéry; Turcotte-Tremblay, Anne-Marie; Souares, Aurélia; Lohmann, Julia; Zombré, David; Koulidiati, Jean Louis; Yaogo, Maurice; Hien, Hervé; Hunt, Matthew; Zongo, Sylvie; De Allegri, Manuela

    2014-10-12

    The low quality of healthcare and the presence of user fees in Burkina Faso contribute to low utilization of healthcare and elevated levels of mortality. To improve access to high-quality healthcare and equity, national authorities are testing different intervention arms that combine performance-based financing with community-based health insurance and pro-poor targeting. There is a need to evaluate the implementation of these unique approaches. We developed a research protocol to analyze the conditions that led to the emergence of these intervention arms, the fidelity between the activities initially planned and those conducted, the implementation and adaptation processes, the sustainability of the interventions, the possibilities for scaling them up, and their ethical implications. The study adopts a longitudinal multiple case study design with several embedded levels of analyses. To represent the diversity of contexts where the intervention arms are carried out, we will select three districts. Within districts, we will select both primary healthcare centers (n =18) representing different intervention arms and the district or regional hospital (n =3). We will select contrasted cases in relation to their initial performance (good, fair, poor). Over a period of 18 months, we will use quantitative and qualitative data collection and analytical tools to study these cases including in-depth interviews, participatory observation, research diaries, and questionnaires. We will give more weight to qualitative methods compared to quantitative methods. Performance-based financing is expanding rapidly across low- and middle-income countries. The results of this study will enable researchers and decision makers to gain a better understanding of the factors that can influence the implementation and the sustainability of complex interventions aiming to increase healthcare quality as well as equity.

  15. HACCP and SACCP protocols for ready-to-eat meals pasteurised with ionising radiation

    Energy Technology Data Exchange (ETDEWEB)

    Haruvy, Y [Soreq Nuclear Research Centre, Israel Atomic Energy Commission, (Israel)

    2002-07-01

    The rationale is the necessity to identify, ab-initio, critical potential failure modes: Hazardous, Sensory, as well as Economic, and assign Critical Control Points to detect and/or eliminate them in due time and course. Preliminary SACCP could be orchestrated along the existing HACCP lines with relatively minor modifications. Care should be taken to avoid overlooking of high-risk failure modes, as well as to avoid overdoing the analysis to yield numerous CCPs with small marginal value. A combined H-S-E ACCP protocol should allow for a facile analytical tool of the objective projects. Primary fields of failure modes seem to be non-uniformity of ingredients, processing and testing.

  16. Framework for pedagogical learning analytics

    OpenAIRE

    Heilala, Ville

    2018-01-01

    Learning analytics is an emergent technological practice and a multidisciplinary scientific discipline, which goal is to facilitate effective learning and knowledge of learning. In this design science research, I combine knowledge discovery process, a concept of pedagogical knowledge, ethics of learning analytics and microservice architecture. The result is a framework for pedagogical learning analytics. The framework is applied and evaluated in the context of agency analytics. The framework ...

  17. Evaluation of Extraction Protocols for Simultaneous Polar and Non-Polar Yeast Metabolite Analysis Using Multivariate Projection Methods

    Directory of Open Access Journals (Sweden)

    Nicolas P. Tambellini

    2013-07-01

    Full Text Available Metabolomic and lipidomic approaches aim to measure metabolites or lipids in the cell. Metabolite extraction is a key step in obtaining useful and reliable data for successful metabolite studies. Significant efforts have been made to identify the optimal extraction protocol for various platforms and biological systems, for both polar and non-polar metabolites. Here we report an approach utilizing chemoinformatics for systematic comparison of protocols to extract both from a single sample of the model yeast organism Saccharomyces cerevisiae. Three chloroform/methanol/water partitioning based extraction protocols found in literature were evaluated for their effectiveness at reproducibly extracting both polar and non-polar metabolites. Fatty acid methyl esters and methoxyamine/trimethylsilyl derivatized aqueous compounds were analyzed by gas chromatography mass spectrometry to evaluate non-polar or polar metabolite analysis. The comparative breadth and amount of recovered metabolites was evaluated using multivariate projection methods. This approach identified an optimal protocol consisting of 64 identified polar metabolites from 105 ion hits and 12 fatty acids recovered, and will potentially attenuate the error and variation associated with combining metabolite profiles from different samples for untargeted analysis with both polar and non-polar analytes. It also confirmed the value of using multivariate projection methods to compare established extraction protocols.

  18. Supplier Selection for Food Industry: A Combination of Taguchi Loss Function and Fuzzy Analytical Hierarchy Process

    OpenAIRE

    Renna Magdalena

    2012-01-01

    Supplier selection is an important part of supply chain management process by which firms identify, evaluate, and establish contracts with suppliers. Deciding the right supplier can be a complex task. As such, various criteria must be taken into account to choose the best supplier. This study focused on the supply in the packaging division of a food industry in Denpasar-Bali. A combination of Taguchi Loss Function and fuzzy-AHP (Analytical Hierarchy Process Fuzzy Linear Programming) was used ...

  19. Combination Protocol of Low-Frequency rTMS and Intensive Occupational Therapy for Post-stroke Upper Limb Hemiparesis: a 6-year Experience of More Than 1700 Japanese Patients.

    Science.gov (United States)

    Kakuda, Wataru; Abo, Masahiro; Sasanuma, Jinichi; Shimizu, Masato; Okamoto, Takatsugu; Kimura, Chikou; Kakita, Kiyohito; Hara, Hiroyoshi

    2016-06-01

    Several years ago, we proposed a combination protocol of repetitive transcranial magnetic stimulation (rTMS) and intensive occupational therapy (OT) for upper limb hemiparesis after stroke. Subsequently, the number of patients treated with the protocol has increased in Japan. We aimed to present the latest data on our proposed combination protocol for post-stroke upper limb hemiparesis as a result of a multi-institutional study. After confirming that a patient met the inclusion criteria for the protocol, they were scheduled to receive the 15-day inpatient protocol. In the protocol, two sessions of 20-min rTMS and 120-min occupational therapy were provided daily, except for Sundays and the days of admission/discharge. Motor function of the affected upper limb was evaluated by the Fugl-Meyer assessment (FMA) and Wolf motor function test (WMFT) at admission/discharge and at 4 weeks after discharge if possible. A total of 1725 post-stroke patients were studied (mean age at admission 61.4 ± 13.0 years). The scheduled 15-day protocol was completed by all patients. At discharge, the increase in FMA score, shortening in performance time of WMFT, and increase in functional ability scale (FAS) score of WMFT were significant (FMA score 46.8 ± 12.2 to 50.9 ± 11.4 points, p hemiparesis after stroke, although its efficacy should be confirmed in a randomized controlled study.

  20. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  1. Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol.

    Science.gov (United States)

    Klonoff, David C; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A; Arreaza-Rubin, Guillermo; Burk, Robert D; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W

    2016-05-01

    Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled "Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program" is attached as supplementary material. This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. © 2015 Diabetes Technology Society.

  2. Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol

    Science.gov (United States)

    Klonoff, David C.; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A.; Arreaza-Rubin, Guillermo; Burk, Robert D.; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B.; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W.

    2015-01-01

    Background: Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. Methods: The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. Results: A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled “Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program” is attached as supplementary material. Conclusion: This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. PMID:26481642

  3. A Flexible Multidose GnRH Antagonist versus a Microdose Flare-Up GnRH Agonist Combined with a Flexible Multidose GnRH Antagonist Protocol in Poor Responders to IVF

    Directory of Open Access Journals (Sweden)

    Gayem İnayet Turgay Çelik

    2015-01-01

    Full Text Available Objective. To compare the effectiveness of a flexible multidose gonadotropin-releasing hormone (GnRH antagonist against the effectiveness of a microdose flare-up GnRH agonist combined with a flexible multidose GnRH antagonist protocol in poor responders to in vitro fertilization (IVF. Study Design. A retrospective study in Akdeniz University, Faculty of Medicine, Department of Obstetrics and Gynecology, IVF Center, for 131 poor responders in the intracytoplasmic sperm injection-embryo transfer (ICSI-ET program between January 2006 and November 2012. The groups were compared to the patients’ characteristics, controlled ovarian stimulation (COH results, and laboratory results. Results. Combination protocol was applied to 46 patients (group 1, and a single protocol was applied to 85 patients (group 2. In group 1, the duration of the treatment was longer and the dose of FSH was higher. The cycle cancellation rate was significantly higher in group 2 (26.1% versus 38.8%. A significant difference was not observed with respect to the number and quality of oocytes and embryos or to the number of embryos transferred. There were no statistically significant differences in the hCG positivity (9.5% versus 9.4% or the clinical pregnancy rates (7.1% versus 10.6%. Conclusion. The combination protocol does not provide additional efficacy.

  4. A Flexible Multidose GnRH Antagonist versus a Microdose Flare-Up GnRH Agonist Combined with a Flexible Multidose GnRH Antagonist Protocol in Poor Responders to IVF.

    Science.gov (United States)

    Çelik, Gayem İnayet Turgay; Sütçü, Havva Kömür; Akpak, Yaşam Kemal; Akar, Münire Erman

    2015-01-01

    To compare the effectiveness of a flexible multidose gonadotropin-releasing hormone (GnRH) antagonist against the effectiveness of a microdose flare-up GnRH agonist combined with a flexible multidose GnRH antagonist protocol in poor responders to in vitro fertilization (IVF). A retrospective study in Akdeniz University, Faculty of Medicine, Department of Obstetrics and Gynecology, IVF Center, for 131 poor responders in the intracytoplasmic sperm injection-embryo transfer (ICSI-ET) program between January 2006 and November 2012. The groups were compared to the patients' characteristics, controlled ovarian stimulation (COH) results, and laboratory results. Combination protocol was applied to 46 patients (group 1), and a single protocol was applied to 85 patients (group 2). In group 1, the duration of the treatment was longer and the dose of FSH was higher. The cycle cancellation rate was significantly higher in group 2 (26.1% versus 38.8%). A significant difference was not observed with respect to the number and quality of oocytes and embryos or to the number of embryos transferred. There were no statistically significant differences in the hCG positivity (9.5% versus 9.4%) or the clinical pregnancy rates (7.1% versus 10.6%). The combination protocol does not provide additional efficacy.

  5. The Pirate group intervention protocol description and a case report of a modified constraint-induced movement therapy combined with bimanual training for young children with unilateral spastic cerebral palsy

    NARCIS (Netherlands)

    Aarts, Pauline B.; van Hartingsveldt, Margo; Anderson, Patricia G.; van den Tillaar, Ingrid; van der Burg, Jan; Geurts, Alexander C.

    The purpose of this article was to describe a child-friendly modified constraint-induced movement therapy protocol that is combined with goal-directed task-specific bimanual training (mCIMT-BiT). This detailed description elucidates the approach and supports various research reports. This protocol

  6. An On-Target Desalting and Concentration Sample Preparation Protocol for MALDI-MS and MS/MS Analysis

    DEFF Research Database (Denmark)

    Zhang, Xumin; Wang, Quanhui; Lou, Xiaomin

    2012-01-01

    2DE coupled with MALDI-MS is one of the most widely used and powerful analytic technologies in proteomics study. The MALDI sample preparation method has been developed and optimized towards the combination of simplicity, sample-cleaning, and sample concentration since its introduction. Here we...... present a protocol of the so-called Sample loading, Matrix loading, and on-target Wash (SMW) method which fulfills the three criteria by taking advantage of the AnchorChip™ targets. Our method is extremely simple and no pre-desalting or concentration is needed when dealing with samples prepared from 2DE...

  7. A slotted access control protocol for metropolitan WDM ring networks

    Science.gov (United States)

    Baziana, P. A.; Pountourakis, I. E.

    2009-03-01

    In this study we focus on the serious scalability problems that many access protocols for WDM ring networks introduce due to the use of a dedicated wavelength per access node for either transmission or reception. We propose an efficient slotted MAC protocol suitable for WDM ring metropolitan area networks. The proposed network architecture employs a separate wavelength for control information exchange prior to the data packet transmission. Each access node is equipped with a pair of tunable transceivers for data communication and a pair of fixed tuned transceivers for control information exchange. Also, each access node includes a set of fixed delay lines for synchronization reasons; to keep the data packets, while the control information is processed. An efficient access algorithm is applied to avoid both the data wavelengths and the receiver collisions. In our protocol, each access node is capable of transmitting and receiving over any of the data wavelengths, facing the scalability issues. Two different slot reuse schemes are assumed: the source and the destination stripping schemes. For both schemes, performance measures evaluation is provided via an analytic model. The analytical results are validated by a discrete event simulation model that uses Poisson traffic sources. Simulation results show that the proposed protocol manages efficient bandwidth utilization, especially under high load. Also, comparative simulation results prove that our protocol achieves significant performance improvement as compared with other WDMA protocols which restrict transmission over a dedicated data wavelength. Finally, performance measures evaluation is explored for diverse numbers of buffer size, access nodes and data wavelengths.

  8. The machine in multimedia analytics

    NARCIS (Netherlands)

    Zahálka, J.

    2017-01-01

    This thesis investigates the role of the machine in multimedia analytics, a discipline that combines visual analytics with multimedia analysis algorithms in order to unlock the potential of multimedia collections as sources of knowledge in scientific and applied domains. Specifically, the central

  9. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1998-01-01

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs

  10. Hanford analytical sample projections FY 1998--FY 2002

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  11. Advertisement-Based Energy Efficient Medium Access Protocols for Wireless Sensor Networks

    Science.gov (United States)

    Ray, Surjya Sarathi

    One of the main challenges that prevents the large-scale deployment of Wireless Sensor Networks (WSNs) is providing the applications with the required quality of service (QoS) given the sensor nodes' limited energy supplies. WSNs are an important tool in supporting applications ranging from environmental and industrial monitoring, to battlefield surveillance and traffic control, among others. Most of these applications require sensors to function for long periods of time without human intervention and without battery replacement. Therefore, energy conservation is one of the main goals for protocols for WSNs. Energy conservation can be performed in different layers of the protocol stack. In particular, as the medium access control (MAC) layer can access and control the radio directly, large energy savings is possible through intelligent MAC protocol design. To maximize the network lifetime, MAC protocols for WSNs aim to minimize idle listening of the sensor nodes, packet collisions, and overhearing. Several approaches such as duty cycling and low power listening have been proposed at the MAC layer to achieve energy efficiency. In this thesis, I explore the possibility of further energy savings through the advertisement of data packets in the MAC layer. In the first part of my research, I propose Advertisement-MAC or ADV-MAC, a new MAC protocol for WSNs that utilizes the concept of advertising for data contention. This technique lets nodes listen dynamically to any desired transmission and sleep during transmissions not of interest. This minimizes the energy lost in idle listening and overhearing while maintaining an adaptive duty cycle to handle variable loads. Additionally, ADV-MAC enables energy efficient MAC-level multicasting. An analytical model for the packet delivery ratio and the energy consumption of the protocol is also proposed. The analytical model is verified with simulations and is used to choose an optimal value of the advertisement period

  12. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options.

    Science.gov (United States)

    Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D

    2013-10-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.

  13. Data Intensive Architecture for Scalable Cyber Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Bryan K.; Johnson, John R.; Critchlow, Terence J.

    2011-11-15

    events, we utilized multidimensional OLAP data cubes. The data cube structure supports interactive analysis of summary data across multiple dimensions, such as location, time, and protocol. Cube technology also allows the analyst to drill-down into the underlying data set, when events of interest are identified and detailed analysis is required. Unfortunately, when creating these cubes, we ran into significant performance issues with our initial architecture, caused by a combination of the data volume and attribute characteristics. Overcoming, these issues required us to develop a novel, data intensive computing infrastructure. In particular, we ended up combining a Netezza Twin Fin data warehouse appliance, a solid state Fusion IO ioDrive, and the Tableau Desktop business intelligence analytic software. Using this architecture, we were able to analyze a month's worth of flow records comprising 4.9B records, totaling approximately 600GB of data. This paper describes our architecture, the challenges that we encountered, and the work that remains to deploy a fully generalized cyber analytical infrastructure.

  14. Analytical calculation of geometric and chromatic aberrations in a bi-potential electrostatic and bell-shaped magnetic combined lens

    International Nuclear Information System (INIS)

    Ximen Jiye; Liu Zhixiong

    2000-01-01

    In the present paper, Gaussian optical property in the bi-potential electrostatic and the bell-shaped magnetic combined lens - a new theoretical model first proposed in electron optics - has been thoroughly studied. Meanwhile, based on electron optical canonical aberration theory, analytical formulas of third-order geometrical and first-order chromatic aberration coefficients and their computational results have first been derived for this bi-potential electrostatic and bell-shaped magnetic combined lens. It is to emphasized that this theoretical study can be used to estimate third-order geometric and first-order chromatic aberrations and to provide a theoretical criterion for numerical computation in a rotationally symmetric electromagnetic lens

  15. Combined multi-analytical approach for study of pore system in bricks: How much porosity is there?

    Energy Technology Data Exchange (ETDEWEB)

    Coletti, Chiara, E-mail: chiara.coletti@studenti.unipd.it [Department of Geosciences, University of Padova, Via G. Gradenigo 6, 35131 Padova (Italy); Department of Mineralogy and Petrology, Faculty of Science, University of Granada, Avda. Fuentenueva s/n, 18002 Granada (Spain); Cultrone, Giuseppe [Department of Mineralogy and Petrology, Faculty of Science, University of Granada, Avda. Fuentenueva s/n, 18002 Granada (Spain); Maritan, Lara; Mazzoli, Claudio [Department of Geosciences, University of Padova, Via G. Gradenigo 6, 35131 Padova (Italy)

    2016-11-15

    During the firing of bricks, mineralogical and textural transformations produce an artificial aggregate characterised by significant porosity. Particularly as regards pore-size distribution and the interconnection model, porosity is an important parameter to evaluate and predict the durability of bricks. The pore system is in fact the main element, which correlates building materials and their environment (especially in cases of aggressive weathering, e.g., salt crystallisation and freeze-thaw cycles) and determines their durability. Four industrial bricks with differing compositions and firing temperatures were analysed with “direct” and “indirect” techniques, traditional methods (mercury intrusion porosimetry, hydric tests, nitrogen adsorption) and new analytical approaches based on digital image reconstruction of 2D and 3D models (back-scattered electrons and computerised X-ray micro-Tomography, respectively). The comparison of results from different analytical methods in the “overlapping ranges” of porosity and the careful reconstruction of a cumulative curve, allowed overcoming their specific limitations and achieving better knowledge on the pore system of bricks. - Highlights: •Pore-size distribution and structure of the pore system in four commercial bricks •A multi-analytical approach combining “direct” and “indirect” techniques •Traditional methods vs. new approaches based on 2D/3D digital image reconstruction •The use of “overlapping ranges” to overcome the limitations of various techniques.

  16. Throughput and Fairness of Collision Avoidance Protocols in Ad Hoc Networks

    National Research Council Canada - National Science Library

    Garcia-Luna-Aceves, J. J; Wang, Yu

    2004-01-01

    .... In Section 1, The authors present an analytical modeling to derive the saturation throughput of these sender-initiated collision avoidance protocols in multi-hop ad hoc networks with nodes randomly...

  17. Heavy element stable isotope ratios. Analytical approaches and applications

    International Nuclear Information System (INIS)

    Tanimizu, Masaharu; Sohrin, Yoshiki; Hirata, Takafumi

    2013-01-01

    Continuous developments in inorganic mass spectrometry techniques, including a combination of an inductively coupled plasma ion source and a magnetic sector-based mass spectrometer equipped with a multiple-collector array, have revolutionized the precision of isotope ratio measurements, and applications of inorganic mass spectrometry for biochemistry, geochemistry, and marine chemistry are beginning to appear on the horizon. Series of pioneering studies have revealed that natural stable isotope fractionations of many elements heavier than S (e.g., Fe, Cu, Zn, Sr, Ce, Nd, Mo, Cd, W, Tl, and U) are common on Earth, and it had been widely recognized that most physicochemical reactions or biochemical processes induce mass-dependent isotope fractionation. The variations in isotope ratios of the heavy elements can provide new insights into past and present biochemical and geochemical processes. To achieve this, the analytical community is actively solving problems such as spectral interference, mass discrimination drift, chemical separation and purification, and reduction of the contamination of analytes. This article describes data calibration and standardization protocols to allow interlaboratory comparisons or to maintain traceability of data, and basic principles of isotope fractionation in nature, together with high-selectivity and high-yield chemical separation and purification techniques for stable isotope studies.

  18. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    Science.gov (United States)

    Duffy, Stephen F.

    1997-01-01

    Ceramic matrix composites (CMC) and intermetallic materials (e.g., single crystal nickel aluminide) are high performance materials that exhibit attractive mechanical, thermal and chemical properties. These materials are critically important in advancing certain performance aspects of gas turbine engines. From an aerospace engineer's perspective the new generation of ceramic composites and intermetallics offers a significant potential for raising the thrust/weight ratio and reducing NO(x) emissions of gas turbine engines. These aspects have increased interest in utilizing these materials in the hot sections of turbine engines. However, as these materials evolve and their performance characteristics improve a persistent need exists for state-of-the-art analytical methods that predict the response of components fabricated from CMC and intermetallic material systems. This need provided the motivation for the technology developed under this research effort. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for "graceful" rather than catastrophic failure. When loaded in the fiber direction, these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Thus modeling efforts supported under this research effort have focused on predicting this sort of behavior. For single crystal intermetallics the issues that motivated the technology development involved questions relating to material behavior and component design. Thus the research effort supported by this grant had to determine the statistical nature and source of fracture in a high strength, Ni

  19. The Kyoto Protocol. An economic appraisal

    International Nuclear Information System (INIS)

    Grubb, M.

    2000-05-01

    This paper examines the overall economics of the Kyoto Protocol on climate change, in three main parts. The first part explores the structure of the Protocol and how this matches against classical economic criteria of an 'optimal' climate change agreement. This discussion also considers the nature of and reasons for shortcomings, and the prospects for its evolution. Given the various flexibilities in the agreement, the Kyoto Protocol is far more economically efficient in its structure than any previous global environmental agreement. The central conclusion is that, from an economic perspective, the Protocol's structure for industrialised country commitments is as good as could reasonably be expected. The second part of the paper explores more closely the economics of the commitments themselves and how they combine with the various flexibilities, briefly reviewing the available literature and using a simple spreadsheet model of how the commitments might combine with trading mechanisms under a range of assumptions. Flexibility is intrinsic and necessary, but it is argued that the allocations to Russia and Ukraine in particular mean that unlimited flexibility could render the Protocol's commitments weaker in their impacts than is economically desirable to address climate change. It is argued that, should this prove to be the case, access to the large surplus in the transition economies could be used as a control valve to limit the costs of the Protocol to within acceptable limits. Finally, the paper considers the issues of developing country involvement in the Kyoto Protocol, and the Protocol's longer-term impact and evolution, including its impact on technological evolution and dissemination and the evolution of future commitments. It is argued that taking account of such issues critically affects views of the Protocol

  20. A Business Evaluation Of The Next Generation Ipv6 Protocol In Fixed And Mobile Communication Services

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2002-01-01

    textabstractThis paper gives an analytical business model of the Internet IPv4 and IPv6 protocols ,focussing on the business implications of intrinsic technical properties of these protocols .The technical properties modeled in business terms are : address space, payload, autoconfiguration, IP

  1. [Protocol for the study of bone tumours and standardization of pathology reports].

    Science.gov (United States)

    Machado, Isidro; Pozo, José Juan; Marcilla, David; Cruz, Julia; Tardío, Juan C; Astudillo, Aurora; Bagué, Sílvia

    Primary bone neoplasms represent a rare and heterogeneous group of mesenchymal tumours. The prevalence of benign and malignant tumours varies; the latter (sarcomas) account for less than 0.2% of all malignant tumours. Primary bone neoplasms are usually diagnosed and classified according to the criteria established and published by the World Health Organization (WHO 2013). These criteria are a result of advances in molecular pathology, which complements the histopathological diagnosis. Bone tumours should be diagnosed and treated in referral centers by a multidisciplinary team including pathologists, radiologists, orthopedic surgeons and oncologists. We analyzed different national and international protocols in order to provide a guide of recommendations for the improvement of pathological evaluation and management of bone tumours. We include specific recommendations for the pre-analytical, analytical, and post-analytical phases, as well as protocols for gross and microscopic pathology. Copyright © 2016 Sociedad Española de Anatomía Patológica. Publicado por Elsevier España, S.L.U. All rights reserved.

  2. Indirect MR venography: contrast medium protocols, postprocessing and combination in diagnosing pulmonary emboli with MRI

    International Nuclear Information System (INIS)

    Kluge, A.; Bachmann, G.; Rominger, M.; Schoenburg, M.

    2004-01-01

    Purpose: Integration of MR venography in a comprehensive MR imaging protocol in patients with suspected pulmonary embolism (PE) and evaluation of contrast media dosage, timing and postprocessing for diagnostic accuracy. Materials and Methods: Fortyeight consecutive inpatients with suspected PE or deep vein thrombosis were examined by MR venography according to one of the following protocols: protocol I: MR venography only, 0.25 mmol/kg bodyweight (BW) Gadopentate dimeglumine(Gd-DTPA) as single dose, bolus timing; protocol II: MR angiography of pulmonary arteries with a cumulative dosage of 0.25 mmol/kg contrast media, modification of coil setting for MR venography without further contrast media application; protocol III: as protocol II but with 0.125 mmol/kg BW, followed by MR venography. Signal-to-noise ratio, contrast-to-noise ratio, number of definable vascular segments and image quality were evaluated. The results were compared to conventional bilateral venography. Results: All MR venography examinations were of diagnostic quality and the examination time was below 10 min. MR venography could be performed in all 48 patients comparted to 43 of 48 patients for conventional venography. Significantly more superficial and deep veins of the leg could be visualized by MR venography (94% compared to 83% for conventional venography). Sensitivity and specificity were 100% and 92%, respectively. Quality differed significantly between 0.125 mmol/kg (protocol III) and 0.25 mmol/kg Gd-DTPA (protocols I and II) while timing did not influence quality (protocol I vs. II). (orig.) [de

  3. XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes

    Science.gov (United States)

    Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.

    2009-01-01

    Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.

  4. THE QuEChERS ANALYTICAL METHOD COMBINED WITH LOW ...

    African Journals Online (AJOL)

    The method has also been applied to different cereal samples and satisfactory average recoveries ... Analysis of multiclass pesticide residues in foods is a challenging task because of the ... compounds set by regulatory bodies. ..... analytes were used to evaluate the influences of the selected factors on performance of the.

  5. Vibronic interactions proceeding from combined analytical and numerical considerations: Covalent functionalization of graphene by benzene, distortions, electronic transitions

    Energy Technology Data Exchange (ETDEWEB)

    Krasnenko, V.; Boltrushko, V.; Hizhnyakov, V. [Institute of Physics, University of Tartu, W. Ostwaldi Str 1, 50411 Tartu (Estonia)

    2016-04-07

    Chemically bound states of benzene molecules with graphene are studied both analytically and numerically. The states are formed by switching off intrabonds of π-electrons in C{sub 6} rings to interbonds. A number of different undistorted and distorted structures are established both with aligned and with transversal mutual orientation of benzene and graphene. The vibronic interactions causing distortions of bound states are found, by using a combination of analytical and numerical considerations. This allows one to determine all electronic transitions of π-electrons without explicit numerical calculations of excited states, to find the conical intersections of potentials, and to show that the mechanism of distortions is the pseudo-Jahn-Teller effect. It is found that the aligned distorted benzene molecule placed between two graphene sheets makes a chemical bond with both of them, which may be used for fastening of graphene sheets together.

  6. Robust Multi-unit Auction Protocol against False-name Bids

    Science.gov (United States)

    Yokoo, Makoto; Sakurai, Yuko; Matsubara, Shigeo

    This paper presents a new multi-unit auction protocol (IR protocol) that is robust against false-name bids. Internet auctions have become an integral part of Electronic Commerce and a promising field for applying agent and Artificial Intelligence technologies. Although the Internet provides an excellent infrastructure for executing auctions, the possibility of a new type of cheating called false-name bids has been pointed out. A false-name bid is a bid submitted under a fictitious name. A protocol called LDS has been developed for combinatorial auctions of multiple different items and has proven to be robust against false-name bids. Although we can modify the LDS protocol to handle multi-unit auctions, in which multiple units of an identical item are auctioned, the protocol is complicated and requires the auctioneer to carefully pre-determine the combination of bundles to obtain a high social surplus or revenue. For the auctioneer, our newly developed IR protocol is easier to use than the LDS, since the combination of bundles is automatically determined in a flexible manner according to the declared evaluation values of agents. The evaluation results show that the IR protocol can obtain a better social surplus than that obtained by the LDS protocol.

  7. A Dialogue Game Protocol for Agent Purchase Negotiations

    NARCIS (Netherlands)

    McBurney, P.; Eijk, R.M. van; Parsons, S.; Amgoud, L.

    2003-01-01

    We propose a dialogue game protocol for purchase negotiation dialogues which identifies appropriate speech acts, defines constraints on their utterances, and specifies the different sub-tasks agents need to perform in order to engage in dialogues according to this protocol. Our formalism combines

  8. Analytical validation of a reference laboratory ELISA for the detection of feline leukemia virus p27 antigen.

    Science.gov (United States)

    Buch, Jesse S; Clark, Genevieve H; Cahill, Roberta; Thatcher, Brendon; Smith, Peter; Chandrashekar, Ramaswamy; Leutenegger, Christian M; O'Connor, Thomas P; Beall, Melissa J

    2017-09-01

    Feline leukemia virus (FeLV) is an oncogenic retrovirus of cats. Immunoassays for the p27 core protein of FeLV aid in the detection of FeLV infections. Commercial microtiter-plate ELISAs have rapid protocols and visual result interpretation, limiting their usefulness in high-throughput situations. The purpose of our study was to validate the PetChek FeLV 15 ELISA, which is designed for the reference laboratory, and incorporates sequential, orthogonal screening and confirmatory protocols. A cutoff for the screening assay was established with 100% accuracy using 309 feline samples (244 negative, 65 positive) defined by the combined results of FeLV PCR and an independent reference p27 antigen ELISA. Precision of the screening assay was measured using a panel of 3 samples (negative, low-positive, and high-positive). The intra-assay coefficient of variation (CV) was 3.9-7.9%; the inter-assay CV was 6.0-8.6%. For the confirmatory assay, the intra-assay CV was 3.0-4.7%, and the inter-assay CV was 7.4-9.7%. The analytical sensitivity for p27 antigen was 3.7 ng/mL for inactivated whole FeLV and 1.2 ng/mL for purified recombinant FeLV p27. Analytical specificity was demonstrated based on the absence of cross-reactivity to related retroviruses. No interference was observed for samples containing added bilirubin, hemoglobin, or lipids. Based on these results, the new high-throughput design of the PetChek FeLV 15 ELISA makes it suitable for use in reference laboratory settings and maintains overall analytical performance.

  9. GLONASS orbit/clock combination in VNIIFTRI

    Science.gov (United States)

    Bezmenov, I.; Pasynok, S.

    2015-08-01

    An algorithm and a program for GLONASS satellites orbit/clock combination based on daily precise orbits submitted by several Analytic Centers were developed. Some theoretical estimates for combine orbit positions RMS were derived. It was shown that under condition that RMS of satellite orbits provided by the Analytic Centers during a long time interval are commensurable the RMS of combine orbit positions is no greater than RMS of other satellite positions estimated by any of the Analytic Centers.

  10. Two analytical models for evaluating performance of Gigabit Ethernet Hosts

    International Nuclear Information System (INIS)

    Salah, K.

    2006-01-01

    Two analytical models are developed to study the impact of interrupt overhead on operating system performance of network hosts when subjected to Gigabit network traffic. Under heavy network traffic, the system performance will be negatively affected due to interrupt overhead caused by incoming traffic. In particular, excessive latency and significant degradation in system throughput can be experienced. Also user application may livelock as the CPU power is mostly consumed by interrupt handling and protocol processing. In this paper we present and compare two analytical models that capture host behavior and evaluate its performance. The first model is based Markov processes and queuing theory, while the second, which is more accurate but more complex is a pure Markov process. For the most part both models give mathematically-equivalent closed-form solutions for a number of important system performance metrics. These metrics include throughput, latency and stability condition, CPU utilization of interrupt handling and protocol processing and CPU availability for user applications. The analysis yields insight into understanding and predicting the impact of system and network choices on the performance of interrupt-driven systems when subjected to light and heavy network loads. More, importantly, our analytical work can also be valuable in improving host performance. The paper gives guidelines and recommendations to address design and implementation issues. Simulation and reported experimental results show that our analytical models are valid and give a good approximation. (author)

  11. Improving Learning Analytics--Combining Observational and Self-Report Data on Student Learning

    Science.gov (United States)

    Ellis, Robert A.; Han, Feifei; Pardo, Abelardo

    2017-01-01

    The field of education technology is embracing a use of learning analytics to improve student experiences of learning. Along with exponential growth in this area is an increasing concern of the interpretability of the analytics from the student experience and what they can tell us about learning. This study offers a way to address some of the…

  12. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    Science.gov (United States)

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A model based security testing method for protocol implementation.

    Science.gov (United States)

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  14. On Equivalence between Critical Probabilities of Dynamic Gossip Protocol and Static Site Percolation

    Science.gov (United States)

    Ishikawa, Tetsuya; Hayakawa, Tomohisa

    The relationship between the critical probability of gossip protocol on the square lattice and the critical probability of site percolation on the square lattice is discussed. Specifically, these two critical probabilities are analytically shown to be equal to each other. Furthermore, we present a way of evaluating the critical probability of site percolation by approximating the saturation of gossip protocol. Finally, we provide numerical results which support the theoretical analysis.

  15. Hybrid Long-Distance Entanglement Distribution Protocol

    DEFF Research Database (Denmark)

    Brask, J.B.; Rigas, I.; Polzik, E.S.

    2010-01-01

    We propose a hybrid (continuous-discrete variable) quantum repeater protocol for long-distance entanglement distribution. Starting from states created by single-photon detection, we show how entangled coherent state superpositions can be generated by means of homodyne detection. We show that near......-deterministic entanglement swapping with such states is possible using only linear optics and homodyne detectors, and we evaluate the performance of our protocol combining these elements....

  16. Hopping control channel MAC protocol for opportunistic spectrum access networks

    Institute of Scientific and Technical Information of China (English)

    FU Jing-tuan; JI Hong; MAO Xu

    2010-01-01

    Opportunistic spectrum access (OSA) is considered as a promising approach to mitigate spectrum scarcity by allowing unlicensed users to exploit spectrum opportunities in licensed frequency bands. Derived from the existing channel-hopping multiple access (CHMA) protocol,we introduce a hopping control channel medium access control (MAC) protocol in the context of OSA networks. In our proposed protocol,all nodes in the network follow a common channel-hopping sequence; every frequency channel can be used as control channel and data channel. Considering primary users' occupancy of the channel,we use a primary user (PU) detection model to calculate the channel availability for unlicensed users' access. Then,a discrete Markov chain analytical model is applied to describe the channel states and deduce the system throughput. Through simulation,we present numerical results to demonstrate the throughput performance of our protocol and thus validate our work.

  17. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Science.gov (United States)

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  18. Analytical Solution of Interface Effect on the Strength of Combined Model Composed of Different Geologic Bodies

    Directory of Open Access Journals (Sweden)

    Zeng-hui Zhao

    2014-01-01

    Full Text Available According to the special combined structure of surrounding rock in western mining area of China, a micromechanical model with variable parameters containing contact interface was proposed firstly. Then, the derived stresses in coal and rock near the interface were analyzed on the basis of the harmonized strain relation, and the analytical solutions with respect to stress states near the interface were drawn up. The triaxial compressive strength of coal and rock was further determined in case the contact interface was in the horizontal position. Moreover, effects of stiffness ratio, interface angle, and stress level on the strength of two bodies near the contact area were expounded in detail. Results indicate that additional stresses which have significant effect on the strength of combined model are derived due to the adhesive effect of contact interface and lithological differences between geologic bodies located on both sides. The interface effect on the strength of combined body is most associated with the stiffness, interface angle, and the stress level. These conclusions are also basically valid for three-body model and even for the multibody model and lay important theory foundation to guide the stability study of soft strata composed of different geologic bodies.

  19. Surface analytical characterization of Streptavidin/poly(3-hexylthiophene) bilayers for bio-electronic applications

    Science.gov (United States)

    Sportelli, M. C.; Picca, R. A.; Manoli, K.; Re, M.; Pesce, E.; Tapfer, L.; Di Franco, C.; Cioffi, N.; Torsi, L.

    2017-10-01

    The analytical performance of bioelectronic devices is highly influenced by their fabrication methods. In particular, the final architecture of field-effect transistor biosensors combining spin-cast poly(3-hexylthiophene) (P3HT) film and a biomolecule interlayer deposited on a SiO2/Si substrate can lead to the development of highly performing sensing systems, such as for the case of streptavidin (SA) used for biotin sensing. To gain a better understanding of the quality of the interfacial area, critical is the assessment of the morphological features characteristic of the adopted biolayer deposition protocol, namely: the layer-by-layer (LbL) approach and the spin coating technique. The present study relies on a combined surface spectroscopic and morphological characterization. Specifically, X-ray photoelectron spectroscopy operated in the parallel angle-resolved mode allowed the non-destructive investigation of the in-depth chemical composition of the SA film, alone or in the presence of the P3HT overlayer. Spectroscopic data were supported and corroborated by the results obtained with a Scanning Electron and a Helium Ion microscope investigation performed on the SA layer that provided relevant information on the protein structural arrangement or on its surface morphology. Clear differences emerged between the SA layers prepared by the two approaches, with the layer-by-layer deposition resulting in a smoother and better defined bio-electronic interface. Such findings support the superior analytical performance shown by bioelectronic devices based on LbL-deposited protein layers over spin coated ones.

  20. Combining Mental Training and Physical Training With Goal-Oriented Protocols in Stroke Rehabilitation: A Feasibility Case Study

    Directory of Open Access Journals (Sweden)

    Xin Zhang

    2018-04-01

    Full Text Available Stroke is one of the leading causes of permanent disability in adults. The literature suggests that rehabilitation is key to early motor recovery. However, conventional therapy is labor and cost intensive. Robotic and functional electrical stimulation (FES devices can provide a high dose of repetitions and as such may provide an alternative, or an adjunct, to conventional rehabilitation therapy. Brain-computer interfaces (BCI could augment neuroplasticity by introducing mental training. However, mental training alone is not enough; but combining mental with physical training could boost outcomes. In the current case study, a portable rehabilitative platform and goal-oriented supporting training protocols were introduced and tested with a chronic stroke participant. A novel training method was introduced with the proposed rehabilitative platform. A 37-year old individual with chronic stroke participated in 6-weeks of training (18 sessions in total, 3 sessions a week, and 1 h per session. In this case study, we show that an individual with chronic stroke can tolerate a 6-week training bout with our system and protocol. The participant was actively engaged throughout the training. Changes in the Wolf Motor Function Test (WMFT suggest that the training positively affected arm motor function (12% improvement in WMFT score.

  1. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    Science.gov (United States)

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Group Analytic Psychotherapy in Brazil.

    Science.gov (United States)

    Penna, Carla; Castanho, Pablo

    2015-10-01

    Group analytic practice in Brazil began quite early. Highly influenced by the Argentinean Pichon-Rivière, it enjoyed a major development from the 1950s to the early 1980s. Beginning in the 1970s, different factors undermined its development and eventually led to its steep decline. From the mid 1980s on, the number of people looking for either group analytic psychotherapy or group analytic training decreased considerably. Group analytic psychotherapy societies struggled to survive and most of them had to close their doors in the 1990s and the following decade. Psychiatric reform and the new public health system have stimulated a new demand for groups in Brazil. Developments in the public and not-for-profit sectors, combined with theoretical and practical research in universities, present promising new perspectives for group analytic psychotherapy in Brazil nowadays.

  3. An analytical model for the performance of geographical multi-hop broadcast

    NARCIS (Netherlands)

    Klein Wolterink, W.; Heijenk, G.; Berg, J.L. van den

    2012-01-01

    In this paper we present an analytical model accurately describing the behaviour of a multi-hop broadcast protocol. Our model covers the scenario in which a message is forwarded over a straight road and inter-node distances are distributed exponentially. Intermediate forwarders draw a small random

  4. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  5. Persistent RCSMA: A MAC Protocol for a Distributed Cooperative ARQ Scheme in Wireless Networks

    Directory of Open Access Journals (Sweden)

    J. Alonso-Zárate

    2008-05-01

    Full Text Available The persistent relay carrier sensing multiple access (PRCSMA protocol is presented in this paper as a novel medium access control (MAC protocol that allows for the execution of a distributed cooperative automatic retransmission request (ARQ scheme in IEEE 802.11 wireless networks. The underlying idea of the PRCSMA protocol is to modify the basic rules of the IEEE 802.11 MAC protocol to execute a distributed cooperative ARQ scheme in wireless networks in order to enhance their performance and to extend coverage. A closed formulation of the distributed cooperative ARQ average packet transmission delay in a saturated network is derived in the paper. The analytical equations are then used to evaluate the performance of the protocol under different network configurations. Both the accuracy of the analysis and the performance evaluation of the protocol are supported and validated through computer simulations.

  6. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Directory of Open Access Journals (Sweden)

    Shameng Wen

    Full Text Available Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  7. Dose-Response Assessment of Four Genotoxic Chemicals in a Combined Mouse and Rat Micronucleus and Comet Assay Protocol

    Science.gov (United States)

    Recio, Leslie; Hobbs, Cheryl; Caspary, William; Witt, Kristine L.

    2012-01-01

    The in vivo micronucleus (MN) assay has proven to be an effective measure of genotoxicity potential. However, sampling a single tissue (bone marrow) for a single indicator of genetic damage using the MN assay provides a limited genotoxicity profile. The in vivo alkaline (pH>13) Comet assay, which detects a broad spectrum of DNA damage, can be applied to a variety of rodent tissues following administration of test agents. To determine if the Comet assay is a useful supplement to the in vivo MN assay, a combined test protocol (MN/Comet assay) was conducted in male B6C3F1 mice and F344/N rats using four model genotoxicants: ethyl methanesulfonate (EMS), acrylamide (ACM), cyclophosphamide (CP), and vincristine sulfate (VS). Test compounds were administered on 4 consecutive days at 24-hour intervals (VS was administered to rats for 3 days); animals were euthanized 4 hours after the last administration. All compounds induced significant increases in micronucleated reticulocytes (MN-RET) in the peripheral blood of mice, and all but ACM induced MN-RET in rats. EMS and ACM induced significant increases in DNA damage, measured by the Comet assay, in multiple tissues of mice and rats. CP-induced DNA damage was detected in leukocytes and duodenum cells. VS, a spindle fiber disrupting agent, was negative in the Comet assay. Based on these results, the MN/Comet assay holds promise for providing more comprehensive assessments of potential genotoxicants, and the National Toxicology Program is presently using this combined protocol in its overall evaluation of the genotoxicity of substances of public health concern. PMID:20371966

  8. Systematic assessment of benefits and risks: study protocol for a multi-criteria decision analysis using the Analytic Hierarchy Process for comparative effectiveness research.

    Science.gov (United States)

    Maruthur, Nisa M; Joy, Susan; Dolan, James; Segal, Jodi B; Shihab, Hasan M; Singh, Sonal

    2013-01-01

    Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes.  This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation.  Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences.

  9. The Nagoya Protocol: Fragmentation or Consolidation?

    Directory of Open Access Journals (Sweden)

    Carmen Richerzhagen

    2014-02-01

    Full Text Available In October, 2010, a protocol on access and benefit-sharing (ABS of genetic resources was adopted, the so-called Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity. Before the adoption of the Nagoya Protocol, the governance architecture of ABS was already characterized by a multifaceted institutional environment. The use of genetic resources is confronted with many issues (conservation, research and development, intellectual property rights, food security, health issues, climate change that are governed by different institutions and agreements. The Nagoya Protocol contributes to increased fragmentation. However, the question arises whether this new regulatory framework can help to advance the implementation of the ABS provisions of the Convention on Biological Diversity (CBD. This paper attempts to find an answer to that question by following three analytical steps. First, it analyzes the causes of change against the background of theories of institutional change. Second, it aims to assess the typology of the architecture in order to find out if this new set of rules will contribute to a more synergistic, cooperative or conflictive architecture of ABS governance. Third, the paper looks at the problem of “fit” and identifies criteria that can be used to assess the new ABS governance architecture with regard to its effectiveness.

  10. A Clustering Routing Protocol for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jinke Huang

    2016-01-01

    Full Text Available The dynamic topology of a mobile ad hoc network poses a real challenge in the design of hierarchical routing protocol, which combines proactive with reactive routing protocols and takes advantages of both. And as an essential technique of hierarchical routing protocol, clustering of nodes provides an efficient method of establishing a hierarchical structure in mobile ad hoc networks. In this paper, we designed a novel clustering algorithm and a corresponding hierarchical routing protocol for large-scale mobile ad hoc networks. Each cluster is composed of a cluster head, several cluster gateway nodes, several cluster guest nodes, and other cluster members. The proposed routing protocol uses proactive protocol between nodes within individual clusters and reactive protocol between clusters. Simulation results show that the proposed clustering algorithm and hierarchical routing protocol provide superior performance with several advantages over existing clustering algorithm and routing protocol, respectively.

  11. Preoperative vestibular assessment protocol of cochlear implant surgery: an analytical descriptive study.

    Science.gov (United States)

    Bittar, Roseli Saraiva Moreira; Sato, Eduardo Setsuo; Ribeiro, Douglas Jósimo Silva; Tsuji, Robinson Koji

    Cochlear implants are undeniably an effective method for the recovery of hearing function in patients with hearing loss. To describe the preoperative vestibular assessment protocol in subjects who will be submitted to cochlear implants. Our institutional protocol provides the vestibular diagnosis through six simple tests: Romberg and Fukuda tests, assessment for spontaneous nystagmus, Head Impulse Test, evaluation for Head Shaking Nystagmus and caloric test. 21 patients were evaluated with a mean age of 42.75±14.38 years. Only 28% of the sample had all normal test results. The presence of asymmetric vestibular information was documented through the caloric test in 32% of the sample and spontaneous nystagmus was an important clue for the diagnosis. Bilateral vestibular areflexia was present in four subjects, unilateral arreflexia in three and bilateral hyporeflexia in two. The Head Impulse Test was a significant indicator for the diagnosis of areflexia in the tested ear (p=0.0001). The sensitized Romberg test using a foam pad was able to diagnose severe vestibular function impairment (p=0.003). The six clinical tests were able to identify the presence or absence of vestibular function and function asymmetry between the ears of the same individual. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  12. Effective dose comparison between protocols stitched and usual protocols in dental cone beam CT for complete arcade

    International Nuclear Information System (INIS)

    Soares, M. R.; Maia, A. F.; Batista, W. O. G.; Lara, P. A.

    2014-08-01

    To visualization a complete dental radiology dental lives together with two separate proposals: [1] protocols diameter encompassing the entire arch (single) or [2] protocol with multiple fields of view (Fov) which together encompass the entire arch (stitched Fov s). The objective of this study is to evaluate effective dose values in examination protocols for all dental arcade available in different outfits with these two options. For this, a female anthropomorphic phantom manufactured by Radiology Support Devices twenty six thermoluminescent dosimeters inserted in relevant bodies and positions was used. Irradiate the simulator in the clinical conditions. The protocols were averaged and compared: [a] 14.0 cm x 8.5 cm and [b] 8.5 cm x 8.5 cm (Gendex Tomography GXCB 500), [c] protocol stitched for jaw combination of three volumes of 5.0 cm x 3.7 cm (Kodak 9000 3D scanner) [d] protocol stitched Fov s 5.0 cm x 8.0 cm (Planmeca Pro Max 3D) and [e] single technical Fov 14 cm x 8 cm (i-CAT Classical). Our results for the effective dose were: a range between 43.1 and 111.1 micro Sv for technical single Fov and 44.5 and 236.2 for technical stitched Fov s. The protocol presented the highest estimated effective dose was [d] and showed that lowest index was registered [a]. These results demonstrate that the protocol stitched Fov generated in Kodak 9000 3D machine applied the upper dental arch has practically equal value effective dose obtained by protocol extended diameter of, [a], which evaluates in a single image upper and lower arcade. It also demonstrates that the protocol [d] gives an estimate of five times higher than the protocol [a]. Thus, we conclude that in practical terms the protocol [c] stitched Fov s, not presents dosimetric advantages over other protocols. (Author)

  13. Effective dose comparison between protocols stitched and usual protocols in dental cone beam CT for complete arcade

    Energy Technology Data Exchange (ETDEWEB)

    Soares, M. R.; Maia, A. F. [Universidade Federal de Sergipe, Departamento de Fisica, Cidade Universitaria Prof. Jose Aloisio de Campos, Marechal Rondon s/n, Jardim Rosa Elze, 49-100000 Sao Cristovao, Sergipe (Brazil); Batista, W. O. G. [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Barbalho, Salvador, 40301015 Bahia (Brazil); Lara, P. A., E-mail: wilsonottobatista@gmail.com [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    To visualization a complete dental radiology dental lives together with two separate proposals: [1] protocols diameter encompassing the entire arch (single) or [2] protocol with multiple fields of view (Fov) which together encompass the entire arch (stitched Fov s). The objective of this study is to evaluate effective dose values in examination protocols for all dental arcade available in different outfits with these two options. For this, a female anthropomorphic phantom manufactured by Radiology Support Devices twenty six thermoluminescent dosimeters inserted in relevant bodies and positions was used. Irradiate the simulator in the clinical conditions. The protocols were averaged and compared: [a] 14.0 cm x 8.5 cm and [b] 8.5 cm x 8.5 cm (Gendex Tomography GXCB 500), [c] protocol stitched for jaw combination of three volumes of 5.0 cm x 3.7 cm (Kodak 9000 3D scanner) [d] protocol stitched Fov s 5.0 cm x 8.0 cm (Planmeca Pro Max 3D) and [e] single technical Fov 14 cm x 8 cm (i-CAT Classical). Our results for the effective dose were: a range between 43.1 and 111.1 micro Sv for technical single Fov and 44.5 and 236.2 for technical stitched Fov s. The protocol presented the highest estimated effective dose was [d] and showed that lowest index was registered [a]. These results demonstrate that the protocol stitched Fov generated in Kodak 9000 3D machine applied the upper dental arch has practically equal value effective dose obtained by protocol extended diameter of, [a], which evaluates in a single image upper and lower arcade. It also demonstrates that the protocol [d] gives an estimate of five times higher than the protocol [a]. Thus, we conclude that in practical terms the protocol [c] stitched Fov s, not presents dosimetric advantages over other protocols. (Author)

  14. Efficient MAC Protocol for Hybrid Wireless Network with Heterogeneous Sensor Nodes

    Directory of Open Access Journals (Sweden)

    Md. Nasre Alam

    2016-01-01

    Full Text Available Although several Directional Medium Access Control (DMAC protocols have been designed for use with homogeneous networks, it can take a substantial amount of time to change sensor nodes that are equipped with an omnidirectional antenna for sensor nodes with a directional antenna. Thus, we require a novel MAC protocol for use with an intermediate wireless network that consists of heterogeneous sensor nodes equipped with either an omnidirectional antenna or a directional antenna. The MAC protocols that have been designed for use in homogeneous networks are not suitable for use in a hybrid network due to deaf, hidden, and exposed nodes. Therefore, we propose a MAC protocol that exploits the characteristics of a directional antenna and can also work efficiently with omnidirectional nodes in a hybrid network. In order to address the deaf, hidden, and exposed node problems, we define RTS/CTS for the neighbor (RTSN/CTSN and Neighbor Information (NIP packets. The performance of the proposed MAC protocol is evaluated through a numerical analysis using a Markov model. In addition, the analytical results of the MAC protocol are verified through an OPNET simulation.

  15. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    OpenAIRE

    Magdalena BORYS; Monika CZWÓRNÓG; Tomasz RATAJCZYK

    2016-01-01

    The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analy...

  16. A protocol for the delivery of cannabidiol (CBD) and combined CBD and ∆9-tetrahydrocannabinol (THC) by vaporisation.

    Science.gov (United States)

    Solowij, Nadia; Broyd, Samantha J; van Hell, Hendrika H; Hazekamp, Arno

    2014-10-16

    Significant interest has emerged in the therapeutic and interactive effects of different cannabinoids. Cannabidiol (CBD) has been shown to have anxiolytic and antipsychotic effects with high doses administered orally. We report a series of studies conducted to determine the vaporisation efficiency of high doses of CBD, alone and in combination with ∆9-tetrahydrocannabinol (THC), to achieve faster onset effects in experimental and clinical trials and emulate smoked cannabis. Purified THC and CBD (40 mg/ml and 100 mg/ml respectively) were loaded onto a liquid absorbing pad in a Volcano vaporiser, vaporised and the vapours quantitatively analysed. Preliminary studies determined 200 mg CBD to be the highest dose effectively vaporised at 230 ° C, yielding an availability of approximately 40% in the vapour phase. Six confirmatory studies examined the quantity of each compound delivered when 200 mg or 4 mg CBD was loaded together with 8 mg of THC. THC showed 55% availability when vaporised alone or with low dose CBD, while large variation in the availability of high dose CBD impacted upon the availability of THC when co-administered, with each compound affecting the vaporisation efficiency of the other in a dynamic and dose-dependent manner. We describe optimised protocols that enable delivery of 160 mg CBD through vaporisation. While THC administration by vaporisation is increasingly adopted in experimental studies, often with oral predosing with CBD to examine interactive effects, no studies to date have reported the administration of CBD by vaporisation. We report the detailed methodology aimed at optimising the efficiency of delivery of therapeutic doses of CBD, alone and in combination with THC, by vaporisation. These protocols provide a technical advance that may inform methodology for clinical trials in humans, especially for examining interactions between THC and CBD and for therapeutic applications of CBD. Current Controlled Trials ISRCTN24109245.

  17. Live birth in a 46-year-old woman using microdose GnRH agonist flare-up protocol combined with GnRH antagonist: a case report.

    Science.gov (United States)

    Zhang, Hong; Liu, Ya-Qiong; Lu, Guang-Xiu; Gong, Fei

    2016-12-01

    Few successful pregnancies after age 45 years with low ovarian reserve have been reported. We report a 46-year-old woman with basal FSH 20.36 mIU/mL and an antral follicle count of four obtained two embryos and delivered a healthy infant with IVF using a microdose GnRH-a flare-up protocol combined with GnRH-ant.

  18. Hybrid protocols plus natural treatments for inflammatory conditions.

    Science.gov (United States)

    1998-01-01

    Hybrid protocols combine one, two, or three pharmaceutical drugs with several nutritional or immune-based therapies. These protocols are not limited solely to FDA-approved drugs or strictly to alternative therapies. The rationale for using a hybrid protocol is to find an effective antiviral regimen that also restores immune function. The goal is to obtain the benefits of protease inhibitors without viral resistance and side effects which include problems with fat metabolism and cholesterol levels. Natural treatments for inflammatory conditions are also described. Options include licorice root, ginger root, and slippery elm.

  19. An Energy-Efficient Link Layer Protocol for Reliable Transmission over Wireless Networks

    Directory of Open Access Journals (Sweden)

    Iqbal Adnan

    2009-01-01

    Full Text Available In multihop wireless networks, hop-by-hop reliability is generally achieved through positive acknowledgments at the MAC layer. However, positive acknowledgments introduce significant energy inefficiencies on battery-constrained devices. This inefficiency becomes particularly significant on high error rate channels. We propose to reduce the energy consumption during retransmissions using a novel protocol that localizes bit-errors at the MAC layer. The proposed protocol, referred to as Selective Retransmission using Virtual Fragmentation (SRVF, requires simple modifications to the positive-ACK-based reliability mechanism but provides substantial improvements in energy efficiency. The main premise of the protocol is to localize bit-errors by performing partial checksums on disjoint parts or virtual fragments of a packet. In case of error, only the corrupted virtual fragments are retransmitted. We develop stochastic models of the Simple Positive-ACK-based reliability, the previously-proposed Packet Length Optimization (PLO protocol, and the SRVF protocol operating over an arbitrary-order Markov wireless channel. Our analytical models show that SRVF provides significant theoretical improvements in energy efficiency over existing protocols. We then use bit-error traces collected over different real networks to empirically compare the proposed and existing protocols. These experimental results further substantiate that SRVF provides considerably better energy efficiency than Simple Positive-ACK and Packet Length Optimization protocols.

  20. Using generalizability theory to develop clinical assessment protocols.

    Science.gov (United States)

    Preuss, Richard A

    2013-04-01

    Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.

  1. Understanding protocol performance: impact of test performance.

    Science.gov (United States)

    Turner, Robert G

    2013-01-01

    This is the second of two articles that examine the factors that determine protocol performance. The objective of these articles is to provide a general understanding of protocol performance that can be used to estimate performance, establish limits on performance, decide if a protocol is justified, and ultimately select a protocol. The first article was concerned with protocol criterion and test correlation. It demonstrated the advantages and disadvantages of different criterion when all tests had the same performance. It also examined the impact of increasing test correlation on protocol performance and the characteristics of the different criteria. To examine the impact on protocol performance when individual tests in a protocol have different performance. This is evaluated for different criteria and test correlations. The results of the two articles are combined and summarized. A mathematical model is used to calculate protocol performance for different protocol criteria and test correlations when there are small to large variations in the performance of individual tests in the protocol. The performance of the individual tests that make up a protocol has a significant impact on the performance of the protocol. As expected, the better the performance of the individual tests, the better the performance of the protocol. Many of the characteristics of the different criteria are relatively independent of the variation in the performance of the individual tests. However, increasing test variation degrades some criteria advantages and causes a new disadvantage to appear. This negative impact increases as test variation increases and as more tests are added to the protocol. Best protocol performance is obtained when individual tests are uncorrelated and have the same performance. In general, the greater the variation in the performance of tests in the protocol, the more detrimental this variation is to protocol performance. Since this negative impact is increased as

  2. Analytic solutions of hydrodynamics equations

    International Nuclear Information System (INIS)

    Coggeshall, S.V.

    1991-01-01

    Many similarity solutions have been found for the equations of one-dimensional (1-D) hydrodynamics. These special combinations of variables allow the partial differential equations to be reduced to ordinary differential equations, which must then be solved to determine the physical solutions. Usually, these reduced ordinary differential equations are solved numerically. In some cases it is possible to solve these reduced equations analytically to obtain explicit solutions. In this work a collection of analytic solutions of the 1-D hydrodynamics equations is presented. These can be used for a variety of purposes, including (i) numerical benchmark problems, (ii) as a basis for analytic models, and (iii) to provide insight into more complicated solutions

  3. Analytical applications of ICP-FTS

    International Nuclear Information System (INIS)

    Faires, L.M.; Palmer, B.A.; Cunningham, P.T.

    1986-01-01

    The Analytical Chemistry Group of the Chemistry Division at Los Alamos National Laboratory has been investigating the analytical utility of the inductively coupled plasma (ICP) - Fourier transform spectrometer (FTS) combination. While a new state-of-the-art FTS facility is under construction at Los Alamos, preliminary data has been obtained on the one-meter FTS at the National Solar Observatory at Kitt Peak, Arizona. This paper presents an update of the Los Alamos FTS facility, which is expected to be completed in 1986, and presents data showing the analytical potential of an ICP-FTS system. Some of the potential problems of the multiplex disadvantage are discussed, and the advantages of the high resolution obtainable with the FTS are illustrated

  4. Security Property Validation of the Sensor Network Encryption Protocol (SNEP

    Directory of Open Access Journals (Sweden)

    Salekul Islam

    2015-07-01

    Full Text Available Since wireless sensor networks (WSNs have been designed to be deployed in an unsecured, public environment, secured communication is really vital for their wide-spread use. Among all of the communication protocols developed for WSN, the Security Protocols for Sensor Networks (SPINS is exceptional, as it has been designed with security as a goal. SPINS is composed of two building blocks: Secure Network Encryption Protocol (SNEP and the “micro” version of the Timed Efficient Streaming Loss-tolerant Authentication (TESLA, named μTESLA. From the inception of SPINS, a number of efforts have been made to validate its security properties. In this paper, we have validated the security properties of SNEP by using an automated security protocol validation tool, named AVISPA. Using the protocol specification language, HLPSL, we model two combined scenarios—node to node key agreement and counter exchange protocols—followed by data transmission. Next, we validate the security properties of these combined protocols, using different AVISPA back-ends. AVISPA reports the models we have developed free from attacks. However, by analyzing the key distribution sub-protocol, we find one threat of a potential DoS attack that we have demonstrated by modeling in AVISPA. Finally, we propose a modification, and AVISPA reports this modified version free from the potential DoS attack.

  5. An efficient multi-carrier position-based packet forwarding protocol for wireless sensor networks

    KAUST Repository

    Bader, Ahmed

    2012-01-01

    Beaconless position-based forwarding protocols have recently evolved as a promising solution for packet forwarding in wireless sensor networks. However, as the node density grows, the overhead incurred in the process of relay selection grows significantly. As such, end-to-end performance in terms of energy and latency is adversely impacted. With the motivation of developing a packet forwarding mechanism that is tolerant to variation in node density, an alternative position-based protocol is proposed in this paper. In contrast to existing beaconless protocols, the proposed protocol is designed such that it eliminates the need for potential relays to undergo a relay selection process. Rather, any eligible relay may decide to forward the packet ahead, thus significantly reducing the underlying overhead. The operation of the proposed protocol is empowered by exploiting favorable features of orthogonal frequency division multiplexing (OFDM) at the physical layer. The end-to-end performance of the proposed protocol is evaluated against existing beaconless position-based protocols analytically and as well by means of simulations. The proposed protocol is demonstrated in this paper to be more efficient. In particular, it is shown that for the same amount of energy the proposed protocol transports one bit from source to destination much quicker. © 2012 IEEE.

  6. Exact analysis of Packet Reversed Packet Combining Scheme and Modified Packet Combining Scheme; and a combined scheme

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2007-07-01

    Packet combining scheme is a well defined simple error correction scheme for the detection and correction of errors at the receiver. Although it permits a higher throughput when compared to other basic ARQ protocols, packet combining (PC) scheme fails to correct errors when errors occur in the same bit locations of copies. In a previous work, a scheme known as Packet Reversed Packet Combining (PRPC) Scheme that will correct errors which occur at the same bit location of erroneous copies, was studied however PRPC does not handle a situation where a packet has more than 1 error bit. The Modified Packet Combining (MPC) Scheme that can correct double or higher bit errors was studied elsewhere. Both PRPC and MPC schemes are believed to offer higher throughput in previous studies, however neither adequate investigation nor exact analysis was done to substantiate this claim of higher throughput. In this work, an exact analysis of both PRPC and MPC is carried out and the results reported. A combined protocol (PRPC and MPC) is proposed and the analysis shows that it is capable of offering even higher throughput and better error correction capability at high bit error rate (BER) and larger packet size. (author)

  7. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Magdalena BORYS

    2016-12-01

    Full Text Available The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analysis of participants’ behaviour during eye-tracking sessions allowed improvements of the prototype.

  8. Semi-analytical MBS Pricing

    DEFF Research Database (Denmark)

    Rom-Poulsen, Niels

    2007-01-01

    This paper presents a multi-factor valuation model for fixed-rate callable mortgage backed securities (MBS). The model yields semi-analytic solutions for the value of MBS in the sense that the MBS value is found by solving a system of ordinary differential equations. Instead of modelling the cond......This paper presents a multi-factor valuation model for fixed-rate callable mortgage backed securities (MBS). The model yields semi-analytic solutions for the value of MBS in the sense that the MBS value is found by solving a system of ordinary differential equations. Instead of modelling...... interest rate model. However, if the pool size is specified in a way that makes the expectations solvable using transform methods, semi-analytic pricing formulas are achieved. The affine and quadratic pricing frameworks are combined to get flexible and sophisticated prepayment functions. We show...

  9. Paper-Based Analytical Device for Zinc Ion Quantification in Water Samples with Power-Free Analyte Concentration

    Directory of Open Access Journals (Sweden)

    Hiroko Kudo

    2017-04-01

    Full Text Available Insufficient sensitivity is a general issue of colorimetric paper-based analytical devices (PADs for trace analyte detection, such as metal ions, in environmental water. This paper demonstrates the colorimetric detection of zinc ions (Zn2+ on a paper-based analytical device with an integrated analyte concentration system. Concentration of Zn2+ ions from an enlarged sample volume (1 mL has been achieved with the aid of a colorimetric Zn2+ indicator (Zincon electrostatically immobilized onto a filter paper substrate in combination with highly water-absorbent materials. Analyte concentration as well as sample pretreatment, including pH adjustment and interferent masking, has been elaborated. The resulting device enables colorimetric quantification of Zn2+ in environmental water samples (tap water, river water from a single sample application. The achieved detection limit of 0.53 μM is a significant improvement over that of a commercial colorimetric Zn2+ test paper (9.7 μM, demonstrating the efficiency of the developed analyte concentration system not requiring any equipment.

  10. An ultra low-power and traffic-adaptive medium access control protocol for wireless body area network.

    Science.gov (United States)

    Ullah, Sana; Kwak, Kyung Sup

    2012-06-01

    Wireless Body Area Network (WBAN) consists of low-power, miniaturized, and autonomous wireless sensor nodes that enable physicians to remotely monitor vital signs of patients and provide real-time feedback with medical diagnosis and consultations. It is the most reliable and cheaper way to take care of patients suffering from chronic diseases such as asthma, diabetes and cardiovascular diseases. Some of the most important attributes of WBAN is low-power consumption and delay. This can be achieved by introducing flexible duty cycling techniques on the energy constraint sensor nodes. Stated otherwise, low duty cycle nodes should not receive frequent synchronization and control packets if they have no data to send/receive. In this paper, we introduce a Traffic-adaptive MAC protocol (TaMAC) by taking into account the traffic information of the sensor nodes. The protocol dynamically adjusts the duty cycle of the sensor nodes according to their traffic-patterns, thus solving the idle listening and overhearing problems. The traffic-patterns of all sensor nodes are organized and maintained by the coordinator. The TaMAC protocol is supported by a wakeup radio that is used to accommodate emergency and on-demand events in a reliable manner. The wakeup radio uses a separate control channel along with the data channel and therefore it has considerably low power consumption requirements. Analytical expressions are derived to analyze and compare the performance of the TaMAC protocol with the well-known beacon-enabled IEEE 802.15.4 MAC, WiseMAC, and SMAC protocols. The analytical derivations are further validated by simulation results. It is shown that the TaMAC protocol outperforms all other protocols in terms of power consumption and delay.

  11. The management of cornea blindness from severe corneal scarring, with the Athens Protocol (transepithelial topography-guided PRK therapeutic remodeling, combined with same-day, collagen cross-linking

    Directory of Open Access Journals (Sweden)

    Kanellopoulos AJ

    2012-02-01

    Full Text Available Anastasios John KanellopoulosLaservision.gr Institute, Athens, Greece; Manhattan Eye, Ear and Throat Hospital, New York, NY, USA; New York University Medical School, New York, NY, USAPurpose: To evaluate the safety and efficacy of combined transepithelial topography-guided photorefractive keratectomy (PRK therapeutic remodeling, combined with same-day, collagen cross-linking (CXL. This protocol was used for the management of cornea blindness due to severe corneal scarring.Methods: A 57-year-old man had severe corneal blindness in both eyes. Both corneas had significant central scars attributed to a firework explosion 45 years ago, when the patient was 12 years old. Corrected distance visual acuity (CDVA was 20/100 both eyes (OU with refraction: +4.00, –4.50 at 135° in the right eye and +3.50, –1.00 at 55° in the left. Respective keratometries were: 42.3, 60.4 at 17° and 35.8, 39.1 at 151.3°. Cornea transplantation was the recommendation by multiple cornea specialists as the treatment of choice. We decided prior to considering a transplant to employ the Athens Protocol (combined topography-guided partial PRK and CXL in the right eye in February 2010 and in the left eye in September 2010. The treatment plan for both eyes was designed on the topography-guided wavelight excimer laser platform.Results: Fifteen months after the right eye treatment, the right cornea had improved translucency and was topographically stable with uncorrected distance visual acuity (UDVA 20/50 and CDVA 20/40 with refraction +0.50, –2.00 at 5°. We noted a similar outcome after similar treatment applied in the left eye with UDVA 20/50 and CDVA 20/40 with –0.50, –2.00 at 170° at the 8-month follow-up.Conclusion: In this case, the introduction of successful management of severe cornea abnormalities and scarring with the Athens Protocol may provide an effective alternative to other existing surgical or medical options.Keywords: Athens Protocol, collagen cross

  12. The management of cornea blindness from severe corneal scarring, with the Athens Protocol (transepithelial topography-guided PRK therapeutic remodeling, combined with same-day, collagen cross-linking).

    Science.gov (United States)

    Kanellopoulos, Anastasios John

    2012-01-01

    To evaluate the safety and efficacy of combined transepithelial topography-guided photorefractive keratectomy (PRK) therapeutic remodeling, combined with same-day, collagen cross-linking (CXL). This protocol was used for the management of cornea blindness due to severe corneal scarring. A 57-year-old man had severe corneal blindness in both eyes. Both corneas had significant central scars attributed to a firework explosion 45 years ago, when the patient was 12 years old. Corrected distance visual acuity (CDVA) was 20/100 both eyes (OU) with refraction: +4.00, -4.50 at 135° in the right eye and +3.50, -1.00 at 55° in the left. Respective keratometries were: 42.3, 60.4 at 17° and 35.8, 39.1 at 151.3°. Cornea transplantation was the recommendation by multiple cornea specialists as the treatment of choice. We decided prior to considering a transplant to employ the Athens Protocol (combined topography-guided partial PRK and CXL) in the right eye in February 2010 and in the left eye in September 2010. The treatment plan for both eyes was designed on the topography-guided wavelight excimer laser platform. Fifteen months after the right eye treatment, the right cornea had improved translucency and was topographically stable with uncorrected distance visual acuity (UDVA) 20/50 and CDVA 20/40 with refraction +0.50, -2.00 at 5°. We noted a similar outcome after similar treatment applied in the left eye with UDVA 20/50 and CDVA 20/40 with -0.50, -2.00 at 170° at the 8-month follow-up. In this case, the introduction of successful management of severe cornea abnormalities and scarring with the Athens Protocol may provide an effective alternative to other existing surgical or medical options.

  13. OT-Combiners Via Secure Computation

    DEFF Research Database (Denmark)

    Harnik, Danny; Ishai, Yuval; Kushilevitz, Eyal

    2008-01-01

    of faulty candidates (t = Ω(n)). Previous OT-combiners required either ω(n) or poly(k) calls to the n candidates, where k is a security parameter, and produced only a single secure OT. We demonstrate the usefulness of the latter result by presenting several applications that are of independent interest......An OT-combiner implements a secure oblivious transfer (OT) protocol using oracle access to n OT-candidates of which at most t may be faulty. We introduce a new general approach for combining OTs by making a simple and modular use of protocols for secure computation. Specifically, we obtain an OT......, strengthen the security, and improve the efficiency of previous OT-combiners. In particular, we obtain the first constant-rate OT-combiners in which the number of secure OTs being produced is a constant fraction of the total number of calls to the OT-candidates, while still tolerating a constant fraction...

  14. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  15. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Beckers, Catherine; Hustinx, Roland

    2014-01-01

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  16. A Secure and Efficient Handover Authentication Protocol for Wireless Networks

    Directory of Open Access Journals (Sweden)

    Weijia Wang

    2014-06-01

    Full Text Available Handover authentication protocol is a promising access control technology in the fields of WLANs and mobile wireless sensor networks. In this paper, we firstly review an effcient handover authentication protocol, named PairHand, and its existing security attacks and improvements. Then, we present an improved key recovery attack by using the linearly combining method and reanalyze its feasibility on the improved PairHand protocol. Finally, we present a new handover authentication protocol, which not only achieves the same desirable effciency features of PairHand, but enjoys the provable security in the random oracle model.

  17. Supplier Selection for Food Industry: A Combination of Taguchi Loss Function and Fuzzy Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Renna Magdalena

    2012-09-01

    Full Text Available Supplier selection is an important part of supply chain management process by which firms identify, evaluate, and establish contracts with suppliers. Deciding the right supplier can be a complex task. As such, various criteria must be taken into account to choose the best supplier. This study focused on the supply in the packaging division of a food industry in Denpasar-Bali. A combination of Taguchi Loss Function and fuzzy-AHP (Analytical Hierarchy Process Fuzzy Linear Programming was used to determine the best supplier. In this analysis, several suppliers’ criteria were considered, namely quality, delivery, completeness, quality loss and environmental management. By maximizing the suppliers’ performances based on each criterion and aggregating the suppliers’ performances based on the overall criteria, the best supplier was determined. Keywords: supplier selection, taguchi loss function, AHP, fuzzy linear programming,environment

  18. Decision support for environmental management of industrial non-hazardous secondary materials: New analytical methods combined with simulation and optimization modeling.

    Science.gov (United States)

    Little, Keith W; Koralegedara, Nadeesha H; Northeim, Coleen M; Al-Abed, Souhail R

    2017-07-01

    Non-hazardous solid materials from industrial processes, once regarded as waste and disposed in landfills, offer numerous environmental and economic advantages when put to beneficial uses (BUs). Proper management of these industrial non-hazardous secondary materials (INSM) requires estimates of their probable environmental impacts among disposal as well as BU options. The U.S. Environmental Protection Agency (EPA) has recently approved new analytical methods (EPA Methods 1313-1316) to assess leachability of constituents of potential concern in these materials. These new methods are more realistic for many disposal and BU options than historical methods, such as the toxicity characteristic leaching protocol. Experimental data from these new methods are used to parameterize a chemical fate and transport (F&T) model to simulate long-term environmental releases from flue gas desulfurization gypsum (FGDG) when disposed of in an industrial landfill or beneficially used as an agricultural soil amendment. The F&T model is also coupled with optimization algorithms, the Beneficial Use Decision Support System (BUDSS), under development by EPA to enhance INSM management. Published by Elsevier Ltd.

  19. A combined static-dynamic single-dose imaging protocol to compare quantitative dynamic SPECT with static conventional SPECT.

    Science.gov (United States)

    Sciammarella, Maria; Shrestha, Uttam M; Seo, Youngho; Gullberg, Grant T; Botvinick, Elias H

    2017-08-03

    SPECT myocardial perfusion imaging (MPI) is a clinical mainstay that is typically performed with static imaging protocols and visually or semi-quantitatively assessed for perfusion defects based upon the relative intensity of myocardial regions. Dynamic cardiac SPECT presents a new imaging technique based on time-varying information of radiotracer distribution, which permits the evaluation of regional myocardial blood flow (MBF) and coronary flow reserve (CFR). In this work, a preliminary feasibility study was conducted in a small patient sample designed to implement a unique combined static-dynamic single-dose one-day visit imaging protocol to compare quantitative dynamic SPECT with static conventional SPECT for improving the diagnosis of coronary artery disease (CAD). Fifteen patients (11 males, four females, mean age 71 ± 9 years) were enrolled for a combined dynamic and static SPECT (Infinia Hawkeye 4, GE Healthcare) imaging protocol with a single dose of 99m Tc-tetrofosmin administered at rest and a single dose administered at stress in a one-day visit. Out of 15 patients, eleven had selective coronary angiography (SCA), 8 within 6 months and the rest within 24 months of SPECT imaging, without intervening symptoms or interventions. The extent and severity of perfusion defects in each myocardial region was graded visually. Dynamically acquired data were also used to estimate the MBF and CFR. Both visually graded images and estimated CFR were tested against SCA as a reference to evaluate the validity of the methods. Overall, conventional static SPECT was normal in ten patients and abnormal in five patients, dynamic SPECT was normal in 12 patients and abnormal in three patients, and CFR from dynamic SPECT was normal in nine patients and abnormal in six patients. Among those 11 patients with SCA, conventional SPECT was normal in 5, 3 with documented CAD on SCA with an overall accuracy of 64%, sensitivity of 40% and specificity of 83%. Dynamic SPECT image

  20. Analytical protocols for the determination of sulphur compounds characteristic of the metabolism of Chlorobium limicola

    Directory of Open Access Journals (Sweden)

    A. Aliboni

    2015-09-01

    Full Text Available Chlorobium limicola belongs to the green sulphur bacteria that has a potential for technological applications such as biogas clean up oxidising hydrogen sulphide to elemental sulphur through photosynthetic process. In the present work, analytical methods are described for the determination of different sulphur species in C. limicola cultures – sulphide by GC-FPD, sulphate by ionic HPLC and elemental sulphur by RP HPLC. The latter method eliminates the need for chloroform extraction of water suspensions of elemental sulphur. Data from sulphide and elemental sulphur analyses have been compared with ones coming from more traditional analytical methodologies.

  1. A Business Evaluation Of The Next Generation Ipv6 Protocol In Fixed And Mobile Communication Services

    OpenAIRE

    Pau, Louis-François

    2002-01-01

    textabstractThis paper gives an analytical business model of the Internet IPv4 and IPv6 protocols ,focussing on the business implications of intrinsic technical properties of these protocols .The technical properties modeled in business terms are : address space, payload, autoconfiguration, IP mobility , security, and flow label. Three operational cash flow focussed performance indexes are defined for respectively an Internet operator or ISP, for the address domain owner, and for the end user...

  2. An assessment of the economic and environmental implications for Canada of the Kyoto Protocol

    International Nuclear Information System (INIS)

    2000-11-01

    The National Climate Change Process was launched in April 1998 to examine the feasibility and implications of Canada's commitment to the Kyoto Protocol. The Analysis Modelling Group (AMG) was designated to assess the economic and environmental consequences for Canada in achieving the target. This report summarizes the analytical approach, the assumptions, the results and the main findings of the AMG's efforts to analyse the macro-/micro-economic, social, health and environmental implications of the Kyoto Protocol. The role of the AMG was to provide policymakers with guidance on some issues such as the economic implications of different broad policy approaches, the potential costs of greater access to the Kyoto flexibility mechanisms, the sectoral and regional distributions of emissions reductions, and the degree to which Canada's competitive position could be affected by the achievement of the Protocol. The relative importance of greenhouse gas reduction was also discussed along with a review of actions that offer significant potential for emissions reductions. The AMG examined five policy packages or Paths which are differentiated by different degrees of reliance on specific measures and tradable permit systems and by the imposition of sectoral versus national targets. It was concluded that at the national level, attainment of the target results in sustained, long-term negative economic impacts. In the long run, the reduction in gross domestic product (GDP) relative to the business-as-usual case, ranges from 0 to 3 per cent depending on the path-scenario combination. It was emphasized that it is important to provide perspective on these estimates. 37 refs., 64 figs

  3. A critical analysis of a locally agreed protocol for clinical practice

    International Nuclear Information System (INIS)

    Owen, A.; Hogg, P.; Nightingale, J.

    2004-01-01

    Within the traditional scope of radiographic practice (including advanced practice) there is a need to demonstrate effective patient care and management. Such practice should be set within a context of appropriate evidence and should also reflect peer practice. In order to achieve such practice the use of protocols is encouraged. Effective protocols can maximise care and management by minimising inter- and intra-professional variation; they can also allow for detailed procedural records to be kept in case of legal claims. However, whilst literature exists to encourage the use of protocols there is little published material available to indicate how to create, manage and archive them. This article uses an analytical approach to propose a suitable method for protocol creation and archival, it also offers suggestions on the scope and content of a protocol. To achieve this an existing clinical protocol for radiographer reporting barium enemas is analysed to draw out the general issues. Proposals for protocol creation, management, and archival were identified. The clinical practice described or inferred in the protocol should be drawn from evidence, such evidence could include peer-reviewed material, national standards and peer practice. The protocol should include an explanation of how to proceed when the radiographers reach the limit of their ability. It should refer to the initial training required to undertake the clinical duties as well as the on-going continual professional updating required to maintain competence. Audit of practice should be indicated, including the preferred audit methodology, and associated with this should be a clear statement about standards and what to do if standards are not adequately met. Protocols should be archived, in a paper-based form, for lengthy periods in case of legal claims. On the archived protocol the date it was in clinical use should be included

  4. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  5. A Context Aware Recommender System for Mobile Phone Selection Using Combination of Elimination Method and Analytic Hierarchy Processing

    Directory of Open Access Journals (Sweden)

    Jalal Rezaeenour

    2017-09-01

    Full Text Available Recommender systems suggest proper items to customers based on their preferences and needs. Needed time to search is reduced and the quality of customer’s choice is increased using recommender systems. The context information like time, location and user behaviors can enhance the quality of recommendations and customer satisfication in such systems. In this paper a context aware recommender system is designed and implemented in android smart phones to help customers select mobile phones. The system removes ineffective criteria on user’s purcheses using customer mobile phones’ sensor data. Then creates analytic hierarchy processing tree and computes weights. Finally the recommender system recommends proper mobile phone to user. The system selects and recommends suitable phones using combination of elimination method and analytic hierarchy processing (AHP. The context aware recommender system is used by mobile phone customers to assess recomendation satisfication and user interface design satisfication. In addition a traditional non-context aware recommender system is used by users to compare the recommendation results in two different systems. The article concludes that using context information can improve the recommendation quality and user satisfication. Because of decreasing criteria and pair-wised comparisions, the user interface design satisfication improves a little too.

  6. Publication trends of study protocols in rehabilitation.

    Science.gov (United States)

    Jesus, Tiago S; Colquhoun, Heather L

    2017-09-04

    Growing evidence points for the need to publish study protocols in the health field. To observe whether the growing interest in publishing study protocols in the broader health field has been translated into increased publications of rehabilitation study protocols. Observational study using publication data and its indexation in PubMed. Not applicable. Not applicable. PubMed was searched with appropriate combinations of Medical Subject Headings up to December 2014. The effective presence of study protocols was manually screened. Regression models analyzed the yearly growth of publications. Two-sample Z-tests analyzed whether the proportion of Systematic Reviews (SRs) and Randomized Controlled Trials (RCTs) among study protocols differed from that of the same designs for the broader rehabilitation research. Up to December 2014, 746 publications of rehabilitation study protocols were identified, with an exponential growth since 2005 (r2=0.981; p<0.001). RCT protocols were the most common among rehabilitation study protocols (83%), while RCTs were significantly more prevalent among study protocols than among the broader rehabilitation research (83% vs. 35.8%; p<0.001). For SRs, the picture was reversed: significantly less common among study protocols (2.8% vs. 9.3%; p<0.001). Funding was more often reported by rehabilitation study protocols than the broader rehabilitation research (90% vs. 53.1%; p<0.001). Rehabilitation journals published a significantly lower share of rehabilitation study protocols than they did for the broader rehabilitation research (1.8% vs.16.7%; p<0.001). Identifying the reasons for these discrepancies and reverting unwarranted disparities (e.g. low rate of publication for rehabilitation SR protocols) are likely new avenues for rehabilitation research and its publication. SRs, particularly those aggregating RCT results, are considered the best standard of evidence to guide rehabilitation clinical practice; however, that standard can be improved

  7. A new perspective on adolescent athletes’ transition into upper secondary school: A longitudinal mixed methods study protocol

    Directory of Open Access Journals (Sweden)

    Tatiana V. Ryba

    2016-12-01

    Full Text Available The challenge of combining elite sport and education into a dual career pathway remains to be a source of concern for many high-performance athletes. Previous research findings suggest that committed participation in both domains is highly demanding and success in one pursuit often comes at the expense of the other. There are emergent studies, however, that argue for the beneficial and complementary nature of dual career pathways. Consequently, we emphasize the importance of understanding the processes underlying differences in the development of athletes’ life trajectories. This article presents a study protocol to explore new methodological and analytical approaches that may extend current understandings of the ways psychological and sociocultural processes are interconnected in the construction of adolescent athletes’ identities, motivation, well-being, and career aspirations in the transitory social world.

  8. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    Directory of Open Access Journals (Sweden)

    Saurabh B. Ganorkar

    2017-02-01

    Full Text Available A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral, oxidative, photolytic (acidic, basic, neutral, solid state and thermal (dry heat degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm by isocratic mode at ambient temperature, employing a mobile phase methanol and (0.2%, v/v orthophosphoric acid in ratio of (80:20, v/v at a flow rate of 1.0 mL min−1 and detection at 260 nm. ‘Design of Experiments’ (DOE employing ‘Central Composite Design’ (CCD and ‘Response Surface Methodology’ (RSM were applied as an advancement to traditional ‘One Variable at Time’ (OVAT approach to evaluate the effects of variations in selected factors (methanol content, flow rate, concentration of orthophosphoric acid as graphical interpretation for robustness and statistical interpretation was achieved with Multiple Linear Regression (MLR and ANOVA. The method succeeded over the validation parameters: linearity, precision, accuracy, limit of detection and limit of quantitation, and robustness. The method was applied effectively for analysis of in-house zileuton tablets.

  9. Quantum cryptography: individual eavesdropping with the knowledge of the error-correcting protocol

    International Nuclear Information System (INIS)

    Horoshko, D B

    2007-01-01

    The quantum key distribution protocol BB84 combined with the repetition protocol for error correction is analysed from the point of view of its security against individual eavesdropping relying on quantum memory. It is shown that the mere knowledge of the error-correcting protocol changes the optimal attack and provides the eavesdropper with additional information on the distributed key. (fifth seminar in memory of d.n. klyshko)

  10. Medical management of early pregnancy failure (EPF): a retrospective analysis of a combined protocol of mifepristone and misoprostol used in clinical practice.

    Science.gov (United States)

    Colleselli, Valeria; Schreiber, Courtney A; D'Costa, Elisabeth; Mangesius, Stephanie; Wildt, Ludwig; Seeber, Beata E

    2014-06-01

    To evaluate the efficacy of a combined protocol of mifepristone and misoprostol in the management of early pregnancy failure (EPF) and the average time to expulsion of tissue and rate of side effects. Retrospective chart review of all consecutive women treated with primary medical management for EPF at our institution from 2006 to 2012. 168 patients were included in the present study. The overall success rate, defined as the absence of the need for surgical intervention, was 61 % and did not differ by calendar year. There was no difference in success rate grouped by diagnosis [intrauterine embryonic/fetal demise (IUED/IUFD) vs. anembryonic gestation; p = 0.30] or gestational age (800 μg (68 vs. 50 %, p = 0.029). Of the possible predictive factors of success, only the dose of misoprostol required was a significant independent negative predictor. Mean and median time to tissue expulsion after the first dose of misoprostol were 8.4 and 5.5 h, respectively. The incidence of side effects was low with no blood transfusions required. The success rate in this study is markedly below published data. This can possibly be attributed to retrospective study design, allowing for physician subjectivity and patients' wishes in the absence of strict study requirements. The protocol was well tolerated with a paucity of side effects. We make suggestions for enhancing success rates in the clinical setting by optimizing medication protocols, establishing precise treatment guidelines and training physicians in the accurate interpretation of treatment outcomes.

  11. Energy Efficient MANET Routing Using a Combination of Span and BECA/AFECA

    DEFF Research Database (Denmark)

    Kristensen, Mads Darø; Bouvin, Niels Olof

    2008-01-01

    This paper presents some novel approaches for energy efficient routing in mobile ad-hoc networks. Two known energy preserving techniques, Span and BECA/AFECA, are combined with a well-known re-active routing protocol, AODV, to create a new energy efficient routing protocol. Furthermore, the proto......This paper presents some novel approaches for energy efficient routing in mobile ad-hoc networks. Two known energy preserving techniques, Span and BECA/AFECA, are combined with a well-known re-active routing protocol, AODV, to create a new energy efficient routing protocol. Furthermore...

  12. Utilizing distributional analytics and electronic records to assess timeliness of inpatient blood glucose monitoring in non-critical care wards

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2016-04-01

    Full Text Available Abstract Background Regular and timely monitoring of blood glucose (BG levels in hospitalized patients with diabetes mellitus is crucial to optimizing inpatient glycaemic control. However, methods to quantify timeliness as a measurement of quality of care are lacking. We propose an analytical approach that utilizes BG measurements from electronic records to assess adherence to an inpatient BG monitoring protocol in hospital wards. Methods We applied our proposed analytical approach to electronic records obtained from 24 non-critical care wards in November and December 2013 from a tertiary care hospital in Singapore. We applied distributional analytics to evaluate daily adherence to BG monitoring timings. A one-sample Kolmogorov-Smirnov (1S-KS test was performed to test daily BG timings against non-adherence represented by the uniform distribution. This test was performed among wards with high power, determined through simulation. The 1S-KS test was coupled with visualization via the cumulative distribution function (cdf plot and a two-sample Kolmogorov-Smirnov (2S-KS test, enabling comparison of the BG timing distributions between two consecutive days. We also applied mixture modelling to identify the key features in daily BG timings. Results We found that 11 out of the 24 wards had high power. Among these wards, 1S-KS test with cdf plots indicated adherence to BG monitoring protocols. Integrating both 1S-KS and 2S-KS information within a moving window consisting of two consecutive days did not suggest frequent potential change from or towards non-adherence to protocol. From mixture modelling among wards with high power, we consistently identified four components with high concentration of BG measurements taken before mealtimes and around bedtime. This agnostic analysis provided additional evidence that the wards were adherent to BG monitoring protocols. Conclusions We demonstrated the utility of our proposed analytical approach as a monitoring

  13. Development of a micro total analytic system based on isotachophoresis for the separation and characterization of lanthanides

    International Nuclear Information System (INIS)

    Vio, L.

    2010-01-01

    The accurate and reproducible characterization of radioactive solutions in isotope composition and concentration is an essential topic for analytical laboratories in the nuclear field. In order to reduce manipulation time in glove box and production of contaminated wastes, it is necessary to propose innovative and efficient solutions for these analyses. Since few years, microchips are a major field of development in analytical chemistry and those devices could provide a solution which fits the needs of nuclear industry. The aim of this work is to design a disposable analytical micro-device devoted to lanthanide separation from spent nuclear fuel before their analysis in mass spectrometry. Designed to be used in place of a separation process by liquid chromatography which is involved in a three step protocol, the new protocol based on isotachophoresis (ITP) keeps compatible with the other two steps. The complete separation of lanthanides by ITP was obtained by the use of only one chelating compound rigorously selected: the 2-hydroxy 2-methyl butyric acid (HMBA). The main parameters involved in solute resolution were defined from the theoretical models of ITP and experimental studies of the influence of these parameters allowed to optimize the geometry of the system and to improve its performances. To suppress cleaning of the system and, consequently, to strongly reduce both liquid waste volume and handling radioactive material, the ITP protocol was transferred in a polymeric (COC) disposable microchip especially developed for this purpose. (author) [fr

  14. Analytical and Experimental Feasibility Study of Combined OTEC on NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jeongtae; Oh, Kyemin; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Hoon [KEPCO Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    The concept of the Combined Ocean Thermal Energy Conversion (Combined OTEC) needs to study. Combined OTEC uses exhausted steam on Nuclear Power Plants (NPPs) as heat source instead surface water. Exhausted steam extracted from condenser evaporates working fluid of Combined OTEC at heat exchanger (Hx-W). Essential calculation for conceptual design of Combined OTEC was already performed and presented before. However, the technical issue whether sufficient extraction of exhausted steam from high degree of vacuum of condenser to Hx-W can be supplied or not was unclear, which is significant to continue a demonstration program. In this study, so, we calculated the rate of extracted steam to evaluate whether sufficient steam can be extracted using RELAP code. In aspect of implementation of Combined OTEC, confirmation of sufficient flow of exhausted steam into Hx-W is the starting point of research. As the result of RELAP calculation, we confirmed that exhausted steam would flow into Hx-W. Considering the amount of exhausted steam in NPPs which is 1000 MWe and has 36 % of efficiency, 9 % of flow rate to Hx-W is means that 160 MWt of heat can be available as heat source of Combined OTEC. Using this, it can be possible to improve efficiency of aged NPPs and can compensate power loss caused by increase of circulation water temperature particularly in summer season.

  15. Green Degree Comprehensive Evaluation of Elevator Based on Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Lizhen

    2015-01-01

    Full Text Available The green design of the elevator has many characteristics which contains many factors and the combination of qualitative and quantitative. In view of the fuzzy problem of evaluation index information, fuzzy analytic hierarchy process and fuzzy comprehensive evaluation model are combined to evaluate the green degree of elevator. In this method, the weights of the indexes are calculated by using the fuzzy analytic hierarchy process and the fuzzy analytic hierarchy process is used to calculate the weights of each level. The feasibility will be defined of using green degree evaluation of elevator system as an example to verify the method.

  16. Analytical solutions for a single vertical drain with time-dependent vacuum combined surcharge preloading in membrane and membraneless systems

    International Nuclear Information System (INIS)

    Geng, X Y; Indraratna, B; Rujikiatkamjorn, C

    2010-01-01

    Vertical drains combined with vacuum pressure and surcharge preloading are widely used to accelerate the consolidation process of soft clay in order to decrease the pore pressure as well as to increase the effective stress. Currently there are two types of vacuum preloading systems commercially available; (a) membrane system with an airtight membrane over the drainage layer and, (b) membraneless system where a vacuum system is connected to individual drain. Their effectiveness varies from site to site depending on the type of soil treated and the characteristics of the drain-vacuum system. This study presents the analytical solutions of vertical drains with vacuum preloading for both membrane and membraneless systems. According to the field and laboratory observations, the vacuum in both of the membraneless and membrane system was assumed to be decreasing along the drain whereas in the membrane system, it was maintained at a constant level. This model was verified by using the measured settlements and excess pore pressures obtained from large-scale laboratory testing and case studies in Australia. The analytical solutions improved the accuracy of predicting the dissipation of pore water pressure and the associated settlement. The effects of the permeability of the sand blanket in a membrane system and the possible loss of vacuum were also discussed.

  17. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  18. RCRA groundwater data analysis protocol for the Hanford Site, Washington

    International Nuclear Information System (INIS)

    Chou, C.J.; Jackson, R.L.

    1992-04-01

    The Resource Conservation and Recovery Act of 1976 (RCRA) groundwater monitoring program currently involves site-specific monitoring of 20 facilities on the Hanford Site in southeastern Washington. The RCRA groundwater monitoring program has collected abundant data on groundwater quality. These data are used to assess the impact of a facility on groundwater quality or whether remediation efforts under RCRA corrective action programs are effective. Both evaluations rely on statistical analysis of groundwater monitoring data. The need for information on groundwater quality by regulators and environmental managers makes statistical analysis of monitoring data an important part of RCRA groundwater monitoring programs. The complexity of groundwater monitoring programs and variabilities (spatial, temporal, and analytical) exhibited in groundwater quality variables indicate the need for a data analysis protocol to guide statistical analysis. A data analysis protocol was developed from the perspective of addressing regulatory requirements, data quality, and management information needs. This data analysis protocol contains four elements: data handling methods; graphical evaluation techniques; statistical tests for trend, central tendency, and excursion analysis; and reporting procedures for presenting results to users

  19. Dynamic QoS management in Differentiated Services using bandwidth brokers, RSVP aggregation and load control protocols

    NARCIS (Netherlands)

    Westberg, Lars; Eriksson, Anders; Karagiannis, Georgios; Heijenk, Geert; Rexhepi, Vlora; Partain, David

    2001-01-01

    A method and network subsystem for providing on demand end to end Quality of Service (Qos) in a dynamic manner, use a combination of Resource Reservation Protocol (RSVP), load control protocol (and its successors) and Bandwidth Brokers (BBs)(1106) which communicate using a predetermined protocol.

  20. Dynamic QoS management in Differentiated Services using bandwidth brokers, RSVP aggregation and load control protocols

    NARCIS (Netherlands)

    Westberg, Lars; Eriksson, Anders; Karagiannis, Georgios; Heijenk, Geert; Rexhepi, Vlora; Partain, David

    2009-01-01

    A method and network subsystem for providing on demand end to end Quality of Service (Qos) in a dynamic manner, use a combination of Resource Reservation Protocol (RSVP), load control protocol (and its successors) and Bandwidth Brokers (BBs)(1106) which communicate using a predetermined protocol.

  1. Metrology and analytical chemistry: Bridging the cultural gap

    International Nuclear Information System (INIS)

    King, Bernard

    2002-01-01

    Metrology in general and issues such as traceability and measurement uncertainty in particular are new to most analytical chemists and many remain to be convinced of their value. There is a danger of the cultural gap between metrologists and analytical chemists widening with unhelpful consequences and it is important that greater collaboration and cross-fertilisation is encouraged. This paper discusses some of the similarities and differences in the approaches adopted by metrologists and analytical chemists and indicates how these approaches can be combined to establish a unique metrology of chemical measurement which could be accepted by both cultures. (author)

  2. Multi-criteria approach with linear combination technique and analytical hierarchy process in land evaluation studies

    Directory of Open Access Journals (Sweden)

    Orhan Dengiz

    2018-01-01

    Full Text Available Land evaluation analysis is a prerequisite to achieving optimum utilization of the available land resources. Lack of knowledge on best combination of factors that suit production of yields has contributed to the low production. The aim of this study was to determine the most suitable areas for agricultural uses. For that reasons, in order to determine land suitability classes of the study area, multi-criteria approach was used with linear combination technique and analytical hierarchy process by taking into consideration of some land and soil physico-chemical characteristic such as slope, texture, depth, derange, stoniness, erosion, pH, EC, CaCO3 and organic matter. These data and land mapping unites were taken from digital detailed soil map scaled as 1:5.000. In addition, in order to was produce land suitability map GIS was program used for the study area. This study was carried out at Mahmudiye, Karaamca, Yazılı, Çiçeközü, Orhaniye and Akbıyık villages in Yenişehir district of Bursa province. Total study area is 7059 ha. 6890 ha of total study area has been used as irrigated agriculture, dry farming agriculture, pasture while, 169 ha has been used for non-agricultural activities such as settlement, road water body etc. Average annual temperature and precipitation of the study area are 16.1oC and 1039.5 mm, respectively. Finally after determination of land suitability distribution classes for the study area, it was found that 15.0% of the study area has highly (S1 and moderately (S2 while, 85% of the study area has marginally suitable and unsuitable coded as S3 and N. It was also determined some relation as compared results of linear combination technique with other hierarchy approaches such as Land Use Capability Classification and Suitability Class for Agricultural Use methods.

  3. Performance analysis of routing protocols for IoT

    Science.gov (United States)

    Manda, Sridhar; Nalini, N.

    2018-04-01

    Internet of Things (IoT) is an arrangement of advancements that are between disciplinary. It is utilized to have compelling combination of both physical and computerized things. With IoT physical things can have personal virtual identities and participate in distributed computing. Realization of IoT needs the usage of sensors based on the sector for which IoT is integrated. For instance, in healthcare domain, IoT needs to have integration with wearable sensors used by patients. As sensor devices produce huge amount of data, often called big data, there should be efficient routing protocols in place. To the extent remote systems is worried there are some current protocols, for example, OLSR, DSR and AODV. It additionally tosses light into Trust based routing protocol for low-power and lossy systems (TRPL) for IoT. These are broadly utilized remote directing protocols. As IoT is developing round the corner, it is basic to investigate routing protocols that and evaluate their execution regarding throughput, end to end delay, and directing overhead. The execution experiences can help in settling on very much educated choices while incorporating remote systems with IoT. In this paper, we analyzed different routing protocols and their performance is compared. It is found that AODV showed better performance than other routing protocols aforementioned.

  4. Mac protocols for wireless sensor network (wsn): a comparative study

    International Nuclear Information System (INIS)

    Arshad, J.; Akram, Q.; Saleem, Y.

    2014-01-01

    Data communication between nodes is carried out under Medium Access Control (MAC) protocol which is defined at data link layer. The MAC protocols are responsible to communicate and coordinate between nodes according to the defined standards in WSN (Wireless Sensor Networks). The design of a MAC protocol should also address the issues of energy efficiency and transmission efficiency. There are number of MAC protocols that exist in the literature proposed for WSN. In this paper, nine MAC protocols which includes S-MAC, T-MAC, Wise-MAC, Mu-MAC, Z-MAC, A-MAC, D-MAC, B-MAC and B-MAC+ for WSN have been explored, studied and analyzed. These nine protocols are classified in contention based and hybrid (combination of contention and schedule based) MAC protocols. The goal of this comparative study is to provide a basis for MAC protocols and to highlight different mechanisms used with respect to parameters for the evaluation of energy and transmission efficiency in WSN. This study also aims to give reader a better understanding of the concepts, processes and flow of information used in these MAC protocols for WSN. A comparison with respect to energy reservation scheme, idle listening avoidance, latency, fairness, data synchronization, and throughput maximization has been presented. It was analyzed that contention based MAC protocols are less energy efficient as compared to hybrid MAC protocols. From the analysis of contention based MAC protocols in term of energy consumption, it was being observed that protocols based on preamble sampling consume lesser energy than protocols based on static or dynamic sleep schedule. (author)

  5. Simple and Accurate Analytical Solutions of the Electrostatically Actuated Curled Beam Problem

    KAUST Repository

    Younis, Mohammad I.

    2014-01-01

    We present analytical solutions of the electrostatically actuated initially deformed cantilever beam problem. We use a continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We derive simple analytical expressions

  6. New nuclear facilities and their analytical applications in China

    International Nuclear Information System (INIS)

    Zhang, Z.Y.; He, X.; Ma, Y.H.; Ding, Y.Y.; Chai, Z.F.

    2014-01-01

    Nuclear analytical techniques are a family of modern analytical methods that are based on nuclear reactions, nuclear effects, nuclear radiations, nuclear spectroscopy, nuclear parameters, and nuclear facilities. Because of their combined characteristics of sensitivity and selectivity, they are widely used in projects ranging from life sciences to deep-space exploration. In this review article, new nuclear facilities and their analytical applications in China are selectively reviewed, covering the following aspects: large scientific facilities, national demands, and key scientific issues with the emphasis on the new achievements. (orig.)

  7. Assessment of Grade of Dysphonia and Correlation With Quality of Life Protocol.

    Science.gov (United States)

    Spina, Ana Lúcia; Crespo, Agrício Nubiato

    2017-03-01

    The main objective of this study is to check the correlation between vocal self-assessment and results of the Voice-Related Quality of Life (V-RQOL) protocol, and whether there is a correlation between perceptual vocal assessment made by voice therapists and the results from the V-RQOL protocol. The study included 245 subjects with vocal complaints. This was a prospective analytical clinical study. Vocal perceptual assessment of each subject with dysphonia was made by three voice therapists, followed by self-assessment made by the subjects themselves, and the application of the V-RQOL protocol. The results have shown poor level of agreement between vocal assessment made by the voice therapists and self-assessment made by the subjects. The statistical analysis indicated that the results of V-RQOL protocol showed significant correlation with the vocal assessment made by the voice therapists and the self-assessment by the subjects. The agreement between the assessments was low and variable; age, gender, professional voice use, and clinical laryngoscopic diagnosis did not influence the agreement level. Protocol V-RQOL is sensitive to vocal assessment made by the voice therapists and self-assessment made by the patient. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  8. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations.

    Science.gov (United States)

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.

  9. Vertical equilibrium with sub-scale analytical methods for geological CO2 sequestration

    KAUST Repository

    Gasda, S. E.; Nordbotten, J. M.; Celia, M. A.

    2009-01-01

    equilibrium with sub-scale analytical method (VESA) combines the flexibility of a numerical method, allowing for heterogeneous and geologically complex systems, with the efficiency and accuracy of an analytical method, thereby eliminating expensive grid

  10. A Secure Routing Protocol for Wireless Sensor Networks Considering Secure Data Aggregation

    Directory of Open Access Journals (Sweden)

    Triana Mugia Rahayu

    2015-06-01

    Full Text Available The commonly unattended and hostile deployments of WSNs and their resource-constrained sensor devices have led to an increasing demand for secure energy-efficient protocols. Routing and data aggregation receive the most attention since they are among the daily network routines. With the awareness of such demand, we found that so far there has been no work that lays out a secure routing protocol as the foundation for a secure data aggregation protocol. We argue that the secure routing role would be rendered useless if the data aggregation scheme built on it is not secure. Conversely, the secure data aggregation protocol needs a secure underlying routing protocol as its foundation in order to be effectively optimal. As an attempt for the solution, we devise an energy-aware protocol based on LEACH and ESPDA that combines secure routing protocol and secure data aggregation protocol. We then evaluate its security effectiveness and its energy-efficiency aspects, knowing that there are always trade-off between both.

  11. Development of the protocol for purification of artemisinin based on combination of commercial and computationally designed adsorbents.

    Science.gov (United States)

    Piletska, Elena V; Karim, Kal; Cutler, Malcolm; Piletsky, Sergey A

    2013-01-01

    A polymeric adsorbent for extraction of the antimalarial drug artemisinin from Artemisia annua L. was computationally designed. This polymer demonstrated a high capacity for artemisinin (120 mg g(-1) ), quantitative recovery (87%) and was found to be an effective material for purification of artemisinin from complex plant matrix. The artemisinin quantification was conducted using an optimised HPLC-MS protocol, which was characterised by high precision and linearity in the concentration range between 0.05 and 2 μg mL(-1) . Optimisation of the purification protocol also involved screening of commercial adsorbents for the removal of waxes and other interfering natural compounds, which inhibit the crystallisation of artemisinin. As a result of a two step-purification protocol crystals of artemisinin were obtained, and artemisinin purity was evaluated as 75%. By performing the second stage of purification twice, the purity of artemisinin can be further improved to 99%. The developed protocol produced high-purity artemisinin using only a few purification steps that makes it suitable for large scale industrial manufacturing process. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Combining bleach and mild predigestion improves ancient DNA recovery from bones

    DEFF Research Database (Denmark)

    Boessenkool, Sanne; Hanghøj, Kristian Ebbesen; Nistelberger, Heidi M.

    2017-01-01

    library characteristics, such as DNA damage profiles or the composition of microbial communities, are little affected by the pre-extraction protocols. Application of the combined protocol presented in this study will facilitate the genetic analysis of an increasing number of ancient remains...... aimed to improve ancient DNA recovery before library amplification have recently been developed. Here, we test the effects of combining two of such protocols, a bleach wash and a predigestion step, on 12 bone samples of Atlantic cod and domestic horse aged 750-1350 cal. years before present. Using high...

  13. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  14. Analytical and Numerical Studies of Sloshing in Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Solaas, F

    1996-12-31

    For oil cargo ship tanks and liquid natural gas carriers, the dimensions of the tanks are often such that the highest resonant sloshing periods and the ship motions are in the same period range, which may cause violent resonant sloshing of the liquid. In this doctoral thesis, linear and non-linear analytical potential theory solutions of the sloshing problem are studied for a two-dimensional rectangular tank and a vertical circular cylindrical tank, using perturbation technique for the non-linear case. The tank is forced to oscillate harmonically with small amplitudes of sway with frequency in the vicinity of the lowest natural frequency of the fluid inside the tank. The method is extended to other tank shapes using a combined analytical and numerical method. A boundary element numerical method is used to determine the eigenfunctions and eigenvalues of the problem. These are used in the non-linear analytical free surface conditions, and the velocity potential and free surface elevation for each boundary value problem in the perturbation scheme are determined by the boundary element method. Both the analytical method and the combined analytical and numerical method are restricted to tanks with vertical walls in the free surface. The suitability of a commercial programme, FLOW-3D, to estimate sloshing is studied. It solves the Navier-Stokes equations by the finite difference method. The free surface as function of time is traced using the fractional volume of fluid method. 59 refs., 54 figs., 37 tabs.

  15. Analytical and Numerical Studies of Sloshing in Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Solaas, F.

    1995-12-31

    For oil cargo ship tanks and liquid natural gas carriers, the dimensions of the tanks are often such that the highest resonant sloshing periods and the ship motions are in the same period range, which may cause violent resonant sloshing of the liquid. In this doctoral thesis, linear and non-linear analytical potential theory solutions of the sloshing problem are studied for a two-dimensional rectangular tank and a vertical circular cylindrical tank, using perturbation technique for the non-linear case. The tank is forced to oscillate harmonically with small amplitudes of sway with frequency in the vicinity of the lowest natural frequency of the fluid inside the tank. The method is extended to other tank shapes using a combined analytical and numerical method. A boundary element numerical method is used to determine the eigenfunctions and eigenvalues of the problem. These are used in the non-linear analytical free surface conditions, and the velocity potential and free surface elevation for each boundary value problem in the perturbation scheme are determined by the boundary element method. Both the analytical method and the combined analytical and numerical method are restricted to tanks with vertical walls in the free surface. The suitability of a commercial programme, FLOW-3D, to estimate sloshing is studied. It solves the Navier-Stokes equations by the finite difference method. The free surface as function of time is traced using the fractional volume of fluid method. 59 refs., 54 figs., 37 tabs.

  16. SPARTex: A Vertex-Centric Framework for RDF Data Analytics

    KAUST Repository

    Abdelaziz, Ibrahim

    2015-08-31

    A growing number of applications require combining SPARQL queries with generic graph search on RDF data. However, the lack of procedural capabilities in SPARQL makes it inappropriate for graph analytics. Moreover, RDF engines focus on SPARQL query evaluation whereas graph management frameworks perform only generic graph computations. In this work, we bridge the gap by introducing SPARTex, an RDF analytics framework based on the vertex-centric computation model. In SPARTex, user-defined vertex centric programs can be invoked from SPARQL as stored procedures. SPARTex allows the execution of a pipeline of graph algorithms without the need for multiple reads/writes of input data and intermediate results. We use a cost-based optimizer for minimizing the communication cost. SPARTex evaluates queries that combine SPARQL and generic graph computations orders of magnitude faster than existing RDF engines. We demonstrate a real system prototype of SPARTex running on a local cluster using real and synthetic datasets. SPARTex has a real-time graphical user interface that allows the participants to write regular SPARQL queries, use our proposed SPARQL extension to declaratively invoke graph algorithms or combine/pipeline both SPARQL querying and generic graph analytics.

  17. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Directory of Open Access Journals (Sweden)

    Yongming Han

    Full Text Available Quantifying elemental carbon (EC content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT. A high-temperature method with extended heating times (STN120 showed the highest ECT/ECR ratio (0.86 while a low-temperature protocol (IMPROVE-550, with heating time adjusted for sample loading, showed the lowest (0.53. STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average than in soils (5.2 on average, most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  18. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Science.gov (United States)

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  19. Surface enhanced raman spectroscopy analytical, biophysical and life science applications

    CERN Document Server

    Schlücker, Sebastian

    2013-01-01

    Covering everything from the basic theoretical and practical knowledge to new exciting developments in the field with a focus on analytical and life science applications, this monograph shows how to apply surface-enhanced Raman scattering (SERS) for solving real world problems. From the contents: * Theory and practice of SERS * Analytical applications * SERS combined with other analytical techniques * Biophysical applications * Life science applications including various microscopies Aimed at analytical, surface and medicinal chemists, spectroscopists, biophysicists and materials scientists. Includes a Foreword by the renowned Raman spectroscopist Professor Wolfgang Kiefer, the former Editor-in-Chief of the Journal of Raman Spectroscopy.

  20. Optical trapping for analytical biotechnology.

    Science.gov (United States)

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. A Novel k-out-of-n Oblivious Transfer Protocol from Bilinear Pairing

    Directory of Open Access Journals (Sweden)

    Jue-Sam Chou

    2012-01-01

    Full Text Available Oblivious transfer (OT protocols mainly contain three categories: 1-out-of-2 OT, 1-out-of-n OT, and k-out-of-n OT. In most cases, they are treated as cryptographic primitives and are usually executed without consideration of possible attacks that might frequently occur in an open network, such as an impersonation, replaying, or man-in-the-middle attack. Therefore, when used in certain applications, such as mental poker games and fair contract signings, some extra mechanisms must be combined to ensure the security of the protocol. However, after a combination, we found that very few of the resulting schemes are efficient enough in terms of communicational cost, which is a significant concern for generic commercial transactions. Therefore, we propose a novel k-out-of-n oblivious transfer protocol based on bilinear pairing, which not only satisfies the requirements of a k-out-of-n OT protocol, but also provides mutual authentication to resist malicious attacks. Meanwhile, it is efficient in terms of communication cost.

  2. Early pain management after periodontal treatment in dogs – comparison of single and combined analgesic protocols

    Directory of Open Access Journals (Sweden)

    Petr Raušer

    2015-01-01

    Full Text Available The aim of this study was to assess the analgesic effectiveness of three analgesic protocols in dogs undergoing a periodontal treatment. The study was performed as a prospective, randomized, “double blind” clinical study. A total of 45 client-owned dogs scheduled for periodontal treatment were included. Dogs of Group C received carprofen (4 mg·kg-1, dogs of Group B received bupivacaine (1 mg·kg-1 and dogs of Group CB received a combination of carprofen (4 mg·kg-1 and bupivacaine (1 mg·kg-1. Carprofen was administered subcutaneously 30 min before anaesthesia, bupivacaine was administered by nerve blocks in anaesthetized dogs. Painful periodontal treatment was performed in all patients, lasting up to one hour. Modified University of Melbourne Pain Score (UMPS, Visual Analogue Scale for pain assessment (VAS, plasma glucose and serum cortisol levels were assessed 30 min before administration of analgesics (C-0, B-0, CB‑0 and 2 h after recovery from anaesthesia (C-2, B-2, CB-2. For statistical analysis Friedman test, Mann-Whitney U-test, ANOVA and Fischer exact tests were used (P < 0.05. In CB‑2 compared to CB‑0 significantly decreased modified UMPS values. In CB‑2 UMPS values were significantly lower compared to C‑2 or B‑2. In C‑2 VAS values were significantly increased compared to C‑0, and in B‑2 VAS values were significantly increased compared to B‑0. Visual Analogue Scale values were significantly lower in CB‑2 compared to C‑2 or B‑2. Significantly increased plasma glucose concentrations were found in C‑2 compared to C‑0 and in B‑2 compared to B‑0. No other significant differences were detected. Administration of carprofen, bupivacaine or their combination is sufficient for early postoperative analgesia following periodontal treatment. Carprofen-bupivacaine combination is superior to carprofen or bupivacaine administered separately.

  3. Automated extraction protocol for quantification of SARS-Coronavirus RNA in serum: an evaluation study

    Directory of Open Access Journals (Sweden)

    Lui Wing-bong

    2006-02-01

    Full Text Available Abstract Background We have previously developed a test for the diagnosis and prognostic assessment of the severe acute respiratory syndrome (SARS based on the detection of the SARS-coronavirus RNA in serum by real-time quantitative reverse transcriptase polymerase chain reaction (RT-PCR. In this study, we evaluated the feasibility of automating the serum RNA extraction procedure in order to increase the throughput of the assay. Methods An automated nucleic acid extraction platform using the MagNA Pure LC instrument (Roche Diagnostics was evaluated. We developed a modified protocol in compliance with the recommended biosafety guidelines from the World Health Organization based on the use of the MagNA Pure total nucleic acid large volume isolation kit for the extraction of SARS-coronavirus RNA. The modified protocol was compared with a column-based extraction kit (QIAamp viral RNA mini kit, Qiagen for quantitative performance, analytical sensitivity and precision. Results The newly developed automated protocol was shown to be free from carry-over contamination and have comparable performance with other standard protocols and kits designed for the MagNA Pure LC instrument. However, the automated method was found to be less sensitive, less precise and led to consistently lower serum SARS-coronavirus concentrations when compared with the column-based extraction method. Conclusion As the diagnostic efficiency and prognostic value of the serum SARS-CoV RNA RT-PCR test is critically associated with the analytical sensitivity and quantitative performance contributed both by the RNA extraction and RT-PCR components of the test, we recommend the use of the column-based manual RNA extraction method.

  4. Using the Technology of the Confessional as an Analytical Resource: Four Analytical Stances Towards Research Interviews in Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Brendan K. O'Rourke

    2007-05-01

    Full Text Available Among the various approaches that have developed from FOUCAULT's work is an Anglophone discourse analysis that has attempted to combine FOUCAULTian insights with the techniques of Conversation Analysis. An important current methodological issue in this discourse analytical approach is its theoretical preference for "naturally occurring" rather than research interview data. A FOUCAULTian perspective on the interview as a research instrument, questions the idea of "naturally-occurring discourse". The "technology of the confessional" operates, not only within research interviews, but permeates other interactions as well. Drawing on FOUCAULT does not dismiss the problems of the interview as research instrument rather it shows they cannot be escaped by simply switching to more "natural" interactions. Combining these insights with recent developments within discourse analysis can provide analytical resources for, rather than barriers to, the discourse analysis of research interviews. To aid such an approach, we develop a four-way categorisation of analytical stances towards the research interview in discourse analysis. A demonstration of how a research interview might be subjected to a discourse analysis using elements of this approach is then provided. URN: urn:nbn:de:0114-fqs070238

  5. A multigear protocol for sampling crayfish assemblages in Gulf of Mexico coastal streams

    Science.gov (United States)

    William R. Budnick; William E. Kelso; Susan B. Adams; Michael D. Kaller

    2018-01-01

    Identifying an effective protocol for sampling crayfish in streams that vary in habitat and physical/chemical characteristics has proven problematic. We evaluated an active, combined-gear (backpack electrofishing and dipnetting) sampling protocol in 20 Coastal Plain streams in Louisiana. Using generalized linear models and rarefaction curves, we evaluated environmental...

  6. Introduction on microbiological and biological methods and their possible combination with other analytical techniques for the detection of irradiated food

    International Nuclear Information System (INIS)

    Leonardi, M.

    1991-01-01

    Food irradiation is a physical method of processing and preserving food. One of the main purposes of the application of this technology to food is to obtain specific biological effects on the treated foodstuff. Typical examples of these treatment effects are listed in the article. A whole range of techniques is at disposal of the analyst to assure the Quality Control (QC) of various foodstuffs. They are based on microbiological, organoleptical, chemical, biochemical, immunological and/or physical methods. In the case of irradiation preserved food the opinion of the writer is that very often only a combination of analytical methods can solve the problem of detection of irradiated foodstuffs and in particular in most cases this combination could be formed by a biological or microbiological method + a chemical or physical one. The meaning of these combination of techniques is manifold. Combining the advantages of a rapid screening method with those of a more refined, reliable, even if more time consuming one; offering the possibility to carry out the analysis for the control of irradiated foodstuffs to different kinds of food control laboratories, often equipped in a different way, are some of the most evident advantages. These methods are briefly explained. At present, none method seems promising for the quantitative determination of the irradiation dose. Moreover, some of the proposed methods can only give a good presumption of the irradiation treatment applied to particular foodstuffs. (18 refs)

  7. An analytical model for the assessment of airline expansion strategies

    Directory of Open Access Journals (Sweden)

    Mauricio Emboaba Moreira

    2014-01-01

    Full Text Available Purpose: The purpose of this article is to develop an analytical model to assess airline expansion strategies by combining generic business strategy models with airline business models. Methodology and approach: A number of airline business models are examined, as are Porter’s (1983 industry five forces that drive competition, complemented by Nalebuff/ Brandenburger’s  (1996 sixth force, and the basic elements of the general environment in which the expansion process takes place.  A system of points and weights is developed to create a score among the 904,736 possible combinations considered. The model’s outputs are generic expansion strategies with quantitative assessments for each specific combination of elements inputted. Originality and value: The analytical model developed is original because it combines for the first time and explicitly elements of the general environment, industry environment, airline business models and the generic expansion strategy types. Besides it creates a system of scores that may be used to drive the decision process toward the choice of a specific strategic expansion path. Research implications: The analytical model may be adapted to other industries apart from the airline industry by substituting the element “airline business model” by other industries corresponding elements related to the different specific business models.

  8. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    Science.gov (United States)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification

  9. Combined approach of grey relational analysis and analytic hierarchy process for ARCAL/IAEA strategic actions prioritization

    International Nuclear Information System (INIS)

    Silva, Pedro Maffia da; Martins, Eduardo Ferraz; Rondinelli Junior, Francisco; Garcia, Pauli Adriano de Almada

    2015-01-01

    The IAEA technical cooperation (TC) programme is the main mechanism through which the IAEA delivers technical services to its Member States. Through the programme, the IAEA helps Member States to build, strengthen and maintain capacities in the safe, peaceful and secure use of nuclear technology in support of sustainable socioeconomic development. The Regional Cooperation Agreement for the Promotion of Nuclear Science and Technology in Latin America and the Caribbean (ARCAL) is a TC agreement between most IAEA member states in the Latin America and the Caribbean region for technical and economic cooperation to promote the use of nuclear techniques for peace and development. The present study aims to propose a combined approach to prioritize the needs and problems of ARCAL region. To do that, this paper considers the concept of Grey Relational Analysis and Analytic Hierarchy Process for data treatment, standardization and ranking of those needs and problems. In other words, the proposition intend to reduce the biases that may be introduced along the stage of the needs and problems assessment in the regional strategic profile formulation. (author)

  10. Combined approach of grey relational analysis and analytic hierarchy process for ARCAL/IAEA strategic actions prioritization

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Pedro Maffia da; Martins, Eduardo Ferraz; Rondinelli Junior, Francisco, E-mail: pmsilva@cnen.gov.br, E-mail: efmartins@cnen.gov.br, E-mail: rondinel@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Garcia, Pauli Adriano de Almada, E-mail: pauliadriano@id.uff.br [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil)

    2015-07-01

    The IAEA technical cooperation (TC) programme is the main mechanism through which the IAEA delivers technical services to its Member States. Through the programme, the IAEA helps Member States to build, strengthen and maintain capacities in the safe, peaceful and secure use of nuclear technology in support of sustainable socioeconomic development. The Regional Cooperation Agreement for the Promotion of Nuclear Science and Technology in Latin America and the Caribbean (ARCAL) is a TC agreement between most IAEA member states in the Latin America and the Caribbean region for technical and economic cooperation to promote the use of nuclear techniques for peace and development. The present study aims to propose a combined approach to prioritize the needs and problems of ARCAL region. To do that, this paper considers the concept of Grey Relational Analysis and Analytic Hierarchy Process for data treatment, standardization and ranking of those needs and problems. In other words, the proposition intend to reduce the biases that may be introduced along the stage of the needs and problems assessment in the regional strategic profile formulation. (author)

  11. Interlaboratory test comparison among Environmental Radioactivity Laboratories using the ISO/IUPAC/AOAC Protocol

    International Nuclear Information System (INIS)

    Romero, L.; Ramos, L.; Salas, R.

    1998-01-01

    World-wide acceptance of results from radiochemical analyses requires reliable, traceable and comparable measurements to SI units, particularly when data sets generated by laboratories are to contribute to evaluation of data from environmental pollution research and monitoring programmes. The Spanish Nuclear Safety Council (CSN) organizes in collaboration with CIEMAT periodical interlaboratory test comparisons for environmental radioactivity laboratories aiming to provide them with the necessary means to asses the quality of their results. This paper presents data from the most recent exercise which, for the first time, was evaluated following the procedure recommended in the ISO/IUPAC/AOAC Harmonized Protocol for the proficiency testing of analytical laboratories (1). The test sample was a Reference Material provided by the IAEA-AQCS, a lake sediment containing the following radionuclides: k-40, Ra-226, Ac-228, Cs-137, Sr-90, Pu-(239+240). The results of the proficiency test were computed for the 28 participating laboratories using the z-score approach, the evaluation of the exercises is presented in the paper. The use of a z-score classification has demonstrated to provide laboratories with a more objective means of assessing and demonstrating the reliability of the data they are producing. Analytical proficiency of the participating laboratories has been found to be satisfactory in 57 to 100 percent of cases. (1)- The International harmonized protocol for the proficiency testing of (chemical) analytical laboratories. Pure and Appl. Chem. Vol. 65, n 9, pp. 2123-2144, 1993 IUPAC. GB (Author) 3 refs

  12. Estimation of the Thurstonian model for the 2-AC protocol

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Lee, Hye-Seong; Brockhoff, Per B.

    2012-01-01

    . This relationship makes it possible to extract estimates and standard errors of δ and τ from general statistical software, and furthermore, it makes it possible to combine standard regression modelling with the Thurstonian model for the 2-AC protocol. A model for replicated 2-AC data is proposed using cumulative......The 2-AC protocol is a 2-AFC protocol with a “no-difference” option and is technically identical to the paired preference test with a “no-preference” option. The Thurstonian model for the 2-AC protocol is parameterized by δ and a decision parameter τ, the estimates of which can be obtained...... by fairly simple well-known methods. In this paper we describe how standard errors of the parameters can be obtained and how exact power computations can be performed. We also show how the Thurstonian model for the 2-AC protocol is closely related to a statistical model known as a cumulative probit model...

  13. Combined positron emission tomography/computed tomography (PET/CT) for clinical oncology: technical aspects and acquisition protocols

    International Nuclear Information System (INIS)

    Beyer, T.

    2004-01-01

    Combined PET/CT imaging is a non-invasive means of reviewing both, the anatomy and the molecular pathways of a patient during a quasi-simultaneous examination. Since the introduction of the prototype PET/CT in 1998 a rapid development of this imaging technology is being witnessed. The incorporation of fast PET detector technology into PET/CT designs and the routine use of the CT transmission images for attenuation correction of the PET allow for anato-metabolic whole-body examinations to be completed in less than 30 min. Thus, PET/CT imaging offers a logistical advantage to both, the patient and the clinicians since the two complementary exams - whenever clinically indicated - can be performed almost at the same time and a single integrated report can be created. Nevertheless, a number of pit-falls, primarily from the use of CT-based attenuation correction, have been identified and are being addressed through optimized acquisition protocols. It is fair to say, that PET/CT has been integrated in the diagnostic imaging arena, and in many cases has led to a close collaboration between different, yet complementary diagnostic and therapeutic medical disciplines. (orig.)

  14. Acute effects of high-intensity interval, resistance or combined exercise protocols on testosterone - cortisol responses in inactive overweight individuals.

    Science.gov (United States)

    Velasco-Orjuela, Gina P; Domínguez-Sanchéz, María A; Hernández, Enrique; Correa-Bautista, Jorge E; Triana-Reina, Héctor R; García-Hermoso, Antonio; Peña-Ibagon, Jhonatan C; Izquierdo, Mikel; Cadore, Eduardo L; Hackney, Anthony C; Ramírez-Vélez, Robinson

    2018-06-22

    The purpose of this study was to compare the hormonal responses to one session of high-intensity interval training (HIIT, 4 × 4 min intervals at 85-95% maximum heart rate [HRmax], interspersed with 4 min of recovery at 75-85% HRmax), resistance training (RT at 50-70% of one repetition maximum 12-15 repetitions per set with 60s of recovery) or both (HIIT+RT) exercise protocol in a cohort of physical inactivity, overweight adults (age 18-30 years old). Randomized, parallel-group clinical trial among fifty-one men (23.6 ± 3.5 yr; 83.5 ± 7.8 kg; 28.0 ± 1.9 kg/m2), physical inactivity (i.e., 6 months), with abdominal obesity (waist circumference ≥90 cm) or body mass index ≥25 and ≤30 kg/m 2 were randomized to the following 4 groups: high-intensity interval training (HIIT, n = 14), resistance training (RT, n = 12), combined high-intensity interval and resistance training (HIIT+RT, n = 13), or non-exercising control (CON, n = 12). Cortisol, total- and free-testosterone and total-testosterone/cortisol-ratio (T/C) assessments (all in serum) were determined before (pre) and 1-min post-exercise for each protocol session. Decreases in cortisol levels were -57.08 (95%CI, -75.58 to -38.58; P = 0.001; ɳ 2  = 0.61) and - 37.65 (95%CI, -54.36 to -20.93; P = 0.001; ɳ 2  = 0.51) in the HIIT and control group, respectively. Increases in T/C ratio were 0.022 (95%CI, 0.012 to 0.031; P = 0.001; ɳ 2  = 0.49) and 0.015 (95%CI, 0.004 to 0.025; P = 0.007; ɳ 2  = 0.29) in the HIIT and control group, respectively. In per-protocol analyses revealed a significant change in cortisol levels [interaction effect F( 7.777 ), ɳ 2  = 0.33] and T/C ratio [interaction effect F( 5.298 ), ɳ 2  = 0.25] between groups over time. Additionally, we showed that in both the intention-to-treat (ITT) and per protocol analyses, HIIT+RT did not change serum cortisol, total or free testosterone. The present

  15. Optimizing a Treadmill Ramp Protocol to Evaluate Aerobic Capacity of Hemiparetic Poststroke Patients.

    Science.gov (United States)

    Bernardes, Wendell L; Montenegro, Rafael A; Monteiro, Walace D; de Almeida Freire, Raul; Massaferri, Renato; Farinatti, Paulo

    2018-03-01

    Bernardes, WL, Montenegro, RA, Monteiro, WD, de Almeida Freire, R, Massaferri, R, and Farinatti, P. Optimizing a treadmill ramp protocol to evaluate aerobic capacity of hemiparetic poststroke patients. J Strength Cond Res 32(3): 876-884, 2018-A correct assessment of cardiopulmonary capacity is important for aerobic training within motor rehabilitation of poststroke hemiparetic patients (PSHPs). However, specific cardiopulmonary exercise testing (CPET) for these patients are scarce. We proposed adaptations in a protocol originally developed for PSHPs by Ovando et al. (CPET1). We hypothesized that our adapted protocol (CPET2) would improve the original test, by preventing early fatigue and increasing patients' peak performance. Eleven PSHPs (52 ± 14 years, 10 men) performed both protocols. CPET2 integrated changes in final speed (100-120% vs. 140% maximal speed in 10-m walking test), treadmill inclination (final inclination of 5 vs. 10%), and estimated test duration (10 vs. 8 minutes) to smooth the rate of workload increment of CPET1. Peak oxygen uptake (V[Combining Dot Above]O2peak) (20.3 ± 6.1 vs. 18.6 ± 5.0 ml·kg·min; p = 0.04), V[Combining Dot Above]O2 at gas exchange transition (V[Combining Dot Above]O2-GET) (11.5 ± 2.9 vs. 9.8 ± 2.0 ml·kg·min; p = 0.04), and time to exhaustion (10 ± 3 vs. 6 ± 2 minutes; p higher in CPET2 than in CPET1. Slopes and intercepts of regressions describing relationships between V[Combining Dot Above]O2 vs. workload, heart rate vs. workload, and V[Combining Dot Above]O2 vs. heart rate were similar between CPETs. However, standard errors of estimates obtained for regressions between heart rate vs. workload (3.0 ± 1.3 vs. 3.8 ± 1.0 b·min; p = 0.004) and V[Combining Dot Above]O2 vs. heart rate (6.0 ± 2.1 vs. 4.8 ± 2.4 ml·kg·min; p = 0.05) were lower in CPET2 than in CPET1. In conclusion, the present adaptations in Ovando's CPET protocol increased exercise tolerance of PSHPs, eliciting higher V[Combining Dot Above]O2peak

  16. 5 keys to business analytics program success

    CERN Document Server

    Boyer, John; Green, Brian; Harris, Tracy; Van De Vanter, Kay

    2012-01-01

    With business analytics is becoming increasingly strategic to all types of organizations and with many companies struggling to create a meaningful impact with this emerging technology, this work-based on the combined experience of 10 organizations that display excellence and expertise on the subject-shares the best practices, discusses the management aspects and sociology that drives success, and uncovers the five key aspects behind the success of some of the top business analytics programs in the industry. Readers will learn about numerous topics, including how to create and manage a changing

  17. Energy Efficient Medium Access Control Protocol for Clustered Wireless Sensor Networks with Adaptive Cross-Layer Scheduling.

    Science.gov (United States)

    Sefuba, Maria; Walingo, Tom; Takawira, Fambirai

    2015-09-18

    This paper presents an Energy Efficient Medium Access Control (MAC) protocol for clustered wireless sensor networks that aims to improve energy efficiency and delay performance. The proposed protocol employs an adaptive cross-layer intra-cluster scheduling and an inter-cluster relay selection diversity. The scheduling is based on available data packets and remaining energy level of the source node (SN). This helps to minimize idle listening on nodes without data to transmit as well as reducing control packet overhead. The relay selection diversity is carried out between clusters, by the cluster head (CH), and the base station (BS). The diversity helps to improve network reliability and prolong the network lifetime. Relay selection is determined based on the communication distance, the remaining energy and the channel quality indicator (CQI) for the relay cluster head (RCH). An analytical framework for energy consumption and transmission delay for the proposed MAC protocol is presented in this work. The performance of the proposed MAC protocol is evaluated based on transmission delay, energy consumption, and network lifetime. The results obtained indicate that the proposed MAC protocol provides improved performance than traditional cluster based MAC protocols.

  18. Hyphenated analytical techniques for materials characterisation

    International Nuclear Information System (INIS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-01-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  19. Hyphenated analytical techniques for materials characterisation

    Science.gov (United States)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  20. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  1. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  2. Acute Effect of Different Combined Stretching Methods on Acceleration and Speed in Soccer Players

    Directory of Open Access Journals (Sweden)

    Amiri-Khorasani Mohammadtaghi

    2016-04-01

    Full Text Available The purpose of this study was to investigate the acute effect of different stretching methods, during a warm-up, on the acceleration and speed of soccer players. The acceleration performance of 20 collegiate soccer players (body height: 177.25 ± 5.31 cm; body mass: 65.10 ± 5.62 kg; age: 16.85 ± 0.87 years; BMI: 20.70 ± 5.54; experience: 8.46 ± 1.49 years was evaluated after different warm-up procedures, using 10 and 20 m tests. Subjects performed five types of a warm-up: static, dynamic, combined static + dynamic, combined dynamic + static, and no-stretching. Subjects were divided into five groups. Each group performed five different warm-up protocols in five non-consecutive days. The warm-up protocol used for each group was randomly assigned. The protocols consisted of 4 min jogging, a 1 min stretching program (except for the no-stretching protocol, and 2 min rest periods, followed by the 10 and 20 m sprint test, on the same day. The current findings showed significant differences in the 10 and 20 m tests after dynamic stretching compared with static, combined, and no-stretching protocols. There were also significant differences between the combined stretching compared with static and no-stretching protocols. We concluded that soccer players performed better with respect to acceleration and speed, after dynamic and combined stretching, as they were able to produce more force for a faster execution.

  3. Diverging Trade Strategies in Latin America: An Analytical Framework

    OpenAIRE

    Aggarwal, Vinod K.; Espach, Ralph H.

    2003-01-01

    Although there is increasing divergence among the trade policies of various Latin American nations, overall the last twenty years have seen a dramatic shift away from protectionism towards liberalization. Focusing on case studies of four Latin American nations — Brazil, Mexico, Chile and Argentina — the authors use an analytical framework to explain the rationales behind divergent policies. The analytical approach used considers the combination of economic, political and strategic objectives ...

  4. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    Science.gov (United States)

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  5. A novel protocol for dispatcher assisted CPR improves CPR quality and motivation among rescuers-A randomized controlled simulation study.

    Science.gov (United States)

    Rasmussen, Stinne Eika; Nebsbjerg, Mette Amalie; Krogh, Lise Qvirin; Bjørnshave, Katrine; Krogh, Kristian; Povlsen, Jonas Agerlund; Riddervold, Ingunn Skogstad; Grøfte, Thorbjørn; Kirkegaard, Hans; Løfgren, Bo

    2017-01-01

    Emergency dispatchers use protocols to instruct bystanders in cardiopulmonary resuscitation (CPR). Studies changing one element in the dispatcher's protocol report improved CPR quality. Whether several changes interact is unknown and the effect of combining multiple changes previously reported to improve CPR quality into one protocol remains to be investigated. We hypothesize that a novel dispatch protocol, combining multiple beneficial elements improves CPR quality compared with a standard protocol. A novel dispatch protocol was designed including wording on chest compressions, using a metronome, regular encouragements and a 10-s rest each minute. In a simulated cardiac arrest scenario, laypersons were randomized to perform single-rescuer CPR guided with the novel or the standard protocol. a composite endpoint of time to first compression, hand position, compression depth and rate and hands-off time (maximum score: 22 points). Afterwards participants answered a questionnaire evaluating the dispatcher assistance. The novel protocol (n=61) improved CPR quality score compared with the standard protocol (n=64) (mean (SD): 18.6 (1.4)) points vs. 17.5 (1.7) points, pCPR. A novel bundle of care protocol improved CPR quality score and motivation among rescuers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    Science.gov (United States)

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  7. A Group Neighborhood Average Clock Synchronization Protocol for Wireless Sensor Networks

    Science.gov (United States)

    Lin, Lin; Ma, Shiwei; Ma, Maode

    2014-01-01

    Clock synchronization is a very important issue for the applications of wireless sensor networks. The sensors need to keep a strict clock so that users can know exactly what happens in the monitoring area at the same time. This paper proposes a novel internal distributed clock synchronization solution using group neighborhood average. Each sensor node collects the offset and skew rate of the neighbors. Group averaging of offset and skew rate value are calculated instead of conventional point-to-point averaging method. The sensor node then returns compensated value back to the neighbors. The propagation delay is considered and compensated. The analytical analysis of offset and skew compensation is presented. Simulation results validate the effectiveness of the protocol and reveal that the protocol allows sensor networks to quickly establish a consensus clock and maintain a small deviation from the consensus clock. PMID:25120163

  8. Inter-comparison of NIOSH and IMPROVE protocols for OC and EC determination: implications for inter-protocol data conversion

    Science.gov (United States)

    Wu, Cheng; Huang, X. H. Hilda; Ng, Wai Man; Griffith, Stephen M.; Zhen Yu, Jian

    2016-09-01

    Organic carbon (OC) and elemental carbon (EC) are operationally defined by analytical methods. As a result, OC and EC measurements are protocol dependent, leading to uncertainties in their quantification. In this study, more than 1300 Hong Kong samples were analyzed using both National Institute for Occupational Safety and Health (NIOSH) thermal optical transmittance (TOT) and Interagency Monitoring of Protected Visual Environment (IMPROVE) thermal optical reflectance (TOR) protocols to explore the cause of EC disagreement between the two protocols. EC discrepancy mainly (83 %) arises from a difference in peak inert mode temperature, which determines the allocation of OC4NSH, while the rest (17 %) is attributed to a difference in the optical method (transmittance vs. reflectance) applied for the charring correction. Evidence shows that the magnitude of the EC discrepancy is positively correlated with the intensity of the biomass burning signal, whereby biomass burning increases the fraction of OC4NSH and widens the disagreement in the inter-protocol EC determination. It is also found that the EC discrepancy is positively correlated with the abundance of metal oxide in the samples. Two approaches (M1 and M2) that translate NIOSH TOT OC and EC data into IMPROVE TOR OC and EC data are proposed. M1 uses direct relationship between ECNSH_TOT and ECIMP_TOR for reconstruction: M1 : ECIMP_TOR = a × ECNSH_TOT + b; while M2 deconstructs ECIMP_TOR into several terms based on analysis principles and applies regression only on the unknown terms: M2 : ECIMP_TOR = AECNSH + OC4NSH - (a × PCNSH_TOR + b), where AECNSH, apparent EC by the NIOSH protocol, is the carbon that evolves in the He-O2 analysis stage, OC4NSH is the carbon that evolves at the fourth temperature step of the pure helium analysis stage of NIOSH, and PCNSH_TOR is the pyrolyzed carbon as determined by the NIOSH protocol. The implementation of M1 to all urban site data (without considering seasonal specificity

  9. Superhydrophobic analyte concentration utilizing colloid-pillar array SERS substrates.

    Science.gov (United States)

    Wallace, Ryan A; Charlton, Jennifer J; Kirchner, Teresa B; Lavrik, Nickolay V; Datskos, Panos G; Sepaniak, Michael J

    2014-12-02

    The ability to detect a few molecules present in a large sample is of great interest for the detection of trace components in both medicinal and environmental samples. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. The following work involves superhydrophobic surfaces that have as a framework deterministic or stochastic silicon pillar arrays formed by lithographic or metal dewetting protocols, respectively. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added to the functionalized pillar array system via soaking. Native pillars and pillars with hydrophobic modification are used. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A ≥ 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 × 10(-12) M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up uses in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.

  10. Multispectral analytical image fusion

    International Nuclear Information System (INIS)

    Stubbings, T.C.

    2000-04-01

    With new and advanced analytical imaging methods emerging, the limits of physical analysis capabilities and furthermore of data acquisition quantities are constantly pushed, claiming high demands to the field of scientific data processing and visualisation. Physical analysis methods like Secondary Ion Mass Spectrometry (SIMS) or Auger Electron Spectroscopy (AES) and others are capable of delivering high-resolution multispectral two-dimensional and three-dimensional image data; usually this multispectral data is available in form of n separate image files with each showing one element or other singular aspect of the sample. There is high need for digital image processing methods enabling the analytical scientist, confronted with such amounts of data routinely, to get rapid insight into the composition of the sample examined, to filter the relevant data and to integrate the information of numerous separate multispectral images to get the complete picture. Sophisticated image processing methods like classification and fusion provide possible solution approaches to this challenge. Classification is a treatment by multivariate statistical means in order to extract analytical information. Image fusion on the other hand denotes a process where images obtained from various sensors or at different moments of time are combined together to provide a more complete picture of a scene or object under investigation. Both techniques are important for the task of information extraction and integration and often one technique depends on the other. Therefore overall aim of this thesis is to evaluate the possibilities of both techniques regarding the task of analytical image processing and to find solutions for the integration and condensation of multispectral analytical image data in order to facilitate the interpretation of the enormous amounts of data routinely acquired by modern physical analysis instruments. (author)

  11. BAVP: Blockchain-Based Access Verification Protocol in LEO Constellation Using IBE Keys

    OpenAIRE

    Wei, Songjie; Li, Shuai; Liu, Peilong; Liu, Meilin

    2018-01-01

    LEO constellation has received intensive research attention in the field of satellite communication. The existing centralized authentication protocols traditionally used for MEO/GEO satellite networks cannot accommodate LEO satellites with frequent user connection switching. This paper proposes a fast and efficient access verification protocol named BAVP by combining identity-based encryption and blockchain technology. Two different key management schemes with IBE and blockchain, respectively...

  12. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  13. service line analytics in the new era.

    Science.gov (United States)

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  14. Development of Taiwanese government’s climate policy after the Kyoto protocol: Applying policy network theory as an analytical framework

    International Nuclear Information System (INIS)

    Shyu, Chian-Woei

    2014-01-01

    Given its limited involvement in and recognition by international organizations, Taiwan is not presently a signatory to the United Nations Framework Convention on Climate Change (UNFCCC) or the Kyoto Protocol. The objective of this study is to analyze how and the extent to which changes in an exogenous factor, namely the Kyoto Protocol and Post-Kyoto climate negotiations, affect and ultimately lead to the formulation of and changes in the Taiwanese government's climate policy. This study applies policy network theory to examine the development of and changes in the Taiwanese government's climate policy. The results demonstrate that international climate agreements and negotiations play a key role in the development of, changes to, and transformation of Taiwan's climate policy. Scarce evidence was found in this study to demonstrate that domestic or internal factors affect climate change policy. Despite its lack of participation in the UNFCCC and the Kyoto Protocol, Taiwan has adopted national climate change strategies, action plans, and programs to reduce greenhouse gas emissions. However, these climate policies and measures are fairly passive and aim to only conform to the minimal requirements for developing countries under international climate agreements and negotiations. This process results in inconsistent and variable climate policies, targets, and regulations. - Highlights: • Taiwan is not a signatory to the UNFCCC or its Kyoto Protocol. • International climate agreements strongly affected Taiwan's climate policy. • Little evidence was found that domestic factors affect Taiwan's climate policy. • New climate policies, regulations, and laws are formulated and implemented. • Climate policies, targets, and regulations change frequently and are inconsistent

  15. Code Generation for Protocols from CPN models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael; Kindler, Ekkart

    software implementation satisfies the properties verified for the model. Coloured Petri Nets (CPNs) have been widely used to model and verify protocol software, but limited work exists on using CPN models of protocol software as a basis for automated code generation. In this report, we present an approach...... modelling languages, MDE further has the advantage that models are amenable to model checking which allows key behavioural properties of the software design to be verified. The combination of formally verified models and automated code generation contributes to a high degree of assurance that the resulting...... for generating protocol software from a restricted class of CPN models. The class of CPN models considered aims at being descriptive in that the models are intended to be helpful in understanding and conveying the operation of the protocol. At the same time, a descriptive model is close to a verifiable version...

  16. On accuracy problems for semi-analytical sensitivity analyses

    DEFF Research Database (Denmark)

    Pedersen, P.; Cheng, G.; Rasmussen, John

    1989-01-01

    The semi-analytical method of sensitivity analysis combines ease of implementation with computational efficiency. A major drawback to this method, however, is that severe accuracy problems have recently been reported. A complete error analysis for a beam problem with changing length is carried ou...... pseudo loads in order to obtain general load equilibrium with rigid body motions. Such a method would be readily applicable for any element type, whether analytical expressions for the element stiffnesses are available or not. This topic is postponed for a future study....

  17. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    Users expect communication systems to guarantee, amongst others, privacy and integrity of their data. These can be ensured by using well-established protocols; the best protocol, however, is useless if not all parties involved in a communication have a correct implementation of the protocol and a...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....... necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...

  18. Evaluation of a reduced centrifugation time and higher centrifugal force on various general chemistry and immunochemistry analytes in plasma and serum.

    Science.gov (United States)

    Møller, Mette F; Søndergaard, Tove R; Kristensen, Helle T; Münster, Anna-Marie B

    2017-09-01

    Background Centrifugation of blood samples is an essential preanalytical step in the clinical biochemistry laboratory. Centrifugation settings are often altered to optimize sample flow and turnaround time. Few studies have addressed the effect of altering centrifugation settings on analytical quality, and almost all studies have been done using collection tubes with gel separator. Methods In this study, we compared a centrifugation time of 5 min at 3000 ×  g to a standard protocol of 10 min at 2200 ×  g. Nine selected general chemistry and immunochemistry analytes and interference indices were studied in lithium heparin plasma tubes and serum tubes without gel separator. Results were evaluated using mean bias, difference plots and coefficient of variation, compared with maximum allowable bias and coefficient of variation used in laboratory routine quality control. Results For all analytes except lactate dehydrogenase, the results were within the predefined acceptance criteria, indicating that the analytical quality was not compromised. Lactate dehydrogenase showed higher values after centrifugation for 5 min at 3000 ×  g, mean bias was 6.3 ± 2.2% and the coefficient of variation was 5%. Conclusions We found that a centrifugation protocol of 5 min at 3000 ×  g can be used for the general chemistry and immunochemistry analytes studied, with the possible exception of lactate dehydrogenase, which requires further assessment.

  19. The Spectrum of Learning Analytics

    Directory of Open Access Journals (Sweden)

    Gerd Kortemeyer

    2017-06-01

    Full Text Available "Learning Analytics" became a buzzword during the hype surrounding the advent of "big data" MOOCs, however, the concept has been around for over two decades. When the first online courses became available it was used as a tool to increase student success in particular courses, frequently combined with the hope of conducting educational research. In recent years, the same term started to be used on the institutional level to increase retention and decrease time-to-degree. These two applications, within particular courses on the one hand and at the institutional level on the other, are at the two extremes of the spectrum of Learning Analytics – and they frequently appear to be worlds apart. The survey describes affordances, theories and approaches in these two categories.

  20. Utilization of Neurophysiological Protocols to Characterize Soldier Response to Irritant Gases. Phase 1.

    Science.gov (United States)

    1990-02-15

    DAMM7-89-C-9136 7. PERFORMING ORGANIZATION NAME(S) AND AOO«£S$<£S) Northeast Reserach Institute, Inc Suite A-100 309 Farmington Avenue...is no widely accepted methodology or protocol lor the assessment of human toxicity induced by exposure to irritant gases. Most procedures used by the...employing the appropriate analytical methodologies necessary to more precisely characterize the complex mixture of low-boiling volatilcs, aerosols, and

  1. A round-robin gamma stereotactic radiosurgery dosimetry interinstitution comparison of calibration protocols

    Energy Technology Data Exchange (ETDEWEB)

    Drzymala, R. E., E-mail: drzymala@wustl.edu [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Alvarez, P. E. [Imaging and Radiation Oncology Core Houston, UT MD Anderson Cancer Center, Houston, Texas 77030 (United States); Bednarz, G. [Radiation Oncology Department, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania 15232 (United States); Bourland, J. D. [Department of Radiation Oncology, Wake Forest University, Winston-Salem, North Carolina 27157 (United States); DeWerd, L. A. [Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); Ma, L. [Department of Radiation Oncology, University California San Francisco, San Francisco, California 94143 (United States); Meltsner, S. G. [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 (United States); Neyman, G. [Department of Radiation Oncology, The Cleveland Clinic Foundation, Cleveland, Ohio 44195 (United States); Novotny, J. [Medical Physics Department, Hospital Na Homolce, Prague 15030 (Czech Republic); Petti, P. L. [Gamma Knife Center, Washington Hospital Healthcare System, Fremont, California 94538 (United States); Rivard, M. J. [Department of Radiation Oncology, Tufts University School of Medicine, Boston, Massachusetts 02111 (United States); Shiu, A. S. [Department of Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Goetsch, S. J. [San Diego Medical Physics, Inc., La Jolla, California 92037 (United States)

    2015-11-15

    Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods: Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS

  2. A Composed Protocol of Quantum Identity Authentication Plus Quantum Key Distribution Based on Squeezed States

    International Nuclear Information System (INIS)

    Zhang Sheng; Wang Jian; Tang Chaojing; Zhang Quan

    2011-01-01

    It is established that a single quantum cryptography protocol usually cooperates with other cryptographic systems, such as an authentication system, in the real world. However, few protocols have been proposed on how to combine two or more quantum protocols. To fill this gap, we propose a composed quantum protocol, containing both quantum identity authentication and quantum key distribution, using squeezed states. Hence, not only the identity can be verified, but also a new private key can be generated by our new protocol. We also analyze the security under an optimal attack, and the efficiency, which is defined by the threshold of the tolerant error rate, using Gaussian error function. (general)

  3. Cheating and Anti-Cheating in Gossip-Based Protocol: An Experimental Investigation

    Science.gov (United States)

    Xiao, Xin; Shi, Yuanchun; Tang, Yun; Zhang, Nan

    During recent years, there has been a rapid growth in deployment of gossip-based protocol in many multicast applications. In a typical gossip-based protocol, each node acts as dual roles of receiver and sender, independently exchanging data with its neighbors to facilitate scalability and resilience. However, most of previous work in this literature seldom considered cheating issue of end users, which is also very important in face of the fact that the mutual cooperation inherently determines overall system performance. In this paper, we investigate the dishonest behaviors in decentralized gossip-based protocol through extensive experimental study. Our original contributions come in two-fold: In the first part of cheating study, we analytically discuss two typical cheating strategies, that is, intentionally increasing subscription requests and untruthfully calculating forwarding probability, and further evaluate their negative impacts. The results indicate that more attention should be paid to defending cheating behaviors in gossip-based protocol. In the second part of anti-cheating study, we propose a receiver-driven measurement mechanism, which evaluates individual forwarding traffic from the perspective of receivers and thus identifies cheating nodes with high incoming/outgoing ratio. Furthermore, we extend our mechanism by introducing reliable factor to further improve its accuracy. The experiments under various conditions show that it performs quite well in case of serious cheating and achieves considerable performance in other cases.

  4. A CAD system and quality assurance protocol for bone age assessment utilizing digital hand atlas

    Science.gov (United States)

    Gertych, Arakadiusz; Zhang, Aifeng; Ferrara, Benjamin; Liu, Brent J.

    2007-03-01

    Determination of bone age assessment (BAA) in pediatric radiology is a task based on detailed analysis of patient's left hand X-ray. The current standard utilized in clinical practice relies on a subjective comparison of the hand with patterns in the book atlas. The computerized approach to BAA (CBAA) utilizes automatic analysis of the regions of interest in the hand image. This procedure is followed by extraction of quantitative features sensitive to skeletal development that are further converted to a bone age value utilizing knowledge from the digital hand atlas (DHA). This also allows providing BAA results resembling current clinical approach. All developed methodologies have been combined into one CAD module with a graphical user interface (GUI). CBAA can also improve the statistical and analytical accuracy based on a clinical work-flow analysis. For this purpose a quality assurance protocol (QAP) has been developed. Implementation of the QAP helped to make the CAD more robust and find images that cannot meet conditions required by DHA standards. Moreover, the entire CAD-DHA system may gain further benefits if clinical acquisition protocol is modified. The goal of this study is to present the performance improvement of the overall CAD-DHA system with QAP and the comparison of the CAD results with chronological age of 1390 normal subjects from the DHA. The CAD workstation can process images from local image database or from a PACS server.

  5. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  6. Analytical Model based on Green Criteria for Optical Backbone Network Interconnection

    DEFF Research Database (Denmark)

    Gutierrez Lopez, Jose Manuel; Riaz, M. Tahir; Pedersen, Jens Myrup

    2011-01-01

    Key terms such as Global warming, Green House Gas emissions, or Energy efficiency are currently on the scope of scientific research. Regarding telecommunications networks, wireless applications, routing protocols, etc. are being designed following this new “Green” trend. This work contributes...... to the evaluation of the environmental impact of networks from physical interconnection point of view. Networks deployment, usage, and disposal are analyzed as contributing elements to ICT’s (Information and Communications Technology) CO2 emissions. This paper presents an analytical model for evaluating...

  7. Combined staging at one stop using MR mammography. Evaluation of an extended protocol to screen for distant metastasis in primary breast cancer. Initial results and diagnostic accuracy in a prospective study

    Energy Technology Data Exchange (ETDEWEB)

    Dietzel, M.; Zoubi, R.; Burmeister, H.P.; Kaiser, W.A.; Baltzer, P.A.T. [Jena Univ. (Germany). Inst. of Diagnostic and Interventional Radiology; Runnebaum, I.B. [University Hospital Jena (Germany). Dept. of Gynecology and Obstetrics

    2012-07-15

    Purpose: Accurate staging of primary breast cancer is essential for the therapeutic approach. Modern whole-body MR scanners would allow local and distant staging during a single examination. Accordingly, we designed a dedicated protocol for this purpose and prospectively evaluated the diagnostic accuracy. Materials and Methods: 65 consecutive breast cancer patients underwent pre-therapeutic MRI (1.5 T). A bilateral breast protocol (axial: T1w/GRE dynamic contrast-enhanced, T2w/TSE; TA: 10 min) was extended to screen for distant metastasis at one stop without repositioning (coronal: T2w/HASTE, T1w/VIBE; FOV: thorax, abdomen and spine; TA: 90 sec; multichannel surface coils). The standard of reference was S3 guideline-compliant staging examinations. Global assessment regarding the presence of distant metastasis was performed independently by two experienced and blinded radiologists (five-level confidence score). Inter-rater agreement (weighted kappa) and observer scoring were analyzed (contingency tables). Results: The prevalence of synchronous metastases was 7.7 % (n = 5). The protocol enabled global assessment regarding the presence of distant metastasis with high accuracy (sensitivity: 100 %; specificity: 98.3 %) and inter-rater agreement (kappa: 0.92). Conclusion: Applying the extended MRI protocol, accurate screening for distant metastasis was possible in combination with a dedicated breast examination. (orig.)

  8. Analytical software design : introduction and industrial experience report

    NARCIS (Netherlands)

    Osaiweran, A.A.H.; Boosten, M.; Mousavi, M.R.

    2010-01-01

    Analytical Software Design (ASD) is a design approach that combines formal and empirical methods for developing mathematically verified software systems. Unlike conventional design methods, the design phase is extended with more formal techniques, so that flaws are detected earlier, thereby reducing

  9. Analytical solutions of the electrostatically actuated curled beam problem

    KAUST Repository

    Younis, Mohammad I.

    2014-01-01

    This works presents analytical expressions of the electrostatically actuated initially deformed cantilever beam problem. The formulation is based on the continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We

  10. Intelligent routing protocol for ad hoc wireless network

    Science.gov (United States)

    Peng, Chaorong; Chen, Chang Wen

    2006-05-01

    A novel routing scheme for mobile ad hoc networks (MANETs), which combines hybrid and multi-inter-routing path properties with a distributed topology discovery route mechanism using control agents is proposed in this paper. In recent years, a variety of hybrid routing protocols for Mobile Ad hoc wireless networks (MANETs) have been developed. Which is proactively maintains routing information for a local neighborhood, while reactively acquiring routes to destinations beyond the global. The hybrid protocol reduces routing discovery latency and the end-to-end delay by providing high connectivity without requiring much of the scarce network capacity. On the other side the hybrid routing protocols in MANETs likes Zone Routing Protocol still need route "re-discover" time when a route between zones link break. Sine the topology update information needs to be broadcast routing request on local zone. Due to this delay, the routing protocol may not be applicable for real-time data and multimedia communication. We utilize the advantages of a clustering organization and multi-routing path in routing protocol to achieve several goals at the same time. Firstly, IRP efficiently saves network bandwidth and reduces route reconstruction time when a routing path fails. The IRP protocol does not require global periodic routing advertisements, local control agents will automatically monitor and repair broke links. Secondly, it efficiently reduces congestion and traffic "bottlenecks" for ClusterHeads in clustering network. Thirdly, it reduces significant overheads associated with maintaining clusters. Fourthly, it improves clusters stability due to dynamic topology changing frequently. In this paper, we present the Intelligent Routing Protocol. First, we discuss the problem of routing in ad hoc networks and the motivation of IRP. We describe the hierarchical architecture of IRP. We describe the routing process and illustrate it with an example. Further, we describe the control manage

  11. Combining motivational and volitional strategies to promote unsupervised walking in patients with fibromyalgia: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Pastor, María-Ángeles; López-Roig, Sofía; Lledó, Ana; Peñacoba, Cecilia; Velasco, Lilian; Schweiger-Gallo, Inge; Cigarán, Margarita; Ecija, Carmen; Limón, Ramón; Sanz, Yolanda

    2014-04-11

    Fibromyalgia patients are often advised to engage in regular low- to moderate-intensity physical exercise. The need of fibromyalgia patients to walk has been stressed in previous research. Behavioral self-regulation theories suggest that a combination of motivational aspects (to develop or strengthen a behavioral intention: Theory of Planned Behavior) and volitional aspects (engagement of intention in behavior: implementation intentions) is more effective than a single intervention. In this paper, we describe a protocol for identifying the motivational processes (using the Theory of Planned Behavior) involved in the practice of walking (phase I) and for studying the efficacy of an intervention that combines motivational and volitional contents to enhance the acquisition and continuation of this exercise behavior (phase II). The paper also shows the characteristics of eligible individuals (women who do not walk) and ineligible populations (women who walk or do not walk because of comorbidity without medical recommendation to walk). Both groups consist of members of any of four patients' associations in Spain who are between 18 and 70 years of age and meet the London Fibromyalgia Epidemiology Study Screening Questionnaire criteria for fibromyalgia. Furthermore, using this study protocol, we will explore the characteristics of participants (eligible women who agreed to participate in the study) and nonparticipants (eligible women who refused to participate). Two studies will be conducted: Phase I will be a cross-sectional study, and phase II will be a triple-blind, randomized longitudinal study with two treatment groups and one active control group. The questionnaires were sent to a total of 2,227 members of four patients' associations in Spain. A total of 920 participants with fibromyalgia returned the questionnaires, and 582 were ultimately selected to participate. The first data gathered have allowed us to identify the characteristics of the study population and

  12. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    Science.gov (United States)

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  13. Developing a protocol for managing the biophysical condition of a ...

    African Journals Online (AJOL)

    Their function will focus on the overall management of water resources on a ... for the integrated management of the biophysical component of a catchment, with ... and implement a protocol which will combine and integrate the knowledge of ...

  14. Improved assessment of mediastinal and pulmonary pathologies in combined staging CT examinations using a fast-speed acquisition dual-source CT protocol

    Energy Technology Data Exchange (ETDEWEB)

    Braun, Franziska M.; Holzner, Veronica; Meinel, Felix G.; Armbruster, Marco; Brandlhuber, Martina; Ertl-Wagner, Birgit; Sommer, Wieland H. [University Hospital Munich, Institute for Clinical Radiology, Munich (Germany)

    2017-12-15

    To demonstrate the feasibility of fast Dual-Source CT (DSCT) and to evaluate the clinical utility in chest/abdomen/pelvis staging CT studies. 45 cancer patients with two follow-up combined chest/abdomen/pelvis staging CT examinations (maximally ±10 kV difference in tube potential) were included. The first scan had to be performed with our standard protocol (fixed pitch 0.6), the second one using a novel fast-speed DSCT protocol (fixed pitch 1.55). Effective doses (ED) were calculated, noise measurements performed. Scan times were compared, motion artefacts and the diagnostic confidence rated in consensus reading. ED for the standard and fast-speed scans was 9.1 (7.0-11.1) mSv and 9.2 (7.4-12.8) mSv, respectively (P = 0.075). Image noise was comparable (abdomen; all P > 0.05) or reduced for fast-speed CTs (trachea, P = 0.001; ascending aorta, P < 0.001). Motion artefacts of the heart/the ascending aorta (all P < 0.001) and breathing artefacts (P < 0.031) were reduced in fast DSCT. The diagnostic confidence for the evaluation of mediastinal (P < 0.001) and pulmonary (P = 0.008) pathologies was improved for fast DSCT. Fast DSCT for chest/abdomen/pelvis staging CT examinations is performed within 2 seconds scan time and eliminates relevant intrathoracic motion/breathing artefacts. Mediastinal/pulmonary pathologies can thus be assessed with high diagnostic confidence. Abdominal image quality remains excellent. (orig.)

  15. Pre-exposure and postexposure prophylaxes and the combination HIV prevention methods (The Combine! Study): protocol for a pragmatic clinical trial at public healthcare clinics in Brazil.

    Science.gov (United States)

    Grangeiro, Alexandre; Couto, Márcia Thereza; Peres, Maria Fernanda; Luiz, Olinda; Zucchi, Eliana Miura; de Castilho, Euclides Ayres; Estevam, Denize Lotufo; Alencar, Rosa; Wolffenbüttel, Karina; Escuder, Maria Mercedes; Calazans, Gabriela; Ferraz, Dulce; Arruda, Érico; Corrêa, Maria da Gloria; Amaral, Fabiana Rezende; Santos, Juliane Cardoso Villela; Alvarez, Vivian Salles; Kietzmann, Tiago

    2015-08-25

    Few results from programmes based on combination prevention methods are available. We propose to analyse the degree of protection provided by postexposure prophylaxis (PEP) for consensual sexual activity at healthcare clinics, its compensatory effects on sexual behaviour; and the effectiveness of combination prevention methods and pre-exposure prophylaxis (PrEP), compared with exclusively using traditional methods. A total of 3200 individuals aged 16 years or older presenting for PEP at 5 sexually transmitted disease (STD)/HIV clinics in 3 regions of Brazil will be allocated to one of two groups: the PEP group-individuals who come to the clinic within 72 h after a sexual exposure and start PEP; and the non-PEP group-individuals who come after 72 h but within 30 days of exposure and do not start PEP. Clinical follow-up will be conducted initially for 6 months and comprise educational interventions based on information and counselling for using prevention methods, including PrEP. In the second study phase, individuals who remain HIV negative will be regrouped according to the reported use of prevention methods and observed for 18 months: only traditional methods; combined methods; and PrEP. Effectiveness will be analysed according to the incidence of HIV, syphilis and hepatitis B and C and protected sexual behaviour. A structured questionnaire will be administered to participants at baseline and every 6 months thereafter. Qualitative methods will be employed to provide a comprehensive understanding of PEP-seeking behaviour, preventive choices and exposure to HIV. This study will be conducted in accordance with the resolution of the School of Medicine Research Ethics Commission of Universidade de São Paulo (protocol no. 251/14). The databases will be available for specific studies, after management committee approval. Findings will be presented to researchers, health managers and civil society members by means of newspapers, electronic media and scientific journals

  16. Past and Future of the Kyoto Protocol. Final report

    International Nuclear Information System (INIS)

    Wijen, F.; Zoeteman, K.

    2004-01-01

    The present report reflects findings from a study on the realization of and prospects for the Kyoto Protocol. The purpose of the study was (1) to obtain insights into the factors that enabled the realization of the Kyoto Protocol, in particular the interactions among major parties involved; (2) to assess the future opportunities and threats of the Kyoto Protocol, in particular against the backdrop of an increasingly globalised world. The study was conducted from February up to December 2003 by (a) reviewing the literature, especially publications on the negotiation history of the Kyoto process, the social interactions enabling the realization of the Protocol, analyses of strengths and weaknesses, and future climate regimes; (b) conducting a series of interviews with representatives from government, academia, non-governmental organisations, and business, who have been - directly or indirectly - involved in the Kyoto process; (c) internal discussions,brainstorming and analysing the Protocol's strengths and weaknesses, possible future scenarios (including policy options), and the management of a possible failure of the Kyoto Protocol. The present report reflects and integrates the different sources. The first section deals with the past and the present. It discusses how the Kyoto Protocol could be realized despite the divergent interests, reflects on its architecture, and analyses major strengths and weaknesses. In the second section, we present possible future scenarios. We explore how different combinations of domestic and international commitment provide possible realities that national government may face when crafting climate policy. The third section provides an in-depth analysis of the possible event that the Kyoto Protocol fails. We discuss its definition and policy implications. The final section is reserved for overall conclusions and policy recommendations

  17. Quorum system and random based asynchronous rendezvous protocol for cognitive radio ad hoc networks

    Directory of Open Access Journals (Sweden)

    Sylwia Romaszko

    2013-12-01

    Full Text Available This paper proposes a rendezvous protocol for cognitive radio ad hoc networks, RAC2E-gQS, which utilizes (1 the asynchronous and randomness properties of the RAC2E protocol, and (2 channel mapping protocol, based on a grid Quorum System (gQS, and taking into account channel heterogeneity and asymmetric channel views. We show that the combination of the RAC2E protocol with the grid-quorum based channel mapping can yield a powerful RAC2E-gQS rendezvous protocol for asynchronous operation in a distributed environment assuring a rapid rendezvous between the cognitive radio nodes having available both symmetric and asymmetric channel views. We also propose an enhancement of the protocol, which uses a torus QS for a slot allocation, dealing with the worst case scenario, a large number of channels with opposite ranking lists.

  18. Analytic Reflected Lightcurves for Exoplanets

    Science.gov (United States)

    Haggard, Hal M.; Cowan, Nicolas B.

    2018-04-01

    The disk-integrated reflected brightness of an exoplanet changes as a function of time due to orbital and rotational motion coupled with an inhomogeneous albedo map. We have previously derived analytic reflected lightcurves for spherical harmonic albedo maps in the special case of a synchronously-rotating planet on an edge-on orbit (Cowan, Fuentes & Haggard 2013). In this letter, we present analytic reflected lightcurves for the general case of a planet on an inclined orbit, with arbitrary spin period and non-zero obliquity. We do so for two different albedo basis maps: bright points (δ-maps), and spherical harmonics (Y_l^m-maps). In particular, we use Wigner D-matrices to express an harmonic lightcurve for an arbitrary viewing geometry as a non-linear combination of harmonic lightcurves for the simpler edge-on, synchronously rotating geometry. These solutions will enable future exploration of the degeneracies and information content of reflected lightcurves, as well as fast calculation of lightcurves for mapping exoplanets based on time-resolved photometry. To these ends we make available Exoplanet Analytic Reflected Lightcurves (EARL), a simple open-source code that allows rapid computation of reflected lightcurves.

  19. A Group Neighborhood Average Clock Synchronization Protocol for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Lin Lin

    2014-08-01

    Full Text Available Clock synchronization is a very important issue for the applications of wireless sensor networks. The sensors need to keep a strict clock so that users can know exactly what happens in the monitoring area at the same time. This paper proposes a novel internal distributed clock synchronization solution using group neighborhood average. Each sensor node collects the offset and skew rate of the neighbors. Group averaging of offset and skew rate value are calculated instead of conventional point-to-point averaging method. The sensor node then returns compensated value back to the neighbors. The propagation delay is considered and compensated. The analytical analysis of offset and skew compensation is presented. Simulation results validate the effectiveness of the protocol and reveal that the protocol allows sensor networks to quickly establish a consensus clock and maintain a small deviation from the consensus clock.

  20. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    Science.gov (United States)

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  1. The Integration of a Small Thermal Desorption (TD) System for Air Monitoring into a Mobile Analytical Laboratory in France Used by the NRBC Emergency First Responder Police Organization

    International Nuclear Information System (INIS)

    Roberts, G. M.

    2007-01-01

    A mobile analytical laboratory has been developed in France by Thales Security Systems in conjunction with the French department of defense (DGA) to rapidly identify the composition of toxic substances released accidentally or by terrorist activity at a location of high civilian population density. Accurate and fast identification of toxic material is critical for first responder teams that attend an incident site. Based on this analysis defined decontamination protocols for contaminated people can be implemented, and specific medical treatment can be administered to those worst affected. Analysing samples with high technology instrumentation close to the point of release is therefore highly advantageous and is only possible with mobile analytical platforms. Transporting samples back to a central laboratory for analysis is not realistic due to time limitations. This paper looks at one particular aspect of analysis performed in this mobile multi-technique laboratory namely air monitoring for CW or TIC compounds. Air sampling and pre concentration is achieved using a small, innovative Thermal Desorption system (Unitytm) in combination with a gas chromatograph-mass spectroscopy system for the detection and identification of specific analytes. Implementation of the Unity TD system in the confines of this small mobile environment will be reviewed in this paper. (author)

  2. A review of analytics and clinical informatics in health care.

    Science.gov (United States)

    Simpao, Allan F; Ahumada, Luis M; Gálvez, Jorge A; Rehman, Mohamed A

    2014-04-01

    Federal investment in health information technology has incentivized the adoption of electronic health record systems by physicians and health care organizations; the result has been a massive rise in the collection of patient data in electronic form (i.e. "Big Data"). Health care systems have leveraged Big Data for quality and performance improvements using analytics-the systematic use of data combined with quantitative as well as qualitative analysis to make decisions. Analytics have been utilized in various aspects of health care including predictive risk assessment, clinical decision support, home health monitoring, finance, and resource allocation. Visual analytics is one example of an analytics technique with an array of health care and research applications that are well described in the literature. The proliferation of Big Data and analytics in health care has spawned a growing demand for clinical informatics professionals who can bridge the gap between the medical and information sciences.

  3. Analytical model for nonlinear piezoelectric energy harvesting devices

    International Nuclear Information System (INIS)

    Neiss, S; Goldschmidtboeing, F; M Kroener; Woias, P

    2014-01-01

    In this work we propose analytical expressions for the jump-up and jump-down point of a nonlinear piezoelectric energy harvester. In addition, analytical expressions for the maximum power output at optimal resistive load and the 3 dB-bandwidth are derived. So far, only numerical models have been used to describe the physics of a piezoelectric energy harvester. However, this approach is not suitable to quickly evaluate different geometrical designs or piezoelectric materials in the harvester design process. In addition, the analytical expressions could be used to predict the jump-frequencies of a harvester during operation. In combination with a tuning mechanism, this would allow the design of an efficient control algorithm to ensure that the harvester is always working on the oscillator's high energy attractor. (paper)

  4. Landslide susceptibility mapping by combining the three methods Fuzzy Logic, Frequency Ratio and Analytical Hierarchy Process in Dozain basin

    Directory of Open Access Journals (Sweden)

    E. Tazik

    2014-10-01

    Full Text Available Landslides are among the most important natural hazards that lead to modification of the environment. Therefore, studying of this phenomenon is so important in many areas. Because of the climate conditions, geologic, and geomorphologic characteristics of the region, the purpose of this study was landslide hazard assessment using Fuzzy Logic, frequency ratio and Analytical Hierarchy Process method in Dozein basin, Iran. At first, landslides occurred in Dozein basin were identified using aerial photos and field studies. The influenced landslide parameters that were used in this study including slope, aspect, elevation, lithology, precipitation, land cover, distance from fault, distance from road and distance from river were obtained from different sources and maps. Using these factors and the identified landslide, the fuzzy membership values were calculated by frequency ratio. Then to account for the importance of each of the factors in the landslide susceptibility, weights of each factor were determined based on questionnaire and AHP method. Finally, fuzzy map of each factor was multiplied to its weight that obtained using AHP method. At the end, for computing prediction accuracy, the produced map was verified by comparing to existing landslide locations. These results indicate that the combining the three methods Fuzzy Logic, Frequency Ratio and Analytical Hierarchy Process method are relatively good estimators of landslide susceptibility in the study area. According to landslide susceptibility map about 51% of the occurred landslide fall into the high and very high susceptibility zones of the landslide susceptibility map, but approximately 26 % of them indeed located in the low and very low susceptibility zones.

  5. Separation and Simultaneous Determination of 14 Fungicides with the Combination of Multi-Analyte Methods and HPLC Detection

    Energy Technology Data Exchange (ETDEWEB)

    Canping, Pan [Department of Applied Chemistry, China Agricultural University, Beijing (China)

    2009-07-15

    The separation and simultaneous HPLC-MS determination for a series of fungicide products is reported. Multi-analyte methods were applied on a Chromolith RP-18e monolithic column having low resistance and enabling high flow rates and short analysis time at very good separation power. Details and analytical conditions are described with chromatograms illustrating the results and work done. (author)

  6. Comparison of Haloperidol Alone and in Combination with Midazolam for the Treatment of Acute Agitation in an Inpatient Palliative Care Service.

    Science.gov (United States)

    Ferraz Gonçalves, José António; Almeida, Ana; Costa, Isabel; Silva, Paula; Carneiro, Rui

    2016-12-01

    Agitation is a very distressing problem that must be controlled as quickly as possible, but using a safe method. The authors conducted a comparison of two protocols: a combination of haloperidol and midazolam and haloperidol alone. The combination drug protocol controlled 101 out of 121 (84%) episodes of agitation with only the first dose, whereas the haloperidol alone protocol controlled 47 out of 74 (64%) episodes. This difference is statistically significant (P =.002), with a post hoc analyzed power of 0.88. The median time from the first dose to the control of agitation was 15 minutes (range: 5-210) with the combination and 60 minutes (range: 10-430) with the other protocol, P haloperidol and midazolam is effective and safe for the control of agitation in palliative care and it is more effective than haloperidol alone. Therefore, the combination should be adopted as the preferred protocol. It would be helpful if the usefulness of this protocol is confirmed by others.

  7. Provenancing Flower Bulbs by Analytical Fingerprinting: Convallaria Majalis

    NARCIS (Netherlands)

    Ruth, van S.M.; Visser, de R.

    2015-01-01

    The origin of agricultural products is gaining in appreciation while often hard to determine for various reasons. Geographical origin may be resolved using a combination of chemical and physical analytical technologies. In the present case of Lily of the Valley (Convallaria majalis) rhizomes, we

  8. FOG: Fighting the Achilles' Heel of Gossip Protocols with Fountain Codes

    Science.gov (United States)

    Champel, Mary-Luc; Kermarrec, Anne-Marie; Le Scouarnec, Nicolas

    Gossip protocols are well known to provide reliable and robust dissemination protocols in highly dynamic systems. Yet, they suffer from high redundancy in the last phase of the dissemination. In this paper, we combine fountain codes (rateless erasure-correcting codes) together with gossip protocols for a robust and fast content dissemination in large-scale dynamic systems. The use of fountain enables to eliminate the unnecessary redundancy of gossip protocols. We propose the design of FOG, which fully exploits the first exponential growth phase (where the data is disseminated exponentially fast) of gossip protocols while avoiding the need for the shrinking phase by using fountain codes. FOG voluntarily increases the number of disseminations but limits those disseminations to the exponential growth phase. In addition, FOG creates a split-graph overlay that splits the peers between encoders and forwarders. Forwarder peers become encoders as soon as they have received the whole content. In order to benefit even further and quicker from encoders, FOG biases the dissemination towards the most advanced peers to make them complete earlier.

  9. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David; Wolverton, Michael J.; Bruce, Joseph R.; Burtner, Edwin R.; Endert, Alexander

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying models of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.

  10. Protocol for Short- and Longer-term Spatial Learning and Memory in Mice

    Directory of Open Access Journals (Sweden)

    Emily F. Willis

    2017-10-01

    Full Text Available Studies on the role of the hippocampus in higher cognitive functions such as spatial learning and memory in rodents are reliant upon robust and objective behavioral tests. This protocol describes one such test—the active place avoidance (APA task. This behavioral task involves the mouse continuously integrating visual cues to orientate itself within a rotating arena in order to actively avoid a shock zone, the location of which remains constant relative to the room. This protocol details the step-by-step procedures for a novel paradigm of the hippocampal-dependent APA task, measuring acquisition of spatial learning during a single 20-min trial (i.e., short-term memory, with spatial memory encoding and retrieval (i.e., long-term memory assessed by trials conducted over consecutive days. Using the APA task, cognitive flexibility can be assessed using the reversal learning paradigm, as this increases the cognitive load required for efficient performance in the task. In addition to a detailed experimental protocol, this paper also describes the range of its possible applications, the expected key results, as well as the analytical methods to assess the data, and the pitfalls/troubleshooting measures. The protocol described herein is highly robust and produces replicable results, thus presenting an important paradigm that enables the assessment of subtle short-term changes in spatial learning and memory, such as those observed for many experimental interventions.

  11. Formal Test Automation: The Conference protocol with TGV/TorX

    NARCIS (Netherlands)

    Ural, Hasan; Du Bousquet, Lydie; Ramangalahy, Solofo; Probert, Robert L.; von Bochmann, Gregor; Simon, Severine; Viho, Cesar; Belinfante, Axel; de Vries, R.G.

    We present an experiment of automated formal conformance testing of the Conference Protocol Entity as reported in [2]. Our approach differs from other experiments, since it investigates the combination of the tools TGV for abstract test generation and TorX for test execution.

  12. Combining project based learning with exercises in problem solving in order to train analytical mathematical skills

    DEFF Research Database (Denmark)

    Friesel, Anna

    2013-01-01

    This paper presents the contents and the teaching methods used in the fourth semester course - REG4E - an important subject in engineering, namely Control Theory and Dynamical Systems. Control Theory courses in engineering education are usually related to exercises in the laboratory or to projects....... However, in order to understand complexity of control systems, the students need to possess an analytical understanding of abstract mathematical problems. Our main goal is to illustrate the theory through the robot project, but at the same time we force our students to train their analytical skills...

  13. Analytical solutions for tomato peeling with combined heat flux and convective boundary conditions

    Science.gov (United States)

    Cuccurullo, G.; Giordano, L.; Metallo, A.

    2017-11-01

    Peeling of tomatoes by radiative heating is a valid alternative to steam or lye, which are expensive and pollutant methods. Suitable energy densities are required in order to realize short time operations, thus involving only a thin layer under the tomato surface. This paper aims to predict the temperature field in rotating tomatoes exposed to the source irradiation. Therefore, a 1D unsteady analytical model is presented, which involves a semi-infinite slab subjected to time dependent heating while convective heat transfer takes place on the exposed surface. In order to account for the tomato rotation, the heat source is described as the positive half-wave of a sinusoidal function. The problem being linear, the solution is derived following the Laplace Transform Method. In addition, an easy-to-handle solution for the problem at hand is presented, which assumes a differentiable function for approximating the source while neglecting convective cooling, the latter contribution turning out to be negligible for the context at hand. A satisfying agreement between the two analytical solutions is found, therefore, an easy procedure for a proper design of the dry heating system can be set up avoiding the use of numerical simulations.

  14. Case Study : Visual Analytics in Software Product Assessments

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Lanza, M; Storey, M; Muller, H

    2009-01-01

    We present how a combination of static source code analysis, repository analysis, and visualization techniques has been used to effectively get and communicate insight in the development and project management problems of a large industrial code base. This study is an example of how visual analytics

  15. Mapping debris flow susceptibility using analytical network process ...

    Indian Academy of Sciences (India)

    Evangelin Ramani Sujatha

    2017-11-23

    Nov 23, 2017 ... methods known as the analytical network process (ANP) is used to map the ..... ciated in any prospective way, through feedbacks ..... slide susceptibility by means of multivariate statistical .... and bivariate statistics: A case study in southern Italy;. Nat. ... combination applied to Tevankarai Stream Watershed,.

  16. Bio- and chemiluminescence imaging in analytical chemistry

    International Nuclear Information System (INIS)

    Roda, Aldo; Guardigli, Massimo; Pasini, Patrizia; Mirasoli, Mara; Michelini, Elisa; Musiani, Monica

    2005-01-01

    Bio- and chemiluminescence imaging techniques combine the high sensitivity of bio- and chemiluminescence detection with the ability of current light imaging devices to localize and quantify light emission down to the single-photon level. These techniques have been successfully exploited for the development of sensitive analytical methods relying on the evaluation of the spatial distribution of the light emitted from a target sample. In this paper, we report on recent applications of bio- and chemiluminescence imaging for in vitro and in vivo assays, including: quantitative assays performed in various analytical formats, such as microtiter plates, microarrays and miniaturized analytical devices, used in the pharmaceutical, clinical, diagnostic and environmental fields; luminescence imaging microscopy based on enzymatic, immunohistochemical and in situ hybridization reactions for the localization of metabolites, enzymes, antigens and gene sequences in cells and tissues; whole-body luminescence imaging in live animals for evaluating biological and pathological processes and for pharmacological studies

  17. Archetypes of Supply Chain Analytics Initiatives—An Exploratory Study

    Directory of Open Access Journals (Sweden)

    Tino T. Herden

    2018-05-01

    Full Text Available While Big Data and Analytics are arguably rising stars of competitive advantage, their application is often presented and investigated as an overall approach. A plethora of methods and technologies combined with a variety of objectives creates a barrier for managers to decide how to act, while researchers investigating the impact of Analytics oftentimes neglect this complexity when generalizing their results. Based on a cluster analysis applied to 46 case studies of Supply Chain Analytics (SCA we propose 6 archetypes of initiatives in SCA to provide orientation for managers as means to overcome barriers and build competitive advantage. Further, the derived archetypes present a distinction of SCA for researchers seeking to investigate the effects of SCA on organizational performance.

  18. Impact of the Injection Protocol on an Impurity's Stationary State

    Science.gov (United States)

    Gamayun, Oleksandr; Lychkovskiy, Oleg; Burovski, Evgeni; Malcomson, Matthew; Cheianov, Vadim V.; Zvonarev, Mikhail B.

    2018-06-01

    We examine stationary-state properties of an impurity particle injected into a one-dimensional quantum gas. We show that the value of the impurity's end velocity lies between zero and the speed of sound in the gas and is determined by the injection protocol. This way, the impurity's constant motion is a dynamically emergent phenomenon whose description goes beyond accounting for the kinematic constraints of the Landau approach to superfluidity. We provide exact analytic results in the thermodynamic limit and perform finite-size numerical simulations to demonstrate that the predicted phenomena are within the reach of the ultracold gas experiments.

  19. IEEE 802.11 Wireless LANs: Performance Analysis and Protocol Refinement

    Directory of Open Access Journals (Sweden)

    Chatzimisios P.

    2005-01-01

    Full Text Available The IEEE 802.11 protocol is emerging as a widely used standard and has become the most mature technology for wireless local area networks (WLANs. In this paper, we focus on the tuning of the IEEE 802.11 protocol parameters taking into consideration, in addition to throughput efficiency, performance metrics such as the average packet delay, the probability of a packet being discarded when it reaches the maximum retransmission limit, the average time to drop a packet, and the packet interarrival time. We present an analysis, which has been validated by simulation that is based on a Markov chain model commonly used in the literature. We further study the improvement on these performance metrics by employing suitable protocol parameters according to the specific communication needs of the IEEE 802.11 protocol for both basic access and RTS/CTS access schemes. We show that the use of a higher initial contention window size does not considerably degrade performance in small networks and performs significantly better in any other scenario. Moreover, we conclude that the combination of a lower maximum contention window size and a higher retry limit considerably improves performance. Results indicate that the appropriate adjustment of the protocol parameters enhances performance and improves the services that the IEEE 802.11 protocol provides to various communication applications.

  20. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks

    Science.gov (United States)

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-hoon

    2017-01-01

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads. PMID:28531159

  1. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks.

    Science.gov (United States)

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-Hoon

    2017-05-22

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads.

  2. Protocol for the combined immunosuppression & radiotherapy in thyroid eye disease (CIRTED trial: A multi-centre, double-masked, factorial randomised controlled trial

    Directory of Open Access Journals (Sweden)

    Kingston Laura

    2008-01-01

    Full Text Available Abstract Background Medical management of thyroid eye disease remains controversial due to a paucity of high quality evidence on long-term treatment outcomes. Glucocorticoids are known to be effective initially but have significant side-effects with long-term use and recrudescence can occur on cessation. Current evidence is conflicting on the efficacy of radiotherapy and non-steroid systemic immunosuppression, and the majority of previous studies have been retrospective, uncontrolled, small or poorly designed. The Combined Immunosuppression and Radiotherapy in Thyroid Eye Disease (CIRTED trial was designed to investigate the efficacy of radiotherapy and azathioprine in combination with a standard course of oral prednisolone in patients with active thyroid eye disease. Methods/design Patients with active thyroid eye disease will be randomised to receive (i azathioprine or oral placebo and (ii radiotherapy or sham-radiotherapy in this multi-centre, factorial randomised control trial. The primary outcome is improvement in disease severity (assessed using a composite binary measure at 12 months and secondary end-points include quality of life scores and health economic measures. Discussion The CIRTED trial is the first study to evaluate the role of radiotherapy and azathioprine as part of a long-term, combination immunosuppressive treatment regime for Thyroid Eye Disease. It will provide evidence for the role of radiotherapy and prolonged immunosuppression in the management of this condition, as well as pilot data on their use in combination. We have paid particular attention in the trial design to establishing (a robust placebo controls and masking protocols which are effective and safe for both radiotherapy and the systemic administration of an antiproliferative drug; (b constructing effective inclusion and exclusion criteria to select for active disease; and (c selecting pragmatic outcome measures. Trial registration Current controlled trials

  3. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  4. Evaluation of the analytic performance of laboratories: inter-laboratorial study of the spectroscopy of atomic absorption

    International Nuclear Information System (INIS)

    Wong Wong, S. M.

    1996-01-01

    The author made an inter-laboratorial study, with the participation of 18 national laboratories, that have spectrophotometer of atomic absorption. To evaluate the methods of analysis of lead, sodium, potasium, calcium, magnesium, zinc, copper, manganese, and iron, in the ambit of mg/l. The samples, distributed in four rounds to the laboratories, were prepared from primary patterns, deionized and distilled water. The study evaluated the homogeneity and stability, and verified its concentration, using as a reference method, the spectrometry method of Inductively Coupled Plasma emission (1CP). To obtain the characteristics of analytic performance, it applied the norm ASTM E 691. To evaluated the analytic performance, it used harmonized protocol of the International Union of Pure and applied chemistry (IUPAC). The study obtained the 29% of the laboratories had a satisfactory analytic performance, 9% had a questionable performance and 62% made an unsatisfactory analytic performance, according to the IUPAC norm. The results of the values of the characteristic performance method, show that there is no intercomparability between the laboratories, which is attributed to the different methodologies of analysis. (S. Grainger)

  5. A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages

    DEFF Research Database (Denmark)

    Malzahn, Dorthe; Opper, Manfred

    2003-01-01

    We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages...

  6. Low-dose X-ray computed tomography image reconstruction with a combined low-mAs and sparse-view protocol.

    Science.gov (United States)

    Gao, Yang; Bian, Zhaoying; Huang, Jing; Zhang, Yunwan; Niu, Shanzhou; Feng, Qianjin; Chen, Wufan; Liang, Zhengrong; Ma, Jianhua

    2014-06-16

    To realize low-dose imaging in X-ray computed tomography (CT) examination, lowering milliampere-seconds (low-mAs) or reducing the required number of projection views (sparse-view) per rotation around the body has been widely studied as an easy and effective approach. In this study, we are focusing on low-dose CT image reconstruction from the sinograms acquired with a combined low-mAs and sparse-view protocol and propose a two-step image reconstruction strategy. Specifically, to suppress significant statistical noise in the noisy and insufficient sinograms, an adaptive sinogram restoration (ASR) method is first proposed with consideration of the statistical property of sinogram data, and then to further acquire a high-quality image, a total variation based projection onto convex sets (TV-POCS) method is adopted with a slight modification. For simplicity, the present reconstruction strategy was termed as "ASR-TV-POCS." To evaluate the present ASR-TV-POCS method, both qualitative and quantitative studies were performed on a physical phantom. Experimental results have demonstrated that the present ASR-TV-POCS method can achieve promising gains over other existing methods in terms of the noise reduction, contrast-to-noise ratio, and edge detail preservation.

  7. Self-Adaptive Contention Aware Routing Protocol for Intermittently Connected Mobile Networks

    KAUST Repository

    Elwhishi, Ahmed; Ho, Pin-Han; Naik, K.; Shihada, Basem

    2013-01-01

    This paper introduces a novel multicopy routing protocol, called Self-Adaptive Utility-based Routing Protocol (SAURP), for Delay Tolerant Networks (DTNs) that are possibly composed of a vast number of devices in miniature such as smart phones of heterogeneous capacities in terms of energy resources and buffer spaces. SAURP is characterized by the ability of identifying potential opportunities for forwarding messages to their destinations via a novel utility function-based mechanism, in which a suite of environment parameters, such as wireless channel condition, nodal buffer occupancy, and encounter statistics, are jointly considered. Thus, SAURP can reroute messages around nodes experiencing high-buffer occupancy, wireless interference, and/or congestion, while taking a considerably small number of transmissions. The developed utility function in SAURP is proved to be able to achieve optimal performance, which is further analyzed via a stochastic modeling approach. Extensive simulations are conducted to verify the developed analytical model and compare the proposed SAURP with a number of recently reported encounter-based routing approaches in terms of delivery ratio, delivery delay, and the number of transmissions required for each message delivery. The simulation results show that SAURP outperforms all the counterpart multicopy encounter-based routing protocols considered in the study.

  8. Self-Adaptive Contention Aware Routing Protocol for Intermittently Connected Mobile Networks

    KAUST Repository

    Elwhishi, Ahmed

    2013-07-01

    This paper introduces a novel multicopy routing protocol, called Self-Adaptive Utility-based Routing Protocol (SAURP), for Delay Tolerant Networks (DTNs) that are possibly composed of a vast number of devices in miniature such as smart phones of heterogeneous capacities in terms of energy resources and buffer spaces. SAURP is characterized by the ability of identifying potential opportunities for forwarding messages to their destinations via a novel utility function-based mechanism, in which a suite of environment parameters, such as wireless channel condition, nodal buffer occupancy, and encounter statistics, are jointly considered. Thus, SAURP can reroute messages around nodes experiencing high-buffer occupancy, wireless interference, and/or congestion, while taking a considerably small number of transmissions. The developed utility function in SAURP is proved to be able to achieve optimal performance, which is further analyzed via a stochastic modeling approach. Extensive simulations are conducted to verify the developed analytical model and compare the proposed SAURP with a number of recently reported encounter-based routing approaches in terms of delivery ratio, delivery delay, and the number of transmissions required for each message delivery. The simulation results show that SAURP outperforms all the counterpart multicopy encounter-based routing protocols considered in the study.

  9. Validation of Analytical Damping Ratio by Fatigue Stress Limit

    Science.gov (United States)

    Foong, Faruq Muhammad; Chung Ket, Thein; Beng Lee, Ooi; Aziz, Abdul Rashid Abdul

    2018-03-01

    The optimisation process of a vibration energy harvester is usually restricted to experimental approaches due to the lack of an analytical equation to describe the damping of a system. This study derives an analytical equation, which describes the first mode damping ratio of a clamp-free cantilever beam under harmonic base excitation by combining the transverse equation of motion of the beam with the damping-stress equation. This equation, as opposed to other common damping determination methods, is independent of experimental inputs or finite element simulations and can be solved using a simple iterative convergence method. The derived equation was determined to be correct for cases when the maximum bending stress in the beam is below the fatigue limit stress of the beam. However, an increasing trend in the error between the experiment and the analytical results were observed at high stress levels. Hence, the fatigue limit stress was used as a parameter to define the validity of the analytical equation.

  10. Improving Anomaly Detection for Text-Based Protocols by Exploiting Message Structures

    Directory of Open Access Journals (Sweden)

    Christian M. Mueller

    2010-12-01

    Full Text Available Service platforms using text-based protocols need to be protected against attacks. Machine-learning algorithms with pattern matching can be used to detect even previously unknown attacks. In this paper, we present an extension to known Support Vector Machine (SVM based anomaly detection algorithms for the Session Initiation Protocol (SIP. Our contribution is to extend the amount of different features used for classification (feature space by exploiting the structure of SIP messages, which reduces the false positive rate. Additionally, we show how combining our approach with attribute reduction significantly improves throughput.

  11. Optimized energy-delay sub-network routing protocol development and implementation for wireless sensor networks

    International Nuclear Information System (INIS)

    Fonda, James W; Zawodniok, Maciej; Jagannathan, S; Watkins, Steve E

    2008-01-01

    The development and the implementation issues of a reactive optimized energy-delay sub-network routing (OEDSR) protocol for wireless sensor networks (WSN) are introduced and its performance is contrasted with the popular ad hoc on-demand distance vector (AODV) routing protocol. Analytical results illustrate the performance of the proposed OEDSR protocol, while experimental results utilizing a hardware testbed under various scenarios demonstrate improvements in energy efficiency of the OEDSR protocol. A hardware platform constructed at the University of Missouri-Rolla (UMR), now the Missouri University of Science and Technology (MST), based on the Generation 4 Smart Sensor Node (G4-SSN) prototyping platform is also described. Performance improvements are shown in terms of end-to-end (E2E) delay, throughput, route-set-up time and drop rates and energy usage is given for three topologies, including a mobile topology. Additionally, results from the hardware testbed provide valuable lessons for network deployments. Under testing OEDSR provides a factor of ten improvement in the energy used in the routing session and extends network lifetime compared to AODV. Depletion experiments show that the time until the first node failure is extended by a factor of three with the network depleting and network lifetime is extended by 6.7%

  12. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    Science.gov (United States)

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  13. Ultrasound assisted nucleation and growth characteristics of glycine polymorphs--a combined experimental and analytical approach.

    Science.gov (United States)

    Renuka Devi, K; Raja, A; Srinivasan, K

    2015-05-01

    For the first time, the effect of ultrasound in the diagnostic frequency range of 1-10 MHz on the nucleation and growth characteristics of glycine has been explored. The investigation employing the ultrasonic interferometer was carried out at a constant insonation time over a wide range of relative supersaturation from σ=-0.09 to 0.76 in the solution. Ultrasound promotes only α nucleation and completely inhibits both the β and γ nucleation in the system. The propagation of ultrasound assisted mass transport facilitates nucleation even at very low supersaturation levels in the solution. The presence of ultrasound exhibits a profound effect on nucleation and growth characteristics in terms of decrease in induction period, increase in nucleation rate and decrease in crystal size than its absence in the solution. With an increase in the frequency of ultrasound, a further decrease in induction period, increase in nucleation rate and decrease in the size of the crystal is noticed even at the same relative supersaturation levels. The increase in the nucleation rate explains the combined dominating effects of both the ultrasound frequency and the supersaturation in the solution. Analytically, the nucleation parameters of the nucleated polymorph have been deduced at different ultrasonic frequencies based on the classical nucleation theory and correlations with the experimental results have been obtained. Structural affirmation of the nucleated polymorph has been ascertained by powder X-ray diffraction. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Prepubertal gonadectomy in cats: different injectable anaesthetic combinations and comparison with gonadectomy at traditional age.

    Science.gov (United States)

    Porters, Nathalie; de Rooster, Hilde; Moons, Christel P H; Duchateau, Luc; Goethals, Klara; Bosmans, Tim; Polis, Ingeborgh

    2015-06-01

    Anaesthetic and analgesic effects of three different injectable anaesthetic combinations for prepubertal gonadectomy (PPG) in cats were studied. One anaesthetic protocol was compared with a similar one for gonadectomy at traditional age (TAG). Kittens were randomly assigned to PPG or TAG. For PPG, three different protocols were compared: (1) intramuscular (IM) administration of 60 μg/kg dexmedetomidine plus 20 μg/kg buprenorphine followed by an IM injection of the anaesthetic agent (20 mg/kg ketamine) (DB-IM protocol); (2) oral transmucosal (OTM) administration of 80 μg/kg dexmedetomidine plus 20 μg/kg buprenorphine followed by an IM injection of 20 mg/kg ketamine combined with 20 µg/kg dexmedetomidine (DB-OTM protocol); (3) IM injection of a 40 μg/kg medetomidine-20 μg/kg buprenorphine-20 mg/kg ketamine combination (MBK-IM protocol). For TAG, a DB-IM protocol was used, but with different doses for dexmedetomidine (40 μg/kg) and ketamine (5 mg/kg). All cats (PPG and TAG) received a non-steroidal anti-inflammatory before surgery. Anaesthetic and analgesic effects were assessed pre- and postoperatively (until 6 h). Cumulative logit, linear and logistic regression models were used for statistical analysis. Compared with the DB-OTM protocol, the DB-IM and MBK-IM protocols provided better anaesthesia with fewer adverse effects in PPG cats. Postoperative pain was not significantly different between anaesthetic protocols. PPG and TAG cats anaesthetised with the two DB-IM protocols differed significantly only for sedation and pain scores, but sedation and pain scores were generally low. Although there were no anaesthesia-related mortalities in the present study and all anaesthetic protocols for PPG in cats provided a surgical plane of anaesthesia and analgesia up to 6 h postoperatively, our findings were in favour of the intramuscular (DB-IM and MBK-IM) protocols. © ISFM and AAFP 2014.

  15. Parent Training for Families With a Child With ASD: A Naturalistic Systemic Behavior Analytic Model

    Directory of Open Access Journals (Sweden)

    Angeliki Gena

    2016-03-01

    Full Text Available The great challenges that the treatment of children with Autism Spectrum Disorder (ASD present to therapists and to parents, alike, arise not only from the severity of this disability, but also from two other factors: the continuously increasing prevalence of ASD and the serious financial restraints imposed by the recent economic hardships that the Western World faces. Thus, the need for parent-training practices is more prevalent than ever. The purpose of the present study was to identify parent-training practices that encompass child-related, parent-related and parent-child-interaction related variables as a means of addressing the difficulties that arise during parent-child interactions in a systemic and systematic way. Complex phenomena, such as the parent-child interaction, need to be treated with multi-focused interventions that produce generalized, systemic outcomes that are of clinical or social significance. The changes achieved in this intervention, which was conducted within a naturalistic context, were multiple and systemic since they involve child-related (e.g., on task behavior, parent-related (e.g., provision of reinforcement, and parent-child-interaction related variables (e.g., joint attention. Those changes were obtained through the use of behavior analytic techniques, such as modeling and systematic, direct parent training. Most importantly, those changes were spread to response categories for which training was not provided, generalized to novel settings and maintained through time. We may conclude that the combination of systemic and behavior-analytic approaches and methodologies may provide a highly beneficial perspective toward designing parent-training research protocols that may also lead to improved clinical practices.

  16. Effects of A Combined Treatment Protocol in Chronic Regional Pain Syndrome

    Directory of Open Access Journals (Sweden)

    Ali Asghar Jameh-Bozorgi

    2011-01-01

    Full Text Available Objective: Chronic regional pain syndrome (CRPS is one of the most important and worst types of peripheral nervous system, especially in upper extremity. The aim of this study was determination of the effect of a combined rehabilitation program in the treatment of patients with CRPS type I. Materials & Methods: In this quasi-experimental and before-after study, 20 patients with chronic regional pain syndrome were selected simply and their pain, range of motion, edema and muscular strength were examined and recorded before intervention. Then, patients under went a combined treatment programs included some modalities from physical and occupation therapy. Patients attended at clinic for 20 therapeutic sessions with one day intervals. Finally, data were analyzed using paired–t test. Results: Post operatively, pain and edema were decreased and range of motion and grip strength was increased significantly (P>0.05. Conclusion: Current study demonstrated that early and combined physical and occupational therapy efficient in the treatment of patients suffering from CRPS type I. This combined program can relieve pain and edema and increase ROM and grip strength.

  17. ISS protocol for EPR tooth dosimetry

    International Nuclear Information System (INIS)

    Onori, S.; Aragno, D.; Fattibene, P.; Petetti, E.; Pressello, M.C.

    2000-01-01

    The accuracy in Electron Paramagnetic Resonance (EPR) dose reconstruction with tooth enamel is affected by sample preparation, dosimetric signal amplitude evaluation and unknown dose estimate. Worldwide efforts in the field of EPR dose reconstruction with tooth enamel are focused on the optimization of the three mentioned steps in dose assessment. In the present work, the protocol implemented at ISS in the framework of the European Community Nuclear Fission Safety project 'Dose Reconstruction' is presented. A combined mechanical-chemical procedure for ground enamel sample preparation is used. The signal intensity evaluation is carried out with powder spectra simulation program. Finally, the unknown dose is evaluated individually for each sample with the additive dose method. The unknown dose is obtained by subtracting a mean native dose from the back-extrapolated dose. As an example of the capability of the ISS protocol in unknown dose evaluation, the results obtained in the framework of the 2nd International Intercomparison on EPR tooth enamel dosimetry are reported

  18. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  19. Local appearance features for robust MRI brain structure segmentation across scanning protocols

    DEFF Research Database (Denmark)

    Achterberg, H.C.; Poot, Dirk H. J.; van der Lijn, Fedde

    2013-01-01

    Segmentation of brain structures in magnetic resonance images is an important task in neuro image analysis. Several papers on this topic have shown the benefit of supervised classification based on local appearance features, often combined with atlas-based approaches. These methods require...... a representative annotated training set and therefore often do not perform well if the target image is acquired on a different scanner or with a different acquisition protocol than the training images. Assuming that the appearance of the brain is determined by the underlying brain tissue distribution...... with substantially different imaging protocols and on different scanners. While a combination of conventional appearance features trained on data from a different scanner with multiatlas segmentation performed poorly with an average Dice overlap of 0.698, the local appearance model based on the new acquisition...

  20. A New MAC Protocol with Pseudo-TDMA Behavior for Supporting Quality of Service in 802.11 Wireless LANs

    Directory of Open Access Journals (Sweden)

    2006-01-01

    Full Text Available A new medium access control (MAC protocol is proposed for quality-of-service (QoS support in wireless local area networks (WLAN. The protocol is an alternative to the recent enhancement 802.11e. A new priority policy provides the system with better performance by simulating time division multiple access (TDMA functionality. Collisions are reduced and starvation of low-priority classes is prevented by a distributed admission control algorithm. The model performance is found analytically extending previous work on this matter. The results show that a better organization of resources is achieved through this scheme. Throughput analysis is verified with OPNET simulations.

  1. Design of a 1-chip IBM-3270 protocol handler

    NARCIS (Netherlands)

    Spaanenburg, L.

    1989-01-01

    The single-chip design of a 20MHz IBM-3270 coax protocol handler in a conventional 3 μ CMOS process-technology is discussed. The harmonious combination of CMOS circuit tricks and high-level design disciplines allows the 50k transistor design to be compiled and optimized into a 35 mm**2 chip in 4

  2. Gaseous analytes of concern at Hanford Tank Farms. Topical report

    International Nuclear Information System (INIS)

    1996-01-01

    Large amounts of toxic and radioactive waste materials are stored in underground tanks at DOE sites. When the vapors in the tank headspaces vent to the open atmosphere a potentially dangerous situation can occur for personnel in the area. An open-path atmospheric pollution monitor is being developed for DOE to monitor the open air space above these tanks. In developing this monitor it is important to know what hazardous gases are most likely to be found in dangerous concentrations. These gases are called the Analytes of Concern. At the present time, measurements in eight tanks have detected thirty-one analytes in at least two tanks and fifteen analytes in only one tank. In addition to these gases, Carbon tetrachloride is considered to be an Analyte of Concern because it permeates the ground around the tanks. These Analytes are described and ranked according to a Hazard Index which combines their vapor pressure, density, and approximate danger level. The top sixteen ranked analytes which have been detected in at least two tanks comprise an open-quotes Analytes of Concern Test Listclose quotes for determining the system performance of the atmospheric pollution monitor under development. A preliminary examination of the infrared spectra, barring atmospheric interferences, indicates that: The pollution monitor will detect all forty-seven Analytes exclamation point

  3. The SHIP: A SIP to HTTP Interaction Protocol

    Science.gov (United States)

    Zeiß, Joachim; Gabner, Rene; Bessler, Sandford; Happenhofer, Marco

    IMS is capable of providing a wide range of services. As a result, terminal software becomes more and more complex to deliver network intelligence to user applications. Currently mobile terminal software needs to be permanently updated so that the latest network services and functionality can be delivered to the user. In the Internet, browser based user interfaces assure that an interface is made available to the user which offers the latest services in the net immediately. Our approach combines the benefits of the Session Initiation Protocol (SIP) and those of the HTTP protocol to bring the same type of user interfacing to IMS. SIP (IMS) realizes authentication, session management, charging and Quality of Service (QoS), HTTP provides access to Internet services and allows the user interface of an application to run on a mobile terminal while processing and orchestration is done on the server. A SHIP enabled IMS client only needs to handle data transport and session management via SIP, HTTP and RTP and render streaming media, HTML and Javascript. SHIP allows new kinds of applications, which combine audio, video and data within a single multimedia session.

  4. [Impact of the implementation of a protocol for the adequate and safe use of tumor markers].

    Science.gov (United States)

    Mérida de la Torre, Francisco Javier; Moreno Campoy, Elvira Eva; Martos Crespo, Francisco

    2015-12-21

    Improper clinical use of tumor markers (TM) may cause unnecessary additional studies to confirm or refute a positive result. After observing 2 adverse events due to a wrong use of TM, a protocol for improving their use was implemented. The objective of this study was to determine the impact of the implementation of the protocol. This was a pre-postintervention study, where analytical requests of carcinoembryonic antigen, CA15.3, CA19.9 and CA125 were analyzed during one year in patients not undergoing checking of neoplasia. A protocol was implemented and physicians were trained as recommended by the European Group on Tumor Markers, limiting its use to monitor the disease and its treatment. The study period was 2010-2014. The total number of requests dropped 50.81% and the percentage of adequacy of TM increased, each year, from 31.03 to 77.91%. The implementation of a protocol for the proper use of TM contributes to a safer use, avoiding incorrect studies and unnecessary and harmful tests for the patient. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  5. Performance Analysis of a Cluster-Based MAC Protocol for Wireless Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jesús Alonso-Zárate

    2010-01-01

    Full Text Available An analytical model to evaluate the non-saturated performance of the Distributed Queuing Medium Access Control Protocol for Ad Hoc Networks (DQMANs in single-hop networks is presented in this paper. DQMAN is comprised of a spontaneous, temporary, and dynamic clustering mechanism integrated with a near-optimum distributed queuing Medium Access Control (MAC protocol. Clustering is executed in a distributed manner using a mechanism inspired by the Distributed Coordination Function (DCF of the IEEE 802.11. Once a station seizes the channel, it becomes the temporary clusterhead of a spontaneous cluster and it coordinates the peer-to-peer communications between the clustermembers. Within each cluster, a near-optimum distributed queuing MAC protocol is executed. The theoretical performance analysis of DQMAN in single-hop networks under non-saturation conditions is presented in this paper. The approach integrates the analysis of the clustering mechanism into the MAC layer model. Up to the knowledge of the authors, this approach is novel in the literature. In addition, the performance of an ad hoc network using DQMAN is compared to that obtained when using the DCF of the IEEE 802.11, as a benchmark reference.

  6. Antioxidant phytochemicals in fresh produce: exploitation of genotype variation and advancements in analytical protocols

    Science.gov (United States)

    Manganaris, George A.; Goulas, Vlasios; Mellidou, Ifigeneia; Drogoudi, Pavlina

    2017-12-01

    Horticultural commodities (fruit and vegetables) are the major dietary source of several bioactive compounds of high nutraceutical value for humans, including polyphenols, carotenoids and vitamins. The aim of the current review was dual. Firstly, towards the eventual enhancement of horticultural crops with bio-functional compounds, the natural genetic variation in antioxidants found in different species and cultivar/genotypes is underlined. Notably, some landraces and/or traditional cultivars have been characterized by substantially higher phytochemical content, i.e. small tomato of Santorini island (cv. ‘Tomataki Santorinis’) possesses appreciably high amounts of ascorbic acid. The systematic screening of key bioactive compounds in a wide range of germplasm for the identification of promising genotypes and the restoration of key gene fractions from wild species and landraces may help in reducing the loss of agro-biodiversity, creating a healthier ‘gene pool’ as the basis of future adaptation. Towards this direction, large scale comparative studies in different cultivars/genotypes of a given species provide useful insights about the ones of higher nutritional value. Secondly, the advancements in the employment of analytical techniques to determine the antioxidant potential through a convenient, easy and fast way are outlined. Such analytical techniques include electron paramagnetic resonance (EPR) and infrared (IR) spectroscopy, electrochemical and chemometric methods, flow injection analysis (FIA), optical sensors and high resolution screening (HRS). Taking into consideration that fruits and vegetables are complex mixtures of water- and lipid-soluble antioxidants, the exploitation of chemometrics to develop “omics” platforms (i.e. metabolomics, foodomics) is a promising tool for researchers to decode and/or predict antioxidant activity of fresh produce. For industry, the use of cheap and rapid optical sensors and IR spectroscopy is recommended to

  7. Antioxidant Phytochemicals in Fresh Produce: Exploitation of Genotype Variation and Advancements in Analytical Protocols

    Directory of Open Access Journals (Sweden)

    George A. Manganaris

    2018-02-01

    Full Text Available Horticultural commodities (fruit and vegetables are the major dietary source of several bioactive compounds of high nutraceutical value for humans, including polyphenols, carotenoids and vitamins. The aim of the current review was dual. Firstly, toward the eventual enhancement of horticultural crops with bio-functional compounds, the natural genetic variation in antioxidants found in different species and cultivars/genotypes is underlined. Notably, some landraces and/or traditional cultivars have been characterized by substantially higher phytochemical content, i.e., small tomato of Santorini island (cv. “Tomataki Santorinis” possesses appreciably high amounts of ascorbic acid (AsA. The systematic screening of key bioactive compounds in a wide range of germplasm for the identification of promising genotypes and the restoration of key gene fractions from wild species and landraces may help in reducing the loss of agro-biodiversity, creating a healthier “gene pool” as the basis of future adaptation. Toward this direction, large scale comparative studies in different cultivars/genotypes of a given species provide useful insights about the ones of higher nutritional value. Secondly, the advancements in the employment of analytical techniques to determine the antioxidant potential through a convenient, easy and fast way are outlined. Such analytical techniques include electron paramagnetic resonance (EPR and infrared (IR spectroscopy, electrochemical, and chemometric methods, flow injection analysis (FIA, optical sensors, and high resolution screening (HRS. Taking into consideration that fruits and vegetables are complex mixtures of water- and lipid-soluble antioxidants, the exploitation of chemometrics to develop “omics” platforms (i.e., metabolomics, foodomics is a promising tool for researchers to decode and/or predict antioxidant activity of fresh produce. For industry, the use of optical sensors and IR spectroscopy is recommended to

  8. Treatment of Middle East Respiratory Syndrome with a combination of lopinavir-ritonavir and interferon-β1b (MIRACLE trial): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Arabi, Yaseen M; Alothman, Adel; Balkhy, Hanan H; Al-Dawood, Abdulaziz; AlJohani, Sameera; Al Harbi, Shmeylan; Kojan, Suleiman; Al Jeraisy, Majed; Deeb, Ahmad M; Assiri, Abdullah M; Al-Hameed, Fahad; AlSaedi, Asim; Mandourah, Yasser; Almekhlafi, Ghaleb A; Sherbeeni, Nisreen Murad; Elzein, Fatehi Elnour; Memon, Javed; Taha, Yusri; Almotairi, Abdullah; Maghrabi, Khalid A; Qushmaq, Ismael; Al Bshabshe, Ali; Kharaba, Ayman; Shalhoub, Sarah; Jose, Jesna; Fowler, Robert A; Hayden, Frederick G; Hussein, Mohamed A

    2018-01-30

    It had been more than 5 years since the first case of Middle East Respiratory Syndrome coronavirus infection (MERS-CoV) was recorded, but no specific treatment has been investigated in randomized clinical trials. Results from in vitro and animal studies suggest that a combination of lopinavir/ritonavir and interferon-β1b (IFN-β1b) may be effective against MERS-CoV. The aim of this study is to investigate the efficacy of treatment with a combination of lopinavir/ritonavir and recombinant IFN-β1b provided with standard supportive care, compared to treatment with placebo provided with standard supportive care in patients with laboratory-confirmed MERS requiring hospital admission. The protocol is prepared in accordance with the SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) guidelines. Hospitalized adult patients with laboratory-confirmed MERS will be enrolled in this recursive, two-stage, group sequential, multicenter, placebo-controlled, double-blind randomized controlled trial. The trial is initially designed to include 2 two-stage components. The first two-stage component is designed to adjust sample size and determine futility stopping, but not efficacy stopping. The second two-stage component is designed to determine efficacy stopping and possibly readjustment of sample size. The primary outcome is 90-day mortality. This will be the first randomized controlled trial of a potential treatment for MERS. The study is sponsored by King Abdullah International Medical Research Center, Riyadh, Saudi Arabia. Enrollment for this study began in November 2016, and has enrolled thirteen patients as of Jan 24-2018. ClinicalTrials.gov, ID: NCT02845843 . Registered on 27 July 2016.

  9. Portable abdomen radiography. Moving to thickness-based protocols

    International Nuclear Information System (INIS)

    Sanchez, Adrian A.; Reiser, Ingrid; Baxter, Tina; Zhang, Yue; Finkle, Joshua H.; Lu, Zheng Feng; Feinstein, Kate A.

    2018-01-01

    Default pediatric protocols on many digital radiography systems are configured based on patient age. However, age does not adequately characterize patient size, which is the principal determinant of proper imaging technique. Use of default pediatric protocols by inexperienced technologists can result in patient overexposure, inadequate image quality, or repeated examinations. To ensure diagnostic image quality at a well-managed patient radiation exposure by transitioning to thickness-based protocols for pediatric portable abdomen radiography. We aggregated patient thickness data, milliamperes (mAs), kilovoltage peak (kVp), exposure index (EI), source-to-detector distance, and grid use for all portable abdomen radiographs performed in our pediatric hospital in a database with a combination of automated and manual data collection techniques. We then analyzed the database and used it as the basis to construct thickness-based protocols with consistent image quality across varying patient thicknesses, as determined by the EI. Retrospective analysis of pediatric portable exams performed at our adult-focused hospitals demonstrated substantial variability in EI relative to our pediatric hospital. Data collection at our pediatric hospital over 4 months accumulated roughly 800 portable abdomen exams, which we used to develop a thickness-based technique chart. Through automated retrieval of data in our systems' digital radiography exposure logs and recording of patient abdomen thickness, we successfully developed thickness-based techniques for portable abdomen radiography. (orig.)

  10. Portable abdomen radiography. Moving to thickness-based protocols

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Adrian A.; Reiser, Ingrid; Baxter, Tina; Zhang, Yue; Finkle, Joshua H.; Lu, Zheng Feng; Feinstein, Kate A. [University of Chicago Medical Center, Department of Radiology, Chicago, IL (United States)

    2018-02-15

    Default pediatric protocols on many digital radiography systems are configured based on patient age. However, age does not adequately characterize patient size, which is the principal determinant of proper imaging technique. Use of default pediatric protocols by inexperienced technologists can result in patient overexposure, inadequate image quality, or repeated examinations. To ensure diagnostic image quality at a well-managed patient radiation exposure by transitioning to thickness-based protocols for pediatric portable abdomen radiography. We aggregated patient thickness data, milliamperes (mAs), kilovoltage peak (kVp), exposure index (EI), source-to-detector distance, and grid use for all portable abdomen radiographs performed in our pediatric hospital in a database with a combination of automated and manual data collection techniques. We then analyzed the database and used it as the basis to construct thickness-based protocols with consistent image quality across varying patient thicknesses, as determined by the EI. Retrospective analysis of pediatric portable exams performed at our adult-focused hospitals demonstrated substantial variability in EI relative to our pediatric hospital. Data collection at our pediatric hospital over 4 months accumulated roughly 800 portable abdomen exams, which we used to develop a thickness-based technique chart. Through automated retrieval of data in our systems' digital radiography exposure logs and recording of patient abdomen thickness, we successfully developed thickness-based techniques for portable abdomen radiography. (orig.)

  11. Protocol for physical assessment in patients with fibromyalgia syndrome.

    Science.gov (United States)

    dos Santos, Michele R; Moro, Claudia M C; Vosgerau, Dilmeire S R

    2014-01-01

    Fibromyalgia syndrome (FMS) is a chronic disease that causes pain and fatigue, presenting a negative impact on quality of life. Exercise helps maintaining physical fitness and influences directly on the improvement of quality of life. Develop a protocol for health-related physical fitness assessment of patients with FMS with tests that are feasible and appropriate for this population. An exploratory and analytical literature review was performed, seeking to determine the tests used by the scientific community. With this in mind, we performed a literature revision through the use of virtual libraries databases: PubMed, Bireme, Banco de Teses e Dissertações da Capes and Biblioteca Digital Brasileira de Teses e Dissertações, published in between 1992-2012. A variety of tests was found; the following, by number of citations, stood out: Body Mass Index (BMI) and bioimpedance; 6-minute walk; handgrip strength (dynamometer, 1RM [Repetition Maximum]); Sit and reach and Shoulder flexibility; Foot Up and Go, and Flamingo balance. These are the tests that should make up the protocol for the physical evaluation of FMS patients, emphasizing their ease of use.

  12. BAVP: Blockchain-Based Access Verification Protocol in LEO Constellation Using IBE Keys

    Directory of Open Access Journals (Sweden)

    Songjie Wei

    2018-01-01

    Full Text Available LEO constellation has received intensive research attention in the field of satellite communication. The existing centralized authentication protocols traditionally used for MEO/GEO satellite networks cannot accommodate LEO satellites with frequent user connection switching. This paper proposes a fast and efficient access verification protocol named BAVP by combining identity-based encryption and blockchain technology. Two different key management schemes with IBE and blockchain, respectively, are investigated, which further enhance the authentication reliability and efficiency in LEO constellation. Experiments on OPNET simulation platform evaluate and demonstrate the effectiveness, reliability, and fast-switching efficiency of the proposed protocol. For LEO networks, BAVP surpasses the well-known existing solutions with significant advantages in both performance and scalability which are supported by theoretical analysis and simulation results.

  13. Toward Open Data Blockchain Analytics: A Bitcoin Perspective

    OpenAIRE

    McGinn, D.; McIlwraith, D; Guo, Y.

    2018-01-01

    Bitcoin is the first implementation of what has become known as a 'public permissionless' blockchain. Guaranteeing security and protocol conformity through its elegant combination of cryptographic assurances and game theoretic economic incentives, it permits censorship resistant public read-write access to its append-only blockchain database without the need for any mediating central authority. Not until its advent has such a trusted, transparent, comprehensive and granular data set of digita...

  14. Methods for CT automatic exposure control protocol translation between scanner platforms.

    Science.gov (United States)

    McKenney, Sarah E; Seibert, J Anthony; Lamba, Ramit; Boone, John M

    2014-03-01

    An imaging facility with a diverse fleet of CT scanners faces considerable challenges when propagating CT protocols with consistent image quality and patient dose across scanner makes and models. Although some protocol parameters can comfortably remain constant among scanners (eg, tube voltage, gantry rotation time), the automatic exposure control (AEC) parameter, which selects the overall mA level during tube current modulation, is difficult to match among scanners, especially from different CT manufacturers. Objective methods for converting tube current modulation protocols among CT scanners were developed. Three CT scanners were investigated, a GE LightSpeed 16 scanner, a GE VCT scanner, and a Siemens Definition AS+ scanner. Translation of the AEC parameters such as noise index and quality reference mAs across CT scanners was specifically investigated. A variable-diameter poly(methyl methacrylate) phantom was imaged on the 3 scanners using a range of AEC parameters for each scanner. The phantom consisted of 5 cylindrical sections with diameters of 13, 16, 20, 25, and 32 cm. The protocol translation scheme was based on matching either the volumetric CT dose index or image noise (in Hounsfield units) between two different CT scanners. A series of analytic fit functions, corresponding to different patient sizes (phantom diameters), were developed from the measured CT data. These functions relate the AEC metric of the reference scanner, the GE LightSpeed 16 in this case, to the AEC metric of a secondary scanner. When translating protocols between different models of CT scanners (from the GE LightSpeed 16 reference scanner to the GE VCT system), the translation functions were linear. However, a power-law function was necessary to convert the AEC functions of the GE LightSpeed 16 reference scanner to the Siemens Definition AS+ secondary scanner, because of differences in the AEC functionality designed by these two companies. Protocol translation on the basis of

  15. Recovery of environmental analytes from clays and soils by supercritical fluid extracting/gas chromatography

    International Nuclear Information System (INIS)

    Emery, A.P.; Chesler, S.N.; MacCrehan, W.A.

    1992-01-01

    This paper reports on Supercritical Fluid Extraction (SFE) which promises to provide rapid extractions of organic analytes from environmental sample types without the use of hazardous solvents. In addition, SFE protocols using commercial instrumentation can be automated lowering analysis costs. Because of these benefits, we are investigating SFE as an alternative to the solvent extraction (eg. Soxhlet and sonication) techniques required in many EPA test procedures. SFE, using non-polar carbon dioxide as well as more polar supercritical fluids, was used to determine n-alkane hydrocarbons and polynuclear aromatic hydrocarbons (PAHs) in solid samples. The extraction behavior of these analyte classes from environmentally-contaminated soil matrices and model soil and clay matrices was investigated using a SFE apparatus in which the extracted analytes were collected on a solid phase trap and then selectively eluted with a solvent. The SFE conditions for quantitative recovery of n-alkane hydrocarbons in diesel fuel from a series of clays and soils were determined using materials prepared at the 0.02% level with diesel fuel oil in order to simplify analyte collection and analysis after extraction. The effect of extraction parameters including temperature, fluid flow rate and modifier addition were investigated by monitoring the amount of diesel fuel extracted as a function of time

  16. On the Performance of the Cache Coding Protocol

    Directory of Open Access Journals (Sweden)

    Behnaz Maboudi

    2018-03-01

    Full Text Available Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent these attacks while allowing for the benefits of coding in mesh networks, the cache coding protocol was proposed. This protocol only allows recoding at the relays when the relay has received enough coded packets to decode an entire generation of packets. At that point, the relay node recodes and signs the recoded packets with its own private key, allowing the system to detect and minimize the effect of pollution attacks and making the relays accountable for changes on the data. This paper analyzes the delay performance of cache coding to understand the security-performance trade-off of this scheme. We introduce an analytical model for the case of two relays in an erasure channel relying on an absorbing Markov chain and an approximate model to estimate the performance in terms of the number of transmissions before successfully decoding at the receiver. We confirm our analysis using simulation results. We show that cache coding can overcome the security issues of unrestricted recoding with only a moderate decrease in system performance.

  17. [Chinese Protocol of Diagnosis and Treatment of Colorectal Cancer].

    Science.gov (United States)

    2018-04-01

    Colorectal cancer is one of the most common malignant tumors in China. In 2012 one million thirty six thousand cases of colorectal cancer were diagnosed all over the world, two hundred fifty three thousand cases were diagnosed in China (accounted for 18.6%). China has the largest number of new cases of colorectal cancer in the world. Colorectal cancer has becoming a serious threat of Chinese residents' health. In 2010, the National Ministry of Health organized colorectal cancer expertise of the Chinese Medical Association to write the "Chinese Protocol of Diagnosis and Treatment of Colorectal Cancer" (2010edition), and publish it publicly. In recent years, the National Health and Family Planning Commission has organized experts to revised the protocol 2 times: the first time in 2015, the second time in 2017. The revised part of "Chinese Protocol of Diagnosis and Treatment of Colorectal Cancer" (2017 edition) involves new progress in the field of imaging examination, pathological evaluation, surgery, chemotherpy and radiotherapy. The 2017 edition of the protocol not only referred to the contents of the international guidelines, but also combined with the specific national conditions and clinical practice in China, and also included many evidence-based clinical data in China recently. The 2017 edition of the protocol would further promote the standardization of diagnosis and treatment of colorectal cancer in China, improve the survival and prognosis of patients, and benefit millions of patients with colorectal cancer and their families.

  18. A joint research protocol for music therapy in dementia care

    DEFF Research Database (Denmark)

    Ridder, Hanne Mette Ochsner; Stige, Brynjulf

    2011-01-01

    Agitation is a major challenge within institutions of care for the elderly. The effect of music therapy on agitation and quality of live is investigated in a practice-relevant research combined with a Randomized Controlled Trial and multicentre research. The research protocol is developed...... in dialogue with practicing music therapists....

  19. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  20. Model-based Engineering for the Integration of Manufacturing Systems with Advanced Analytics

    OpenAIRE

    Lechevalier , David; Narayanan , Anantha; Rachuri , Sudarsan; Foufou , Sebti; Lee , Y Tina

    2016-01-01

    Part 3: Interoperability and Systems Integration; International audience; To employ data analytics effectively and efficiently on manufacturing systems, engineers and data scientists need to collaborate closely to bring their domain knowledge together. In this paper, we introduce a domain-specific modeling approach to integrate a manufacturing system model with advanced analytics, in particular neural networks, to model predictions. Our approach combines a set of meta-models and transformatio...

  1. Hanford transuranic analytical capability

    International Nuclear Information System (INIS)

    McVey, C.B.

    1995-01-01

    With the current DOE focus on ER/WM programs, an increase in the quantity of waste samples that requires detailed analysis is forecasted. One of the prime areas of growth is the demand for DOE environmental protocol analyses of TRU waste samples. Currently there is no laboratory capacity to support analysis of TRU waste samples in excess of 200 nCi/gm. This study recommends that an interim solution be undertaken to provide these services. By adding two glove boxes in room 11A of 222S the interim waste analytical needs can be met for a period of four to five years or until a front end facility is erected at or near the 222-S facility. The yearly average of samples is projected to be approximately 600 samples. The figure has changed significantly due to budget changes and has been downgraded from 10,000 samples to the 600 level. Until these budget and sample projection changes become firmer, a long term option is not recommended at this time. A revision to this document is recommended by March 1996 to review the long term option and sample projections

  2. Development and implementation of the Dutch protocol for rehabilitative management in amyotrophic lateral sclerosis.

    Science.gov (United States)

    van den Berg, J P; de Groot, I J M; Joha, B C; van Haelst, J M; van Gorcom, P; Kalmijn, S

    2004-12-01

    In the Netherlands, rehabilitation care plays an important role in the symptomatic and palliative treatment of ALS patients. However, until 1999 there were no guidelines or practice parameters available for the management of ALS. Therefore, the Dutch protocol for rehabilitative management in ALS was developed. We describe the development process, the outcome and implementation of the protocol. A concept management protocol was written and the Delphi method was selected to develop the protocol further. This method comprises repetitive discussion sessions from postulates, using a combination of written questionnaires and work-conferences. Between 80 and 90 persons (rehabilitation team members of different professional backgrounds and neurologists) were involved in this process. The protocol was implemented by sending it to all consultants in rehabilitation medicine in the Netherlands; they were asked to inform all the treatment team members about the final protocol and to implement it in their treatment of ALS patients. The protocol was developed in 1999, implemented in 2000 and evaluated in 2001. Recommendations for improvement were made during the evaluation and improvements are currently being developed by an expert group. The protocol is widely used (88.9%) by consultants in rehabilitation medicine and their treatment teams in the Netherlands. The Dutch protocol for rehabilitative management was developed to provide an optimal and adequate care plan for patients with ALS. It is widely used in the Netherlands.

  3. A Comparison Between Inter-Asterisk eXchange Protocol and Jingle Protocol: Session Time

    Directory of Open Access Journals (Sweden)

    H. S. Haj Aliwi

    2016-08-01

    Full Text Available Over the last few years, many multimedia conferencing and Voice over Internet Protocol (VoIP applications have been developed due to the use of signaling protocols in providing video, audio and text chatting services between at least two participants. This paper compares between two widely common signaling protocols: InterAsterisk eXchange Protocol (IAX and the extension of the eXtensible Messaging and Presence Protocol (Jingle in terms of delay time during call setup, call teardown, and media sessions.

  4. Intuitive versus analytical decision making modulates trust in e-commerce

    Directory of Open Access Journals (Sweden)

    Paola Iannello

    2014-11-01

    Full Text Available The hypothesis that intuition and analytical processes affect differently trust in e-commerce was tested. Participants were offered products by a series of sellers via Internet. In the intuitive condition pictures of the sellers were followed by neutral descriptions and participants had less time to decide whether to trust the seller. In the analytical condition participants were given an informative description of the seller and had a longer time to decide. Interactions among condition, price and trust emerged in behavioral and psychophysiological responses. EMG signals increased during analytical processing, suggesting a cognitive effort, whereas higher cardiovascular measures mirrored the emotional involvement when faced to untrustworthy sellers. The study supported the fruitful application of the intuitive vs. analytical approach to e-commerce and of the combination of different sources of information about the buyers while they have to choose to trust the seller in a financial transaction over the Internet.

  5. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  6. Rethinking Visual Analytics for Streaming Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    2017-01-01

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between the two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive

  7. Analytical solution of Luedeking-Piret equation for a batch fermentation obeying Monod growth kinetics.

    Science.gov (United States)

    Garnier, Alain; Gaillet, Bruno

    2015-12-01

    Not so many fermentation mathematical models allow analytical solutions of batch process dynamics. The most widely used is the combination of the logistic microbial growth kinetics with Luedeking-Piret bioproduct synthesis relation. However, the logistic equation is principally based on formalistic similarities and only fits a limited range of fermentation types. In this article, we have developed an analytical solution for the combination of Monod growth kinetics with Luedeking-Piret relation, which can be identified by linear regression and used to simulate batch fermentation evolution. Two classical examples are used to show the quality of fit and the simplicity of the method proposed. A solution for the combination of Haldane substrate-limited growth model combined with Luedeking-Piret relation is also provided. These models could prove useful for the analysis of fermentation data in industry as well as academia. © 2015 Wiley Periodicals, Inc.

  8. Abbreviated MRI protocols for detecting breast cancer in women with dense breasts

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Shung Qing; Huang, Min; Shen, Yu Ying; Liu, Chen Lu; Xu, Chuan Xiao [The Affiliated Suzhou Hospital, Nanjing Medical University, Suzhou (China)

    2017-06-15

    To evaluate the validity of two abbreviated protocols (AP) of MRI in breast cancer screening of dense breast tissue. This was a retrospective study in 356 participants with dense breast tissue and negative mammography results. The study was approved by the Nanjing Medical University Ethics Committee. Patients were imaged with a full diagnostic protocol (FDP) of MRI. Two APs (AP-1 consisting of the first post-contrast subtracted [FAST] and maximum-intensity projection [MIP] images, and AP-2 consisting of AP-1 combined with diffusion-weighted imaging [DWI]) and FDP images were analyzed separately, and the sensitivities and specificities of breast cancer detection were calculated. Of the 356 women, 67 lesions were detected in 67 women (18.8%) by standard MR protocol, and histological examination revealed 14 malignant lesions and 53 benign lesions. The average interpretation time of AP-1 and AP-2 were 37 seconds and 54 seconds, respectively, while the average interpretation time of the FDP was 3 minutes and 25 seconds. The sensitivities of the AP-1, AP-2, and FDP were 92.9, 100, and 100%, respectively, and the specificities of the three MR protocols were 86.5, 95.0, and 96.8%, respectively. There was no significant difference among the three MR protocols in the diagnosis of breast cancer (p > 0.05). However, the specificity of AP-1 was significantly lower than that of AP-2 (p = 0.031) and FDP (p = 0.035), while there was no difference between AP-2 and FDP (p > 0.05). The AP may be efficient in the breast cancer screening of dense breast tissue. FAST and MIP images combined with DWI of MRI are helpful to improve the specificity of breast cancer detection.

  9. Flow cytometry protocol to evaluate ionizing radiation effects on P-glycoprotein activity

    International Nuclear Information System (INIS)

    Santos, Neyliane Goncalves dos; Amaral, Ademir; Cavalcanti, Mariana Brayner . E-mail; Neves, Maria Amelia Batista; Machado, Cintia Gonsalves de Faria

    2008-01-01

    The aim of this work was to establish a protocol to evaluate ionizing radiation effects on P-glycoprotein (P-gp) activity. For this, human peripheral blood samples were irradiated in vitro with different doses and P-gp activity was analyzed for CD4 and CD8 T lymphocytes through rhodamine123-efflux assay by flow cytometry. By simultaneous employment of percentage and mean fluorescence index parameters, subject-by-subject analysis pointed out changes in P-gp activity for some individuals and irradiated samples. Based on this work, the proposed protocol was considered adequate for evaluating P-gp activity on cells after radioactive stress. Besides, this research suggests that P-gp activity could be an important factor to define patient-specific protocols in combined chemo- and radiotherapy, particularly when radiation exposure precedes chemical treatment. (author)

  10. Flow cytometry protocol to evaluate ionizing radiation effects on P-glycoprotein activity

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Neyliane Goncalves dos; Amaral, Ademir; Cavalcanti, Mariana Brayner [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear]. E-mail; neylisantos@yahoo.com.br; Neves, Maria Amelia Batista; Machado, Cintia Gonsalves de Faria [Fundacao de Hematologia e Hemoterapia de Pernambuco, Recife, PE (Brazil). Unidade de Laboratorios Especializados. Lab. de Imunofenotipagem

    2008-12-15

    The aim of this work was to establish a protocol to evaluate ionizing radiation effects on P-glycoprotein (P-gp) activity. For this, human peripheral blood samples were irradiated in vitro with different doses and P-gp activity was analyzed for CD4 and CD8 T lymphocytes through rhodamine123-efflux assay by flow cytometry. By simultaneous employment of percentage and mean fluorescence index parameters, subject-by-subject analysis pointed out changes in P-gp activity for some individuals and irradiated samples. Based on this work, the proposed protocol was considered adequate for evaluating P-gp activity on cells after radioactive stress. Besides, this research suggests that P-gp activity could be an important factor to define patient-specific protocols in combined chemo- and radiotherapy, particularly when radiation exposure precedes chemical treatment. (author)

  11. Experimenting with new combinations of old ideas

    NARCIS (Netherlands)

    Linde, J.

    2012-01-01

    Many important (economic) decisions involve risk. At the same time every decision is made within a social context. So far (experimental) research studies these two aspects in isolation from each other. Jona Linde therefore combines existing experimental protocols to examine the influence of social

  12. Modified Aggressive Packet Combining Scheme

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2010-06-01

    In this letter, a few schemes are presented to improve the performance of aggressive packet combining scheme (APC). To combat error in computer/data communication networks, ARQ (Automatic Repeat Request) techniques are used. Several modifications to improve the performance of ARQ are suggested by recent research and are found in literature. The important modifications are majority packet combining scheme (MjPC proposed by Wicker), packet combining scheme (PC proposed by Chakraborty), modified packet combining scheme (MPC proposed by Bhunia), and packet reversed packet combining (PRPC proposed by Bhunia) scheme. These modifications are appropriate for improving throughput of conventional ARQ protocols. Leung proposed an idea of APC for error control in wireless networks with the basic objective of error control in uplink wireless data network. We suggest a few modifications of APC to improve its performance in terms of higher throughput, lower delay and higher error correction capability. (author)

  13. Implementing an ultrasound-based protocol for diagnosingappendicitis while maintaining diagnostic accuracy

    International Nuclear Information System (INIS)

    Van Atta, Angela J.; Baskin, Henry J.; Maves, Connie K.; Dansie, David M.; Rollins, Michael D.; Bolte, Robert G.; Mundorff, Michael B.; Andrews, Seth P.

    2015-01-01

    The use of ultrasound to diagnose appendicitis in children is well-documented but not universally employed outside of pediatric academic centers, especially in the United States. Various obstacles make it difficult for institutions and radiologists to abandon a successful and accurate CT-based imaging protocol in favor of a US-based protocol. To describe how we overcame barriers to implementing a US-based appendicitis protocol among a large group of nonacademic private-practice pediatric radiologists while maintaining diagnostic accuracy and decreasing medical costs. A multidisciplinary team of physicians (pediatric surgery, pediatric emergency medicine and pediatric radiology) approved an imaging protocol using US as the primary modality to evaluate suspected appendicitis with CT for equivocal cases. The protocol addressed potential bias against US and accommodated for institutional limitations of radiologist and sonographer experience and availability. Radiologists coded US reports according to the probability of appendicitis. Radiology reports were compared with clinical outcomes to assess diagnostic accuracy. During the study period, physicians from each group were apprised of the interim US protocol accuracy results. Problematic cases were discussed openly. A total of 512 children were enrolled and underwent US for evaluation of appendicitis over a 30-month period. Diagnostic accuracy was comparable to published results for combined US/CT protocols. Comparing the first 12 months to the last 12 months of the study period, the proportion of children achieving an unequivocal US result increased from 30% (51/169) to 53% (149/282) and the proportion of children undergoing surgery based solely on US findings increased from 55% (23/42) to 84% (92/109). Overall, 63% (325/512) of patients in the protocol did not require a CT. Total patient costs were reduced by $30,182 annually. We overcame several barriers to implementing a US protocol. During the study period our

  14. Implementing an ultrasound-based protocol for diagnosingappendicitis while maintaining diagnostic accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Van Atta, Angela J. [University of Utah School of Medicine, Salt Lake City, UT (United States); Baskin, Henry J.; Maves, Connie K.; Dansie, David M. [Primary Children' s Hospital, Department of Radiology, Salt Lake City, UT (United States); Rollins, Michael D. [University of Utah School of Medicine, Department of Surgery, Division of Pediatric Surgery, Salt Lake City, UT (United States); Bolte, Robert G. [University of Utah School of Medicine, Department of Pediatrics, Division of Pediatric Emergency Medicine, Salt Lake City, UT (United States); Mundorff, Michael B.; Andrews, Seth P. [Primary Children' s Hospital, Systems Improvement, Salt Lake City, UT (United States)

    2015-05-01

    The use of ultrasound to diagnose appendicitis in children is well-documented but not universally employed outside of pediatric academic centers, especially in the United States. Various obstacles make it difficult for institutions and radiologists to abandon a successful and accurate CT-based imaging protocol in favor of a US-based protocol. To describe how we overcame barriers to implementing a US-based appendicitis protocol among a large group of nonacademic private-practice pediatric radiologists while maintaining diagnostic accuracy and decreasing medical costs. A multidisciplinary team of physicians (pediatric surgery, pediatric emergency medicine and pediatric radiology) approved an imaging protocol using US as the primary modality to evaluate suspected appendicitis with CT for equivocal cases. The protocol addressed potential bias against US and accommodated for institutional limitations of radiologist and sonographer experience and availability. Radiologists coded US reports according to the probability of appendicitis. Radiology reports were compared with clinical outcomes to assess diagnostic accuracy. During the study period, physicians from each group were apprised of the interim US protocol accuracy results. Problematic cases were discussed openly. A total of 512 children were enrolled and underwent US for evaluation of appendicitis over a 30-month period. Diagnostic accuracy was comparable to published results for combined US/CT protocols. Comparing the first 12 months to the last 12 months of the study period, the proportion of children achieving an unequivocal US result increased from 30% (51/169) to 53% (149/282) and the proportion of children undergoing surgery based solely on US findings increased from 55% (23/42) to 84% (92/109). Overall, 63% (325/512) of patients in the protocol did not require a CT. Total patient costs were reduced by $30,182 annually. We overcame several barriers to implementing a US protocol. During the study period our

  15. The French dosimetry protocol

    International Nuclear Information System (INIS)

    Dutreix, A.

    1985-01-01

    After a general introduction the protocol is divided in five sections dealing with: determination of the quality of X-ray, γ-ray and electron beams; the measuring instrument; calibration of the reference instrument; determination of the reference absorbed dose in the user's beams; determination of the absorbed dose in water at other points, in other conditions. The French protocol is not essentially different from the Nordic protocol and it is based on the experience gained in using both the American and the Nordic protocols. Therefore, only the main difference with the published protocols are discussed. (Auth.)

  16. Approximate analytical solution of two-dimensional multigroup P-3 equations

    International Nuclear Information System (INIS)

    Matausek, M.V.; Milosevic, M.

    1981-01-01

    Iterative solution of multigroup spherical harmonics equations reduces, in the P-3 approximation and in two-dimensional geometry, to a problem of solving an inhomogeneous system of eight ordinary first order differential equations. With appropriate boundary conditions, these equations have to be solved for each energy group and in each iteration step. The general solution of the corresponding homogeneous system of equations is known in analytical form. The present paper shows how the right-hand side of the system can be approximated in order to derive a particular solution and thus an approximate analytical expression for the general solution of the inhomogeneous system. This combined analytical-numerical approach was shown to have certain advantages compared to the finite-difference method or the Lie-series expansion method, which have been used to solve similar problems. (author)

  17. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  18. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  19. Analytical Model for Fictitious Crack Propagation in Concrete Beams

    DEFF Research Database (Denmark)

    Ulfkjær, J. P.; Krenk, S.; Brincker, Rune

    An analytical model for load-displacement curves of unreinforced notched and un-notched concrete beams is presented. The load displacement-curve is obtained by combining two simple models. The fracture is modelled by a fictitious crack in an elastic layer around the mid-section of the beam. Outside...... the elastic layer the deformations are modelled by the Timoshenko beam theory. The state of stress in the elastic layer is assumed to depend bi-lineary on local elongation corresponding to a linear softening relation for the fictitious crack. For different beam size results from the analytical model...... is compared with results from a more accurate model based on numerical methods. The analytical model is shown to be in good agreement with the numerical results if the thickness of the elastic layer is taken as half the beam depth. Several general results are obtained. It is shown that the point on the load...

  20. Fall risk screening protocol for older hearing clinic patients.

    Science.gov (United States)

    Criter, Robin E; Honaker, Julie A

    2017-10-01

    The primary purposes of this study were (1) to describe measures that may contrast audiology patients who fall from those who do not fall and (2) to evaluate the clinical performance of measures that could be easily used for fall risk screening in a mainstream audiology hearing clinic. Cross-sectional study Study sample: Thirty-six community-dwelling audiology patient participants and 27 community-dwelling non-audiology patients over 60 years of age. The Hearing Handicap Inventory for the Elderly (HHIE) most accurately identified patients with a recent fall (sensitivity: 76.0%), while the Dizziness Handicap Inventory (DHI) most accurately identified patients without a recent fall (specificity: 90.9%). A combination of measures used in a protocol-including HHIE, DHI, number of medications, and the Timed Up and Go test-resulted in good, accurate identification of patients with or without a recent history of falls (92.0% sensitivity, 100% specificity). This study reports good sensitivity and excellent specificity for identifying patients with and without a recent history of falls when measures were combined into a screening protocol. Despite previously reported barriers, effective fall risk screenings may be performed in hearing clinic settings with measures often readily accessible to audiologists.

  1. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  2. Second-line treatment of recurrent HNSCC: tumor debulking in combination with high-dose-rate brachytherapy and a simultaneous cetuximab-paclitaxel protocol

    International Nuclear Information System (INIS)

    Ritter, M.; Teudt, I. U.; Meyer, J. E.; Schröder, U.; Kovács, G.; Wollenberg, B.

    2016-01-01

    After the failure of first-line treatment, the clinical prognosis in head and neck cancer (HNSCC) deteriorates. Effective therapeutic strategies are limited due to the toxicity of previous treatments and the diminished tolerance of surrounding normal tissue. This study demonstrates a promising second-line regimen, with function preserving surgical tumor debulking, followed by a combination of postoperative interstitial brachytherapy and a simultaneous protocol of cetuximab and taxol. From January 2006 to May 2013, 197 patients with HNSCC were treated with brachytherapy at the University Hospital Schleswig-Holstein Campus Lübeck, including 94 patients due to recurrent cancer. Within these, 18 patients were referred to our clinic because of early progressive disease following first- or second-line treatment failure. They received the new palliative regimen. A matched-pair analysis including recurrent tumor stage, status of resection margins, tissue invasion and previous therapy was performed to evaluate this treatment retrospectively. Overall survival (OS), disease-free survival (DFS), functional outcome and treatment toxicity was analyzed on the basis of medical records and follow-up data. DFS and OS of the study group were 8.7 and 14.8 months. Whereas, DFS and OS of the control group, treated only by function preserving tumor debulking and brachytherapy, was 3.9 and 6.1 months respectively. This demonstrates a positive trend through the additional use of the cetuximab-taxane protocol. Furthermore, no increase of therapy induced toxicities was displayed. Pre-treated patients with a further relapse benefit from the ‘cetuximab-taxane recurrency scheme’. It seems to be a valuable complement to interdisciplinary and multimodal tumor therapy, which improves OS and results in acceptable toxicity. The online version of this article (doi:10.1186/s13014-016-0583-0) contains supplementary material, which is available to authorized users

  3. Novel protocol for highly efficient gas-phase chemical derivatization of surface amine groups using trifluoroacetic anhydride

    Science.gov (United States)

    Duchoslav, Jiri; Kehrer, Matthias; Hinterreiter, Andreas; Duchoslav, Vojtech; Unterweger, Christoph; Fürst, Christian; Steinberger, Roland; Stifter, David

    2018-06-01

    In the current work, chemical derivatization of amine (NH2) groups with trifluoroacetic anhydride (TFAA) as an analytical method to improve the information scope of X-ray photoelectron spectroscopy (XPS) is investigated. TFAA is known to successfully label hydroxyl (OH) groups. With the introduction of a newly developed gas-phase derivatization protocol conducted at ambient pressure and using a catalyst also NH2 groups can now efficiently be labelled with a high yield and without the formation of unwanted by-products. By establishing a comprehensive and self-consistent database of reference binding energies for XPS a promising approach for distinguishing hydroxyl from amine groups is presented. The protocol was verified on different polymers, including poly(allylamine), poly(ethyleneimine), poly(vinylalcohol) and chitosan, the latter one containing both types of addressed chemical groups.

  4. Performance Analysis of the IEEE 802.11p Multichannel MAC Protocol in Vehicular Ad Hoc Networks.

    Science.gov (United States)

    Song, Caixia

    2017-12-12

    Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. The safety applications require timely and reliable transmissions, while the non-safety applications require efficient and high throughput. In the IEEE 1609.4 protocol, operating interval is divided into alternating Control Channel (CCH) interval and Service Channel (SCH) interval with an identical length. During the CCH interval, nodes transmit safety-related messages and control messages, and Enhanced Distributed Channel Access (EDCA) mechanism is employed to allow four Access Categories (ACs) within a station with different priorities according to their criticality for the vehicle's safety. During the SCH interval, the non-safety massages are transmitted. An analytical model is proposed in this paper to evaluate performance, reliability and efficiency of the IEEE 802.11p and IEEE 1609.4 protocols. The proposed model improves the existing work by taking serval aspects and the character of multichannel switching into design consideration. Extensive performance evaluations based on analysis and simulation help to validate the accuracy of the proposed model and analyze the capabilities and limitations of the IEEE 802.11p and IEEE 1609.4 protocols, and enhancement suggestions are given.

  5. Perspectives on making big data analytics work for oncology.

    Science.gov (United States)

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from

  6. Methods of analytical check for highly pure tungsten

    International Nuclear Information System (INIS)

    Miklin, D.G.; Karpov, Yu.A.; Orlova, V.A.

    1993-01-01

    The review is devoted to the methods of high-purity tungsten analysis. Current trends in the development of this branch of analytical chemistry are considered. Application of both instrument mass-spectrometry analysis and optico-spectral, activation methods and mass-spectrometry ones with inductively-bound plasma in combination with preliminary isolation of the basis and impurity concentration is expected to be the most actual

  7. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  8. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  9. Test set of gaseous analytes at Hanford tank farms

    International Nuclear Information System (INIS)

    1997-01-01

    DOE has stored toxic and radioactive waste materials in large underground tanks. When the vapors in the tank headspaces vent to the open atmosphere a potentially dangerous situation can occur for personnel in the area. An open-path atmospheric pollution monitor is being developed to monitor the open air space above these tanks. In developing this infrared spectra monitor as a safety alert instrument, it is important to know what hazardous gases, called the Analytes of Concern, are most likely to be found in dangerous concentrations. The monitor must consider other gases which could interfere with measurements of the Analytes of Concern. The total list of gases called the Test Set Analytes form the basis for testing the pollution monitor. Prior measurements in 54 tank headspaces have detected 102 toxic air pollutants (TAPs) and over 1000 other analytes. The hazardous Analytes are ranked herein by a Hazardous Atmosphere Rating which combines their measured concentration, their density relative to air, and the concentration at which they become dangerous. The top 20 toxic air pollutants, as ranked by the Hazardous Atmosphere Rating, and the top 20 other analytes, in terms of measured concentrations, are analyzed for possible inclusion in the Test Set Analytes. Of these 40 gases, 20 are selected. To these 20 gases are added the 6 omnipresent atmospheric gases with the highest concentrations, since their spectra could interfere with measurements of the other spectra. The 26 Test Set Analytes are divided into a Primary Set and a Secondary Set. The Primary Set, gases which must be detectable by the monitor, includes the 6 atmospheric gases and the 6 hazardous gases which have been measured at dangerous concentrations. The Secondary Set gases need not be monitored at this time. The infrared spectra indicates that the pollution monitor will detect all 26 Test Set Analytes by thermal emission and will detect 15 Test Set Analytes by laser absorption

  10. Evaluation of image quality and dose in renal colic: comparison of different spiral-CT protocols

    International Nuclear Information System (INIS)

    Rimondini, A.; Mucelli, R.P.; Dalla Palma, L.; De Denaro, M.; Bregant, P.

    2001-01-01

    The aim of this study was to test different technical spiral-CT parameters to obtain optimal image quality with reduced X-ray dose. Images were acquired with a spiral-CT system Philips Tomoscan AVE1, using 250 mA, 120 kV, and 1-s rotational time. Three protocols were tested: protocol A with 5-mm thickness, pitch 1.6, slice reconstruction every 2.5 mm; protocol B with 3-mm thickness, pitch 1.6, slice reconstruction every 1.5 mm; and protocol C with 3-mm thickness, pitch 2, slice reconstruction every 1.5 mm. Two phantoms were employed to evaluate the image quality. Axial images were acquired, then sagittal and coronal images were reconstructed. Finally, the absorbed X-ray dose for each protocol was measured. Regarding image quality, 5-mm-thick images (protocol A) showed greater spatial resolution and lower noise compared with 3-mm-thick images (protocols B and C) on the axial plane; 3-mm reconstructed sagittal and coronal images (protocols B and C) showed an improved image quality compared with 5-mm reformatted images (protocol A). Concerning X-ray dose, the mean dose was: protocol A 19.6±0.8 mGy; protocol B 14.4±0.6 mGy; protocol C 12.5±1.0 mGy. Our study supports the use of thin slices (3 mm) combined with pitch of 1.6 or 2 in renal colic for X-ray dose reduction to the patient and good image quality. (orig.)

  11. The HPA photon protocol and proposed electron protocol

    International Nuclear Information System (INIS)

    Pitchford, W.G.

    1985-01-01

    The Hospital Physicists Association (HPA) photon dosimetry protocol has been produced and was published in 1983. Revised values of some components of Csub(lambda) and refinements introduced into the theory in the last few years have enabled new Csub(lambda) values to be produced. The proposed HPA electron protocol is at present in draft form and will be published shortly. Both protocels are discussed. (Auth.)

  12. Generating regionalized neuronal cells from pluripotency, a step-by-step protocol

    Directory of Open Access Journals (Sweden)

    Agnete eKirkeby

    2013-01-01

    Full Text Available Human pluripotent stem cells possess the potential to generate cells for regenerative therapies in patients with neurodegenerative diseases, and constitute an excellent cell source for studying human neural development and disease modeling. Protocols for neural differentiation of human pluripotent stem cells have undergone significant progress during recent years, allowing for rapid and synchronized neural conversion. Differentiation procedures can further be combined with accurate and efficient positional patterning to yield regionalized neural progenitors and subtype-specific neurons corresponding to different parts of the developing human brain. Here, we present a step-by-step protocol for neuralization and regionalization of human pluripotent cells for transplantation studies or in vitro analysis.

  13. Treatment of feline lymphoma using a 12-week, maintenance-free combination chemotherapy protocol in 26 cats.

    Science.gov (United States)

    Limmer, S; Eberle, N; Nerschbach, V; Nolte, I; Betz, D

    2016-08-01

    The aim of this prospective clinical trial was to investigate the efficacy and toxicity of a short-term, maintenance-free chemotherapy protocol in feline lymphoma. Twenty-six cats with confirmed diagnosis of high-/intermediate-grade lymphoma were treated with a 12-week protocol consisting of cyclic administration of l-asparaginase, vincristine, cyclophosphamide, doxorubicin and prednisolone. Complete (CR) and partial remission (PR) rates were 46 and 27%, respectively. Median duration of first CR was 394 days compared with a median PR duration of 41 days. No factor was identified to significantly influence the likelihood to reach CR. Overall survival amounted to 78 days (range: 9-2230 days). Median survival in CR cats was 454 days and in PR cats was 82 days. Toxicosis was mainly low grade with anorexia seen most frequently. In cats achieving CR, maintenance-free chemotherapy may be sufficient to attain long-term remission and survival. Factors aiding in prognosticating the likelihood for CR, strategies enhancing response and targeting chemotherapy-induced anorexia need to be identified in future. © 2014 John Wiley & Sons Ltd.

  14. A comparison of two analytical evaluation methods for educational computer games for young children

    NARCIS (Netherlands)

    Bekker, M.M.; Baauw, E.; Barendregt, W.

    2008-01-01

    In this paper we describe a comparison of two analytical methods for educational computer games for young children. The methods compared in the study are the Structured Expert Evaluation Method (SEEM) and the Combined Heuristic Evaluation (HE) (based on a combination of Nielsen’s HE and the

  15. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  16. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Satellite Communications Using Commercial Protocols

    Science.gov (United States)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  18. Security Protocols in a Nutshell

    OpenAIRE

    Toorani, Mohsen

    2016-01-01

    Security protocols are building blocks in secure communications. They deploy some security mechanisms to provide certain security services. Security protocols are considered abstract when analyzed, but they can have extra vulnerabilities when implemented. This manuscript provides a holistic study on security protocols. It reviews foundations of security protocols, taxonomy of attacks on security protocols and their implementations, and different methods and models for security analysis of pro...

  19. Cumulative effective dose and cancer risk for pediatric population in repetitive full spine follow-up imaging: How micro dose is the EOS microdose protocol?

    Science.gov (United States)

    Law, Martin; Ma, Wang-Kei; Lau, Damian; Cheung, Kenneth; Ip, Janice; Yip, Lawrance; Lam, Wendy

    2018-04-01

    To evaluate and to obtain analytic formulation for the calculation of the effective dose and associated cancer risk using the EOS microdose protocol for scoliotic pediatric patients undergoing full spine imaging at different age of exposure; to demonstrate the microdose protocol capable of delivering lesser radiation dose and hence of further reducing cancer risk induction when compared with the EOS low dose protocol; to obtain cumulative effective dose and cancer risk for both genders scoliotic pediatrics of US and Hong Kong population using the microdose protocol. Organ absorbed doses of full spine exposed scoliotic pediatric patients have been simulated with the use of EOS microdose protocol imaging parameters input to the Monte Carlo software PCXMC. Gender and age specific effective dose has been calculated with the simulated organ absorbed dose using the ICRP-103 approach. The associated radiation induced cancer risk, expressed as lifetime attributable risk (LAR), has been estimated according to the method introduced in the Biological Effects of Ionizing Radiation VII report. Values of LAR have been estimated for scoliotic patients exposed repetitively during their follow up period at different age for US and Hong Kong population. The effective doses of full spine imaging with simultaneous posteroanterior and lateral projection for patients exposed at the age between 5 and 18 years using the EOS microdose protocol have been calculated within the range of 2.54-14.75 μSv. The corresponding LAR for US and Hong Kong population was ranged between 0.04 × 10 -6 and 0.84 × 10 -6 . Cumulative effective dose and cancer risk during follow-up period can be estimated using the results and are of information to patients and their parents. With the use of computer simulation and analytic formulation, we obtained the cumulative effective dose and cancer risk at any age of exposure for pediatric patients of US and Hong Kong population undergoing repetitive

  20. Dose evaluation using multiple-aliquot quartz OSL: Test of methods and a new protocol for improved accuracy and precision

    DEFF Research Database (Denmark)

    Jain, M.; Bøtter-Jensen, L.; Singhvi, A.K.

    2003-01-01

    -dose-dependent sensitivity changes during the pre-heat, and fundamental variability in the shapes of quartz OSL (blue-green or blue-light stimulated luminescence) decay forms. A new protocol using a combination of 'elevated temperature IR cleaning' (ETIR) and 'component-specific dose normalisation' (CSDN) has been developed....... CSDN accounts for variability in the OSL decay forms and absorbs such sensitivity changes. A combination of ETIR and CSDN protocol increased palaeodose precision from +/-100% to +/-4% in quartz separates from the fluvially transported sands in the Thar desert. A comparison with palaeodose estimates...

  1. An approximate analytical approach to resampling averages

    DEFF Research Database (Denmark)

    Malzahn, Dorthe; Opper, M.

    2004-01-01

    Using a novel reformulation, we develop a framework to compute approximate resampling data averages analytically. The method avoids multiple retraining of statistical models on the samples. Our approach uses a combination of the replica "trick" of statistical physics and the TAP approach for appr...... for approximate Bayesian inference. We demonstrate our approach on regression with Gaussian processes. A comparison with averages obtained by Monte-Carlo sampling shows that our method achieves good accuracy....

  2. Vertical equilibrium with sub-scale analytical methods for geological CO2 sequestration

    KAUST Repository

    Gasda, S. E.

    2009-04-23

    Large-scale implementation of geological CO2 sequestration requires quantification of risk and leakage potential. One potentially important leakage pathway for the injected CO2 involves existing oil and gas wells. Wells are particularly important in North America, where more than a century of drilling has created millions of oil and gas wells. Models of CO 2 injection and leakage will involve large uncertainties in parameters associated with wells, and therefore a probabilistic framework is required. These models must be able to capture both the large-scale CO 2 plume associated with the injection and the small-scale leakage problem associated with localized flow along wells. Within a typical simulation domain, many hundreds of wells may exist. One effective modeling strategy combines both numerical and analytical models with a specific set of simplifying assumptions to produce an efficient numerical-analytical hybrid model. The model solves a set of governing equations derived by vertical averaging with assumptions of a macroscopic sharp interface and vertical equilibrium. These equations are solved numerically on a relatively coarse grid, with an analytical model embedded to solve for wellbore flow occurring at the sub-gridblock scale. This vertical equilibrium with sub-scale analytical method (VESA) combines the flexibility of a numerical method, allowing for heterogeneous and geologically complex systems, with the efficiency and accuracy of an analytical method, thereby eliminating expensive grid refinement for sub-scale features. Through a series of benchmark problems, we show that VESA compares well with traditional numerical simulations and to a semi-analytical model which applies to appropriately simple systems. We believe that the VESA model provides the necessary accuracy and efficiency for applications of risk analysis in many CO2 sequestration problems. © 2009 Springer Science+Business Media B.V.

  3. Two-dimensional analytical solution for nodal calculation of nuclear reactors

    International Nuclear Information System (INIS)

    Silva, Adilson C.; Pessoa, Paulo O.; Silva, Fernando C.; Martinez, Aquilino S.

    2017-01-01

    Highlights: • A proposal for a coarse mesh nodal method is presented. • The proposal uses the analytical solution of the two-dimensional neutrons diffusion equation. • The solution is performed homogeneous nodes with dimensions of the fuel assembly. • The solution uses four average fluxes on the node surfaces as boundary conditions. • The results show good accuracy and efficiency. - Abstract: In this paper, the two-dimensional (2D) neutron diffusion equation is analytically solved for two energy groups (2G). The spatial domain of reactor core is divided into a set of nodes with uniform nuclear parameters. To determine iteratively the multiplication factor and the neutron flux in the reactor we combine the analytical solution of the neutron diffusion equation with an iterative method known as power method. The analytical solution for different types of regions that compose the reactor is obtained, such as fuel and reflector regions. Four average fluxes in the node surfaces are used as boundary conditions for analytical solution. Discontinuity factors on the node surfaces derived from the homogenization process are applied to maintain averages reaction rates and the net current in the fuel assembly (FA). To validate the results obtained by the analytical solution a relative power density distribution in the FAs is determined from the neutron flux distribution and compared with the reference values. The results show good accuracy and efficiency.

  4. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  5. Approximate analytical solution of two-dimensional multigroup P-3 equations

    International Nuclear Information System (INIS)

    Matausek, M.V.; Milosevic, M.

    1981-01-01

    Iterative solution of multigroup spherical harmonics equations reduces, in the P-3 approximation and in two-dimensional geometry, to a problem of solving an inhomogeneous system of eight ordinary first order differential equations. With appropriate boundary conditions, these equations have to be solved for each energy group and in each iteration step. The general solution of the corresponding homogeneous system of equations is known in analytical form. The present paper shows how the right-hand side of the system can be approximated in order to derive a particular solution and thus an approximate analytical expression for the general solution of the inhomogeneous system. This combined analytical-numerical approach was shown to have certain advantages compared to the finite-difference method or the Lie-series expansion method, which have been used to solve similar problems. (orig./RW) [de

  6. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Science.gov (United States)

    Zhang, Sida; Liu, Wei; Zhang, Xiaohe; Duan, Yixiang

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements.

  7. Big data and visual analytics in anaesthesia and health care.

    Science.gov (United States)

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Analytical Model for Fictitious Crack Propagation in Concrete Beams

    DEFF Research Database (Denmark)

    Ulfkjær, J. P.; Krenk, Steen; Brincker, Rune

    1995-01-01

    An analytical model for load-displacement curves of concrete beams is presented. The load-displacement curve is obtained by combining two simple models. The fracture is modeled by a fictitious crack in an elastic layer around the midsection of the beam. Outside the elastic layer the deformations...... are modeled by beam theory. The state of stress in the elastic layer is assumed to depend bilinearly on local elongation corresponding to a linear softening relation for the fictitious crack. Results from the analytical model are compared with results from a more detailed model based on numerical methods...... for different beam sizes. The analytical model is shown to be in agreement with the numerical results if the thickness of the elastic layer is taken as half the beam depth. It is shown that the point on the load-displacement curve where the fictitious crack starts to develop and the point where the real crack...

  9. Analytical solutions of the electrostatically actuated curled beam problem

    KAUST Repository

    Younis, Mohammad I.

    2014-07-24

    This works presents analytical expressions of the electrostatically actuated initially deformed cantilever beam problem. The formulation is based on the continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We derive simple analytical expressions for two commonly observed deformed beams configurations: the curled and tilted configurations. The derived analytical formulas are validated by comparing their results to experimental data and numerical results of a multi-mode reduced order model. The derived expressions do not involve any complicated integrals or complex terms and can be conveniently used by designers for quick, yet accurate, estimations. The formulas are found to yield accurate results for most commonly encountered microbeams of initial tip deflections of few microns. For largely deformed beams, we found that these formulas yield less accurate results due to the limitations of the single-mode approximation. In such cases, multi-mode reduced order models are shown to yield accurate results. © 2014 Springer-Verlag Berlin Heidelberg.

  10. Business protocol in integrated Europe

    OpenAIRE

    Pavelová, Nina

    2009-01-01

    The first chapter devotes to definitions of basic terms such as protocol or business protocol, to differences between protocol and etiquette, and between social etiquette and business etiquette. The second chapter focuses on the factors influencing the European business protocol. The third chapter is devoted to the etiquette of business protocol in the European countries. It touches the topics such as punctuality and planning of business appointment, greeting, business cards, dress and appear...

  11. Accuracy of prehospital triage protocols in selecting severely injured patients: A systematic review.

    Science.gov (United States)

    van Rein, Eveline A J; Houwert, R Marijn; Gunning, Amy C; Lichtveld, Rob A; Leenen, Luke P H; van Heijl, Mark

    2017-08-01

    Prehospital trauma triage ensures proper transport of patients at risk of severe injury to hospitals with an appropriate corresponding level of trauma care. Incorrect triage results in undertriage and overtriage. The American College of Surgeons Committee on Trauma recommends an undertriage rate below 5% and an overtriage rate below 50% for prehospital trauma triage protocols. To find the most accurate prehospital trauma triage protocol, a clear overview of all currently available protocols and corresponding outcomes is necessary. The aim of this systematic review was to evaluate the current literature on all available prehospital trauma triage protocols and determine accuracy of protocol-based triage quality in terms of sensitivity and specificity. A search of Pubmed, Embase, and Cochrane Library databases was performed to identify all studies describing prehospital trauma triage protocols before November 2016. The search terms included "trauma," "trauma center," or "trauma system" combined with "triage," "undertriage," or "overtriage." All studies describing protocol-based triage quality were reviewed. To assess the quality of these type of studies, a new critical appraisal tool was developed. In this review, 21 articles were included with numbers of patients ranging from 130 to over 1 million. Significant predictors for severe injury were: vital signs, suspicion of certain anatomic injuries, mechanism of injury, and age. Sensitivity ranged from 10% to 100%; specificity from 9% to 100%. Nearly all protocols had a low sensitivity, thereby failing to identify severely injured patients. Additionally, the critical appraisal showed poor quality of the majority of included studies. This systematic review shows that nearly all protocols are incapable of identifying severely injured patients. Future studies of high methodological quality should be performed to improve prehospital trauma triage protocols. Systematic review, level III.

  12. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    Science.gov (United States)

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  13. A stepwise protocol for drug permeation assessment that combines heat-separated porcine ear epidermis and vertical diffusion cells

    Directory of Open Access Journals (Sweden)

    Pantelić Ivana

    2018-01-01

    Full Text Available After decades long absence of an official consensus on the most appropriate evaluation method for in vitro skin performance of topical semisolid drugs, United States Pharmacopoeia (USP 39 finally suggested three types of testing equipment; however, all these provide data on drug release using inert synthetic membranes. Considering the need for a readily available membrane that would be more structurally similar to human skin, this paper provides a detailed protocol of a method for drug permeation assessment that uses heat-separated porcine ear epidermis and modified Franz diffusion cells. Phases that were shown to be critical for variability of the results are identified (e.g., membrane preparation, and process parameters optimized. Applicability of the method was tested on four cream samples loaded with aceclofenac as a model drug. Sample compositions were designed in such a way to provide „large“ variations (variation of the main stabilizer: natural-origin versus synthetic emulsifier and relatively „minor“ variations (co-solvent variation: none/isopropanol/glycerol. The developed protocol is a straightforward and reliable in vitro test for the evaluation of rate and extent of drug delivery into/through the skin. Moreover, this protocol may be routinely applied even in averagely equipped laboratories during formulation development or preliminary bioequivalence assessment of generic topical semisolids. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. TR34031

  14. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  15. Modal instability of rod fiber amplifiers: a semi-analytic approach

    DEFF Research Database (Denmark)

    Jørgensen, Mette Marie; Hansen, Kristian Rymann; Laurila, Marko

    2013-01-01

    The modal instability (MI) threshold is estimated for four rod fiber designs by combining a semi-analytic model with the finite element method. The thermal load due to the quantum defect is calculated and used to numerically determine the mode distributions on which the expression for the onset o...

  16. A combined HM-PCR/SNuPE method for high sensitive detection of rare DNA methylation

    Directory of Open Access Journals (Sweden)

    Tierling Sascha

    2010-06-01

    Full Text Available Abstract Background DNA methylation changes are widely used as early molecular markers in cancer detection. Sensitive detection and classification of rare methylation changes in DNA extracted from circulating body fluids or complex tissue samples is crucial for the understanding of tumor etiology, clinical diagnosis and treatment. In this paper, we describe a combined method to monitor the presence of methylated tumor DNA in an excess of unmethylated background DNA of non-tumorous cells. The method combines heavy methyl-PCR, which favors preferential amplification of methylated marker sequence from bisulfite-treated DNA with a methylation-specific single nucleotide primer extension monitored by ion-pair, reversed-phase, high-performance liquid chromatography separation. Results This combined method allows detection of 14 pg (that is, four to five genomic copies of methylated chromosomal DNA in a 2000-fold excess (that is, 50 ng of unmethylated chromosomal background, with an analytical sensitivity of > 90%. We outline a detailed protocol for the combined assay on two examples of known cancer markers (SEPT9 and TMEFF2 and discuss general aspects of assay design and data interpretation. Finally, we provide an application example for rapid testing on tumor methylation in plasma DNA derived from a small cohort of patients with colorectal cancer. Conclusion The method allows unambiguous detection of rare DNA methylation, for example in body fluid or DNA isolates from cells or tissues, with very high sensitivity and accuracy. The application combines standard technologies and can easily be adapted to any target region of interest. It does not require costly reagents and can be used for routine screening of many samples.

  17. A robust ECC based mutual authentication protocol with anonymity for session initiation protocol.

    Science.gov (United States)

    Mehmood, Zahid; Chen, Gongliang; Li, Jianhua; Li, Linsen; Alzahrani, Bander

    2017-01-01

    Over the past few years, Session Initiation Protocol (SIP) is found as a substantial application-layer protocol for the multimedia services. It is extensively used for managing, altering, terminating and distributing the multimedia sessions. Authentication plays a pivotal role in SIP environment. Currently, Lu et al. presented an authentication protocol for SIP and profess that newly proposed protocol is protected against all the familiar attacks. However, the detailed analysis describes that the Lu et al.'s protocol is exposed against server masquerading attack and user's masquerading attack. Moreover, it also fails to protect the user's identity as well as it possesses incorrect login and authentication phase. In order to establish a suitable and efficient protocol, having ability to overcome all these discrepancies, a robust ECC-based novel mutual authentication mechanism with anonymity for SIP is presented in this manuscript. The improved protocol contains an explicit parameter for user to cope the issues of security and correctness and is found to be more secure and relatively effective to protect the user's privacy, user's masquerading and server masquerading as it is verified through the comprehensive formal and informal security analysis.

  18. A robust ECC based mutual authentication protocol with anonymity for session initiation protocol.

    Directory of Open Access Journals (Sweden)

    Zahid Mehmood

    Full Text Available Over the past few years, Session Initiation Protocol (SIP is found as a substantial application-layer protocol for the multimedia services. It is extensively used for managing, altering, terminating and distributing the multimedia sessions. Authentication plays a pivotal role in SIP environment. Currently, Lu et al. presented an authentication protocol for SIP and profess that newly proposed protocol is protected against all the familiar attacks. However, the detailed analysis describes that the Lu et al.'s protocol is exposed against server masquerading attack and user's masquerading attack. Moreover, it also fails to protect the user's identity as well as it possesses incorrect login and authentication phase. In order to establish a suitable and efficient protocol, having ability to overcome all these discrepancies, a robust ECC-based novel mutual authentication mechanism with anonymity for SIP is presented in this manuscript. The improved protocol contains an explicit parameter for user to cope the issues of security and correctness and is found to be more secure and relatively effective to protect the user's privacy, user's masquerading and server masquerading as it is verified through the comprehensive formal and informal security analysis.

  19. Performance Evaluation of TDMA Medium Access Control Protocol in Cognitive Wireless Networks

    Directory of Open Access Journals (Sweden)

    Muhammed Enes Bayrakdar

    2017-02-01

    Full Text Available Cognitive radio paradigm has been revealed as a new communication technology that shares channels in wireless networks. Channel assignment is a crucial issue in the field of cognitive wireless networks because of the spectrum scarcity. In this work, we have evaluated the performance of TDMA medium access control protocol. In our simulation scenarios, primary users and secondary users utilize TDMA as a medium access control protocol. We have designed a network environment in Riverbed simulation software that consists of primary users, secondary users, and base stations. In our system model, secondary users sense the spectrum and inform the base station about empty channels. Then, the base station decides accordingly which secondary user may utilize the empty channel. Energy detection technique is employed as a spectrum sensing technique because it is the best when information about signal of primary user is acquired. Besides, different number of users is selected in simulation scenarios in order to obtain accurate delay and throughput results. Comparing analytical model with simulation results, we have shown that performance analysis of our system model is consistent and accurate.

  20. Toward Synthesis, Analysis, and Certification of Security Protocols

    Science.gov (United States)

    Schumann, Johann

    2004-01-01

    : multiple tries with invalid passwords caused the expected error message (too many retries). but let the user nevertheless pass. Finally, security can be compromised by silly implementation bugs or design decisions. In a commercial VPN software, all calls to the encryption routines were incidentally replaced by stubs, probably during factory testing. The product worked nicely. and the error (an open VPN) would have gone undetected, if a team member had not inspected the low-level traffic out of curiosity. Also, the use secret proprietary encryption routines can backfire, because such algorithms often exhibit weaknesses which can be exploited easily (see e.g., DVD encoding). Summarizing, there is large number of possibilities to make errors which can compromise the security of a protocol. In today s world with short time-to-market and the use of security protocols in open and hostile networks for safety-critical applications (e.g., power or air-traffic control), such slips could lead to catastrophic situations. Thus, formal methods and automatic reasoning techniques should not be used just for the formal proof of absence of an attack, but they ought to be used to provide an end-to-end tool-supported framework for security software. With such an approach all required artifacts (code, documentation, test cases) , formal analyses, and reliable certification will be generated automatically, given a single, high level specification. By a combination of program synthesis, formal protocol analysis, certification; and proof-carrying code, this goal is within practical reach, since all the important technologies for such an approach actually exist and only need to be assembled in the right way.

  1. Power Saving MAC Protocols for WSNs and Optimization of S-MAC Protocol

    Directory of Open Access Journals (Sweden)

    Simarpreet Kaur

    2012-11-01

    Full Text Available Low power MAC protocols have received a lot of consideration in the last few years because of their influence on the lifetime of wireless sensor networks. Since, sensors typically operate on batteries, replacement of which is often difficult. A lot of work has been done to minimize the energy expenditure and prolong the sensor lifetime through energy efficient designs, across layers. Meanwhile, the sensor network should be able to maintain a certain throughput in order to fulfill the QoS requirements of the end user, and to ensure the constancy of the network. This paper introduces different types of MAC protocols used for WSNs and proposes S‐MAC, a Medium‐Access Control protocol designed for Wireless Sensor Networks. S‐MAC uses a few innovative techniques to reduce energy consumption and support selfconfiguration. A new protocol is suggested to improve the energy efficiency, latency and throughput of existing MAC protocol for WSNs. A modification of the protocol is then proposed to eliminate the need for some nodes to stay awake longer than the other nodes which improves the energy efficiency, latency and throughput and hence increases the life span of a wireless sensor network.

  2. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  3. How to conduct External Quality Assessment Schemes for the pre-analytical phase?

    Science.gov (United States)

    Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre

    2014-01-01

    In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.

  4. Analytical method for predicting plastic flow in notched fiber composite materials

    International Nuclear Information System (INIS)

    Flynn, P.L.; Ebert, L.J.

    1977-01-01

    An analytical system was developed for prediction of the onset and progress of plastic flow of oriented fiber composite materials in which both externally applied complex stress states and stress raisers were present. The predictive system was a unique combination of two numerical systems, the ''SAAS II'' finite element analysis system and a micromechanics finite element program. The SAAS II system was used to generate the three-dimensional stress distributions, which were used as the input into the finite element micromechanics program. Appropriate yielding criteria were then applied to this latter program. The accuracy of the analytical system was demonstrated by the agreement between the analytically predicted and the experimentally measured flow values of externally notched tungsten wire reinforced copper oriented fiber composites, in which the fiber fraction was 50 vol pct

  5. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    Science.gov (United States)

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  6. Validation of the Tensoval Duo Control II blood pressure monitor for clinic use and self-measurement according to the British Hypertension Society protocol and the European Society of Hypertension International Protocol Revision 2010.

    Science.gov (United States)

    de Greeff, Annemarie; Shennan, Andrew H

    2013-06-01

    The Tensoval Duo Control II is an automated upper arm device that uses a combination of oscillometric and auscultatory technology to determine blood pressure noninvasively. The accuracy of this device was assessed according to the British Hypertension Society (BHS) protocol and the European Society of Hypertension International Protocol revision 2010 (ESH-IP2) in an adult population. Ethical approval was obtained. Eighty-five and 33 adult individuals, respectively, were recruited to fulfil the requirements of each protocol. Trained observers took nine sequential same-arm measurements alternating between a mercury sphygmomanometer and the device. The device had to achieve at least a B grade for both systolic and diastolic pressures to pass the BHS protocol and had to fulfil the criteria of all three phases of the ESH-IP2 protocol to receive recommendation. The device achieved an A/A grading for the BHS protocol and passed all three phases of the ESH-IP2 protocol. The mean difference±SD for the BHS/ESH protocols, respectively, was -1.8±6.5/-0.7±5.7 mmHg for systolic pressure and 1.9±5.1/2.4±4.5 mmHg for diastolic pressure. The device maintained its A/A grading throughout the low-pressure, medium-pressure and high-pressure ranges. The Tensoval Duo Control II device is recommended for clinical and home use according to both the BHS and the ESH-IP2 standard.

  7. Cross-Layer Protocol as a Better Option in Wireless Mesh Network with Respect to Layered-Protocol

    OpenAIRE

    Ahmed Abdulwahab Al-Ahdal; Dr. V. P. Pawar; G. N. Shinde

    2014-01-01

    The Optimal way to improve Wireless Mesh Networks (WMNs) performance is to use a better network protocol, but whether layered-protocol design or cross-layer design is a better option to optimize protocol performance in WMNs is still an on-going research topic. In this paper, we focus on cross-layer protocol as a better option with respect to layered-protocol. The layered protocol architecture (OSI) model divides networking tasks into layers and defines a pocket of services for each layer to b...

  8. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  9. IDMA-Based MAC Protocol for Satellite Networks with Consideration on Channel Quality

    Directory of Open Access Journals (Sweden)

    Gongliang Liu

    2014-01-01

    Full Text Available In order to overcome the shortcomings of existing medium access control (MAC protocols based on TDMA or CDMA in satellite networks, interleave division multiple access (IDMA technique is introduced into satellite communication networks. Therefore, a novel wide-band IDMA MAC protocol based on channel quality is proposed in this paper, consisting of a dynamic power allocation algorithm, a rate adaptation algorithm, and a call admission control (CAC scheme. Firstly, the power allocation algorithm combining the technique of IDMA SINR-evolution and channel quality prediction is developed to guarantee high power efficiency even in terrible channel conditions. Secondly, the effective rate adaptation algorithm, based on accurate channel information per timeslot and by the means of rate degradation, can be realized. What is more, based on channel quality prediction, the CAC scheme, combining the new power allocation algorithm, rate scheduling, and buffering strategies together, is proposed for the emerging IDMA systems, which can support a variety of traffic types, and offering quality of service (QoS requirements corresponding to different priority levels. Simulation results show that the new wide-band IDMA MAC protocol can make accurate estimation of available resource considering the effect of multiuser detection (MUD and QoS requirements of multimedia traffic, leading to low outage probability as well as high overall system throughput.

  10. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  11. [Immunocytochemical demonstration of astrocytes in brain sections combined with Nissl staining].

    Science.gov (United States)

    Korzhevskiĭ, D E; Otellin, V A

    2004-01-01

    The aim of the present study was to develop an easy and reliable protocol of combined preparation staining, which would unite the advantages of immunocytochemical demonstration of astrocytes with the availability to evaluate functional state of neurons provided by Nissl technique. The presented protocol of paraffin sections processing allows to retain high quality of tissue structure and provides for selective demonstration of astrocytes using the monoclonal antibodies against glial fibrillary acidic protein and contrast Nissl staining of cells. The protocol can be used without any changes for processing of brain sections obtained from the humans and other mammals with the exception of mice and rabbits.

  12. The Earth Data Analytic Services (EDAS) Framework

    Science.gov (United States)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  13. The presentation of explicit analytical solutions of a class of nonlinear evolution equations

    International Nuclear Information System (INIS)

    Feng Jinshun; Guo Mingpu; Yuan Deyou

    2009-01-01

    In this paper, we introduce a function set Ω m . There is a conjecture that an arbitrary explicit travelling-wave analytical solution of a real constant coefficient nonlinear evolution equation is necessarily a linear (or nonlinear) combination of the product of some elements in Ω m . A widespread applicable approach for solving a class of nonlinear evolution equations is established. The new analytical solutions to two kinds of nonlinear evolution equations are described with the aid of the guess.

  14. Management of corneal ectasia after LASIK with combined, same-day, topography-guided partial transepithelial PRK and collagen cross-linking: the athens protocol.

    Science.gov (United States)

    Kanellopoulos, Anastasios John; Binder, Perry S

    2011-05-01

    To evaluate a series of patients with corneal ectasia after LASIK that underwent the Athens Protocol: combined topography-guided photorefractive keratectomy (PRK) to reduce or eliminate induced myopia and astigmatism followed by sequential, same-day ultraviolet A (UVA) corneal collagen cross-linking (CXL). Thirty-two consecutive corneal ectasia cases underwent transepithelial PRK (WaveLight ALLEGRETTO) immediately followed by CXL (3 mW/cm(2)) for 30 minutes using 0.1% topical riboflavin sodium phosphate. Uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), manifest refraction spherical equivalent, keratometry, central ultrasonic pachymetry, corneal tomography (Oculus Pentacam), and endothelial cell counts were analyzed. Mean follow-up was 27 months (range: 6 to 59 months). Twenty-seven of 32 eyes had an improvement in UDVA and CDVA of 20/45 or better (2.25 logMAR) at last follow-up. Four eyes showed some topographic improvement but no improvement in CDVA. One of the treated eyes required a subsequent penetrating keratoplasty. Corneal haze grade 2 was present in 2 eyes. Combined, same-day, topography-guided PRK and CXL appeared to offer tomographic stability, even after long-term follow-up. Only 2 of 32 eyes had corneal ectasia progression after the intervention. Seventeen of 32 eyes appeared to have improvement in UDVA and CDVA with follow-up >1.5 years. This technique may offer an alternative in the management of iatrogenic corneal ectasia. Copyright 2011, SLACK Incorporated.

  15. [Multidisciplinary protocol for computed tomography imaging and angiographic embolization of splenic injury due to trauma: assessment of pre-protocol and post-protocol outcomes].

    Science.gov (United States)

    Koo, M; Sabaté, A; Magalló, P; García, M A; Domínguez, J; de Lama, M E; López, S

    2011-11-01

    To assess conservative treatment of splenic injury due to trauma, following a protocol for computed tomography (CT) and angiographic embolization. To quantify the predictive value of CT for detecting bleeding and need for embolization. The care protocol developed by the multidisciplinary team consisted of angiography with embolization of lesions revealed by contrast extravasation under CT as well as embolization of grade III-V injuries observed, or grade I-II injuries causing hemodynamic instability and/or need for blood transfusion. We collected data on demographic variables, injury severity score (ISS), angiographic findings, and injuries revealed by CT. Pre-protocol and post-protocol outcomes were compared. The sensitivity and specificity of CT findings were calculated for all patients who required angiographic embolization. Forty-four and 30 angiographies were performed in the pre- and post-protocol periods, respectively. The mean (SD) ISSs in the two periods were 25 (11) and 26 (12), respectively. A total of 24 (54%) embolizations were performed in the pre-protocol period and 28 (98%) after implementation of the protocol. Two and 7 embolizations involved the spleen in the 2 periods, respectively; abdominal laparotomies numbered 32 and 25, respectively, and 10 (31%) vs 4 (16%) splenectomies were performed. The specificity and sensitivity values for contrast extravasation found on CT and followed by embolization were 77.7% and 79.5%. The implementation of this multidisciplinary protocol using CT imaging and angiographic embolization led to a decrease in the number of splenectomies. The protocol allows us to take a more conservative treatment approach.

  16. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  17. Using boolean and fuzzy logic combined with analytic hierarchy process for hazardous waste landfill site selection: A case study from Hormozgan province, Iran

    Directory of Open Access Journals (Sweden)

    Mahdieh Saadat Foomani

    2017-01-01

    Full Text Available Hazardous wastes include numerous kinds of discarded chemicals and other wastes generated from industrial, commercial, and institutional activities. These types of waste present immediate or long-term risks to humans, animals, plants, or the environment and therefore require special handling for safe disposal. Landfills that can accept hazardous wastes are excavated or engineered sites where these special types of waste can be disposed of securely. Since landfills are permanent sites, special attention must be afforded in selecting the location. This paper investigated the use of the Boolean theory and Fuzzy logic in combination with Analytic Hierarchy Process (AHP methods by applying GIS and IDRISI software for the selection of a hazardous waste landfill site in the Iranian province of Hormozgan. The best location was determined via the Fuzzy and the Boolean methodologies. By collating the area selected for the hazardous waste landfill, this study found that Fuzzy logic with an AND operator had the best options for this purpose. In the end, the most suitable area for a hazardous waste landfill was about 1.6 km2 which was obtained by employing Fuzzy in combination with AHP and by using an AND operator. In addition, all the fundamental criteria affecting the landfill location were considered.

  18. Network protocols. Special issue; Netwerkprotocollen. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, G.A. [RTB Van Heugten, Nijmegen (Netherlands); Rooijakkers, G.W.J. [GTI Building Automation, Amsterdam (Netherlands); Peterse, A. [Regel Partners, Hoevelaken (Netherlands); Smits, P. [Konnex Nederland, Valkenswaard (Netherlands); Hamers, E.P. [Van Dorp Installaties, Breda (Netherlands); Van der Velden, J.A.J. [Kropman, Rijswijk (Netherlands); Van Lingen, G.; Wijn, D.M. [Engineer Johnson Controls, Gorinchem (Netherlands); Deckere, W.J.M.A. [Deerns raadgevende ingenieurs, Rijswijk (Netherlands); Driessen, B. [Saia Burgess, Gouda (Netherlands); Van Olst, K. [K en R Consultants, Deventer (Netherlands); Mosterman, F. [Wago Building Technology, Harderwijk (Netherlands); Staub, R. [BUS-House, Zuerich (Switzerland); Meiring, O.B.; Hut, W.H. [Sauter Building Control Nederland, Amsterdam (Netherlands); Tukker, A. [Webeasy Products, Sliedrecht (Netherlands); Bakker, L.G.; Soethout, L.L.; Elkhuizen, P.A. [TNO Bouw en Ondergrond, Delft (Netherlands); Haeseler, U. [TAC GmbH, Berlin (Germany); Kerdel, J.F. [Siemens Building Technologies, Zoetermeer (Netherlands); Lugt, G.L.; Draijer, G.W.

    2007-11-15

    In 20 articles attention is paid to several aspects of network protocols by means of which building automation systems can exchange data: building automation and management, history of technical installations management, the open communication standard BACnet (Building Automation and Control network), the so-called ISO/IEC domotics and communication standard KNX or Konnex, the integration of electrotechnical and engineering installations by the LonWorks technology, other standard protocols as Modbus, M-bus, OPC (OLE for Process Control), an outline of TCP/IP, smart design of networks, automation and networks and building owners, the use of BACnet and Ethernet in a renovated office building, the use of an open management network in buildings, wireless open integrated systems, terminology in network communication, the use of BACnet in combination with KNX, the impact of BACnet on building automation, the role of the installation sector in the ICT-environment, knowledge of building automation and management, regulations with respect to building automation, and BACnet MSTP (Multiple Spanning Tree Protocol) [Dutch] In 20 artikelen wordt in dit themanummer aandacht besteed aan diverse aspecten m.b.t. netwerkprotocollen waarmee verschillende automatiseringssystemen gegevens met elkaar uitwisselen: gebouwautomatisering en beheer, geschiedenis van technisch installatie beheer, de open communicatie standaard BACnet (Building Automation and Control network), de zogenaamde ISO/IEC domotica en communicatie standaard KNX of Konnex, de integratie van electrotechnische en werktuigbouwkundige installaties met behulp van de LonWorks technologie, andere standaard protocollen zoals Modbus, M-bus, OPC (OLE for Process Control), uitleg over TCP/IP, slim ontwerpen van netwerken, gebouweigenaren over automatisering en netwerken, het gebruik van BACnet en Ethernet in een tot kantoorgebouw gerenoveerd monumentaal gebouw, het gebruik van een open management netwerk in gebouwen, draadloos met

  19. Hanford environmental analytical methods: Methods as of March 1990

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Daniel, J.L.

    1993-05-01

    This paper from the analytical laboratories at Hanford describes the method used to measure pH of single-shell tank core samples. Sludge or solid samples are mixed with deionized water. The pH electrode used combines both a sensor and reference electrode in one unit. The meter amplifies the input signal from the electrode and displays the pH visually

  20. PAAPPAS community trial protocol: a randomized study of obesity prevention for adolescents combining school with household intervention

    Directory of Open Access Journals (Sweden)

    Michele R. Sgambato

    2016-08-01

    Full Text Available Abstract Background The prevalence of childhood obesity is increasing at a high rate in Brazil, making prevention a health priority. Schools are the central focus of interventions aiming the prevention and treatment of childhood obesity, however, randomized trials and cohort studies have not yet provided clear evidence of strategies to reduce prevalence of obesity. The aim of this study is to present a protocol to evaluate the efficacy of combining school and household level interventions to reduce excessive weight gain among students. Methods The intervention target fifth and sixth graders from 18 public schools (9 interventions and 9 controls in the municipality of Duque de Caxias, metropolitan area of Rio de Janeiro, Brazil. A sample size of 2500 students will be evaluated at school for their weight status and those from the intervention group who are overweight or obese will be followed monthly at home by community health agents. Demographic, socioeconomic, anthropometric, eating behavior and food consumption data will be collected at school using a standardized questionnaire programmed in personal digital assistant. At school, all students from the intervention group will be encouraged to change eating habits and food consumption and to increase physical activity and reducing sedentary behavior. Discussion This study will provide evidence whether integration of school with primary health care can prevent excessive weight gain among adolescents. Positive results will inform a sustainable strategy to be disseminated in the health care system in Brazil. Trial registration ClinicalTrials.gov, NCT02711488 . Date of registration: March 11, 2016.

  1. Security bound of two-basis quantum-key-distribution protocols using qudits

    International Nuclear Information System (INIS)

    Nikolopoulos, Georgios M.; Alber, Gernot

    2005-01-01

    We investigate the security bounds of quantum-cryptographic protocols using d-level systems. In particular, we focus on schemes that use two mutually unbiased bases, thus extending the Bennett-Brassard 1984 quantum-key-distribution scheme to higher dimensions. Under the assumption of general coherent attacks, we derive an analytic expression for the ultimate upper security bound of such quantum-cryptography schemes. This bound is well below the predictions of optimal cloning machines. The possibility of extraction of a secret key beyond entanglement distillation is discussed. In the case of qutrits we argue that any eavesdropping strategy is equivalent to a symmetric one. For higher dimensions such an equivalence is generally no longer valid

  2. Compact tokamak reactors. Part 1 (analytic results)

    International Nuclear Information System (INIS)

    Wootton, A.J.; Wiley, J.C.; Edmonds, P.H.; Ross, D.W.

    1996-01-01

    We discuss the possible use of tokamaks for thermonuclear power plants, in particular tokamaks with low aspect ratio and copper toroidal field coils. Three approaches are presented. First we review and summarize the existing literature. Second, using simple analytic estimates, the size of the smallest tokamak to produce an ignited plasma is derived. This steady state energy balance analysis is then extended to determine the smallest tokamak power plant, by including the power required to drive the toroidal field, and considering two extremes of plasma current drive efficiency. The analytic results will be augmented by a numerical calculation which permits arbitrary plasma current drive efficiency; the results of which will be presented in Part II. Third, a scaling from any given reference reactor design to a copper toroidal field coil device is discussed. Throughout the paper the importance of various restrictions is emphasized, in particular plasma current drive efficiency, plasma confinement, plasma safety factor, plasma elongation, plasma beta, neutron wall loading, blanket availability and recirculating electric power. We conclude that the latest published reactor studies, which show little advantage in using low aspect ratio unless remarkably high efficiency plasma current drive and low safety factor are combined, can be reproduced with the analytic model

  3. Two-Layer Hierarchy Optimization Model for Communication Protocol in Railway Wireless Monitoring Networks

    Directory of Open Access Journals (Sweden)

    Xiaoping Ma

    2018-01-01

    Full Text Available The wireless monitoring system is always destroyed by the insufficient energy of the sensors in railway. Hence, how to optimize the communication protocol and extend the system lifetime is crucial to ensure the stability of system. However, the existing studies focused primarily on cluster-based or multihop protocols individually, which are ineffective in coping with the complex communication scenarios in the railway wireless monitoring system (RWMS. This study proposes a hybrid protocol which combines the cluster-based and multihop protocols (CMCP to minimize and balance the energy consumption in different sections of the RWMS. In the first hierarchy, the total energy consumption is minimized by optimizing the cluster quantities in the cluster-based protocol and the number of hops and the corresponding hop distances in the multihop protocol. In the second hierarchy, the energy consumption is balanced through rotating the cluster head (CH in the subnetworks and further optimizing the hops and the corresponding hop distances in the backbone network. On this basis, the system lifetime is maximized with the minimum and balance energy consumption among the sensors. Furthermore, the hybrid particle swarm optimization and genetic algorithm (PSO-GA are adopted to optimize the energy consumption from the two-layer hierarchy. Finally, the effectiveness of the proposed CMCP is verified in the simulation. The performances of the proposed CMCP in system lifetime, residual energy, and the corresponding variance are all superior to the LEACH protocol widely applied in the previous research. The effective protocol proposed in this study can facilitate the application of the wireless monitoring network in the railway system and enhance safety operation of the railway.

  4. Successful treatment of a large oral verrucous hyperplasia with photodynamic therapy combined with cryotherapy

    Directory of Open Access Journals (Sweden)

    Yu-Chao Chang

    2013-03-01

    Full Text Available Studies have shown that topical 5-aminolevulinic acid-mediated photodynamic therapy (ALA-PDT can be used successfully for the treatment of oral verrucous hyperplasia (OVH. Studies have also demonstrated that cryotherapy could be used as a treatment modality for OVH lesions. In this case report, we tested the efficacy of topical ALA-PDT, combined with cryogun cryotherapy, for an extensive OVH lesion on the right buccal mucosa of a 65-year-old male areca quid chewer. The tumor was cleared after six treatments of combined topical ALA-PDT and cryogun cryotherapy. No recurrence of the lesion was found after a follow-up period of 18 months. We suggest that our combined treatment protocol may be effective in treating OVH lesions. The treatment course may be slightly shortened with this combined protocol and was well tolerated by the patient.

  5. Deep Random based Key Exchange protocol resisting unlimited MITM

    OpenAIRE

    de Valroger, Thibault

    2018-01-01

    We present a protocol enabling two legitimate partners sharing an initial secret to mutually authenticate and to exchange an encryption session key. The opponent is an active Man In The Middle (MITM) with unlimited computation and storage capacities. The resistance to unlimited MITM is obtained through the combined use of Deep Random secrecy, formerly introduced and proved as unconditionally secure against passive opponent for key exchange, and universal hashing techniques. We prove the resis...

  6. Cryptographic Protocols:

    DEFF Research Database (Denmark)

    Geisler, Martin Joakim Bittel

    cryptography was thus concerned with message confidentiality and integrity. Modern cryptography cover a much wider range of subjects including the area of secure multiparty computation, which will be the main topic of this dissertation. Our first contribution is a new protocol for secure comparison, presented...... implemented the comparison protocol in Java and benchmarks show that is it highly competitive and practical. The biggest contribution of this dissertation is a general framework for secure multiparty computation. Instead of making new ad hoc implementations for each protocol, we want a single and extensible...... in Chapter 2. Comparisons play a key role in many systems such as online auctions and benchmarks — it is not unreasonable to say that when parties come together for a multiparty computation, it is because they want to make decisions that depend on private information. Decisions depend on comparisons. We have...

  7. Fiber Laser Component Testing for Space Qualification Protocol Development

    Science.gov (United States)

    Falvey, S.; Buelow, M.; Nelson, B.; Starcher, Y.; Thienel, L.; Rhodes, C.; Tull, Jackson; Drape, T.; Westfall, C.

    A test protocol for the space qualifying of Ytterbium-doped diode-pumped fiber laser (DPFL) components was developed under the Bright Light effort, sponsored by AFRL/VSE. A literature search was performed and summarized in an AMOS 2005 conference paper that formed the building blocks for the development of the test protocol. The test protocol was developed from the experience of the Bright Light team, the information in the literature search, and the results of a study of the Telcordia standards. Based on this protocol developed, test procedures and acceptance criteria for a series of vibration, thermal/vacuum, and radiation exposure tests were developed for selected fiber laser components. Northrop Grumman led the effort in vibration and thermal testing of these components at the Aerospace Engineering Facility on Kirtland Air Force Base, NM. The results of the tests conducted have been evaluated. This paper discusses the vibration and thermal testing that was executed to validate the test protocol. The lessons learned will aid in future assessments and definition of space qualification protocols. Components representative of major items within a Ytterbium-doped diode-pumped fiber laser were selected for testing; including fibers, isolators, combiners, fiber Bragg gratings, and laser diodes. Selection of the components was based on guidelines to test multiple models of typical fiber laser components. A goal of the effort was to test two models (i.e. different manufacturers) of each type of article selected, representing different technologies for the same type of device. The test articles did not include subsystems or systems. These components and parts may not be available commercial-off-the-shelf (COTS), and, in fact, many are custom articles, or newly developed by the manufacturer. The primary goal for this effort is a completed taxonomy that lists all relevant laser components, modules, subsystems, and interfaces, and cites the documentation for space

  8. An improved method of studying user-system interaction by combining transaction log analysis and protocol analysis

    Directory of Open Access Journals (Sweden)

    Jillian R. Griffiths

    2002-01-01

    Full Text Available The paper reports a novel approach to studying user-system interaction that captures a complete record of the searcher's actions, the system responses and synchronised talk-aloud comments from the searcher. The data is recorded unobtrusively and is available for later analysis. The approach is set in context by a discussion of transaction logging and protocol analysis and examples of the search logging in operation are presented

  9. A new approach combining analytical methods for workplace exposure assessment of inhalable multi-walled carbon nanotubes

    NARCIS (Netherlands)

    Tromp, P.C.; Kuijpers, E.; Bekker, C.; Godderis, L.; Lan, Q.; Jedynska, A.D.; Vermeulen, R.; Pronk, A.

    2017-01-01

    To date there is no consensus about the most appropriate analytical method for measuring carbon nanotubes (CNTs), hampering the assessment and limiting the comparison of data. The goal of this study is to develop an approach for the assessment of the level and nature of inhalable multi-wall CNTs

  10. A Very Low Power MAC (VLPM Protocol for Wireless Body Area Networks

    Directory of Open Access Journals (Sweden)

    Kyung Sup Kwak

    2011-03-01

    Full Text Available Wireless Body Area Networks (WBANs consist of a limited number of battery operated nodes that are used to monitor the vital signs of a patient over long periods of time without restricting the patient’s movements. They are an easy and fast way to diagnose the patient’s status and to consult the doctor. Device as well as network lifetime are among the most important factors in a WBAN. Prolonging the lifetime of the WBAN strongly depends on controlling the energy consumption of sensor nodes. To achieve energy efficiency, low duty cycle MAC protocols are used, but for medical applications, especially in the case of pacemakers where data have time-limited relevance, these protocols increase latency which is highly undesirable and leads to system instability. In this paper, we propose a low power MAC protocol (VLPM based on existing wakeup radio approaches which reduce energy consumption as well as improving the response time of a node. We categorize the traffic into uplink and downlink traffic. The nodes are equipped with both a low power wake-up transmitter and receiver. The low power wake-up receiver monitors the activity on channel all the time with a very low power and keeps the MCU (Micro Controller Unit along with main radio in sleep mode. When a node [BN or BNC (BAN Coordinator] wants to communicate with another node, it uses the low-power radio to send a wakeup packet, which will prompt the receiver to power up its primary radio to listen for the message that follows shortly. The wake-up packet contains the desired node’s ID along with some other information to let the targeted node to wake-up and take part in communication and let all other nodes to go to sleep mode quickly. The VLPM protocol is proposed for applications having low traffic conditions. For high traffic rates, optimization is needed. Analytical results show that the proposed protocol outperforms both synchronized and unsynchronized MAC protocols like T-MAC, SCP-MAC, B

  11. Enabling analytics on sensitive medical data with secure multi-party computation

    NARCIS (Netherlands)

    M. Veeningen (Meilof); S. Chatterjea (Supriyo); A.Z. Horváth (Anna Zsófia); G. Spindler (Gerald); E. Boersma (Eric); P. van der Spek (Peter); O. van der Galiën (Onno); J. Gutteling (Job); W. Kraaij (Wessel); P.J.M. Veugen (Thijs)

    2018-01-01

    textabstractWhile there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multiparty computation can enable such data

  12. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  13. Combined ESR and U-series isochron dating of fossil tooth from Longgupo cave

    International Nuclear Information System (INIS)

    Han Fei; Yin Gongming; Liu Chunru; Jean-Jacques Bahain

    2012-01-01

    Background: In ESR and luminescence archaeological dating, the assessment of external radiation dose rate is one of the constant sources of uncertainty because of its variation in the past time and it cannot be determined accurately using the present-day measurements. Purpose: ESR isochron protocol was proposed to solve this uncertainty for the tooth samples. This protocol is applicable wherever multiple samples with different internal doses have all experienced a common external dose. The variable uranium concentration of tooth samples makes it possible to plot the equivalent dose versus the internal dose rate of each sample, and the slope of isochron line gives hence the age. For isochron dating of teeth, combined ESR/U-series dating analysis must be done together with isochron protocol. Methods: In this study, we try to use combined ESR/U-series isochron method on 5 tooth samples collected from immediate adjacent square in layer C Ⅲ'6 of Longgupo archaeological site, Chongqing, China. Combined ESR/U-series analysis with in situ external dose rate shows recent uranium uptake of all the samples. Results: The time-averaged external dose rate was iterative calculated by isochron protocol, and gives an isochron age of 1.77±0.09 Ma for layer C Ⅲ'6, which consistent with the mean US-ESR ages of 5 samples (1.64+0.16/-0.21 Ma) in the error range. The calculated time-averaged external dose rate(∼807 μGy/a) was basically in agreement with the in situ measured gamma dose rate value (8.50 μGy/a) in 2006, indicating the geochemical alterations may not occurred or do not affect the environmental dose rate obviously during the burial history. Conclusions: This study indicates the potential of solving both internal and external dose rate problems of ESR dating of fossil teeth by combining with U-series analysis and isochron protocol. (authors)

  14. Analytic theory of curvature effects for wave problems with general boundary conditions

    DEFF Research Database (Denmark)

    Willatzen, Morten; Gravesen, Jens; Voon, L. C. Lew Yan

    2010-01-01

    A formalism based on a combination of differential geometry and perturbation theory is used to obtain analytic expressions for confined eigenmode changes due to general curvature effects. In cases of circular-shaped and helix-shaped structures, where alternative analytic solutions can be found......, the perturbative solution is shown to yield the same result. The present technique allows the generalization of earlier results to arbitrary boundary conditions. The power of the method is illustrated using examples based on Maxwell’s and Schrödinger’s equations for applications in photonics and nanoelectronics....

  15. New methodology for analytical calculation of resonance integrals in an heterogeneous medium

    International Nuclear Information System (INIS)

    Campos, T.P.R. de; Martinez, A.S.

    1986-01-01

    A new methodology for analytical calculation of Resonance Integral in a typical fuel cell is presented. The expression obtained for the Resonance Integral presents the advantage of being analytical. Its constituent terms are combinations of the well known function J(xi,β) with its partial derivatives in regard to β. This is a general expression for all types of resonance. The parameters used in this method depend on the resonance type and are obtained as a function of the parameter lambda. A simple expression, depending on resonance parameters is proposed for this variable. (Author) [pt

  16. Effectiveness of couch height-based patient set-up and an off-line correction protocol in prostate cancer radiotherapy

    International Nuclear Information System (INIS)

    Lin, Emile N.J.Th. van; Nijenhuis, Edwin; Huizenga, Henk; Vight, Lisette van der; Visser, Andries

    2001-01-01

    Purpose: To investigate set-up improvement caused by applying a couch height-based patient set-up method in combination with a technologist-driven off-line correction protocol in nonimmobilized radiotherapy of prostate patients. Methods and Materials: A three-dimensional shrinking action level correction protocol is applied in two consecutive patient cohorts with different set-up methods: the traditional 'laser set-up' group (n=43) and the 'couch height set-up' group (n=112). For all directions, left-right, ventro-dorsal, and cranio-caudal, random and systematic set-up deviations were measured. Results: The couch height set-up method improves the patient positioning compared to the laser set-up method. Without application of the correction protocol, both systematic and random errors reduced to 2.2-2.4 mm (1 SD) and 1.7-2.2 mm (1 SD), respectively. By using the correction protocol, systematic errors reduced further to 1.3-1.6 mm (1 SD). One-dimensional deviations were within 5 mm for >90% of the measured fractions. The required number of corrections per patient in the off-line correction protocol was reduced significantly during the course of treatment from 1.1 to 0.6 by the couch height set-up method. The treatment time was not prolonged by application of the correction protocol. Conclusions: The couch height set-up method improves the set-up significantly, especially in the ventro-dorsal direction. Combination of this set-up method with an off-line correction strategy, executed by technologists, reduces the number of set-up corrections required

  17. Performance Analysis of Selective Decode-and-Forward Multinode Incremental Relaying with Maximal Ratio Combining

    KAUST Repository

    Hadjtaieb, Amir

    2013-09-12

    In this paper, we propose an incremental multinode relaying protocol with arbitrary N-relay nodes that allows an efficient use of the channel spectrum. The destination combines the received signals from the source and the relays using maximal ratio Combining (MRC). The transmission ends successfully once the accumulated signal-to-noise ratio (SNR) exceeds a predefined threshold. The number of relays participating in the transmission is adapted to the channel conditions based on the feedback from the destination. The use of incremental relaying allows obtaining a higher spectral efficiency. Moreover, the symbol error probability (SEP) performance is enhanced by using MRC at the relays. The use of MRC at the relays implies that each relay overhears the signals from the source and all previous relays and combines them using MRC. The proposed protocol differs from most of existing relaying protocol by the fact that it combines both incremental relaying and MRC at the relays for a multinode topology. Our analyses for a decode-and-forward mode show that: (i) compared to existing multinode relaying schemes, the proposed scheme can essentially achieve the same SEP performance but with less average number of time slots, (ii) compared to schemes without MRC at the relays, the proposed scheme can approximately achieve a 3 dB gain.

  18. Multimode Communication Protocols Enabling Reconfigurable Radios

    Directory of Open Access Journals (Sweden)

    Berlemann Lars

    2005-01-01

    Full Text Available This paper focuses on the realization and application of a generic protocol stack for reconfigurable wireless communication systems. This focus extends the field of software-defined radios which usually concentrates on the physical layer. The generic protocol stack comprises common protocol functionality and behavior which are extended through specific parts of the targeted radio access technology. This paper considers parameterizable modules of basic protocol functions residing in the data link layer of the ISO/OSI model. System-specific functionality of the protocol software is realized through adequate parameterization and composition of the generic modules. The generic protocol stack allows an efficient realization of reconfigurable protocol software and enables a completely reconfigurable wireless communication system. It is a first step from side-by-side realized, preinstalled modes in a terminal towards a dynamic reconfigurable anymode terminal. The presented modules of the generic protocol stack can also be regarded as a toolbox for the accelerated and cost-efficient development of future communication protocols.

  19. Ancestors protocol for scalable key management

    Directory of Open Access Journals (Sweden)

    Dieter Gollmann

    2010-06-01

    Full Text Available Group key management is an important functional building block for secure multicast architecture. Thereby, it has been extensively studied in the literature. The main proposed protocol is Adaptive Clustering for Scalable Group Key Management (ASGK. According to ASGK protocol, the multicast group is divided into clusters, where each cluster consists of areas of members. Each cluster uses its own Traffic Encryption Key (TEK. These clusters are updated periodically depending on the dynamism of the members during the secure session. The modified protocol has been proposed based on ASGK with some modifications to balance the number of affected members and the encryption/decryption overhead with any number of the areas when a member joins or leaves the group. This modified protocol is called Ancestors protocol. According to Ancestors protocol, every area receives the dynamism of the members from its parents. The main objective of the modified protocol is to reduce the number of affected members during the leaving and joining members, then 1 affects n overhead would be reduced. A comparative study has been done between ASGK protocol and the modified protocol. According to the comparative results, it found that the modified protocol is always outperforming the ASGK protocol.

  20. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    Science.gov (United States)

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    Science.gov (United States)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  2. Towards Personalized Medicine: Leveraging Patient Similarity and Drug Similarity Analytics

    Science.gov (United States)

    Zhang, Ping; Wang, Fei; Hu, Jianying; Sorrentino, Robert

    2014-01-01

    The rapid adoption of electronic health records (EHR) provides a comprehensive source for exploratory and predictive analytic to support clinical decision-making. In this paper, we investigate how to utilize EHR to tailor treatments to individual patients based on their likelihood to respond to a therapy. We construct a heterogeneous graph which includes two domains (patients and drugs) and encodes three relationships (patient similarity, drug similarity, and patient-drug prior associations). We describe a novel approach for performing a label propagation procedure to spread the label information representing the effectiveness of different drugs for different patients over this heterogeneous graph. The proposed method has been applied on a real-world EHR dataset to help identify personalized treatments for hypercholesterolemia. The experimental results demonstrate the effectiveness of the approach and suggest that the combination of appropriate patient similarity and drug similarity analytics could lead to actionable insights for personalized medicine. Particularly, by leveraging drug similarity in combination with patient similarity, our method could perform well even on new or rarely used drugs for which there are few records of known past performance. PMID:25717413

  3. Summarizing documentation of the laboratory automation system RADAR for the analytical services of a nuclear fuel reprocessing facility

    International Nuclear Information System (INIS)

    Brandenburg, G.; Brocke, W.; Brodda, B.G.; Buerger, K.; Halling, H.; Heer, H.; Puetz, K.; Schaedlich, W.; Watzlawik, K.H.

    1981-12-01

    The essential tasks of the system are on-line open-loop process control based on in-line measurements and automation of the off-line analytical laboratory. The in-line measurements (at 55 tanks of the chemical process area) provide density-, liquid-, level-, and temperature values. The concentration value of a single component may easily be determined, if the solution consists of no more than two phases. The automation of the off-line analytical laboratory contains laboratory organization including sample management and data organization and computer-aided sample transportation control, data acquisition and data processing at chemical and nuclear analytical devices. The computer system consists of two computer-subsystems: a front end system for sample central registration and in-line process control and a central size system for the off-line analytical tasks. The organization of the application oriented system uses a centralized data base. Similar data processing functions concerning different analytical management tasks are structured into the following subsystem: man machine interface, interrupt- and data acquisition system, data base, protocol service and data processing. The procedures for the laboratory management (organization and experiment sequences) are defined by application data bases. Following the project phases, engineering requirements-, design-, assembly-, start up- and test run phase are described. In addition figures on expenditure and experiences are given and the system concept is discussed. (orig./HP) [de

  4. Improved online δ18O measurements of nitrogen- and sulfur-bearing organic materials and a proposed analytical protocol

    Science.gov (United States)

    Qi, H.; Coplen, T.B.; Wassenaar, L.I.

    2011-01-01

    It is well known that N2 in the ion source of a mass spectrometer interferes with the CO background during the δ18O measurement of carbon monoxide. A similar problem arises with the high-temperature conversion (HTC) analysis of nitrogenous O-bearing samples (e.g. nitrates and keratins) to CO for δ18O measurement, where the sample introduces a significant N2 peak before the CO peak, making determination of accurate oxygen isotope ratios difficult. Although using a gas chromatography (GC) column longer than that commonly provided by manufacturers (0.6 m) can improve the efficiency of separation of CO and N2 and using a valve to divert nitrogen and prevent it from entering the ion source of a mass spectrometer improved measurement results, biased δ18O values could still be obtained. A careful evaluation of the performance of the GC separation column was carried out. With optimal GC columns, the δ18O reproducibility of human hair keratins and other keratin materials was better than ±0.15 ‰ (n = 5; for the internal analytical reproducibility), and better than ±0.10 ‰ (n = 4; for the external analytical reproducibility).

  5. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  6. Cross-layer protocol design for QoS optimization in real-time wireless sensor networks

    Science.gov (United States)

    Hortos, William S.

    2010-04-01

    The metrics of quality of service (QoS) for each sensor type in a wireless sensor network can be associated with metrics for multimedia that describe the quality of fused information, e.g., throughput, delay, jitter, packet error rate, information correlation, etc. These QoS metrics are typically set at the highest, or application, layer of the protocol stack to ensure that performance requirements for each type of sensor data are satisfied. Application-layer metrics, in turn, depend on the support of the lower protocol layers: session, transport, network, data link (MAC), and physical. The dependencies of the QoS metrics on the performance of the higher layers of the Open System Interconnection (OSI) reference model of the WSN protocol, together with that of the lower three layers, are the basis for a comprehensive approach to QoS optimization for multiple sensor types in a general WSN model. The cross-layer design accounts for the distributed power consumption along energy-constrained routes and their constituent nodes. Following the author's previous work, the cross-layer interactions in the WSN protocol are represented by a set of concatenated protocol parameters and enabling resource levels. The "best" cross-layer designs to achieve optimal QoS are established by applying the general theory of martingale representations to the parameterized multivariate point processes (MVPPs) for discrete random events occurring in the WSN. Adaptive control of network behavior through the cross-layer design is realized through the parametric factorization of the stochastic conditional rates of the MVPPs. The cross-layer protocol parameters for optimal QoS are determined in terms of solutions to stochastic dynamic programming conditions derived from models of transient flows for heterogeneous sensor data and aggregate information over a finite time horizon. Markov state processes, embedded within the complex combinatorial history of WSN events, are more computationally

  7. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Sida; Liu, Wei [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China); Zhang, Xiaohe [College of Water Resources and Hydropower, Sichuan University, Chengdu (China); Duan, Yixiang, E-mail: yduan@scu.edu.cn [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China)

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements. - Highlights: • Plasma-based cavity ringdown spectroscopy • High sensitivity and high resolution • Elemental and isotopic measurements.

  8. HISTOPATHOLOGICAL AND CYTOLOGICAL ANALYSIS OF TRANSMISSIBLE VENEREAL TUMOR IN DOGS AFTER TWO TREATMENT PROTOCOLS

    Directory of Open Access Journals (Sweden)

    Fabiana Aguena Sales Lapa

    2012-06-01

    Full Text Available The transmissible venereal tumor (TVT is a contagious neoplasm of round cells that frequently affect dogs. The treatment consists of chemotherapy being more effective the vincristine alone, however the resistance emergence to this agent due multidrug resistance of the P-glycoprotein (P-gp, a transporter protein encoded by the MDR1 gene, has been taking the association with other drugs. Recent studies demonstrated the antitumoral effect of the avermectins when associated to the vincristine in the treatment of some neoplasms. Therefore, the objective of the present study was to compare the effectiveness of standard treatment of TVT with vincristine only when compared to combined treatment with vincristine and ivermectin, evaluated through number of applications of the two protocols, histopathological and cytological analysis from 50 dogs diagnosed with TVT during the period of 2007 to 2010. The combined protocol significant reduced the number of applications and cytological and histopathological findings collaborate with the hypothesis that the combination of vincristine and ivermectin promotes faster healing than the use of vincristine alone. Combination treatment with vincristine and ivermectin could be in the future an excellent therapeutic alternative for the treatment of TVT for probably reducing the resistance to vincristine, simultaneously reducing the cost of TVT treatment and promoting a faster recovery of the dog.

  9. Preoperative magnetic resonance imaging protocol for endoscopic cranial base image-guided surgery.

    Science.gov (United States)

    Grindle, Christopher R; Curry, Joseph M; Kang, Melissa D; Evans, James J; Rosen, Marc R

    2011-01-01

    Despite the increasing utilization of image-guided surgery, no radiology protocols for obtaining magnetic resonance (MR) imaging of adequate quality are available in the current literature. At our institution, more than 300 endonasal cranial base procedures including pituitary, extended pituitary, and other anterior skullbase procedures have been performed in the past 3 years. To facilitate and optimize preoperative evaluation and assessment, there was a need to develop a magnetic resonance protocol. Retrospective Technical Assessment was performed. Through a collaborative effort between the otolaryngology, neurosurgery, and neuroradiology departments at our institution, a skull base MR image-guided (IGS) protocol was developed with several ends in mind. First, it was necessary to generate diagnostic images useful for the more frequently seen pathologies to improve work flow and limit the expense and inefficiency of case specific MR studies. Second, it was necessary to generate sequences useful for IGS, preferably using sequences that best highlight that lesion. Currently, at our institution, all MR images used for IGS are obtained using this protocol as part of preoperative planning. The protocol that has been developed allows for thin cut precontrast and postcontrast axial cuts that can be used to plan intraoperative image guidance. It also obtains a thin cut T2 axial series that can be compiled separately for intraoperative imaging, or may be fused with computed tomographic images for combined modality. The outlined protocol obtains image sequences effective for diagnostic and operative purposes for image-guided surgery using both T1 and T2 sequences. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Epistemic Protocols for Distributed Gossiping

    Directory of Open Access Journals (Sweden)

    Krzysztof R. Apt

    2016-06-01

    Full Text Available Gossip protocols aim at arriving, by means of point-to-point or group communications, at a situation in which all the agents know each other's secrets. We consider distributed gossip protocols which are expressed by means of epistemic logic. We provide an operational semantics of such protocols and set up an appropriate framework to argue about their correctness. Then we analyze specific protocols for complete graphs and for directed rings.

  11. Evaluating Protocol Lifecycle Time Intervals in HIV/AIDS Clinical Trials

    Science.gov (United States)

    Schouten, Jeffrey T.; Dixon, Dennis; Varghese, Suresh; Cope, Marie T.; Marci, Joe; Kagan, Jonathan M.

    2014-01-01

    present for a specific study phase may have been masked by combining protocols into phase groupings. Presence of informative censoring, such as withdrawal of some protocols from development if they began showing signs of lost interest among investigators, complicates interpretation of Kaplan-Meier estimates. Because this study constitutes a retrospective examination over an extended period of time, it does not allow for the precise identification of relative factors impacting timing. Conclusions Delays not only increase the time and cost to complete clinical trials, but they also diminish their usefulness by failing to answer research questions in time. We believe that research analyzing the time spent traversing defined intervals across the clinical trial protocol development and implementation continuum can stimulate business process analyses and reengineering efforts that could lead to reductions in the time from clinical trial concept to results, thereby accelerating progress in clinical research. PMID:24980279

  12. Forecasting Hotspots-A Predictive Analytics Approach.

    Science.gov (United States)

    Maciejewski, R; Hafen, R; Rudolph, S; Larew, S G; Mitchell, M A; Cleveland, W S; Ebert, D S

    2011-04-01

    Current visual analytics systems provide users with the means to explore trends in their data. Linked views and interactive displays provide insight into correlations among people, events, and places in space and time. Analysts search for events of interest through statistical tools linked to visual displays, drill down into the data, and form hypotheses based upon the available information. However, current systems stop short of predicting events. In spatiotemporal data, analysts are searching for regions of space and time with unusually high incidences of events (hotspots). In the cases where hotspots are found, analysts would like to predict how these regions may grow in order to plan resource allocation and preventative measures. Furthermore, analysts would also like to predict where future hotspots may occur. To facilitate such forecasting, we have created a predictive visual analytics toolkit that provides analysts with linked spatiotemporal and statistical analytic views. Our system models spatiotemporal events through the combination of kernel density estimation for event distribution and seasonal trend decomposition by loess smoothing for temporal predictions. We provide analysts with estimates of error in our modeling, along with spatial and temporal alerts to indicate the occurrence of statistically significant hotspots. Spatial data are distributed based on a modeling of previous event locations, thereby maintaining a temporal coherence with past events. Such tools allow analysts to perform real-time hypothesis testing, plan intervention strategies, and allocate resources to correspond to perceived threats.

  13. Using semantics for representing experimental protocols.

    Science.gov (United States)

    Giraldo, Olga; García, Alexander; López, Federico; Corcho, Oscar

    2017-11-13

    An experimental protocol is a sequence of tasks and operations executed to perform experimental research in biological and biomedical areas, e.g. biology, genetics, immunology, neurosciences, virology. Protocols often include references to equipment, reagents, descriptions of critical steps, troubleshooting and tips, as well as any other information that researchers deem important for facilitating the reusability of the protocol. Although experimental protocols are central to reproducibility, the descriptions are often cursory. There is the need for a unified framework with respect to the syntactic structure and the semantics for representing experimental protocols. In this paper we present "SMART Protocols ontology", an ontology for representing experimental protocols. Our ontology represents the protocol as a workflow with domain specific knowledge embedded within a document. We also present the S ample I nstrument R eagent O bjective (SIRO) model, which represents the minimal common information shared across experimental protocols. SIRO was conceived in the same realm as the Patient Intervention Comparison Outcome (PICO) model that supports search, retrieval and classification purposes in evidence based medicine. We evaluate our approach against a set of competency questions modeled as SPARQL queries and processed against a set of published and unpublished protocols modeled with the SP Ontology and the SIRO model. Our approach makes it possible to answer queries such as Which protocols use tumor tissue as a sample. Improving reporting structures for experimental protocols requires collective efforts from authors, peer reviewers, editors and funding bodies. The SP Ontology is a contribution towards this goal. We build upon previous experiences and bringing together the view of researchers managing protocols in their laboratory work. Website: https://smartprotocols.github.io/ .

  14. Enhanced tumor responses through therapies combining CCNU, MISO and radiation

    International Nuclear Information System (INIS)

    Siemann, D.W.; Hill, S.A.

    1984-01-01

    Studies were performed to determine whether the radiation sensitizer misonidazole (MISO) could enhance the tumor control probability in a treatment strategy combining radiation and the nitrosourea 1-(2-chloroethyl)-3-cyclohexyl-1-nitrosourea (CCNU). In initial experiments KHT sarcoma-bearing mice were injected with 1.0 mg/g of MISO simultaneously with a 20 mg/kg dose of CCNU 30-40 min prior to irradiation (1500 rad). With this treatment protocol approximately 60% of the mice were found to be tumor-free 100 days post treatment. By comparison all 2 agent combinations led to 0% cures. To evaluate the relative importance of chemopotentiation versus radiosensitization in the 3 agent protocol, tumors were treated with MISO plus one anti-tumor agent (either radiation of CCNU) and then at times ranging from 0 to 24 hr later exposed to the other agent. When the time between treatments was 0 to 6 hr, a 60 to 80% tumor control rate was achieved for both MISO plus radiation followed by CCNU and MISO plus CCNU followed by radiation. However if the time interval was increased to 18 or 24 hr, the cure rate in the former treatment regimen dropped to 10% while that of the latter remained high at 40%. The data therefore indicate that (1) improved tumor responses may be achieved when MISO is added to a radiation-chemotherapy combination and (2) MISO may be more effective in such a protocol when utilized as a chemopotentiator

  15. The Nature and Extent of Instructors' Use of Learning Analytics in Higher Education to Inform Teaching and Learning

    Science.gov (United States)

    King, Janet L.

    2017-01-01

    The utilization of learning analytics to support teaching and learning has emerged as a newer phenomenon combining instructor-oriented action research, the mining of educational data, and the analyses of statistics and patterns. Learning analytics have documented, quantified and graphically displayed students' interactions, engagement, and…

  16. The influence of referral protocols on the utilization of magnetic resonance imaging: evidence from Manitoba

    Energy Technology Data Exchange (ETDEWEB)

    Mustard, C.A.; McClarty, B.M.; MacEwan, D.W. [Manitoba Univ., Winnipeg, MB (Canada)

    1994-04-01

    The influence of referral protocols on the utilization of magnetic resonance imaging (MRI) services was studied. Three neuroradiologists and one radiologist reviewed the indications for MRI for 198 referrals to a facility in Winnipeg, selected at random from patients seen in 1991 for suspected disorders of the brain or the spine. Out-of-province referrals had not been subject to referral protocols, whereas those from within Manitoba had been subject to such protocols. At least three of the four radiologists agreed on whether an examination was appropriate in 88.4 % of the cases. Out-of-province referrals were significantly more likely to be considered inappropriate for MRI (24 %) than referrals from within Manitoba (10 %). It was estimated that the combined effect of instituting protocols and reviewing each referral before the examination could result in a 16 % to 31 % reduction in the demand for MRI services without compromising diagnostic information. 18 refs., 3 tabs.

  17. MAC Protocol for Ad Hoc Networks Using a Genetic Algorithm

    Science.gov (United States)

    Elizarraras, Omar; Panduro, Marco; Méndez, Aldo L.

    2014-01-01

    The problem of obtaining the transmission rate in an ad hoc network consists in adjusting the power of each node to ensure the signal to interference ratio (SIR) and the energy required to transmit from one node to another is obtained at the same time. Therefore, an optimal transmission rate for each node in a medium access control (MAC) protocol based on CSMA-CDMA (carrier sense multiple access-code division multiple access) for ad hoc networks can be obtained using evolutionary optimization. This work proposes a genetic algorithm for the transmission rate election considering a perfect power control, and our proposition achieves improvement of 10% compared with the scheme that handles the handshaking phase to adjust the transmission rate. Furthermore, this paper proposes a genetic algorithm that solves the problem of power combining, interference, data rate, and energy ensuring the signal to interference ratio in an ad hoc network. The result of the proposed genetic algorithm has a better performance (15%) compared to the CSMA-CDMA protocol without optimizing. Therefore, we show by simulation the effectiveness of the proposed protocol in terms of the throughput. PMID:25140339

  18. MAC Protocol for Ad Hoc Networks Using a Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Omar Elizarraras

    2014-01-01

    Full Text Available The problem of obtaining the transmission rate in an ad hoc network consists in adjusting the power of each node to ensure the signal to interference ratio (SIR and the energy required to transmit from one node to another is obtained at the same time. Therefore, an optimal transmission rate for each node in a medium access control (MAC protocol based on CSMA-CDMA (carrier sense multiple access-code division multiple access for ad hoc networks can be obtained using evolutionary optimization. This work proposes a genetic algorithm for the transmission rate election considering a perfect power control, and our proposition achieves improvement of 10% compared with the scheme that handles the handshaking phase to adjust the transmission rate. Furthermore, this paper proposes a genetic algorithm that solves the problem of power combining, interference, data rate, and energy ensuring the signal to interference ratio in an ad hoc network. The result of the proposed genetic algorithm has a better performance (15% compared to the CSMA-CDMA protocol without optimizing. Therefore, we show by simulation the effectiveness of the proposed protocol in terms of the throughput.

  19. A rapid tattoo removal technique using a combination of pulsed Er:YAG and Q-Switched Nd:YAG in a split lesion protocol.

    Science.gov (United States)

    Sardana, Kabir; Ranjan, Rashmi; Kochhar, Atul M; Mahajan, Khushbu Goel; Garg, Vijay K

    2015-01-01

    Tattoo removal has evolved over the years and though Q-switched laser is the 'workhorse' laser, it invariably requires multiple sittings, which are dependent on numerous factors, including the skin colour, location of the tattoo, age of the tattoo, colour of pigment used, associated fibrosis and the kind of tattoo treated. Though ablative lasers, both pulsed CO2 and Er:YAG, have been used for recalcitrant tattoos, very few studies have been done comparing them with pigment-specific lasers. Our study was based on the premise that ablating the epidermis overlying the tattoo pigment with Er:YAG could help in gaining better access to the pigment which would enable the Q-switched laser to work effectively with less beam scattering. A study of rapid tattoo removal (RTR) technique using a combination of pulsed Er:YAG and Q-Switched Nd:YAG in a split lesion protocol. This prospective study was undertaken during 2010-13 at a laser Clinic in the Maulana Azad Medical College, New Delhi. A total of 10 patients were recruited, 5 of amateur tattoo and 5 of professional tattoo. After informed consent each tattoo was arbitrarily 'split' into two parts. One part was treated with QS Nd:YAG laser(1064 nm) and the other part with Er:YAG laser immediately followed by the QS Nd:YAG. The laser treatments were repeated at 6-week intervals until the tattoo pigment had cleared. On the combination side in subsequent sittings only the QS Nd:YAG was used, to minimize repetitive ablation. To ensure consistency in the intervention methods a trained dermatologist who was independent of the treatment delivery randomly rated 10% of the procedures. The mean improvement achieved by the Q-switched laser (2.93) was less than the combination laser (3.85) side (p = 0.001) and needed more sessions (3.8 vs. 1.6; p = 0.001). There was a statistically significant difference in the improvement on the combination side till the second session. On the combination side patients required a maximum of 2 sessions

  20. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    Science.gov (United States)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  1. Degenerate Fermi gas in a combined harmonic-lattice potential

    International Nuclear Information System (INIS)

    Blakie, P. B.; Bezett, A.; Buonsante, P.

    2007-01-01

    In this paper we derive an analytic approximation to the density of states for atoms in a combined optical lattice and harmonic trap potential as used in current experiments with quantum degenerate gases. We compare this analytic density of states to numerical solutions and demonstrate its validity regime. Our work explicitly considers the role of higher bands and when they are important in quantitative analysis of this system. Applying our density of states to a degenerate Fermi gas, we consider how adiabatic loading from a harmonic trap into the combined harmonic-lattice potential affects the degeneracy temperature. Our results suggest that occupation of excited bands during loading should lead to more favorable conditions for realizing degenerate Fermi gases in optical lattices

  2. Combined and controlled remote implementations of partially unknown quantum operations of multiqubits using Greenberger-Horne-Zeilinger states

    International Nuclear Information System (INIS)

    Wang Anmin

    2007-01-01

    We propose and prove protocols of combined and controlled remote implementations of partially unknown quantum operations belonging to the restricted sets [A. M. Wang, Phys. Rev. A 74, 032317 (2006)] using Greenberger-Horne-Zeilinger (GHZ) states. We present the protocols in detail in the cases of one qubit, with two senders and with one controller, respectively. Then we study the variations of protocols with many senders, or with many controllers, or with both many senders and controllers using a multipartite GHZ state. Furthermore, we extend these protocols to the cases of multiqubits. Because our protocols have to request that the senders work together and transfer the information in turn or receive the repertoire of extra supercontrollers, or/and the controller(s) open the quantum channel and distribute the passwords in different ways, they definitely have the strong security in remote quantum information processing and communications. Moreover, the combined protocol with many senders is helpful to arrive at the power of remote implementations of quantum operations to the utmost extent in theory, since the different senders may have different operational resources and different operational rights in practice, and the controlled protocol with many controllers is able to enhance security and increase applications of remote implementations of quantum operations in engineering, since it has some common features in a controlled process

  3. Elementary mechanics using Matlab a modern course combining analytical and numerical techniques

    CERN Document Server

    Malthe-Sørenssen, Anders

    2015-01-01

    This book – specifically developed as a novel textbook on elementary classical mechanics – shows how analytical and numerical methods can be seamlessly integrated to solve physics problems. This approach allows students to solve more advanced and applied problems at an earlier stage and equips them to deal with real-world examples well beyond the typical special cases treated in standard textbooks. Another advantage of this approach is that students are brought closer to the way physics is actually discovered and applied, as they are introduced right from the start to a more exploratory way of understanding phenomena and of developing their physical concepts. While not a requirement, it is advantageous for the reader to have some prior knowledge of scientific programming with a scripting-type language. This edition of the book uses Matlab, and a chapter devoted to the basics of scientific programming with Matlab is included. A parallel edition using Python instead of Matlab is also available. Last but not...

  4. Elementary mechanics using Python a modern course combining analytical and numerical techniques

    CERN Document Server

    Malthe-Sørenssen, Anders

    2015-01-01

    This book – specifically developed as a novel textbook on elementary classical mechanics – shows how analytical and numerical methods can be seamlessly integrated to solve physics problems. This approach allows students to solve more advanced and applied problems at an earlier stage and equips them to deal with real-world examples well beyond the typical special cases treated in standard textbooks. Another advantage of this approach is that students are brought closer to the way physics is actually discovered and applied, as they are introduced right from the start to a more exploratory way of understanding phenomena and of developing their physical concepts. While not a requirement, it is advantageous for the reader to have some prior knowledge of scientific programming with a scripting-type language. This edition of the book uses Python, and a chapter devoted to the basics of scientific programming with Python is included. A parallel edition using Matlab instead of Python is also available. Last but not...

  5. Addressing the need for biomarker liquid chromatography/mass spectrometry assays: a protocol for effective method development for the bioanalysis of endogenous compounds in cerebrospinal fluid.

    Science.gov (United States)

    Benitex, Yulia; McNaney, Colleen A; Luchetti, David; Schaeffer, Eric; Olah, Timothy V; Morgan, Daniel G; Drexler, Dieter M

    2013-08-30

    Research on disorders of the central nervous system (CNS) has shown that an imbalance in the levels of specific endogenous neurotransmitters may underlie certain CNS diseases. These alterations in neurotransmitter levels may provide insight into pathophysiology, but can also serve as disease and pharmacodynamic biomarkers. To measure these potential biomarkers in vivo, the relevant sample matrix is cerebrospinal fluid (CSF), which is in equilibrium with the brain's interstitial fluid and circulates through the ventricular system of the brain and spinal cord. Accurate analysis of these potential biomarkers can be challenging due to low CSF sample volume, low analyte levels, and potential interferences from other endogenous compounds. A protocol has been established for effective method development of bioanalytical assays for endogenous compounds in CSF. Database searches and standard-addition experiments are employed to qualify sample preparation and specificity of the detection thus evaluating accuracy and precision. This protocol was applied to the study of the histaminergic neurotransmitter system and the analysis of histamine and its metabolite 1-methylhistamine in rat CSF. The protocol resulted in a specific and sensitive novel method utilizing pre-column derivatization ultra high performance liquid chromatography/tandem mass spectrometry (UHPLC/MS/MS), which is also capable of separating an endogenous interfering compound, identified as taurine, from the analytes of interest. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  7. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  8. Analysis of steps adapted protocol in cardiac rehabilitation in the hospital phase

    Science.gov (United States)

    Winkelmann, Eliane Roseli; Dallazen, Fernanda; Bronzatti, Angela Beerbaum Steinke; Lorenzoni, Juliara Cristina Werner; Windmöller, Pollyana

    2015-01-01

    Objective To analyze a cardiac rehabilitation adapted protocol in physical therapy during the postoperative hospital phase of cardiac surgery in a service of high complexity, in aspects regarded to complications and mortality prevalence and hospitalization days. Methods This is an observational cross-sectional, retrospective and analytical study performed by investigating 99 patients who underwent cardiac surgery for coronary artery bypass graft, heart valve replacement or a combination of both. Step program adapted for rehabilitation after cardiac surgery was analyzed under the command of the physiotherapy professional team. Results In average, a patient stays for two days in the Intensive Care Unit and three to four days in the hospital room, totalizing six days of hospitalization. Fatalities occurred in a higher percentage during hospitalization (5.1%) and up to two years period (8.6%) when compared to 30 days after hospital discharge (1.1%). Among the postoperative complications, the hemodynamic (63.4%) and respiratory (42.6%) were the most prevalent. 36-42% of complications occurred between the immediate postoperative period and the second postoperative day. The hospital discharge started from the fifth postoperative day. We can observe that in each following day, the patients are evolving in achieving the Steps, where Step 3 was the most used during the rehabilitation phase I. Conclusion This evolution program by steps can to guide the physical rehabilitation at the hospital in patients after cardiac surgery. PMID:25859866

  9. Analytical applications of spectroscopy

    International Nuclear Information System (INIS)

    Creaser, C.S.

    1988-01-01

    This book provides an up to date overview of recent developments in analytical spectroscopy, with a particular emphasis on the common themes of chromatography - spectroscopy combinations, Fourier transform methods, and data handling techniques, which have played an increasingly important part in the development of all spectroscopic techniques. The book contains papers originally presented at a conference entitled 'Spectroscopy Across The Spectrum' held jointly with the first 'International Near Infrared Spectroscopy Conference' at the University of East Anglia, Norwich, UK, in July 1987, which have been edited and rearranged with some additional material. Each section includes reviews of key areas of current research as well as short reports of new developments. The fields covered are: Near Infrared Spectroscopy; Infrared Spectroscopy; Mass Spectroscopy; NMR Spectroscopy; Atomic and UV/Visible Spectroscopy; Chemometrics and Data Analysis. (author)

  10. Analytical models of optical response in one-dimensional semiconductors

    International Nuclear Information System (INIS)

    Pedersen, Thomas Garm

    2015-01-01

    The quantum mechanical description of the optical properties of crystalline materials typically requires extensive numerical computation. Including excitonic and non-perturbative field effects adds to the complexity. In one dimension, however, the analysis simplifies and optical spectra can be computed exactly. In this paper, we apply the Wannier exciton formalism to derive analytical expressions for the optical response in four cases of increasing complexity. Thus, we start from free carriers and, in turn, switch on electrostatic fields and electron–hole attraction and, finally, analyze the combined influence of these effects. In addition, the optical response of impurity-localized excitons is discussed. - Highlights: • Optical response of one-dimensional semiconductors including excitons. • Analytical model of excitonic Franz–Keldysh effect. • Computation of optical response of impurity-localized excitons

  11. A class-chest for deriving transport protocols

    Energy Technology Data Exchange (ETDEWEB)

    Strayer, W.T.

    1996-10-01

    Development of new transport protocols or protocol algorithms suffers from the complexity of the environment in which they are intended to run. Modeling techniques attempt to avoid this by simulating the environment. Another approach to promoting rapid prototyping of protocols and protocol algorithms is to provide a pre-built infrastructure that is common to transport protocols, so that the focus is placed on the protocol-specific aspects. The Meta-Transport Library is a library of C++ base classes that implement or abstract out the mundane functions of a protocol, new protocol implementations are derived from base classes. The result is a fully viable user- level transport protocol implementation, with emphasis on modularity. The collection of base classes form a ``class-chest`` of tools .from which protocols can be developed and studied with as little change to a normal UNIX environment as possible.

  12. Recommendations for sampling for prevention of hazards in civil defense. On analytics of chemical, biological and radioactive contaminations. Brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling

    International Nuclear Information System (INIS)

    Bachmann, Udo; Biederbick, Walter; Derakshani, Nahid

    2010-01-01

    The recommendation for sampling for prevention of hazards in civil defense is describing the analytics of chemical, biological and radioactive contaminations and includes detail information on the sampling, protocol preparation and documentation procedures. The volume includes a separate brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling.

  13. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  14. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  15. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  16. Efficient secure two-party protocols

    CERN Document Server

    Hazay, Carmit

    2010-01-01

    The authors present a comprehensive study of efficient protocols and techniques for secure two-party computation -- both general constructions that can be used to securely compute any functionality, and protocols for specific problems of interest. The book focuses on techniques for constructing efficient protocols and proving them secure. In addition, the authors study different definitional paradigms and compare the efficiency of protocols achieved under these different definitions.The book opens with a general introduction to secure computation and then presents definitions of security for a

  17. Augmented Quadruple-Phase Contrast Media Administration and Triphasic Scan Protocol Increases Image Quality at Reduced Radiation Dose During Computed Tomography Urography.

    Science.gov (United States)

    Saade, Charbel; Mohamad, May; Kerek, Racha; Hamieh, Nadine; Alsheikh Deeb, Ibrahim; El-Achkar, Bassam; Tamim, Hani; Abdul Razzak, Farah; Haddad, Maurice; Abi-Ghanem, Alain S; El-Merhi, Fadi

    The aim of this article was to investigate the opacification of the renal vasculature and the urogenital system during computed tomography urography by using a quadruple-phase contrast media in a triphasic scan protocol. A total of 200 patients with possible urinary tract abnormalities were equally divided between 2 protocols. Protocol A used the conventional single bolus and quadruple-phase scan protocol (pre, arterial, venous, and delayed), retrospectively. Protocol B included a quadruple-phase contrast media injection with a triphasic scan protocol (pre, arterial and combined venous, and delayed), prospectively. Each protocol used 100 mL contrast and saline at a flow rate of 4.5 mL. Attenuation profiles and contrast-to-noise ratio of the renal arteries, veins, and urogenital tract were measured. Effective radiation dose calculation, data analysis by independent sample t test, receiver operating characteristic, and visual grading characteristic analyses were performed. In arterial circulation, only the inferior interlobular arteries in both protocols showed a statistical significance (P contrast-to-noise ratio than protocol A (protocol B: 22.68 ± 13.72; protocol A: 14.75 ± 5.76; P contrast media and triphasic scan protocol usage increases the image quality at a reduced radiation dose.

  18. Improving the efficiency of single and multiple teleportation protocols based on the direct use of partially entangled states

    Energy Technology Data Exchange (ETDEWEB)

    Fortes, Raphael; Rigolin, Gustavo, E-mail: rigolin@ifi.unicamp.br

    2013-09-15

    We push the limits of the direct use of partially pure entangled states to perform quantum teleportation by presenting several protocols in many different scenarios that achieve the optimal efficiency possible. We review and put in a single formalism the three major strategies known to date that allow one to use partially entangled states for direct quantum teleportation (no distillation strategies permitted) and compare their efficiencies in real world implementations. We show how one can improve the efficiency of many direct teleportation protocols by combining these techniques. We then develop new teleportation protocols employing multipartite partially entangled states. The three techniques are also used here in order to achieve the highest efficiency possible. Finally, we prove the upper bound for the optimal success rate for protocols based on partially entangled Bell states and show that some of the protocols here developed achieve such a bound. -- Highlights: •Optimal direct teleportation protocols using directly partially entangled states. •We put in a single formalism all strategies of direct teleportation. •We extend these techniques for multipartite partially entangle states. •We give upper bounds for the optimal efficiency of these protocols.

  19. Linear transceiver design for nonorthogonal amplify-and-forward protocol using a bit error rate criterion

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2014-04-01

    The ever growing demand of higher data rates can now be addressed by exploiting cooperative diversity. This form of diversity has become a fundamental technique for achieving spatial diversity by exploiting the presence of idle users in the network. This has led to new challenges in terms of designing new protocols and detectors for cooperative communications. Among various amplify-and-forward (AF) protocols, the half duplex non-orthogonal amplify-and-forward (NAF) protocol is superior to other AF schemes in terms of error performance and capacity. However, this superiority is achieved at the cost of higher receiver complexity. Furthermore, in order to exploit the full diversity of the system an optimal precoder is required. In this paper, an optimal joint linear transceiver is proposed for the NAF protocol. This transceiver operates on the principles of minimum bit error rate (BER), and is referred as joint bit error rate (JBER) detector. The BER performance of JBER detector is superior to all the proposed linear detectors such as channel inversion, the maximal ratio combining, the biased maximum likelihood detectors, and the minimum mean square error. The proposed transceiver also outperforms previous precoders designed for the NAF protocol. © 2002-2012 IEEE.

  20. Using Ovsynch protocol versus Cosynch protocol in dairy cows

    Directory of Open Access Journals (Sweden)

    Ion Valeriu Caraba

    2013-10-01

    Full Text Available As a research on the reproductive physiology and endocrinology surrounding the estrous cycle in dairy cattle has been compiled, several estrous synchronization programs have been developed for use with dairy cows. These include several programs that facilitate the mass breeding of all animals at a predetermined time (timed-AI rather than the detection of estrus. We studied on 15 dary cows which were synchronized by Ovsynch and Cosynch programs. The estrus response for cows in Ovsynch protocol was of 63%. Pregnancy per insemination at 60 days was of 25%. Estrus response for cow in Cosynch protocol was of 57%. Pregnancy per insemination at 60 days was of 57%. Synchronization of ovulation using Ovsynch protocols can provide an effective way to manage reproduction in lactating dairy cows by eliminating the need for estrus detection. These are really efficient management programs for TAI of dairy cows that are able to reduce both the labour costs and the extra handling to daily estrus detection and AI.

  1. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    Science.gov (United States)

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  2. Intelligent Local Avoided Collision (iLAC) MAC Protocol for Very High Speed Wireless Network

    Science.gov (United States)

    Hieu, Dinh Chi; Masuda, Akeo; Rabarijaona, Verotiana Hanitriniala; Shimamoto, Shigeru

    Future wireless communication systems aim at very high data rates. As the medium access control (MAC) protocol plays the central role in determining the overall performance of the wireless system, designing a suitable MAC protocol is critical to fully exploit the benefit of high speed transmission that the physical layer (PHY) offers. In the latest 802.11n standard [2], the problem of long overhead has been addressed adequately but the issue of excessive colliding transmissions, especially in congested situation, remains untouched. The procedure of setting the backoff value is the heart of the 802.11 distributed coordination function (DCF) to avoid collision in which each station makes its own decision on how to avoid collision in the next transmission. However, collision avoidance is a problem that can not be solved by a single station. In this paper, we introduce a new MAC protocol called Intelligent Local Avoided Collision (iLAC) that redefines individual rationality in choosing the backoff counter value to avoid a colliding transmission. The distinguishing feature of iLAC is that it fundamentally changes this decision making process from collision avoidance to collaborative collision prevention. As a result, stations can avoid colliding transmissions with much greater precision. Analytical solution confirms the validity of this proposal and simulation results show that the proposed algorithm outperforms the conventional algorithms by a large margin.

  3. Analytic integration of real-virtual counterterms in NNLO jet cross sections II

    Energy Technology Data Exchange (ETDEWEB)

    Bolzoni, Paolo; Moch, Sven-Olaf [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Somogyi, Gabor [Zurich Univ. (Switzerland). Inst. fuer Theoretische Physik; Trocsanyi, Zoltan [Debrecen Univ. (Hungary); Hungarian Academy of Sciences, Debrecen (Hungary). Inst. of Nuclear Research

    2009-05-15

    We present analytic expressions of all integrals required to complete the explicit evaluation of the real-virtual integrated counterterms needed to define a recently proposed subtraction scheme for jet cross sections at next-to-next-to-leading order in QCD. We use the Mellin-Barnes representation of these integrals in 4-2{epsilon} dimensions to obtain the coefficients of their Laurent expansions around {epsilon}=0. These coefficients are given by linear combinations of multidimensional Mellin-Barnes integrals. We compute the coefficients of such expansions in {epsilon} both numerically and analytically by complex integration over the Mellin-Barnes contours. (orig.)

  4. Analytic integration of real-virtual counterterms in NNLO jet cross sections II

    International Nuclear Information System (INIS)

    Bolzoni, Paolo; Moch, Sven-Olaf; Somogyi, Gabor; Trocsanyi, Zoltan

    2009-01-01

    We present analytic expressions of all integrals required to complete the explicit evaluation of the real-virtual integrated counterterms needed to define a recently proposed subtraction scheme for jet cross sections at next-to-next-to-leading order in QCD. We use the Mellin-Barnes representation of these integrals in 4 - 2ε dimensions to obtain the coefficients of their Laurent expansions around ε = 0. These coefficients are given by linear combinations of multidimensional Mellin-Barnes integrals. We compute the coefficients of such expansions in ε both numerically and analytically by complex integration over the Mellin-Barnes contours.

  5. Analytic integration of real-virtual counterterms in NNLO jet cross sections II

    Science.gov (United States)

    Bolzoni, Paolo; Moch, Sven-Olaf; Somogyi, Gábor; Trócsányi, Zoltán

    2009-08-01

    We present analytic expressions of all integrals required to complete the explicit evaluation of the real-virtual integrated counterterms needed to define a recently proposed subtraction scheme for jet cross sections at next-to-next-to-leading order in QCD. We use the Mellin-Barnes representation of these integrals in 4 - 2epsilon dimensions to obtain the coefficients of their Laurent expansions around epsilon = 0. These coefficients are given by linear combinations of multidimensional Mellin-Barnes integrals. We compute the coefficients of such expansions in epsilon both numerically and analytically by complex integration over the Mellin-Barnes contours.

  6. Analytic integration of real-virtual counterterms in NNLO jet cross sections II

    International Nuclear Information System (INIS)

    Bolzoni, Paolo; Moch, Sven-Olaf; Somogyi, Gabor; Trocsanyi, Zoltan; Hungarian Academy of Sciences, Debrecen

    2009-05-01

    We present analytic expressions of all integrals required to complete the explicit evaluation of the real-virtual integrated counterterms needed to define a recently proposed subtraction scheme for jet cross sections at next-to-next-to-leading order in QCD. We use the Mellin-Barnes representation of these integrals in 4-2ε dimensions to obtain the coefficients of their Laurent expansions around ε=0. These coefficients are given by linear combinations of multidimensional Mellin-Barnes integrals. We compute the coefficients of such expansions in ε both numerically and analytically by complex integration over the Mellin-Barnes contours. (orig.)

  7. Sunfall: a collaborative visual analytics system for astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Cecilia R.; Aragon, Cecilia R.; Bailey, Stephen J.; Poon, Sarah; Runge, Karl; Thomas, Rollin C.

    2008-07-07

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project.

  8. Exploring the design space of immersive urban analytics

    Directory of Open Access Journals (Sweden)

    Zhutian Chen

    2017-06-01

    Full Text Available Recent years have witnessed the rapid development and wide adoption of immersive head-mounted devices, such as HTC VIVE, Oculus Rift, and Microsoft HoloLens. These immersive devices have the potential to significantly extend the methodology of urban visual analytics by providing critical 3D context information and creating a sense of presence. In this paper, we propose a theoretical model to characterize the visualizations in immersive urban analytics. Furthermore, based on our comprehensive and concise model, we contribute a typology of combination methods of 2D and 3D visualizations that distinguishes between linked views, embedded views, and mixed views. We also propose a supporting guideline to assist users in selecting a proper view under certain circumstances by considering visual geometry and spatial distribution of the 2D and 3D visualizations. Finally, based on existing work, possible future research opportunities are explored and discussed.

  9. Sunfall: a collaborative visual analytics system for astrophysics

    International Nuclear Information System (INIS)

    Aragon, C R; Bailey, S J; Poon, S; Runge, K; Thomas, R C

    2008-01-01

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project

  10. Sunfall: a collaborative visual analytics system for astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, C R; Bailey, S J; Poon, S; Runge, K; Thomas, R C [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)], E-mail: CRAragon@lbl.gov

    2008-07-15

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project.

  11. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  12. A family of multi-party authentication protocols

    NARCIS (Netherlands)

    Cremers, C.J.F.; Mauw, S.

    2006-01-01

    We introduce a family of multi-party authentication protocols and discuss six novel protocols, which are members of this family. The first three generalize the well-known Needham-Schroeder-Lowe public-key protocol, the Needham-Schroeder private-key protocol, and the Bilateral Key Exchange protocol.

  13. Critical factors for assembling a high volume of DNA barcodes

    Science.gov (United States)

    Hajibabaei, Mehrdad; deWaard, Jeremy R; Ivanova, Natalia V; Ratnasingham, Sujeevan; Dooh, Robert T; Kirk, Stephanie L; Mackie, Paula M; Hebert, Paul D.N

    2005-01-01

    Large-scale DNA barcoding projects are now moving toward activation while the creation of a comprehensive barcode library for eukaryotes will ultimately require the acquisition of some 100 million barcodes. To satisfy this need, analytical facilities must adopt protocols that can support the rapid, cost-effective assembly of barcodes. In this paper we discuss the prospects for establishing high volume DNA barcoding facilities by evaluating key steps in the analytical chain from specimens to barcodes. Alliances with members of the taxonomic community represent the most effective strategy for provisioning the analytical chain with specimens. The optimal protocols for DNA extraction and subsequent PCR amplification of the barcode region depend strongly on their condition, but production targets of 100K barcode records per year are now feasible for facilities working with compliant specimens. The analysis of museum collections is currently challenging, but PCR cocktails that combine polymerases with repair enzyme(s) promise future success. Barcode analysis is already a cost-effective option for species identification in some situations and this will increasingly be the case as reference libraries are assembled and analytical protocols are simplified. PMID:16214753

  14. An Energy-efficient Clock Synchronization Protocol for Wireless Sensor Networks

    OpenAIRE

    Albu, Roxana; Labit, Yann; Thierry, Gayraud; Pascal, Berthou

    2010-01-01

    5p.; International audience; The behavior of Wireless Sensor Networks (WSN) is nowadays widely analyzed. One of the most important issues is related to their energy consumption, as this has a major impact on the network lifetime. Another important application requirement is to ensure data sensing synchronization, which leads to additional energy consumption as a high number of messages is sent and received at each node. Our proposal consists in implementing a combined synchronization protocol...

  15. Analytic treatment of distributions of lithium neutrals and ions in linear devices

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Kyu-Sun, E-mail: kschung@hanyang.ac.kr [Department of Electrical Engineering, Hanyang University, Seoul (Korea, Republic of); Hirooka, Yoshi; Ashikawa, Naoko [National Institute for Fusion Science, Toki (Japan); Cho, Soon Gook; Choi, Heung Gyoon; Kang, In Je [Department of Electrical Engineering, Hanyang University, Seoul (Korea, Republic of); Tsuchiya, Hayato [National Institute for Fusion Science, Toki (Japan)

    2017-06-15

    Neutral lithium (Li) has been used for the mitigation of heat flux to the plasma facing components and for the control of hydrogen of fusion plasmas. Radial and axial variations of densities of Li neutrals and ions are obtained analytically for a cylindrical chamber by assuming the classical diffusion with or without the magnetic field (B). Neutrals and ions without B can be expressed as a linear combination of the modified Bessel functions of order zero (I{sub 0} and K{sub 0}), while ions with B are to be expressed as the square root of them. Analytical solutions of Li neutral densities with Dirichlet and Neumann boundary conditions are compared to those using Monte Carlo simulation and experimental values of the LIGHT-1 (Lithium Injection Gettering of Hydrogen and its Transport experiments) device. Proper combinations of the relaxation length and size of the source would produce well fitted profiles similar to those observed experimentally and those using Monte Carlo codes.

  16. The Impact of Combining a Low-Tube Voltage Acquisition with Iterative Reconstruction on Total Iodine Dose in Coronary CT Angiography

    Directory of Open Access Journals (Sweden)

    Toon Van Cauteren

    2017-01-01

    Full Text Available Objectives. To assess the impact of combining low-tube voltage acquisition with iterative reconstruction (IR techniques on the iodine dose in coronary CTA. Methods. Three minipigs underwent CCTA to compare a standard of care protocol with two alternative study protocols combining low-tube voltage and low iodine dose with IR. Image quality was evaluated objectively by the CT value, signal-to-noise ratio (SNR, and contrast-to-noise ratio (CNR in the main coronary arteries and aorta and subjectively by expert reading. Statistics were performed by Mann–Whitney U test and Chi-square analysis. Results. Despite reduced iodine dose, both study protocols maintained CT values, SNR, and CNR compared to the standard of care protocol. Expert readings confirmed these findings; all scans were perceived to be of at least diagnostically acceptable quality on all evaluated parameters allowing image interpretation. No statistical differences were observed (all p values > 0.11, except for streak artifacts (p=0.02 which were considered to be more severe, although acceptable, with the 80 kVp protocol. Conclusions. Reduced tube voltage in combination with IR allows a total iodine dose reduction between 37 and 50%, by using contrast media with low iodine concentrations of 200 and 160 mg I/mL, while maintaining image quality.

  17. Analytic Model Predictive Control of Uncertain Nonlinear Systems: A Fuzzy Adaptive Approach

    Directory of Open Access Journals (Sweden)

    Xiuyan Peng

    2015-01-01

    Full Text Available A fuzzy adaptive analytic model predictive control method is proposed in this paper for a class of uncertain nonlinear systems. Specifically, invoking the standard results from the Moore-Penrose inverse of matrix, the unmatched problem which exists commonly in input and output dimensions of systems is firstly solved. Then, recurring to analytic model predictive control law, combined with fuzzy adaptive approach, the fuzzy adaptive predictive controller synthesis for the underlying systems is developed. To further reduce the impact of fuzzy approximation error on the system and improve the robustness of the system, the robust compensation term is introduced. It is shown that by applying the fuzzy adaptive analytic model predictive controller the rudder roll stabilization system is ultimately uniformly bounded stabilized in the H-infinity sense. Finally, simulation results demonstrate the effectiveness of the proposed method.

  18. Promoting teamwork and surgical optimization: combining TeamSTEPPS with a specialty team protocol.

    Science.gov (United States)

    Tibbs, Sheila Marie; Moss, Jacqueline

    2014-11-01

    This quality improvement project was a 300-day descriptive preintervention and postintervention comparison consisting of a convenience sample of 18 gynecology surgical team members. We administered the Team Strategies & Tools to Enhance Performance and Patient Safety (TeamSTEPPS®) Teamwork Perception Questionnaire to measure the perception of teamwork. In addition, we collected data regarding rates of compliance (ie, huddle, time out) and measurable surgical procedure times. Results showed a statistically significant increase in the number of team members present for each procedure, 2.34 μ before compared with 2.61 μ after (P = .038), and in the final time-out (FTO) compliance as a result of a clarification of the definition of FTO, 1.05 μ before compared with 1.18 μ after (P = .004). Additionally, there was improvement in staff members' perception of teamwork. The implementation of team training, protocols, and algorithms can enhance surgical optimization, communication, and work relationships. Copyright © 2014 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  19. Who needs inpatient detox? Development and implementation of a hospitalist protocol for the evaluation of patients for alcohol detoxification.

    Science.gov (United States)

    Stephens, John R; Liles, E Allen; Dancel, Ria; Gilchrist, Michael; Kirsch, Jonathan; DeWalt, Darren A

    2014-04-01

    Clinicians caring for patients seeking alcohol detoxification face many challenges, including lack of evidence-based guidelines for treatment and high recidivism rates. To develop a standardized protocol for determining which alcohol dependent patients seeking detoxification need inpatient versus outpatient treatment, and to study the protocol's implementation. Review of best evidence by ad hoc task force and subsequent creation of standardized protocol. Prospective observational evaluation of initial protocol implementation. Patients presenting for alcohol detoxification. Development and implementation of a protocol for evaluation and treatment of patients requesting alcohol detoxification. Number of admissions per month with primary alcohol related diagnosis (DRG), 30-day readmission rate, and length of stay, all measured before and after protocol implementation. We identified one randomized clinical trial and three cohort studies to inform the choice of inpatient versus outpatient detoxification, along with one prior protocol in this population, and combined that data with clinical experience to create an institutional protocol. After implementation, the average number of alcohol related admissions was 15.9 per month, compared with 18.9 per month before implementation (p = 0.037). There was no difference in readmission rate or length of stay. Creation and utilization of a protocol led to standardization of care for patients requesting detoxification from alcohol. Initial evaluation of protocol implementation showed a decrease in number of admissions.

  20. Evaluation of sample preparation protocols for spider venom profiling by MALDI-TOF MS.

    Science.gov (United States)

    Bočánek, Ondřej; Šedo, Ondrej; Pekár, Stano; Zdráhal, Zbyněk

    2017-07-01

    Spider venoms are highly complex mixtures containing biologically active substances with potential for use in biotechnology or pharmacology. Fingerprinting of venoms by Matrix-Assisted Laser Desorption-Ionization - Time of Flight Mass Spectrometry (MALDI-TOF MS) is a thriving technology, enabling the rapid detection of peptide/protein components that can provide comparative information. In this study, we evaluated the effects of sample preparation procedures on MALDI-TOF mass spectral quality to establish a protocol providing the most reliable analytical outputs. We adopted initial sample preparation conditions from studies already published in this field. Three different MALDI matrixes, three matrix solvents, two sample deposition methods, and different acid concentrations were tested. As a model sample, venom from Brachypelma albopilosa was used. The mass spectra were evaluated on the basis of absolute and relative signal intensities, and signal resolution. By conducting three series of analyses at three weekly intervals, the reproducibility of the mass spectra were assessed as a crucial factor in the selection for optimum conditions. A sample preparation protocol based on the use of an HCCA matrix dissolved in 50% acetonitrile with 2.5% TFA deposited onto the target by the dried-droplet method was found to provide the best results in terms of information yield and repeatability. We propose that this protocol should be followed as a standard procedure, enabling the comparative assessment of MALDI-TOF MS spider venom fingerprints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  2. Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control.

    Science.gov (United States)

    Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob

    2017-02-08

    Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant's intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms.

  3. Evaluation and standardisation of fast analytical techniques for destructive radwaste control

    International Nuclear Information System (INIS)

    De Simone, A.; Troiani, F.

    2001-01-01

    The document describes the work programme carried out by the Laboratorio Nazionale per la 'Caratterizzazione dei Refit Radioattivi', in the frame of the European research project Destructive Radwaste Control. The main tasks of the research work were the evaluation of fast sample pre-treatment procedures and the development of chromatographic methods coupled to fast nuclide detection by Liquid Scintillation Counting. In order to test the High Performance Ion Chromatograph (HPIC) coupled to the Liquid Scintillation Counter (LSC) on high salt content solutions, synthetic cement solutions have been prepared and spiked with several β-emitters hard to be measured with non-destructive analyses, along with other radionuclides important for the determination of the radiological inventory in radwastes. As the validation tests for the new analytical methods involved the manipulation of radioactive solutions, a remote area for HPIC-LSC apparatus has been designed and performed, in order to operate in safe conditions. According to the research programme, fast analytical methods for the chemical separation and radionuclide detection of the radioactive elements of interest, have been developed and qualified. From the results of the work, some protocols of analysis have been defined: they contain all information about operative conditions for HPIC-LSC apparatus, field of applicability, chemical and radioactive detection limits [it

  4. Effects of general, specific and combined warm-up on explosive muscular performance

    Directory of Open Access Journals (Sweden)

    David Cristobal Andrade

    2015-02-01

    Full Text Available The purpose of this study was to compare the acute effects of general, specific and combined warm-up (WU on explosive performance. Healthy male (n=10 subjects participated in six WU protocols in a crossover randomized study design. Protocols were: passive rest (PR; 15 min of passive rest, running (Run; 5 min of running at 70% of maximum heart rate, stretching (STR; 5 min of static stretching exercise, jumping [Jump; 5 min of jumping exercises – 3x8 countermovement jumps (CMJ and 3x8 drop jumps from 60 cm (DJ60], and combined (COM; protocols Run + STR + Jump combined. Immediately before and after each WU, subjects were assessed for explosive concentric-only (i.e. squat jump – SJ, slow stretch-shortening cycle (i.e. CMJ, fast stretch-shortening cycle (i.e. DJ60 and contact time (CT muscle performance. PR significantly reduced SJ performance (p =0.007. Run increased SJ (p =0.0001 and CMJ (p =0.002. STR increased CMJ (p =0.048. Specific WU (i.e. Jump increased SJ (p =0.001, CMJ (p =0.028 and DJ60 (p =0.006 performance. COM increased CMJ performance (p =0.006. Jump was superior in SJ performance vs. PR (p =0.001. Jump reduced (p =0.03 CT in DJ60. In conclusion, general, specific and combined WU increase slow stretch-shortening cycle (SSC muscle performance, but only specific WU increases fast SSC muscle performance. Therefore, to increase fast SSC performance, specific fast SSC muscle actions must be included during the WU.

  5. A Cryptographic Moving-Knife Cake-Cutting Protocol

    Directory of Open Access Journals (Sweden)

    Yoshifumi Manabe

    2012-02-01

    Full Text Available This paper proposes a cake-cutting protocol using cryptography when the cake is a heterogeneous good that is represented by an interval on a real line. Although the Dubins-Spanier moving-knife protocol with one knife achieves simple fairness, all players must execute the protocol synchronously. Thus, the protocol cannot be executed on asynchronous networks such as the Internet. We show that the moving-knife protocol can be executed asynchronously by a discrete protocol using a secure auction protocol. The number of cuts is n-1 where n is the number of players, which is the minimum.

  6. Implementing the Kyoto protocol in Europe: Interactions between international and Community controls

    International Nuclear Information System (INIS)

    Tabau, Anne-Sophie

    2011-07-01

    This bibliographical note presents a book which discusses the coexistence of the Kyoto protocol and of a regional regime within the European Union for the actual application of rules requiring mechanisms of control. The international regime implements a continuous monitoring which combines conventional techniques and more intrusive procedures. The European Community introduced a non-contentious mechanism with a large and strong law basis and sanction ability. The author assesses the ability of the monitoring system as a whole to ensure the very credibility of the Protocol. She also assesses the reliability of international and community economic tools which aim at reducing greenhouse gas emissions at a minimum cost. She also discusses the desirable evolutions of the regime of struggle against climate changes

  7. The Simplest Protocol for Oblivious Transfer

    DEFF Research Database (Denmark)

    Chou, Tung; Orlandi, Claudio

    2015-01-01

    Oblivious Transfer (OT) is the fundamental building block of cryptographic protocols. In this paper we describe the simplest and most efficient protocol for 1-out-of-n OT to date, which is obtained by tweaking the Diffie-Hellman key-exchange protocol. The protocol achieves UC-security against...... active and adaptive corruptions in the random oracle model. Due to its simplicity, the protocol is extremely efficient and it allows to perform m 1-out-of-n OTs using only: - Computation: (n+1)m+2 exponentiations (mn for the receiver, mn+2 for the sender) and - Communication: 32(m+1) bytes (for the group...... optimizations) is at least one order of magnitude faster than previous work. Category / Keywords: cryptographic protocols / Oblivious Transfer, UC Security, Elliptic Curves, Efficient Implementation...

  8. Playing With Population Protocols

    Directory of Open Access Journals (Sweden)

    Xavier Koegler

    2009-06-01

    Full Text Available Population protocols have been introduced as a model of sensor networks consisting of very limited mobile agents with no control over their own movement: A collection of anonymous agents, modeled by finite automata, interact in pairs according to some rules. Predicates on the initial configurations that can be computed by such protocols have been characterized under several hypotheses. We discuss here whether and when the rules of interactions between agents can be seen as a game from game theory. We do so by discussing several basic protocols.

  9. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  10. Model Additional Protocol

    International Nuclear Information System (INIS)

    Rockwood, Laura

    2001-01-01

    Since the end of the cold war a series of events has changed the circumstances and requirements of the safeguards system. The discovery of a clandestine nuclear weapons program in Iraq, the continuing difficulty in verifying the initial report of Democratic People's Republic of Korea upon entry into force of their safeguards agreement, and the decision of the South African Government to give up its nuclear weapons program and join the Treaty on the Non-Proliferation of Nuclear Weapons have all played a role in an ambitious effort by IAEA Member States and the Secretariat to strengthen the safeguards system. A major milestone in this effort was reached in May 1997 when the IAEA Board of Governors approved a Model Protocol Additional to Safeguards Agreements. The Model Additional Protocol was negotiated over a period of less than a year by an open-ended committee of the Board involving some 70 Member States and two regional inspectorates. The IAEA is now in the process of negotiating additional protocols, State by State, and implementing them. These additional protocols will provide the IAEA with rights of access to information about all activities related to the use of nuclear material in States with comprehensive safeguards agreements and greatly expanded physical access for IAEA inspectors to confirm or verify this information. In conjunction with this, the IAEA is working on the integration of these measures with those provided for in comprehensive safeguards agreements, with a view to maximizing the effectiveness and efficiency, within available resources, the implementation of safeguards. Details concerning the Model Additional Protocol are given. (author)

  11. Asymptotic adaptive bipartite entanglement-distillation protocol

    International Nuclear Information System (INIS)

    Hostens, Erik; Dehaene, Jeroen; De Moor, Bart

    2006-01-01

    We present an asymptotic bipartite entanglement-distillation protocol that outperforms all existing asymptotic schemes. This protocol is based on the breeding protocol with the incorporation of two-way classical communication. Like breeding, the protocol starts with an infinite number of copies of a Bell-diagonal mixed state. Breeding can be carried out as successive stages of partial information extraction, yielding the same result: one bit of information is gained at the cost (measurement) of one pure Bell state pair (ebit). The basic principle of our protocol is at every stage to replace measurements on ebits by measurements on a finite number of copies, whenever there are two equiprobable outcomes. In that case, the entropy of the global state is reduced by more than one bit. Therefore, every such replacement results in an improvement of the protocol. We explain how our protocol is organized as to have as many replacements as possible. The yield is then calculated for Werner states

  12. Business analytics a practitioner's guide

    CERN Document Server

    Saxena, Rahul

    2013-01-01

    This book provides a guide to businesses on how to use analytics to help drive from ideas to execution. Analytics used in this way provides "full lifecycle support" for business and helps during all stages of management decision-making and execution.The framework presented in the book enables the effective interplay of business, analytics, and information technology (business intelligence) both to leverage analytics for competitive advantage and to embed the use of business analytics into the business culture. It lays out an approach for analytics, describes the processes used, and provides gu

  13. eAnalytics: Dynamic Web-based Analytics for the Energy Industry

    Directory of Open Access Journals (Sweden)

    Paul Govan

    2016-11-01

    Full Text Available eAnalytics is a web application built on top of R that provides dynamic data analytics to energy industry stakeholders. The application allows users to dynamically manipulate chart data and style through the Shiny package’s reactive framework. eAnalytics currently supports a number of features including interactive datatables, dynamic charting capabilities, and the ability to save, download, or export information for further use. Going forward, the goal for this project is that it will serve as a research hub for discovering new relationships in the data. The application is illustrated with a simple tutorial of the user interface design.

  14. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  15. Novel Techniques with the Aid of a Staged CBCT Guided Surgical Protocol

    Directory of Open Access Journals (Sweden)

    Evdokia Chasioti

    2015-01-01

    Full Text Available The case report will present some novel techniques for using a “staged” protocol utilizing strategic periodontally involved teeth as transitional abutments in combination with CBCT guided implant surgery. Staging the case prevented premature loading of the grafted sites during the healing phase. A CBCT following a tenting screw guided bone regeneration procedure ensured adequate bone to place an implant fixture. Proper assessment of the CBCT allowed the surgeon to do an osteotome internal sinus lift in an optimum location. The depth of the bone needed for the osteotome sinus floor elevation was planned. The staged appliance allowed these sinus-augmented sites to heal for an extended period of time compared to implants, which were uncovered and loaded at an earlier time frame. The staged protocol and CBCT analysis enabled the immediate implants to be placed in proper alignment to the adjacent fixture. After teeth were extracted, the osseointegrated implants were converted to abutments for the transitional appliance. Finally, the staged protocol allowed for soft tissue enhancement in the implant and pontic areas prior to final insertion of the prosthesis.

  16. Analytical determination of thermal conductivity of W-UO2 and W-UN CERMET nuclear fuels

    Science.gov (United States)

    Webb, Jonathan A.; Charit, Indrajit

    2012-08-01

    The thermal conductivity of tungsten based CERMET fuels containing UO2 and UN fuel particles are determined as a function of particle geometry, stabilizer fraction and fuel-volume fraction, by using a combination of an analytical approach and experimental data collected from literature. Thermal conductivity is estimated using the Bruggeman-Fricke model. This study demonstrates that thermal conductivities of various CERMET fuels can be analytically predicted to values that are very close to the experimentally determined ones.

  17. ASSESSMENT OF RIP-V1 AND OSPF-V2 PROTOCOL WITH CONSIDERATION OF CONVERGENCE CRITERIA AND SENDING PROTOCOLS TRAFFIC

    Directory of Open Access Journals (Sweden)

    Hamed Jelodar

    2014-03-01

    Full Text Available Routing Protocols are underlying principles in networks like internet, transport and mobile. Routing Protocols include a series of rules and algorithms that consider routing metric and select the best way for sending healthy data packets from origin to destination. Dynamic routing protocol compatible to topology has a changeable state. RIP and OSPF are dynamic routing protocol that we consider criteria like convergence and sending protocols traffic assessment RIP first version and OSPF second version. By the test we have done on OPNET stimulation we understood that the OSPF protocol was more efficient than RIP protocol.

  18. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    Science.gov (United States)

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Using research literature to develop a perceptual retraining treatment protocol.

    Science.gov (United States)

    Neistadt, M E

    1994-01-01

    Treatment protocols derived from research literature can help therapists provide more rigorous treatment and more systematic assessment of client progress. This study applied research findings about the influence of task, subject, and feedback parameters on adult performance with block designs to an occupational therapy treatment protocol for parquetry block assembly--an activity occupational therapists use to remediate constructional deficits. Task parameter research suggests that parquetry tasks can be graded according to the features of the design cards, with cards having all block boundaries drawn in being easier than those with some block boundaries omitted. Subject parameter findings suggest that clients' lesions and initial constructional competence can influence their approaches to parquetry tasks. Feedback parameter research suggests that a combination of perceptual and planning cues is most effective for parquetry tasks. Methods to help clients transfer constructional skills from parquetry to functional tasks are also discussed.

  20. Green sample preparation for liquid chromatography and capillary electrophoresis of anionic and cationic analytes.

    Science.gov (United States)

    Wuethrich, Alain; Haddad, Paul R; Quirino, Joselito P

    2015-04-21

    A sample preparation device for the simultaneous enrichment and separation of cationic and anionic analytes was designed and implemented in an eight-channel configuration. The device is based on the use of an electric field to transfer the analytes from a large volume of sample into small volumes of electrolyte that was suspended into two glass micropipettes using a conductive hydrogel. This simple, economical, fast, and green (no organic solvent required) sample preparation scheme was evaluated using cationic and anionic herbicides as test analytes in water. The analytical figures of merit and ecological aspects were evaluated against the state-of-the-art sample preparation, solid-phase extraction. A drastic reduction in both sample preparation time (94% faster) and resources (99% less consumables used) was observed. Finally, the technique in combination with high-performance liquid chromatography and capillary electrophoresis was applied to analysis of quaternary ammonium and phenoxypropionic acid herbicides in fortified river water as well as drinking water (at levels relevant to Australian guidelines). The presented sustainable sample preparation approach could easily be applied to other charged analytes or adopted by other laboratories.