WorldWideScience

Sample records for test set analytes

  1. Test set of gaseous analytes at Hanford tank farms

    International Nuclear Information System (INIS)

    1997-01-01

    DOE has stored toxic and radioactive waste materials in large underground tanks. When the vapors in the tank headspaces vent to the open atmosphere a potentially dangerous situation can occur for personnel in the area. An open-path atmospheric pollution monitor is being developed to monitor the open air space above these tanks. In developing this infrared spectra monitor as a safety alert instrument, it is important to know what hazardous gases, called the Analytes of Concern, are most likely to be found in dangerous concentrations. The monitor must consider other gases which could interfere with measurements of the Analytes of Concern. The total list of gases called the Test Set Analytes form the basis for testing the pollution monitor. Prior measurements in 54 tank headspaces have detected 102 toxic air pollutants (TAPs) and over 1000 other analytes. The hazardous Analytes are ranked herein by a Hazardous Atmosphere Rating which combines their measured concentration, their density relative to air, and the concentration at which they become dangerous. The top 20 toxic air pollutants, as ranked by the Hazardous Atmosphere Rating, and the top 20 other analytes, in terms of measured concentrations, are analyzed for possible inclusion in the Test Set Analytes. Of these 40 gases, 20 are selected. To these 20 gases are added the 6 omnipresent atmospheric gases with the highest concentrations, since their spectra could interfere with measurements of the other spectra. The 26 Test Set Analytes are divided into a Primary Set and a Secondary Set. The Primary Set, gases which must be detectable by the monitor, includes the 6 atmospheric gases and the 6 hazardous gases which have been measured at dangerous concentrations. The Secondary Set gases need not be monitored at this time. The infrared spectra indicates that the pollution monitor will detect all 26 Test Set Analytes by thermal emission and will detect 15 Test Set Analytes by laser absorption

  2. Analytic webs support the synthesis of ecological data sets.

    Science.gov (United States)

    Ellison, Aaron M; Osterweil, Leon J; Clarke, Lori; Hadley, Julian L; Wise, Alexander; Boose, Emery; Foster, David R; Hanson, Allen; Jensen, David; Kuzeja, Paul; Riseman, Edward; Schultz, Howard

    2006-06-01

    A wide variety of data sets produced by individual investigators are now synthesized to address ecological questions that span a range of spatial and temporal scales. It is important to facilitate such syntheses so that "consumers" of data sets can be confident that both input data sets and synthetic products are reliable. Necessary documentation to ensure the reliability and validation of data sets includes both familiar descriptive metadata and formal documentation of the scientific processes used (i.e., process metadata) to produce usable data sets from collections of raw data. Such documentation is complex and difficult to construct, so it is important to help "producers" create reliable data sets and to facilitate their creation of required metadata. We describe a formal representation, an "analytic web," that aids both producers and consumers of data sets by providing complete and precise definitions of scientific processes used to process raw and derived data sets. The formalisms used to define analytic webs are adaptations of those used in software engineering, and they provide a novel and effective support system for both the synthesis and the validation of ecological data sets. We illustrate the utility of an analytic web as an aid to producing synthetic data sets through a worked example: the synthesis of long-term measurements of whole-ecosystem carbon exchange. Analytic webs are also useful validation aids for consumers because they support the concurrent construction of a complete, Internet-accessible audit trail of the analytic processes used in the synthesis of the data sets. Finally we describe our early efforts to evaluate these ideas through the use of a prototype software tool, SciWalker. We indicate how this tool has been used to create analytic webs tailored to specific data-set synthesis and validation activities, and suggest extensions to it that will support additional forms of validation. The process metadata created by SciWalker is

  3. An analytic data analysis method for oscillatory slug tests.

    Science.gov (United States)

    Chen, Chia-Shyun

    2006-01-01

    An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.

  4. Analytical one parameter method for PID motion controller settings

    NARCIS (Netherlands)

    van Dijk, Johannes; Aarts, Ronald G.K.M.

    2012-01-01

    In this paper analytical expressions for PID-controllers settings for electromechanical motion systems are presented. It will be shown that by an adequate frequency domain oriented parametrization, the parameters of a PID-controller are analytically dependent on one variable only, the cross-over

  5. GenoSets: visual analytic methods for comparative genomics.

    Directory of Open Access Journals (Sweden)

    Aurora A Cain

    Full Text Available Many important questions in biology are, fundamentally, comparative, and this extends to our analysis of a growing number of sequenced genomes. Existing genomic analysis tools are often organized around literal views of genomes as linear strings. Even when information is highly condensed, these views grow cumbersome as larger numbers of genomes are added. Data aggregation and summarization methods from the field of visual analytics can provide abstracted comparative views, suitable for sifting large multi-genome datasets to identify critical similarities and differences. We introduce a software system for visual analysis of comparative genomics data. The system automates the process of data integration, and provides the analysis platform to identify and explore features of interest within these large datasets. GenoSets borrows techniques from business intelligence and visual analytics to provide a rich interface of interactive visualizations supported by a multi-dimensional data warehouse. In GenoSets, visual analytic approaches are used to enable querying based on orthology, functional assignment, and taxonomic or user-defined groupings of genomes. GenoSets links this information together with coordinated, interactive visualizations for both detailed and high-level categorical analysis of summarized data. GenoSets has been designed to simplify the exploration of multiple genome datasets and to facilitate reasoning about genomic comparisons. Case examples are included showing the use of this system in the analysis of 12 Brucella genomes. GenoSets software and the case study dataset are freely available at http://genosets.uncc.edu. We demonstrate that the integration of genomic data using a coordinated multiple view approach can simplify the exploration of large comparative genomic data sets, and facilitate reasoning about comparisons and features of interest.

  6. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  7. RUPTURES IN THE ANALYTIC SETTING AND DISTURBANCES IN THE TRANSFORMATIONAL FIELD OF DREAMS.

    Science.gov (United States)

    Brown, Lawrence J

    2015-10-01

    This paper explores some implications of Bleger's (1967, 2013) concept of the analytic situation, which he views as comprising the analytic setting and the analytic process. The author discusses Bleger's idea of the analytic setting as the depositary for projected painful aspects in either the analyst or patient or both-affects that are then rendered as nonprocess. In contrast, the contents of the analytic process are subject to an incessant process of transformation (Green 2005). The author goes on to enumerate various components of the analytic setting: the nonhuman, object relational, and the analyst's "person" (including mental functioning). An extended clinical vignette is offered as an illustration. © 2015 The Psychoanalytic Quarterly, Inc.

  8. Setting analytical performance specifications based on outcome studies - is it possible?

    NARCIS (Netherlands)

    Horvath, Andrea Rita; Bossuyt, Patrick M. M.; Sandberg, Sverre; John, Andrew St; Monaghan, Phillip J.; Verhagen-Kamerbeek, Wilma D. J.; Lennartz, Lieselotte; Cobbaert, Christa M.; Ebert, Christoph; Lord, Sarah J.

    2015-01-01

    The 1st Strategic Conference of the European Federation of Clinical Chemistry and Laboratory Medicine proposed a simplified hierarchy for setting analytical performance specifications (APS). The top two levels of the 1999 Stockholm hierarchy, i.e., evaluation of the effect of analytical performance

  9. Quality management and accreditation in a mixed research and clinical hair testing analytical laboratory setting-a review.

    Science.gov (United States)

    Fulga, Netta

    2013-06-01

    Quality management and accreditation in the analytical laboratory setting are developing rapidly and becoming the standard worldwide. Quality management refers to all the activities used by organizations to ensure product or service consistency. Accreditation is a formal recognition by an authoritative regulatory body that a laboratory is competent to perform examinations and report results. The Motherisk Drug Testing Laboratory is licensed to operate at the Hospital for Sick Children in Toronto, Ontario. The laboratory performs toxicology tests of hair and meconium samples for research and clinical purposes. Most of the samples are involved in a chain of custody cases. Establishing a quality management system and achieving accreditation became mandatory by legislation for all Ontario clinical laboratories since 2003. The Ontario Laboratory Accreditation program is based on International Organization for Standardization 15189-Medical laboratories-Particular requirements for quality and competence, an international standard that has been adopted as a national standard in Canada. The implementation of a quality management system involves management commitment, planning and staff education, documentation of the system, validation of processes, and assessment against the requirements. The maintenance of a quality management system requires control and monitoring of the entire laboratory path of workflow. The process of transformation of a research/clinical laboratory into an accredited laboratory, and the benefits of maintaining an effective quality management system, are presented in this article.

  10. Analytical validation of an ultra low-cost mobile phone microplate reader for infectious disease testing.

    Science.gov (United States)

    Wang, Li-Ju; Naudé, Nicole; Demissie, Misganaw; Crivaro, Anne; Kamoun, Malek; Wang, Ping; Li, Lei

    2018-07-01

    Most mobile health (mHealth) diagnostic devices for laboratory tests only analyze one sample at a time, which is not suitable for large volume serology testing, especially in low-resource settings with shortage of health professionals. In this study, we developed an ultra-low-cost clinically-accurate mobile phone microplate reader (mReader), and clinically validated this optical device for 12 infectious disease tests. The mReader optically reads 96 samples on a microplate at one time. 771 de-identified patient samples were tested for 12 serology assays for bacterial/viral infections. The mReader and the clinical instrument blindly read and analyzed all tests in parallel. The analytical accuracy and the diagnostic performance of the mReader were evaluated across the clinical reportable categories by comparison with clinical laboratorial testing results. The mReader exhibited 97.59-99.90% analytical accuracy and envision the mReader can benefit underserved areas/populations and low-resource settings in rural clinics/hospitals at a low cost (~$50 USD) with clinical-level analytical quality. It has the potential to improve health access, speed up healthcare delivery, and reduce health disparities and education disparities by providing access to a low-cost spectrophotometer. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 2. Assessment of MCNP Statistical Analysis of keff Eigenvalue Convergence with an Analytical Criticality Verification Test Set

    International Nuclear Information System (INIS)

    Sood, Avnet; Forster, R. Arthur; Parsons, D. Kent

    2001-01-01

    Monte Carlo simulations of nuclear criticality eigenvalue problems are often performed by general purpose radiation transport codes such as MCNP. MCNP performs detailed statistical analysis of the criticality calculation and provides feedback to the user with warning messages, tables, and graphs. The purpose of the analysis is to provide the user with sufficient information to assess spatial convergence of the eigenfunction and thus the validity of the criticality calculation. As a test of this statistical analysis package in MCNP, analytic criticality verification benchmark problems have been used for the first time to assess the performance of the criticality convergence tests in MCNP. The MCNP statistical analysis capability has been recently assessed using the 75 multigroup criticality verification analytic problem test set. MCNP was verified with these problems at the 10 -4 to 10 -5 statistical error level using 40 000 histories per cycle and 2000 active cycles. In all cases, the final boxed combined k eff answer was given with the standard deviation and three confidence intervals that contained the analytic k eff . To test the effectiveness of the statistical analysis checks in identifying poor eigenfunction convergence, ten problems from the test set were deliberately run incorrectly using 1000 histories per cycle, 200 active cycles, and 10 inactive cycles. Six problems with large dominance ratios were chosen from the test set because they do not achieve the normal spatial mode in the beginning of the calculation. To further stress the convergence tests, these problems were also started with an initial fission source point 1 cm from the boundary thus increasing the likelihood of a poorly converged initial fission source distribution. The final combined k eff confidence intervals for these deliberately ill-posed problems did not include the analytic k eff value. In no case did a bad confidence interval go undetected. Warning messages were given signaling that

  12. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  13. Some applications of fuzzy sets and the analytical hierarchy process to decision making

    OpenAIRE

    Castro, Alberto Rosas

    1984-01-01

    Approved for public release; distribution unlimited This thesis examines the use of fuzzy set theory and the analytic hierarchy process in decision making. It begins by reviewing the insight of psychologists, social scientists and computer scientists to the decision making process. The Operations Research- Systems Analysis approach is discussed followed by a presentation of the basis of fuzzy set theory and the analytic hierarchy process. Two applications of these meth...

  14. Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption

    Science.gov (United States)

    Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane

    2014-01-01

    A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…

  15. Dispersant testing : a study on analytical test procedures

    International Nuclear Information System (INIS)

    Fingas, M.F.; Fieldhouse, B.; Wang, Z.; Environment Canada, Ottawa, ON

    2004-01-01

    Crude oil is a complex mixture of hydrocarbons, ranging from small, volatile compounds to very large, non-volatile compounds. Analysis of the dispersed oil is crucial. This paper described Environment Canada's ongoing studies on various traits of dispersants. In particular, it describes small studies related to dispersant effectiveness and methods to improve analytical procedures. The study also re-evaluated the analytical procedure for the Swirling Flask Test, which is now part of the ASTM standard procedure. There are new and improved methods for analyzing oil-in-water using gas chromatography (GC). The methods could be further enhanced by integrating the entire chromatogram rather than just peaks. This would result in a decrease in maximum variation from 5 per cent to about 2 per cent. For oil-dispersant studies, the surfactant-dispersed oil hydrocarbons consist of two parts: GC-resolved hydrocarbons and GC-unresolved hydrocarbons. This study also tested a second feature of the Swirling Flask Test in which the side spout was tested and compared with a new vessel with a septum port instead of a side spout. This decreased the variability as well as the energy and mixing in the vessel. Rather than being a variation of the Swirling Flask Test, it was suggested that a spoutless vessel might be considered as a completely separate test. 7 refs., 2 tabs., 4 figs

  16. Cost effectiveness of ovarian reserve testing in in vitro fertilization : a Markov decision-analytic model

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Broekmans, Frank J. M.; van Disseldorp, Jeroen; Fauser, Bart C. J. M.; Eijkemans, Marinus J. C.; Hompes, Peter G. A.; van der Veen, Fulco; Mol, Ben Willem J.

    2011-01-01

    Objective: To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). Design: A Markov decision model based on data from the literature and original patient data. Setting: Decision analytic framework. Patient(s): Computer-simulated cohort of subfertile women aged

  17. Performance specifications for the extra-analytical phases of laboratory testing: Why and how.

    Science.gov (United States)

    Plebani, Mario

    2017-07-01

    An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Analytical challenges in sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  19. Analytical support for the preparation of bundle test QUENCH-10 on air ingress

    International Nuclear Information System (INIS)

    Birchley, J.; Haste, T.; Homann, C.; Hering, W.

    2005-07-01

    Bundle test QUENCH-10 is dedicated to study air ingress with subsequent water quench during a supposed accident in a spent fuel storage tank. It was proposed by AEKI, Budapest, Hungary and was performed on 21 July 2004 in the QUENCH facility at Forschungszentrum Karlsruhe. Preparation of the test is based on common analytical work at Forschungszentrum Karlsruhe and Paul Scherrer Institut, Villigen, Switzerland, mainly with the severe accident codes SCDAP/RELAP5 and MELCOR, to derive the protocol for the essential test phases, namely pre-oxidation, air ingress and quench phase. For issues that could not be tackled by this computational work, suggestions for the test conduct were made and applied during the test. Improvements of the experimental set-up and the test conduct were suggested and largely applied. In SCDAP/RELAP5, an error was found: for thick oxide scales, the output value of the oxide scale is sensibly underestimated. For the aims of the test preparation, its consequences could be taken into account. Together with the related computational and other analytical support by the engaged institutions the test is co-financed as test QUENCH-L1 by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (LACOMERA Project, contract No. FIR1-CT2002-40158). (orig.)

  20. A new MCNP trademark test set

    International Nuclear Information System (INIS)

    Brockhoff, R.C.; Hendricks, J.S.

    1994-09-01

    The MCNP test set is used to test the MCNP code after installation on various computer platforms. For MCNP4 and MCNP4A this test set included 25 test problems designed to test as many features of the MCNP code as possible. A new and better test set has been devised to increase coverage of the code from 85% to 97% with 28 problems. The new test set is as fast as and shorter than the MCNP4A test set. The authors describe the methodology for devising the new test set, the features that were not covered in the MCNP4A test set, and the changes in the MCNP4A test set that have been made for MCNP4B and its developmental versions. Finally, new bugs uncovered by the new test set and a compilation of all known MCNP4A bugs are presented

  1. Emergency analytical testing: things to consider

    CSIR Research Space (South Africa)

    Pretorius, Cecilia J

    2017-07-01

    Full Text Available Circumstances may dictate that samples from mining operations are analysed for unknown compounds that are potentially harmful to humans. These circumstances may be out of the ordinary, unique or isolated incidents. Emergency analytical testing may...

  2. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test.

    Science.gov (United States)

    Yang, Hannah P; Walmer, David K; Merisier, Delson; Gage, Julia C; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S; Castle, Philip E

    2011-09-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings ("triage test"). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kappa = 0.87). DNA sequencing on a subset of 16 HPV16/18/45 positive and 16 HPV16/18/45 negative verified the analytic specificity of the research test. It is concluded that the HPV16/18/45 assay is a promising triage test with a minimum detection of approximately 5000 viral copies, the clinically relevant threshold. Published by Elsevier B.V.

  3. Analytical study on model tests of soil-structure interaction

    International Nuclear Information System (INIS)

    Odajima, M.; Suzuki, S.; Akino, K.

    1987-01-01

    Since nuclear power plant (NPP) structures are stiff, heavy and partly-embedded, the behavior of those structures during an earthquake depends on the vibrational characteristics of not only the structure but also the soil. Accordingly, seismic response analyses considering the effects of soil-structure interaction (SSI) are extremely important for seismic design of NPP structures. Many studies have been conducted on analytical techniques concerning SSI and various analytical models and approaches have been proposed. Based on the studies, SSI analytical codes (computer programs) for NPP structures have been improved at JINS (Japan Institute of Nuclear Safety), one of the departments of NUPEC (Nuclear Power Engineering Test Center) in Japan. These codes are soil-spring lumped-mass code (SANLUM), finite element code (SANSSI), thin layered element code (SANSOL). In proceeding with the improvement of the analytical codes, in-situ large-scale forced vibration SSI tests were performed using models simulating light water reactor buildings, and simulation analyses were performed to verify the codes. This paper presents an analytical study to demonstrate the usefulness of the codes

  4. The Generalized Higher Criticism for Testing SNP-Set Effects in Genetic Association Studies

    Science.gov (United States)

    Barnett, Ian; Mukherjee, Rajarshi; Lin, Xihong

    2017-01-01

    It is of substantial interest to study the effects of genes, genetic pathways, and networks on the risk of complex diseases. These genetic constructs each contain multiple SNPs, which are often correlated and function jointly, and might be large in number. However, only a sparse subset of SNPs in a genetic construct is generally associated with the disease of interest. In this article, we propose the generalized higher criticism (GHC) to test for the association between an SNP set and a disease outcome. The higher criticism is a test traditionally used in high-dimensional signal detection settings when marginal test statistics are independent and the number of parameters is very large. However, these assumptions do not always hold in genetic association studies, due to linkage disequilibrium among SNPs and the finite number of SNPs in an SNP set in each genetic construct. The proposed GHC overcomes the limitations of the higher criticism by allowing for arbitrary correlation structures among the SNPs in an SNP-set, while performing accurate analytic p-value calculations for any finite number of SNPs in the SNP-set. We obtain the detection boundary of the GHC test. We compared empirically using simulations the power of the GHC method with existing SNP-set tests over a range of genetic regions with varied correlation structures and signal sparsity. We apply the proposed methods to analyze the CGEM breast cancer genome-wide association study. Supplementary materials for this article are available online. PMID:28736464

  5. Analytic tests and their relation to jet fuel thermal stability

    Energy Technology Data Exchange (ETDEWEB)

    Heneghan, S.P.; Kauffman, R.E. [Univ. of Dayton Research Institute, OH (United States)

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions show that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.

  6. Analytical evaluation on loss of off-side electric power simulation of the High Temperature Engineering Test Reactor

    International Nuclear Information System (INIS)

    Takeda, Takeshi; Nakagawa, Shigeaki; Tachibana, Yukio; Takada, Eiji; Kunitomi, Kazuhiko

    2000-03-01

    A rise-to-power test of the high temperature engineering test reactor (HTTR) started on September 28 in 1999 for establishing and upgrading the technological basis for the high temperature gas-cooled reactor (HTGR). A loss of off-site electric power test of the HTTR from the normal operation under 15 and 30 MW thermal power will be carried out in the rise-to-power test. Analytical evaluations on transient behaviors of the reactor and plant during the loss of off-site electric power were conducted. These estimations are proposed as benchmark problems for the IAEA coordinated research program on 'Evaluation of HTGR Performance'. This report describes an event scenario of transient during the loss of off-site electric power, the outline of major components and system, detailed thermal and nuclear data set for these problems and pre-estimation results of the benchmark problems by an analytical code 'ACCORD' for incore and plant dynamics of the HTGR. (author)

  7. Social Set Visualizer

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Hussain, Abid; Vatrapu, Ravi

    2015-01-01

    -edge open source visual analytics libraries from D3.js and creation of new visualizations (ac-tor mobility across time, conversational comets etc). Evaluation of the dashboard consisting of technical testing, usability testing, and domain-specific testing with CSR students and yielded positive results.......This paper presents a state-of-the art visual analytics dash-board, Social Set Visualizer (SoSeVi), of approximately 90 million Facebook actions from 11 different companies that have been mentioned in the traditional media in relation to garment factory accidents in Bangladesh. The enterprise...

  8. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test

    OpenAIRE

    Yang, Hannah P.; Walmer, David K.; Merisier, Delson; Gage, Julia C.; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S.; Castle, Philip E.

    2011-01-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings (“triage test”). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kap...

  9. Geodesics of electrically and magnetically charged test particles in the Reissner-Nordstroem space-time: Analytical solutions

    International Nuclear Information System (INIS)

    Grunau, Saskia; Kagramanova, Valeria

    2011-01-01

    We present the full set of analytical solutions of the geodesic equations of charged test particles in the Reissner-Nordstroem space-time in terms of the Weierstrass weierp, σ, and ζ elliptic functions. Based on the study of the polynomials in the θ and r equations, we characterize the motion of test particles and discuss their properties. The motion of charged test particles in the Reissner-Nordstroem space-time is compared with the motion of neutral test particles in the field of a gravitomagnetic monopole. Electrically or magnetically charged particles in the Reissner-Nordstroem space-time with magnetic or electric charges, respectively, move on cones similar to neutral test particles in the Taub-NUT space-times.

  10. Towards a Set Theoretical Approach to Big Data Analytics

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    Formal methods, models and tools for social big data analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by relational sociology. There are no other unified modeling approaches to social big data that integrate the conceptual, formal...... this technique to the data analysis of big social data collected from Facebook page of the fast fashion company, H&M....... and software realms. In this paper, we first present and discuss a theory and conceptual model of social data. Second, we outline a formal model based on set theory and discuss the semantics of the formal model with a real-world social data example from Facebook. Third, we briefly present and discuss...

  11. 105-KE Basin isolation barrier leak rate test analytical development. Revision 1

    International Nuclear Information System (INIS)

    Irwin, J.J.

    1995-01-01

    This document provides an analytical development in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used to determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under postulated accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage: viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  12. Microlocal analysis of quantum fields on curved space-times: Analytic wave front sets and Reeh-Schlieder theorems

    International Nuclear Information System (INIS)

    Strohmaier, Alexander; Verch, Rainer; Wollenberg, Manfred

    2002-01-01

    We show in this article that the Reeh-Schlieder property holds for states of quantum fields on real analytic curved space-times if they satisfy an analytic microlocal spectrum condition. This result holds in the setting of general quantum field theory, i.e., without assuming the quantum field to obey a specific equation of motion. Moreover, quasifree states of the Klein-Gordon field are further investigated in the present work and the (analytic) microlocal spectrum condition is shown to be equivalent to simpler conditions. We also prove that any quasifree ground or KMS state of the Klein-Gordon field on a stationary real analytic space-time fulfills the analytic microlocal spectrum condition

  13. Test set for initial value problem solvers

    NARCIS (Netherlands)

    W.M. Lioen (Walter); J.J.B. de Swart (Jacques)

    1998-01-01

    textabstractThe CWI test set for IVP solvers presents a collection of Initial Value Problems to test solvers for implicit differential equations. This test set can both decrease the effort for the code developer to test his software in a reliable way, and cross the bridge between the application

  14. Gene set analysis using variance component tests.

    Science.gov (United States)

    Huang, Yen-Tsung; Lin, Xihong

    2013-06-28

    Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.

  15. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    Science.gov (United States)

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  16. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  17. Continuous Analytical Performances Monitoring at the On-Site Laboratory through Proficiency, Inter-Laboratory Testing and Inter-Comparison Analytical Methods

    International Nuclear Information System (INIS)

    Duhamel, G.; Decaillon, J.-G.; Dashdondog, S.; Kim, C.-K.; Toervenyi, A.; Hara, S.; Kato, S.; Kawaguchi, T.; Matsuzawa, K.

    2015-01-01

    Since 2008, as one measure to strengthen its quality management system, the On-Site Laboratory for nuclear safeguards at the Rokkasho Reprocessing Plant, has increased its participation in domestic and international proficiency and inter-laboratory testing for the purpose of determining analytical method accuracy, precision and robustness but also to support method development and improvement. This paper provides a description of the testing and its scheduling. It presents the way the testing was optimized to cover most of the analytical methods at the OSL. The paper presents the methodology used for the evaluation of the obtained results based on Analysis of variance (ANOVA). Results are discussed with respect to random, systematic and long term systematic error. (author)

  18. Analytical quality control of neutron activation analysis by interlaboratory comparison and proficiency test

    International Nuclear Information System (INIS)

    Kim, S. H.; Moon, J. H.; Jeong, Y. S.

    2002-01-01

    Two air filters (V-50, P-50) artificially loaded with urban dust were provided from IAEA and trace elements to study inter-laboratory comparison and proficiency test were determined using instrumental neutron activation analysis non-destructively. Standard reference material(Urban Particulate Matter, NIST SRM 1648) of National Institute of Standard and Technology was used for internal analytical quality control. About 20 elements in each loaded filter sample were determined, respectively. Our analytical data were compared with statistical results using neutron activation analysis, particle induced X-ray emission spectrometry, inductively coupled plasma mass spectroscopy, etc., which were collected from 49 laboratories of 40 countries. From the results that were statistically re-treated with reported values, Z-scores of our analytical values are within ±2. In addition, the results of proficiency test are passed and accuracy and precision of the analytical values are reliable. Consequently, it was proved that analytical quality control for the analysis of air dust samples is reasonable

  19. Sampling analytical tests and destructive tests for quality assurance

    International Nuclear Information System (INIS)

    Saas, A.; Pasquini, S.; Jouan, A.; Angelis, de; Hreen Taywood, H.; Odoj, R.

    1990-01-01

    In the context of the third programme of the European Communities on the monitoring of radioactive waste, various methods have been developed for the performance of sampling and measuring tests on encapsulated waste of low and medium level activity, on the one hand, and of high level activity, on the other hand. The purpose was to provide better quality assurance for products to be stored on an interim or long-term basis. Various testing sampling means are proposed such as: - sampling of raw waste before conditioning and determination of the representative aliquot, - sampling of encapsulated waste on process output, - sampling of core specimens subjected to measurement before and after cutting. Equipment suitable for these sampling procedures have been developed and, in the case of core samples, a comparison of techniques has been made. The results are described for the various analytical tests carried out on the samples such as: - mechanical tests, - radiation resistance, - fire resistance, - lixiviation, - determination of free water, - biodegradation, - water resistance, - chemical and radiochemical analysis. Every time it was possible, these tests were compared with non-destructive tests on full-scale packages and some correlations are given. This word has made if possible to improve and clarify sample optimization, with fine sampling techniques and methodologies and draw up characterization procedures. It also provided an occasion for a first collaboration between the laboratories responsible for these studies and which will be furthered in the scope of the 1990-1994 programme

  20. Annual banned-substance review: analytical approaches in human sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2017-01-01

    There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  2. Analytic evaluation of a new glucose meter system in 15 different critical care settings.

    Science.gov (United States)

    Mitsios, John V; Ashby, Lori A; Haverstick, Doris M; Bruns, David E; Scott, Mitchell G

    2013-09-01

    Maintaining appropriate glycemic control in critically ill patients reduces morbidity and mortality. The use of point-of-care (POC) glucose devices is necessary to obtain rapid results at the patient's bedside. However, the devices should be thoroughly tested in the intended population before implementation. The use of POC glucose meters in critically ill patients has been questioned both in the literature and by regulatory agencies. The aim of this study was to determine if the ACCU-CHEK® Inform II system (Roche Diagnostics) POC glucose meter demonstrated the desired accuracy and precision, as defined by Clinical and Laboratory Standards Institute guideline POCT12-A3, in a large number of critically ill patients from multiple intensive care settings at two academic medical centers. A total of 1200 whole blood meter results from 600 patients were compared with central laboratory plasma values. Whole blood aliquots from venous samples were used to obtain duplicate meter results with the remaining sample being processed to obtain plasma for central laboratory testing within 5 min of meter testing. A total of 1185 (98.8%) of the new meter's glucose values were within ± 12.5% (± 12 mg/dl for values ≥ 100 mg/dl) of the comparative laboratory glucose values, and 1198 (99.8%) were within ± 20% (± 20 mg/dl for values meter system appears to have sufficient analytic accuracy for use in critically ill patients. © 2013 Diabetes Technology Society.

  3. Test Program Set (TPS) Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The ARDEC TPS Laboratory provides an organic Test Program Set (TPS) development, maintenance, and life cycle management capability for DoD LCMC materiel developers....

  4. Appropriate criteria set for personnel promotion across organizational levels using analytic hierarchy process (AHP

    Directory of Open Access Journals (Sweden)

    Charles Noven Castillo

    2017-01-01

    Full Text Available Currently, there has been limited established specific set of criteria for personnel promotion to each level of the organization. This study is conducted in order to develop a personnel promotion strategy by identifying specific sets of criteria for each level of the organization. The complexity of identifying the criteria set along with the subjectivity of these criteria require the use of multi-criteria decision-making approach particularly the analytic hierarchy process (AHP. Results show different sets of criteria for each management level which are consistent with several frameworks in literature. These criteria sets would help avoid mismatch of employee skills and competencies and their job, and at the same time eliminate the issues in personnel promotion such as favouritism, glass ceiling, and gender and physical attractiveness preference. This work also shows that personality and traits, job satisfaction and experience and skills are more critical rather than social capital across different organizational levels. The contribution of this work is in identifying relevant criteria in developing a personnel promotion strategy across organizational levels.

  5. Annual banned-substance review: analytical approaches in human sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Set-up for differential manometers testing

    International Nuclear Information System (INIS)

    Ratushnyj, M.I.; Galkin, Yu.V.; Nechaj, A.G.

    1985-01-01

    Set-up characteristic for controlling and testing metrological characteristics of TPP and NPP differential manometers with extreme pressure drop upto 250 kPa is briefly described. The set-up provides with automatic and manual assignment of values of gauge air pressure with errors of 0.1 and 0.25% correspondingly. The set-up is supplied with standard equipment to measure output signals. Set-up supply is carried out by a one-phase alternating current circuit with 220 V. Air supply is carried out by O.4-0.6 MPa. pressure of a pneumatic system. Application of the set-up increases operating efficiency 5 times while checking and turning differential manometers

  7. A Meta-Analytic Review of Tactile-Cued Self-Monitoring Interventions Used by Students in Educational Settings

    Science.gov (United States)

    McDougall, Dennis; Ornelles, Cecily; Mersberg, Kawika; Amona, Kekama

    2015-01-01

    In this meta-analytic review, we critically evaluate procedures and outcomes from nine intervention studies in which students used tactile-cued self-monitoring in educational settings. Findings suggest that most tactile-cued self-monitoring interventions have moderate to strong effects, have emerged only recently, and have not yet achieved the…

  8. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  9. Complete set of homogeneous isotropic analytic solutions in scalar-tensor cosmology with radiation and curvature

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-10-01

    We study a model of a scalar field minimally coupled to gravity, with a specific potential energy for the scalar field, and include curvature and radiation as two additional parameters. Our goal is to obtain analytically the complete set of configurations of a homogeneous and isotropic universe as a function of time. This leads to a geodesically complete description of the Universe, including the passage through the cosmological singularities, at the classical level. We give all the solutions analytically without any restrictions on the parameter space of the model or initial values of the fields. We find that for generic solutions the Universe goes through a singular (zero-size) bounce by entering a period of antigravity at each big crunch and exiting from it at the following big bang. This happens cyclically again and again without violating the null-energy condition. There is a special subset of geodesically complete nongeneric solutions which perform zero-size bounces without ever entering the antigravity regime in all cycles. For these, initial values of the fields are synchronized and quantized but the parameters of the model are not restricted. There is also a subset of spatial curvature-induced solutions that have finite-size bounces in the gravity regime and never enter the antigravity phase. These exist only within a small continuous domain of parameter space without fine-tuning the initial conditions. To obtain these results, we identified 25 regions of a 6-parameter space in which the complete set of analytic solutions are explicitly obtained.

  10. Finite test sets development method for test execution of safety critical software

    International Nuclear Information System (INIS)

    Shin, Sung Min; Kim, Hee Eun; Kang, Hyun Gook; Lee, Sung Jiun

    2014-01-01

    The V and V method has been utilized for this safety critical software, while SRGM has difficulties because of lack of failure occurrence data on developing phase. For the safety critical software, however, failure data cannot be gathered after installation in real plant when we consider the severe consequence. Therefore, to complement the V and V method, the test-based method need to be developed. Some studies on test-based reliability quantification method for safety critical software have been conducted in nuclear field. These studies provide useful guidance on generating test sets. An important concept of the guidance is that the test sets represent 'trajectories' (a series of successive values for the input variables of a program that occur during the operation of the software over time) in the space of inputs to the software.. Actually, the inputs to the software depends on the state of plant at that time, and these inputs form a new internal state of the software by changing values of some variables. In other words, internal state of the software at specific timing depends on the history of past inputs. Here the internal state of the software which can be changed by past inputs is named as Context of Software (CoS). In a certain CoS, a software failure occurs when a fault is triggered by some inputs. To cover the failure occurrence mechanism of a software, preceding researches insist that the inputs should be a trajectory form. However, in this approach, there are two critical problems. One is the length of the trajectory input. Input trajectory should long enough to cover failure mechanism, but the enough length is not clear. What is worse, to cover some accident scenario, one set of input should represent dozen hours of successive values. The other problem is number of tests needed. To satisfy a target reliability with reasonable confidence level, very large number of test sets are required. Development of this number of test sets is a herculean

  11. The wire optical test: a thorough analytical study in and out of caustic surface, and advantages of a dynamical adaptation

    Science.gov (United States)

    Alejandro Juárez-Reyes, Salvador; Sosa-Sánchez, Citlalli Teresa; Silva-Ortigoza, Gilberto; de Jesús Cabrera-Rosas, Omar; Espíndola-Ramos, Ernesto; Ortega-Vidals, Paula

    2018-03-01

    Among the best known non-interferometric optical tests are the wire test, the Foucault test and Ronchi test with a low frequency grating. Since the wire test is the seed to understand the other ones, the aim of the present work is to do a thorough study of this test for a lens with symmetry of revolution and to do this study for any configuration of the object and detection planes where both planes could intersect: two, one or no branches of the caustic region (including the marginal and paraxial foci). To this end, we calculated the vectorial representation for the caustic region, and we found the analytical expression for the pattern; we report that the analytical pattern explicitly depends on the magnitude of a branch of the caustic. With the analytical pattern we computed a set of simulations of a dynamical adaptation of the optical wire test. From those simulations, we have done a thorough analysis of the topological structure of the pattern; so we explain how the multiple image formation process and the image collapse process take place for each configuration, in particular, when both the wire and the detection planes are placed inside the caustic region, which has not been studied before. For the first time, we remark that not only the intersections of the object and detection planes with the caustic are important in the change of pattern topology; but also the projection of the intersection between the caustic and the object plane mapped onto the detection plane; and the virtual projection of the intersection between the caustic and the detection plane mapped onto the object plane. We present that for the new configurations of the optical system, the wire image is curves of the Tschirnhausen’s cubic, the piriform and the deformed eight-curve types.

  12. Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plum Constituents Under Test Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the Phase II STTR project is to develop a prototype multi-analyte sensor system to detect gaseous analytes present in the test stands during...

  13. Analytical study for frequency effects on the EPRI/USNRC piping component tests. Part 1: Theoretical basis and model development

    International Nuclear Information System (INIS)

    Adams, T.M.; Branch, E.B.; Tagart, S.W. Jr.

    1994-01-01

    As part of the engineering effort for the Advanced Light Water Reactor the Advanced Reactor Corporation formed a Piping Technical Core Group to develop a set of improved ASME Boiler and Pressure Vessel Code, Section III design rules and approaches for ALWR plant piping and support design. The technical basis for the proposed changes to the ASME Boiler and Pressure Vessel Code developed by Technical Core Group for the design of piping relies heavily on the failure margins determined from the EPRI/USNRC piping component test program. The majority of the component tests forming the basis for the reported margins against failure were run with input frequency to natural frequency ratios (Ω/ω) in the range of 0.74 to 0.87. One concern investigated by the Technical Core Group was the effect which could exist on measured margins if the tests had been run at higher or lower frequency ratios than those in the limited frequency ratio range tested. Specifically, the concern investigated was that the proposed Technical Core Group Piping Stress Criteria will allow piping to be designed in the low frequency range (Ω/ω ≥ 2.0) for which there is little test data from the EPRI/USNRC test program. The purpose of this analytical study was to: (1) evaluate the potential for margin variation as a function of the frequency ratio (R ω = Ω/ω, where Ω is the forcing frequency and ω is the natural component frequency), (2) recommend a margin reduction factor (MRF) that could be applied to margins determined from the EPRI/USNRC test program to adjust those margins for potential margin variation with frequency ratio. Presented in this paper is the analytical approach and methodology, which are inelastic analysis, which was the basis of the study. Also, discussed is the development of the analytical model, the procedure used to benchmark the model to actual test results, and the various parameter studies conducted

  14. GA-4/GA-9 honeycomb impact limiter tests and analytical model

    International Nuclear Information System (INIS)

    Koploy, M.A.; Taylor, C.S.

    1991-01-01

    General Atomics (GA) has a test program underway to obtain data on the behavior of a honeycomb impact limiter. The program includes testing of small samples to obtain basic information, as well as testing of complete 1/4-scale impact limiters to obtain load-versus-deflection curves for different crush orientations. GA has used the test results to aid in the development of an analytical model to predict the impact limiter loads. The results also helped optimize the design of the impact limiters for the GA-4 and GA-9 Casks

  15. Immunochemical faecal occult blood tests have superior stability and analytical performance characteristics over guaiac-based tests in a controlled in vitro study.

    LENUS (Irish Health Repository)

    Lee, Chun Seng

    2011-06-01

    The aims of this study were (1) to determine the measurement accuracy of a widely used guaiac faecal occult blood test (gFOBT) compared with an immunochemical faecal occult blood test (iFOBT) during in vitro studies, including their analytical stability over time at ambient temperature and at 4°C; and (2) to compare analytical imprecision and other characteristics between two commercially available iFOBT methods.

  16. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  17. Automotive RF immunity test set-up analysis

    NARCIS (Netherlands)

    Coenen, M.J.; Pues, H.; Bousquet, T.; Gillon, R.; Gielen, G.; Baric, A.

    2011-01-01

    Though the automotive RF emission and RF immunity requirements are highly justifiable, the application of those requirements in an non-intended manner leads to false conclusions and unnecessary redesigns for the electronics involved. When the test results become too dependent upon the test set-up

  18. Ideas for Testing of Planetary Gear Sets of Automotive Transmissions

    Directory of Open Access Journals (Sweden)

    Achtenová Gabriela

    2017-06-01

    Full Text Available The article describes the concept of modular stand, where is possible to provide tests of gear pairs with fixed axes from mechanical automotive gearboxes, as well as tests of separate planetary sets from automatic gearboxes. Special attention in the article will be paid to the variant dedicated for testing of planetary gear sets. This variant is particularly interesting because: 1 it is rarely described in the literature, and 2 this topology allows big simplification with respect to testing of standard gearwheels. In the planetary closed-loop stand it is possible to directly link two identical planetary sets. Without any bracing flange or other connecting clutches, shafts or gear sets, just two planetary sets face-to-face will be assembled and connected to the electric motor.

  19. Integrative set enrichment testing for multiple omics platforms

    Directory of Open Access Journals (Sweden)

    Poisson Laila M

    2011-11-01

    Full Text Available Abstract Background Enrichment testing assesses the overall evidence of differential expression behavior of the elements within a defined set. When we have measured many molecular aspects, e.g. gene expression, metabolites, proteins, it is desirable to assess their differential tendencies jointly across platforms using an integrated set enrichment test. In this work we explore the properties of several methods for performing a combined enrichment test using gene expression and metabolomics as the motivating platforms. Results Using two simulation models we explored the properties of several enrichment methods including two novel methods: the logistic regression 2-degree of freedom Wald test and the 2-dimensional permutation p-value for the sum-of-squared statistics test. In relation to their univariate counterparts we find that the joint tests can improve our ability to detect results that are marginal univariately. We also find that joint tests improve the ranking of associated pathways compared to their univariate counterparts. However, there is a risk of Type I error inflation with some methods and self-contained methods lose specificity when the sets are not representative of underlying association. Conclusions In this work we show that consideration of data from multiple platforms, in conjunction with summarization via a priori pathway information, leads to increased power in detection of genomic associations with phenotypes.

  20. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  1. A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet

    NARCIS (Netherlands)

    Mazzia, F.; Cash, J.R.; Soetaert, K.

    2012-01-01

    In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value

  2. Analytical bounds on SET charge sensitivity for qubit readout in a solid-state quantum computer

    International Nuclear Information System (INIS)

    Green, F.; Buehler, T.M.; Brenner, R.; Hamilton, A.R.; Dzurak, A.S.; Clark, R.G.

    2002-01-01

    Full text: Quantum Computing promises processing powers orders of magnitude beyond what is possible in conventional silicon-based computers. It harnesses the laws of quantum mechanics directly, exploiting the in built potential of a wave function for massively parallel information processing. Highly ordered and scaleable arrays of single donor atoms (quantum bits, or qubits), embedded in Si, are especially promising; they are a very natural fit to the existing, highly sophisticated, Si industry. The success of Si-based quantum computing depends on precisely initializing the quantum state of each qubit, and on precise reading out its final form. In the Kane architecture the qubit states are read out by detecting the spatial distribution of the donor's electron cloud using a sensitive electrometer. The single-electron transistor (SET) is an attractive candidate readout device for this, since the capacitive, or charging, energy of a SET's metallic central island is exquisitely sensitive to its electronic environment. Use of SETs as high-performance electrometers is therefore a key technology for data transfer in a solid-state quantum computer. We present an efficient analytical method to obtain bounds on the charge sensitivity of a single electron transistor (SET). Our classic Green-function analysis provides reliable estimates of SET sensitivity optimizing the design of the readout hardware. Typical calculations, and their physical meaning, are discussed. We compare them with the measured SET-response data

  3. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    Science.gov (United States)

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  4. Finite test sets development method for test execution of safety critical software

    International Nuclear Information System (INIS)

    El-Bordany Ayman; Yun, Won Young

    2014-01-01

    It reads inputs, computes new states, and updates output for each scan cycle. Korea Nuclear Instrumentation and Control System (KNICS) has recently developed a fully digitalized Reactor Protection System (RPS) based on PLD. As a digital system, this RPS is equipped with a dedicated software. The Reliability of this software is crucial to NPPs safety where its malfunction may cause irreversible consequences and affect the whole system as a Common Cause Failure (CCF). To guarantee the reliability of the whole system, the reliability of this software needs to be quantified. There are three representative methods for software reliability quantification, namely the Verification and Validation (V and V) quality-based method, the Software Reliability Growth Model (SRGM), and the test-based method. An important concept of the guidance is that the test sets represent 'trajectories' (a series of successive values for the input variables of a program that occur during the operation of the software over time) in the space of inputs to the software.. Actually, the inputs to the software depends on the state of plant at that time, and these inputs form a new internal state of the software by changing values of some variables. In other words, internal state of the software at specific timing depends on the history of past inputs. Here the internal state of the software which can be changed by past inputs is named as Context of Software (CoS). In a certain CoS, a software failure occurs when a fault is triggered by some inputs. To cover the failure occurrence mechanism of a software, preceding researches insist that the inputs should be a trajectory form. However, in this approach, there are two critical problems. One is the length of the trajectory input. Input trajectory should long enough to cover failure mechanism, but the enough length is not clear. What is worse, to cover some accident scenario, one set of input should represent dozen hours of successive values

  5. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    Science.gov (United States)

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  6. Analytical validation of a novel multiplex test for detection of advanced adenoma and colorectal cancer in symptomatic patients.

    Science.gov (United States)

    Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce

    2018-05-30

    Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as

  7. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    Science.gov (United States)

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  8. Testing the statistical compatibility of independent data sets

    International Nuclear Information System (INIS)

    Maltoni, M.; Schwetz, T.

    2003-01-01

    We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ 2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed

  9. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    Science.gov (United States)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  10. Coagulation Tests and Selected Biochemical Analytes in Dairy Cows with Hepatic Lipidosis

    OpenAIRE

    S. Padilla-Arellanes; F. Constantino-Casas; L. Núnez-Ochoa; J. Doubek; C. Vega-Murguia; J. Bouda

    2007-01-01

    The aim of this study was to determine the values and changes in conventional and optimised clotting tests, as well as in selected biochemical analytes during hepatic lipidosis in postpartum dairy cows. Ten healthy and ten Holstein cows with hepatic lipidosis were selected based upon clinical history, clinical examination, liver biopsy, flotation test and histological analysis of hepatic tissue. Prothrombin time (PT) and partial thromboplastin time (PTT) were determined in non-diluted and dil...

  11. Urine specimen validity test for drug abuse testing in workplace and court settings.

    Science.gov (United States)

    Lin, Shin-Yu; Lee, Hei-Hwa; Lee, Jong-Feng; Chen, Bai-Hsiun

    2018-01-01

    In recent decades, urine drug testing in the workplace has become common in many countries in the world. There have been several studies concerning the use of the urine specimen validity test (SVT) for drug abuse testing administered in the workplace. However, very little data exists concerning the urine SVT on drug abuse tests from court specimens, including dilute, substituted, adulterated, and invalid tests. We investigated 21,696 submitted urine drug test samples for SVT from workplace and court settings in southern Taiwan over 5 years. All immunoassay screen-positive urine specimen drug tests were confirmed by gas chromatography/mass spectrometry. We found that the mean 5-year prevalence of tampering (dilute, substituted, or invalid tests) in urine specimens from the workplace and court settings were 1.09% and 3.81%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the workplace were 89.2%, 6.8%, and 4.1%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the court were 94.8%, 1.4%, and 3.8%, respectively. No adulterated cases were found among the workplace or court samples. The most common drug identified from the workplace specimens was amphetamine, followed by opiates. The most common drug identified from the court specimens was ketamine, followed by amphetamine. We suggest that all urine specimens taken for drug testing from both the workplace and court settings need to be tested for validity. Copyright © 2017. Published by Elsevier B.V.

  12. An Analytical Framework for Delirium Research in Palliative Care Settings: Integrated Epidemiologic, Clinician-Researcher, and Knowledge User Perspectives

    Science.gov (United States)

    Ansari, Mohammed; Hosie, Annmarie; Kanji, Salmaan; Momoli, Franco; Bush, Shirley H.; Watanabe, Sharon; Currow, David C.; Gagnon, Bruno; Agar, Meera; Bruera, Eduardo; Meagher, David J.; de Rooij, Sophia E.J.A.; Adamis, Dimitrios; Caraceni, Augusto; Marchington, Katie; Stewart, David J.

    2014-01-01

    Context Delirium often presents difficult management challenges in the context of goals of care in palliative care settings. Objectives The aim was to formulate an analytical framework for further research on delirium in palliative care settings, prioritize the associated research questions, discuss the inherent methodological challenges associated with relevant studies, and outline the next steps in a program of delirium research. Methods We combined multidisciplinary input from delirium researchers and knowledge users at an international delirium study planning meeting, relevant literature searches, focused input of epidemiologic expertise, and a meeting participant and coauthor survey to formulate a conceptual research framework and prioritize research questions. Results Our proposed framework incorporates three main groups of research questions: the first was predominantly epidemiologic, such as delirium occurrence rates, risk factor evaluation, screening, and diagnosis; the second covers pragmatic management questions; and the third relates to the development of predictive models for delirium outcomes. Based on aggregated survey responses to each research question or domain, the combined modal ratings of “very” or “extremely” important confirmed their priority. Conclusion Using an analytical framework to represent the full clinical care pathway of delirium in palliative care settings, we identified multiple knowledge gaps in relation to the occurrence rates, assessment, management, and outcome prediction of delirium in this population. The knowledge synthesis generated from adequately powered, multicenter studies to answer the framework’s research questions will inform decision making and policy development regarding delirium detection and management and thus help to achieve better outcomes for patients in palliative care settings. PMID:24726762

  13. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  14. 40 CFR 136.6 - Method modifications and analytical requirements.

    Science.gov (United States)

    2010-07-01

    ... modifications and analytical requirements. (a) Definitions of terms used in this section. (1) Analyst means the..., oil and grease, total suspended solids, total phenolics, turbidity, chemical oxygen demand, and.... Except as set forth in paragraph (b)(3) of this section, an analyst may modify an approved test procedure...

  15. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid

    2016-01-01

    , conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....

  16. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    Science.gov (United States)

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  17. Staged-Fault Testing of Distance Protection Relay Settings

    Science.gov (United States)

    Havelka, J.; Malarić, R.; Frlan, K.

    2012-01-01

    In order to analyze the operation of the protection system during induced fault testing in the Croatian power system, a simulation using the CAPE software has been performed. The CAPE software (Computer-Aided Protection Engineering) is expert software intended primarily for relay protection engineers, which calculates current and voltage values during faults in the power system, so that relay protection devices can be properly set up. Once the accuracy of the simulation model had been confirmed, a series of simulations were performed in order to obtain the optimal fault location to test the protection system. The simulation results were used to specify the test sequence definitions for the end-to-end relay testing using advanced testing equipment with GPS synchronization for secondary injection in protection schemes based on communication. The objective of the end-to-end testing was to perform field validation of the protection settings, including verification of the circuit breaker operation, telecommunication channel time and the effectiveness of the relay algorithms. Once the end-to-end secondary injection testing had been completed, the induced fault testing was performed with three-end lines loaded and in service. This paper describes and analyses the test procedure, consisting of CAPE simulations, end-to-end test with advanced secondary equipment and staged-fault test of a three-end power line in the Croatian transmission system.

  18. S.E.T., CSNI Separate Effects Test Facility Validation Matrix

    International Nuclear Information System (INIS)

    1997-01-01

    1 - Description of test facility: The SET matrix of experiments is suitable for the developmental assessment of thermal-hydraulics transient system computer codes by selecting individual tests from selected facilities, relevant to each phenomena. Test facilities differ from one another in geometrical dimensions, geometrical configuration and operating capabilities or conditions. Correlation between SET facility and phenomena were calculated on the basis of suitability for model validation (which means that a facility is designed in such a way as to stimulate the phenomena assumed to occur in a plant and is sufficiently instrumented); limited suitability for model variation (which means that a facility is designed in such a way as to stimulate the phenomena assumed to occur in a plant but has problems associated with imperfect scaling, different test fluids or insufficient instrumentation); and unsuitability for model validation. 2 - Description of test: Whereas integral experiments are usually designed to follow the behaviour of a reactor system in various off-normal or accident transients, separate effects tests focus on the behaviour of a single component, or on the characteristics of one thermal-hydraulic phenomenon. The construction of a separate effects test matrix is an attempt to collect together the best sets of openly available test data for code validation, assessment and improvement, from the wide range of experiments that have been carried out world-wide in the field of thermal hydraulics. In all, 2094 tests are included in the SET matrix

  19. An Independent Filter for Gene Set Testing Based on Spectral Enrichment

    NARCIS (Netherlands)

    Frost, H Robert; Li, Zhigang; Asselbergs, Folkert W; Moore, Jason H

    2015-01-01

    Gene set testing has become an indispensable tool for the analysis of high-dimensional genomic data. An important motivation for testing gene sets, rather than individual genomic variables, is to improve statistical power by reducing the number of tested hypotheses. Given the dramatic growth in

  20. Use of Strain Measurements from Acoustic Bench Tests of the Battleship Flowliner Test Articles To Link Analytical Model Results to In-Service Resonant Response

    Science.gov (United States)

    Frady, Greg; Smaolloey, Kurt; LaVerde, Bruce; Bishop, Jim

    2004-01-01

    The paper will discuss practical and analytical findings of a test program conducted to assist engineers in determining which analytical strain fields are most appropriate to describe the crack initiating and crack propagating stresses in thin walled cylindrical hardware that serves as part of the Space Shuttle Main Engine's fuel system. In service the hardware is excited by fluctuating dynamic pressures in a cryogenic fuel that arise from turbulent flow/pump cavitation. A bench test using a simplified system was conducted using acoustic energy in air to excite the test articles. Strain measurements were used to reveal response characteristics of two Flowliner test articles that are assembled as a pair when installed in the engine feed system.

  1. Portland, Oregon Test Data Set Arterial Loop Detector Data

    Data.gov (United States)

    Department of Transportation — This set of data files was acquired under USDOT FHWA cooperative agreement DTFH61-11-H-00025 as one of the four test data sets acquired by the USDOT Data Capture and...

  2. Portland, Oregon Test Data Set Freeway Loop Detector Data

    Data.gov (United States)

    Department of Transportation — This set of data files was acquired under USDOT FHWA cooperative agreement DTFH61-11-H-00025 as one of the four test data sets acquired by the USDOT Data Capture and...

  3. Analytical Model of Coil Spring Damper Based on the Loading Test

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook; Park, Woong Ki [INNOSE TECH Co. LTD, Incheon (Korea, Republic of); Furuya, Osamu [Tokyo City University, Tokyo (Japan); Kurabayashi, Hiroshi [Vibro-System, Tokyo (Japan)

    2016-05-15

    The one way of solving such problems is to enhance and to develop an improved damping element used in base-isolation and response control system. A cost reduction of damper for a large scale structure is another important task to upgrade the total response control abilities in the near future. This study has examined a response control device using elastoplastic hysteresis damping of metal material. The proposed damper is designed to be coil spring element shape for a uniform stress of metal and for a reduction of low cyclic fatigue in large deformation to upgrade a repetitive strength during the earthquake motions. By using the metal material of SS400 general structural rolled steel, the corresponding cost issues of the damping element will be effectively reduced. The analytical of elasto-plastic coil spring damper (CSD) is introduced, and basic mechanical properties evaluated experimentally and analytically. This study has been examined the response control damper using elasto-plastic hysteresis characteristics of metal material. The paper described the design method of elasto-plastic coil spring damper, basic mechanical properties evaluated from loading test, and analytical model of damper are summarized. It was confirmed that the damping force and mechanical characteristics of elasto-plastic coil spring damper are almost satisfied the design specifications.

  4. Simplified Analytic Approach of Pole-to-Pole Faults in MMC-HVDC for AC System Backup Protection Setting Calculation

    Directory of Open Access Journals (Sweden)

    Tongkun Lan

    2018-01-01

    Full Text Available AC (alternating current system backup protection setting calculation is an important basis for ensuring the safe operation of power grids. With the increasing integration of modular multilevel converter based high voltage direct current (MMC-HVDC into power grids, it has been a big challenge for the AC system backup protection setting calculation, as the MMC-HVDC lacks the fault self-clearance capability under pole-to-pole faults. This paper focused on the pole-to-pole faults analysis for the AC system backup protection setting calculation. The principles of pole-to-pole faults analysis were discussed first according to the standard of the AC system protection setting calculation. Then, the influence of fault resistance on the fault process was investigated. A simplified analytic approach of pole-to-pole faults in MMC-HVDC for the AC system backup protection setting calculation was proposed. In the proposed approach, the derived expressions of fundamental frequency current are applicable under arbitrary fault resistance. The accuracy of the proposed approach was demonstrated by PSCAD/EMTDC (Power Systems Computer-Aided Design/Electromagnetic Transients including DC simulations.

  5. Analytical performance of centrifuge-based device for clinical chemistry testing.

    Science.gov (United States)

    Suk-Anake, Jamikorn; Promptmas, Chamras

    2012-01-01

    A centrifuge-based device has been introduced to the Samsung Blood Analyzer (SBA). The verification of this analyzer is essential to meet the ISO15189 standard. Analytical performance was evaluated according to the NCCLS EP05-A method. The results of plasma samples were compared between the SBA and a Hitachi 917 analyzer according to the NCCLS EP09-A2-IR method. Percent recovery was determined via analysis of original control serum and spiked serum. Within-run precision was found to be 0.00 - 6.61% and 0.96 - 5.99% in normal- and abnormal-level assays, respectively, while between-run precision was 1.31 - 9.09% and 0.89 - 6.92%, respectively. The correlation coefficients (r) were > 0.990. The SBA presented analytical accuracy at 96.64 +/- 3.39% to 102.82 +/- 2.75% and 98.31 +/- 4.04% to 103.61 +/- 8.28% recovery, respectively. The results obtained verify that all of the 13 tests performed using the SBA demonstrates good and reliable precision suitable for use in qualified clinical chemistry laboratory service.

  6. Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.

    Science.gov (United States)

    Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok

    2015-01-01

    Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.

  7. Investigation of clustering in sets of analytical data

    Energy Technology Data Exchange (ETDEWEB)

    Kajfosz, J [Institute of Nuclear Physics, Cracow (Poland)

    1993-04-01

    Foundation of the statistical method of cluster analysis are briefly presented and its usefulness for the examination and evaluation of analytical data obtained from series of samples investigated by PIXE, PIGE or other methods is discussed. A simple program for fast examination of dissimilarities between samples within an investigated series is described. Useful information on clustering for several hundreds of samples can be obtained with minimal time and storage requirements. (author). 5 refs, 10 figs.

  8. Investigation of clustering in sets of analytical data

    International Nuclear Information System (INIS)

    Kajfosz, J.

    1993-04-01

    Foundation of the statistical method of cluster analysis are briefly presented and its usefulness for the examination and evaluation of analytical data obtained from series of samples investigated by PIXE, PIGE or other methods is discussed. A simple program for fast examination of dissimilarities between samples within an investigated series is described. Useful information on clustering for several hundreds of samples can be obtained with minimal time and storage requirements. (author). 5 refs, 10 figs

  9. A Comparison of Two Approaches for the Ruggedness Testing of an Analytical Method

    International Nuclear Information System (INIS)

    Maestroni, Britt

    2016-01-01

    As part of an initiative under the “Red Analitica de Latino America y el Caribe” (RALACA) network the FAO/IAEA Food and Environmental Protection Laboratory validated a multi-residue method for pesticides in potato. One of the parameters to be assessed was the intra laboratory robustness or ruggedness. The objective of this work was to implement a worked example for RALACA laboratories to test for the robustness (ruggedness) of an analytical method. As a conclusion to this study, it is evident that there is a need for harmonization of the definition of the terms robustness/ruggedness, the limits, the methodology and the statistical treatment of the generated data. A worked example for RALACA laboratories to test for the robustness (ruggedness) of an analytical method will soon be posted on the RALACA website (www.red-ralaca.net). This study was carried out with collaborators from LVA (Austria), University of Antwerp (Belgium), University of Leuwen (The Netherlands), Universidad de la Republica (Uruguay) and Agilent technologies.

  10. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  11. Automotive RF immunity test set-up analysis : why test results can't compare

    NARCIS (Netherlands)

    Coenen, Mart; Pues, H.; Bousquet, T.

    2011-01-01

    Though the automotive RF emission and RF immunity requirements are highly justifiable, the application of those requirements in an non-intended manner leads to false conclusions and unnecessary redesigns for the electronics involved. When the test results become too dependent upon the test set-up

  12. Development of a grinding-specific performance test set-up

    DEFF Research Database (Denmark)

    Olesen, C. G.; Larsen, B. H.; Andresen, E. L.

    2015-01-01

    The aim of this study was to develop a performance test set-up for America's Cup grinders. The test set-up had to mimic the on-boat grinding activity and be capable of collecting data for analysis and evaluation of grinding performance. This study included a literature-based analysis of grinding...... demands and a test protocol developed to accommodate the necessary physiological loads. This study resulted in a test protocol consisting of 10 intervals of 20 revolutions each interspersed with active resting periods of 50 s. The 20 revolutions are a combination of both forward and backward grinding...... and an exponentially rising resistance. A custom-made grinding ergometer was developed with computer-controlled resistance and capable of collecting data during the test. The data collected can be used to find measures of grinding performance such as peak power, time to complete and the decline in repeated grinding...

  13. Development of a grinding-specific performance test set-up.

    Science.gov (United States)

    Olesen, C G; Larsen, B H; Andresen, E L; de Zee, M

    2015-01-01

    The aim of this study was to develop a performance test set-up for America's Cup grinders. The test set-up had to mimic the on-boat grinding activity and be capable of collecting data for analysis and evaluation of grinding performance. This study included a literature-based analysis of grinding demands and a test protocol developed to accommodate the necessary physiological loads. This study resulted in a test protocol consisting of 10 intervals of 20 revolutions each interspersed with active resting periods of 50 s. The 20 revolutions are a combination of both forward and backward grinding and an exponentially rising resistance. A custom-made grinding ergometer was developed with computer-controlled resistance and capable of collecting data during the test. The data collected can be used to find measures of grinding performance such as peak power, time to complete and the decline in repeated grinding performance.

  14. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  15. Analytic treatment of leading-order parton evolution equations: Theory and tests

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; McKay, Douglas W.

    2009-01-01

    We recently derived an explicit expression for the gluon distribution function G(x,Q 2 )=xg(x,Q 2 ) in terms of the proton structure function F 2 γp (x,Q 2 ) in leading-order (LO) QCD by solving the LO Dokshitzer-Gribov-Lipatov-Altarelli-Parisi equation for the Q 2 evolution of F 2 γp (x,Q 2 ) analytically, using a differential-equation method. We showed that accurate experimental knowledge of F 2 γp (x,Q 2 ) in a region of Bjorken x and virtuality Q 2 is all that is needed to determine the gluon distribution in that region. We rederive and extend the results here using a Laplace-transform technique, and show that the singlet quark structure function F S (x,Q 2 ) can be determined directly in terms of G from the Dokshitzer-Gribov-Lipatov-Altarelli-Parisi gluon evolution equation. To illustrate the method and check the consistency of existing LO quark and gluon distributions, we used the published values of the LO quark distributions from the CTEQ5L and MRST2001 LO analyses to form F 2 γp (x,Q 2 ), and then solved analytically for G(x,Q 2 ). We find that the analytic and fitted gluon distributions from MRST2001LO agree well with each other for all x and Q 2 , while those from CTEQ5L differ significantly from each other for large x values, x > or approx. 0.03-0.05, at all Q 2 . We conclude that the published CTEQ5L distributions are incompatible in this region. Using a nonsinglet evolution equation, we obtain a sensitive test of quark distributions which holds in both LO and next-to-leading order perturbative QCD. We find in either case that the CTEQ5 quark distributions satisfy the tests numerically for small x, but fail the tests for x > or approx. 0.03-0.05--their use could potentially lead to significant shifts in predictions of quantities sensitive to large x. We encountered no problems with the MRST2001LO distributions or later CTEQ distributions. We suggest caution in the use of the CTEQ5 distributions.

  16. Pre-Analytical Conditions in Non-Invasive Prenatal Testing of Cell-Free Fetal RHD

    DEFF Research Database (Denmark)

    Clausen, Frederik Banch; Jakobsen, Tanja Roien; Rieneck, Klaus

    2013-01-01

    D positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based...

  17. Proficiency Testing by Interlaboratory Comparison Performed in 2010-2015 for Neutron Activation Analysis and Other Analytical Techniques

    International Nuclear Information System (INIS)

    2017-12-01

    The IAEA supports its Member States to increase the utilization of their research reactors. Small and medium sized reactors are mostly used for neutron activation analysis (NAA). Although the markets for NAA laboratories have been identified, demonstration of valid analytical results and organizational quality of the work process are preconditions for expanding the stakeholder community, particularly in commercial routine application of this powerful technique. The IAEA has implemented a new mechanism for supporting NAA laboratories in demonstrating their analytical performance by participation in proficiency testing schemes by interlaboratory comparison. This activity makes possible the identification of deviations and non-conformities, their causes and the process to implement effective approaches to eliminate them. Over 30 laboratories participated between 2010 and 2015 in consecutive proficiency tests organized by the IAEA in conjunction with the Wageningen Evaluating Programmes for Analytical Laboratories (WEPAL) to assess their analytical performances. This publication reports the findings and includes lessons learned of this activity. An attached CD-ROM contains many individual participating laboratory papers sharing their individual results and experience gained through this participation.

  18. Evaluating Diagnostic Point-of-Care Tests in Resource-Limited Settings

    Science.gov (United States)

    Drain, Paul K; Hyle, Emily P; Noubary, Farzad; Freedberg, Kenneth A; Wilson, Douglas; Bishai, William; Rodriguez, William; Bassett, Ingrid V

    2014-01-01

    Diagnostic point-of-care (POC) testing is intended to minimize the time to obtain a test result, thereby allowing clinicians and patients to make an expeditious clinical decision. As POC tests expand into resource-limited settings (RLS), the benefits must outweigh the costs. To optimize POC testing in RLS, diagnostic POC tests need rigorous evaluations focused on relevant clinical outcomes and operational costs, which differ from evaluations of conventional diagnostic tests. Here, we reviewed published studies on POC testing in RLS, and found no clearly defined metric for the clinical utility of POC testing. Therefore, we propose a framework for evaluating POC tests, and suggest and define the term “test efficacy” to describe a diagnostic test’s capacity to support a clinical decision within its operational context. We also proposed revised criteria for an ideal diagnostic POC test in resource-limited settings. Through systematic evaluations, comparisons between centralized diagnostic testing and novel POC technologies can be more formalized, and health officials can better determine which POC technologies represent valuable additions to their clinical programs. PMID:24332389

  19. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  20. An analytical approach for the Propagation Saw Test

    Science.gov (United States)

    Benedetti, Lorenzo; Fischer, Jan-Thomas; Gaume, Johan

    2016-04-01

    The Propagation Saw Test (PST) [1, 2] is an experimental in-situ technique that has been introduced to assess crack propagation propensity in weak snowpack layers buried below cohesive snow slabs. This test attracted the interest of a large number of practitioners, being relatively easy to perform and providing useful insights for the evaluation of snow instability. The PST procedure requires isolating a snow column of 30 centimeters of width and -at least-1 meter in the downslope direction. Then, once the stratigraphy is known (e.g. from a manual snow profile), a saw is used to cut a weak layer which could fail, potentially leading to the release of a slab avalanche. If the length of the saw cut reaches the so-called critical crack length, the onset of crack propagation occurs. Furthermore, depending on snow properties, the crack in the weak layer can initiate the fracture and detachment of the overlying slab. Statistical studies over a large set of field data confirmed the relevance of the PST, highlighting the positive correlation between test results and the likelihood of avalanche release [3]. Recent works provided key information on the conditions for the onset of crack propagation [4] and on the evolution of slab displacement during the test [5]. In addition, experimental studies [6] and simplified models [7] focused on the qualitative description of snowpack properties leading to different failure types, namely full propagation or fracture arrest (with or without slab fracture). However, beside current numerical studies utilizing discrete elements methods [8], only little attention has been devoted to a detailed analytical description of the PST able to give a comprehensive mechanical framework of the sequence of processes involved in the test. Consequently, this work aims to give a quantitative tool for an exhaustive interpretation of the PST, stressing the attention on important parameters that influence the test outcomes. First, starting from a pure

  1. A multi-analyte biosensor for the simultaneous label-free detection of pathogens and biomarkers in point-of-need animal testing.

    Science.gov (United States)

    Ewald, Melanie; Fechner, Peter; Gauglitz, Günter

    2015-05-01

    For the first time, a multi-analyte biosensor platform has been developed using the label-free 1-lambda-reflectometry technique. This platform is the first, which does not use imaging techniques, but is able to perform multi-analyte measurements. It is designed to be portable and cost-effective and therefore allows for point-of-need testing or on-site field-testing with possible applications in diagnostics. This work highlights the application possibilities of this platform in the field of animal testing, but is also relevant and transferable to human diagnostics. The performance of the platform has been evaluated using relevant reference systems like biomarker (C-reactive protein) and serology (anti-Salmonella antibodies) as well as a panel of real samples (animal sera). The comparison of the working range and limit of detection shows no loss of performance transferring the separate assays to the multi-analyte setup. Moreover, the new multi-analyte platform allows for discrimination between sera of animals infected with different Salmonella subtypes.

  2. Distinguished hyperbolic trajectories in time-dependent fluid flows: analytical and computational approach for velocity fields defined as data sets

    Directory of Open Access Journals (Sweden)

    K. Ide

    2002-01-01

    Full Text Available In this paper we develop analytical and numerical methods for finding special hyperbolic trajectories that govern geometry of Lagrangian structures in time-dependent vector fields. The vector fields (or velocity fields may have arbitrary time dependence and be realized only as data sets over finite time intervals, where space and time are discretized. While the notion of a hyperbolic trajectory is central to dynamical systems theory, much of the theoretical developments for Lagrangian transport proceed under the assumption that such a special hyperbolic trajectory exists. This brings in new mathematical issues that must be addressed in order for Lagrangian transport theory to be applicable in practice, i.e. how to determine whether or not such a trajectory exists and, if it does exist, how to identify it in a sequence of instantaneous velocity fields. We address these issues by developing the notion of a distinguished hyperbolic trajectory (DHT. We develop an existence criteria for certain classes of DHTs in general time-dependent velocity fields, based on the time evolution of Eulerian structures that are observed in individual instantaneous fields over the entire time interval of the data set. We demonstrate the concept of DHTs in inhomogeneous (or "forced" time-dependent linear systems and develop a theory and analytical formula for computing DHTs. Throughout this work the notion of linearization is very important. This is not surprising since hyperbolicity is a "linearized" notion. To extend the analytical formula to more general nonlinear time-dependent velocity fields, we develop a series of coordinate transforms including a type of linearization that is not typically used in dynamical systems theory. We refer to it as Eulerian linearization, which is related to the frame independence of DHTs, as opposed to the Lagrangian linearization, which is typical in dynamical systems theory, which is used in the computation of Lyapunov exponents. We

  3. Experimental Implementation of a Kochen-Specker Set of Quantum Tests

    Directory of Open Access Journals (Sweden)

    Vincenzo D’Ambrosio

    2013-02-01

    Full Text Available The conflict between classical and quantum physics can be identified through a series of yes-no tests on quantum systems, without it being necessary that these systems be in special quantum states. Kochen-Specker (KS sets of yes-no tests have this property and provide a quantum-versus-classical advantage that is free of the initialization problem that affects some quantum computers. Here, we report the first experimental implementation of a complete KS set that consists of 18 yes-no tests on four-dimensional quantum systems and show how to use the KS set to obtain a state-independent quantum advantage. We first demonstrate the unique power of this KS set for solving a task while avoiding the problem of state initialization. Such a demonstration is done by showing that, for 28 different quantum states encoded in the orbital-angular-momentum and polarization degrees of freedom of single photons, the KS set provides an impossible-to-beat solution. In a second experiment, we generate maximally contextual quantum correlations by performing compatible sequential measurements of the polarization and path of single photons. In this case, state independence is demonstrated for 15 different initial states. Maximum contextuality and state independence follow from the fact that the sequences of measurements project any initial quantum state onto one of the KS set’s eigenstates. Our results show that KS sets can be used for quantum-information processing and quantum computation and pave the way for future developments.

  4. Experimental Implementation of a Kochen-Specker Set of Quantum Tests

    Science.gov (United States)

    D'Ambrosio, Vincenzo; Herbauts, Isabelle; Amselem, Elias; Nagali, Eleonora; Bourennane, Mohamed; Sciarrino, Fabio; Cabello, Adán

    2013-01-01

    The conflict between classical and quantum physics can be identified through a series of yes-no tests on quantum systems, without it being necessary that these systems be in special quantum states. Kochen-Specker (KS) sets of yes-no tests have this property and provide a quantum-versus-classical advantage that is free of the initialization problem that affects some quantum computers. Here, we report the first experimental implementation of a complete KS set that consists of 18 yes-no tests on four-dimensional quantum systems and show how to use the KS set to obtain a state-independent quantum advantage. We first demonstrate the unique power of this KS set for solving a task while avoiding the problem of state initialization. Such a demonstration is done by showing that, for 28 different quantum states encoded in the orbital-angular-momentum and polarization degrees of freedom of single photons, the KS set provides an impossible-to-beat solution. In a second experiment, we generate maximally contextual quantum correlations by performing compatible sequential measurements of the polarization and path of single photons. In this case, state independence is demonstrated for 15 different initial states. Maximum contextuality and state independence follow from the fact that the sequences of measurements project any initial quantum state onto one of the KS set’s eigenstates. Our results show that KS sets can be used for quantum-information processing and quantum computation and pave the way for future developments.

  5. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    International Nuclear Information System (INIS)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García; Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D.; Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz

    2015-01-01

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs

  6. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García [Instituto de Astronomía Teórica y Experimental, CONICET-UNC, Laprida 854, X5000BGR, Córdoba (Argentina); Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D. [Consejo Nacional de Investigaciones Científicas y Técnicas, Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz, E-mail: andresnicolas@oac.uncor.edu [Instituto de Astrofísica, Pontificia Universidad Católica de Chile, Av. Vicuña Mackenna 4860, Santiago (Chile)

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.

  7. Cosmic Bell Test: Measurement Settings from Milky Way Stars

    Science.gov (United States)

    Handsteiner, Johannes; Friedman, Andrew S.; Rauch, Dominik; Gallicchio, Jason; Liu, Bo; Hosp, Hannes; Kofler, Johannes; Bricher, David; Fink, Matthias; Leung, Calvin; Mark, Anthony; Nguyen, Hien T.; Sanders, Isabella; Steinlechner, Fabian; Ursin, Rupert; Wengerowsky, Sören; Guth, Alan H.; Kaiser, David I.; Scheidl, Thomas; Zeilinger, Anton

    2017-02-01

    Bell's theorem states that some predictions of quantum mechanics cannot be reproduced by a local-realist theory. That conflict is expressed by Bell's inequality, which is usually derived under the assumption that there are no statistical correlations between the choices of measurement settings and anything else that can causally affect the measurement outcomes. In previous experiments, this "freedom of choice" was addressed by ensuring that selection of measurement settings via conventional "quantum random number generators" was spacelike separated from the entangled particle creation. This, however, left open the possibility that an unknown cause affected both the setting choices and measurement outcomes as recently as mere microseconds before each experimental trial. Here we report on a new experimental test of Bell's inequality that, for the first time, uses distant astronomical sources as "cosmic setting generators." In our tests with polarization-entangled photons, measurement settings were chosen using real-time observations of Milky Way stars while simultaneously ensuring locality. Assuming fair sampling for all detected photons, and that each stellar photon's color was set at emission, we observe statistically significant ≳7.31 σ and ≳11.93 σ violations of Bell's inequality with estimated p values of ≲1.8 ×10-13 and ≲4.0 ×10-33, respectively, thereby pushing back by ˜600 years the most recent time by which any local-realist influences could have engineered the observed Bell violation.

  8. Bias-Free Chemically Diverse Test Sets from Machine Learning.

    Science.gov (United States)

    Swann, Ellen T; Fernandez, Michael; Coote, Michelle L; Barnard, Amanda S

    2017-08-14

    Current benchmarking methods in quantum chemistry rely on databases that are built using a chemist's intuition. It is not fully understood how diverse or representative these databases truly are. Multivariate statistical techniques like archetypal analysis and K-means clustering have previously been used to summarize large sets of nanoparticles however molecules are more diverse and not as easily characterized by descriptors. In this work, we compare three sets of descriptors based on the one-, two-, and three-dimensional structure of a molecule. Using data from the NIST Computational Chemistry Comparison and Benchmark Database and machine learning techniques, we demonstrate the functional relationship between these structural descriptors and the electronic energy of molecules. Archetypes and prototypes found with topological or Coulomb matrix descriptors can be used to identify smaller, statistically significant test sets that better capture the diversity of chemical space. We apply this same method to find a diverse subset of organic molecules to demonstrate how the methods can easily be reapplied to individual research projects. Finally, we use our bias-free test sets to assess the performance of density functional theory and quantum Monte Carlo methods.

  9. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    Science.gov (United States)

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  10. 40 CFR 141.74 - Analytical and monitoring requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Analytical and monitoring requirements... Analytical and monitoring requirements. (a) Analytical requirements. Only the analytical method(s) specified... as set forth in the article “National Field Evaluation of a Defined Substrate Method for the...

  11. Social Set Visualizer (SoSeVi) II

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Vatrapu, Ravi

    2016-01-01

    This paper reports the second iteration of the Social Set Visualizer (SoSeVi), a set theoretical visual analytics dashboard of big social data. In order to further demonstrate its usefulness in large-scale visual analytics tasks of individual and collective behavior of actors in social networks......, the current iteration of the Social Set Visualizer (SoSeVi) in version II builds on recent advancements in visualizing set intersections. The development of the SoSeVi dashboard involved cutting-edge open source visual analytics libraries (D3.js) and creation of new visualizations such as of actor mobility...

  12. Point-of-care testing in an organ procurement organization donor management setting.

    Science.gov (United States)

    Baier, K A; Markham, L E; Flaigle, S P; Nelson, P W; Shield, C F; Muruve, N A; Aeder, M I; Murillo, D; Bryan, C F

    2003-01-01

    Our organ procurement organization (OPO) evaluated the clinical and financial efficacy of point-of-care testing (POCT) in management of our deceased organ donors. Before we implemented point-of care testing with the i-STAT into routine clinical donor management, we compared the i-STAT result with the result from the respective donor hospital lab (DHL) for certain analytes on 15 consecutive donors in our OPO from 26 March to 14 May 2001. The financial impact was studied by reviewing 77 donors from July 2001 to March 2002. There was a strong correlation for each analyte between the POC and DHL test results with r-values as follows: pH 0.86; PCO2 = 0.96; PO2 = 0.98; sodium = 0.98; potassium = 0.95; chloride = 0.94; BUN = 0.98; glucose = 0.92; haematocrit = 0.87 and creatinine = 0.95. Since our OPO coordinators began using i-STAT in their routine clinical management of organ donors, they can now more quickly maximize oxygenation and fluid management of the donor and make extra-renal placement calls sooner. Finally, since we are no longer being billed for the testing performed on the i-STAT, average financial savings to our OPO are US dollars 733 per case. Point-of-care testing in management of our OPO donors provides a result that is equivalent to that of the donor hospital lab, has quicker turn-around time than the donor hospital laboratory, allowing more immediate clinical management decisions to be made so that extra-renal offers may begin sooner.

  13. Transfer of test-enhanced learning: Meta-analytic review and synthesis.

    Science.gov (United States)

    Pan, Steven C; Rickard, Timothy C

    2018-05-07

    Attempting recall of information from memory, as occurs when taking a practice test, is one of the most potent training techniques known to learning science. However, does testing yield learning that transfers to different contexts? In the present article, we report the findings of the first comprehensive meta-analytic review into that question. Our review encompassed 192 transfer effect sizes extracted from 122 experiments and 67 published and unpublished articles (N = 10,382) that together comprise more than 40 years of research. A random-effects model revealed that testing can yield transferrable learning as measured relative to a nontesting reexposure control condition (d = 0.40, 95% CI [0.31, 0.50]). That transfer of learning is greatest across test formats, to application and inference questions, to problems involving medical diagnoses, and to mediator and related word cues; it is weakest to rearranged stimulus-response items, to untested materials seen during initial study, and to problems involving worked examples. Moderator analyses further indicated that response congruency and elaborated retrieval practice, as well as initial test performance, strongly influence the likelihood of positive transfer. In two assessments for publication bias using PET-PEESE and various selection methods, the moderator effect sizes were minimally affected. However, the intercept predictions were substantially reduced, often indicating no positive transfer when none of the aforementioned moderators are present. Overall, our results motivate a three-factor framework for transfer of test-enhanced learning and have practical implications for the effective use of practice testing in educational and other training contexts. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Verification of the analytical fracture assessments methods by a large scale pressure vessel test

    Energy Technology Data Exchange (ETDEWEB)

    Keinanen, H; Oberg, T; Rintamaa, R; Wallin, K

    1988-12-31

    This document deals with the use of fracture mechanics for the assessment of reactor pressure vessel. Tests have been carried out to verify the analytical fracture assessment methods. The analysis is focused on flaw dimensions and the scatter band of material characteristics. Results are provided and are compared to experimental ones. (TEC).

  15. Correlation Between Screening Mammography Interpretive Performance on a Test Set and Performance in Clinical Practice.

    Science.gov (United States)

    Miglioretti, Diana L; Ichikawa, Laura; Smith, Robert A; Buist, Diana S M; Carney, Patricia A; Geller, Berta; Monsees, Barbara; Onega, Tracy; Rosenberg, Robert; Sickles, Edward A; Yankaskas, Bonnie C; Kerlikowske, Karla

    2017-10-01

    Evidence is inconsistent about whether radiologists' interpretive performance on a screening mammography test set reflects their performance in clinical practice. This study aimed to estimate the correlation between test set and clinical performance and determine if the correlation is influenced by cancer prevalence or lesion difficulty in the test set. This institutional review board-approved study randomized 83 radiologists from six Breast Cancer Surveillance Consortium registries to assess one of four test sets of 109 screening mammograms each; 48 radiologists completed a fifth test set of 110 mammograms 2 years later. Test sets differed in number of cancer cases and difficulty of lesion detection. Test set sensitivity and specificity were estimated using woman-level and breast-level recall with cancer status and expert opinion as gold standards. Clinical performance was estimated using women-level recall with cancer status as the gold standard. Spearman rank correlations between test set and clinical performance with 95% confidence intervals (CI) were estimated. For test sets with fewer cancers (N = 15) that were more difficult to detect, correlations were weak to moderate for sensitivity (woman level = 0.46, 95% CI = 0.16, 0.69; breast level = 0.35, 95% CI = 0.03, 0.61) and weak for specificity (0.24, 95% CI = 0.01, 0.45) relative to expert recall. Correlations for test sets with more cancers (N = 30) were close to 0 and not statistically significant. Correlations between screening performance on a test set and performance in clinical practice are not strong. Test set performance more accurately reflects performance in clinical practice if cancer prevalence is low and lesions are challenging to detect. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  16. Learning Analytics: drivers, developments and challenges

    Directory of Open Access Journals (Sweden)

    Rebecca Ferguson

    2014-12-01

    Full Text Available Learning analytics is a significant area of Technology-Enhanced Learning (TEL that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.

  17. Rorty, Pragmatism, and Analytic Philosophy

    Directory of Open Access Journals (Sweden)

    Cheryl Misak

    2013-07-01

    Full Text Available One of Richard Rorty's legacies is to have put a Jamesian version of pragmatism on the contemporary philosophical map. Part of his argument has been that pragmatism and analytic philosophy are set against each other, with pragmatism almost having been killed off by the reigning analytic philosophy. The argument of this paper is that there is a better and more interesting reading of both the history of pragmatism and the history of analytic philosophy.

  18. In situ impulse test: an experimental and analytical evaluation of data interpretation procedures

    International Nuclear Information System (INIS)

    1975-08-01

    Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10 -1 and 10 -3 percent are achieved. The experimental field work consisted of performing special tests in a large test sand fill to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results

  19. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    International Nuclear Information System (INIS)

    Lowry, N.J.

    1998-01-01

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints

  20. Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests

    Science.gov (United States)

    Ebuoh, Casmir N.

    2018-01-01

    Literature revealed that the patterns/methods of scoring essay tests had been criticized for not being reliable and this unreliability is more likely to be more in internal examinations than in the external examinations. The purpose of this study is to find out the effects of analytical and holistic scoring patterns on scorer reliability in…

  1. Test for arsenic speciation in waters based on a paper-based analytical device with scanometric detection.

    Science.gov (United States)

    Pena-Pereira, Francisco; Villar-Blanco, Lorena; Lavilla, Isela; Bendicho, Carlos

    2018-06-29

    A rapid, simple and affordable method for arsenic speciation analysis is described in this work. The proposed methodology involves in situ arsine generation, transfer of the volatile to the headspace and its reaction with silver nitrate at the detection zone of a paper-based analytical device (PAD). Thus, silver nitrate acts as a recognition element for arsine in the paper-based sensor. The chemical reaction between the recognition element and the analyte derivative results in the formation of a colored product which can be detected by scanning the detection zone and data treatment with an image processing and analysis program. Detection and injection zones were defined in the paper substrate by formation of hydrophobic barriers, thus enabling the formation of the volatile derivative without affecting the chemical stability of the recognition element present in the PAD. Experimental parameters influencing the analytical performance of the methodology, namely color mode detection, composition of the paper-based sensor and hydride generation and mass transfer conditions, were evaluated. Under optimal conditions, the proposed method showed limits of detection and quantification of 1.1 and 3.6 ng mL -1 , respectively. Remarkably, the limit of detection of the method reported herein was much lower than the maximum contaminant levels set by both the World Health Organization and the US Environmental Protection Agency for arsenic in drinking water, unlike several commercially available arsenic test kits. The repeatability, expressed as relative standard deviation, was found to be 7.1% (n = 8). The method was validated against the European Reference Material ERM ® -CA615 groundwater and successfully applied to the determination of As(III), As(V) and total inorganic As in different water samples. Furthermore, the method can be used for the screening analysis of total arsenic in waters when a cut-off level of 7 ng mL -1 is used. Copyright © 2018 Elsevier B

  2. The Computerized Table Setting Test for Detecting Unilateral Neglect.

    Directory of Open Access Journals (Sweden)

    Seok Jong Chung

    Full Text Available Patients with unilateral neglect fail to respond normally to stimuli on the left side. To facilitate the evaluation of unilateral spatial neglect, we developed a new application that runs on a tablet device and investigated its feasibility in stroke patients.We made the computerized table setting test (CTST to run on the tablet computer. Forty acute ischemic stroke patients (20 patients with right hemispheric infarction with neglect, 10 patients with right hemispheric infarction without neglect, and 10 patients with left hemispheric infarction and 10 healthy controls were prospectively enrolled to validate the CTST. The test requires subjects to set a table by dragging 12 dishes located below the table on the tablet screen. The horizontal deviation of the 12 dishes from the midline of the table, the selection tendency measured by the sequence of the dish selection, and the elapsed time for table setting were calculated automatically.Parameters measured by the CTST were correlated with the results of conventional neglect tests. The horizontal deviation was significantly higher in patients with right hemispheric infarction with neglect compared with the other groups. The selection tendency and elapsed time also were significantly different in patients with right hemispheric infarction with neglect compared with the left hemispheric infarction and control groups, but were similar to those with right hemispheric infarction without neglect.The CTST is feasible to administer and comparable with conventional neglect tests. This new application may be useful for the initial diagnosis and follow-up of neglect patients.

  3. Strategy for reduced calibration sets to develop quantitative structure-retention relationships in high-performance liquid chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Andries, Jan P.M. [University of Professional Education, Department of Life Sciences, P.O. Box 90116, 4800 RA Breda (Netherlands); Claessens, Henk A. [University of Professional Education, Department of Life Sciences, P.O. Box 90116, 4800 RA Breda (Netherlands); Eindhoven University of Technology, Department of Chemical Engineering and Chemistry, Laboratory of Polymer Chemistry, P.O. Box 513 (Helix, STW 1.35), 5600 MB Eindhoven (Netherlands); Heyden, Yvan Vander [Department of Analytical Chemistry and Pharmaceutical Technology, Vrije Universiteit Brussel-VUB, Laarbeeklaan 103, B-1090 Brussels (Belgium); Buydens, Lutgarde M.C., E-mail: L.Buydens@science.ru.nl [Institute for Molecules and Materials, Radboud University Nijmegen, Toernooiveld 1, 6525 ED Nijmegen (Netherlands)

    2009-10-12

    In high-performance liquid chromatography, quantitative structure-retention relationships (QSRRs) are applied to model the relation between chromatographic retention and quantities derived from molecular structure of analytes. Classically a substantial number of test analytes is used to build QSRR models. This makes their application laborious and time consuming. In this work a strategy is presented to build QSRR models based on selected reduced calibration sets. The analytes in the reduced calibration sets are selected from larger sets of analytes by applying the algorithm of Kennard and Stone on the molecular descriptors used in the QSRR concerned. The strategy was applied on three QSRR models of different complexity, relating logk{sub w} or log k with either: (i) log P, the n-octanol-water partition coefficient, (ii) calculated quantum chemical indices (QCI), or (iii) descriptors from the linear solvation energy relationship (LSER). Models were developed and validated for 76 reversed-phase high-performance liquid chromatography systems. From the results we can conclude that it is possible to develop log P models suitable for the future prediction of retentions with as few as seven analytes. For the QCI and LSER models we derived the rule that three selected analytes per descriptor are sufficient. Both the dependent variable space, formed by the retention values, and the independent variable space, formed by the descriptors, are covered well by the reduced calibration sets. Finally guidelines to construct small calibration sets are formulated.

  4. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, N.J.

    1998-10-21

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints.

  5. Cost effectiveness of ovarian reserve testing in in vitro fertilization: a Markov decision-analytic model

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Broekmans, Frank J. M.; van Disseldorp, Jeroen; Fauser, Bart C. J. M.; Eijkemans, Marinus J. C.; Hompes, Peter G. A.; van der Veen, Fulco; Mol, Ben Willem J.

    2011-01-01

    To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). A Markov decision model based on data from the literature and original patient data. Decision analytic framework. Computer-simulated cohort of subfertile women aged 20 to 45 years who are eligible for IVF.

  6. Transformational Leadership and Organizational Citizenship Behavior: A Meta-Analytic Test of Underlying Mechanisms.

    Science.gov (United States)

    Nohe, Christoph; Hertel, Guido

    2017-01-01

    Based on social exchange theory, we examined and contrasted attitudinal mediators (affective organizational commitment, job satisfaction) and relational mediators (trust in leader, leader-member exchange; LMX) of the positive relationship between transformational leadership and organizational citizenship behavior (OCB). Hypotheses were tested using meta-analytic path models with correlations from published meta-analyses (761 samples with 227,419 individuals overall). When testing single-mediator models, results supported our expectations that each of the mediators explained the relationship between transformational leadership and OCB. When testing a multi-mediator model, LMX was the strongest mediator. When testing a model with a latent attitudinal mechanism and a latent relational mechanism, the relational mechanism was the stronger mediator of the relationship between transformational leadership and OCB. Our findings help to better understand the underlying mechanisms of the relationship between transformational leadership and OCB.

  7. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  8. Development of an Automated LIBS Analytical Test System Integrated with Component Control and Spectrum Analysis Capabilities

    International Nuclear Information System (INIS)

    Ding Yu; Tian Di; Chen Feipeng; Chen Pengfei; Qiao Shujun; Yang Guang; Li Chunsheng

    2015-01-01

    The present paper proposes an automated Laser-Induced Breakdown Spectroscopy (LIBS) analytical test system, which consists of a LIBS measurement and control platform based on a modular design concept, and a LIBS qualitative spectrum analysis software and is developed in C#. The platform provides flexible interfacing and automated control; it is compatible with different manufacturer component models and is constructed in modularized form for easy expandability. During peak identification, a more robust peak identification method with improved stability in peak identification has been achieved by applying additional smoothing on the slope obtained by calculation before peak identification. For the purpose of element identification, an improved main lines analysis method, which detects all elements on the spectral peak to avoid omission of certain elements without strong spectral lines, is applied to element identification in the tested LIBS samples. This method also increases the identification speed. In this paper, actual applications have been carried out. According to tests, the analytical test system is compatible with components of various models made by different manufacturers. It can automatically control components to get experimental data and conduct filtering, peak identification and qualitative analysis, etc. on spectral data. (paper)

  9. Optimal testing input sets for reduced diagnosis time of nuclear power plant digital electronic circuits

    International Nuclear Information System (INIS)

    Kim, D.S.; Seong, P.H.

    1994-01-01

    This paper describes the optimal testing input sets required for the fault diagnosis of the nuclear power plant digital electronic circuits. With the complicated systems such as very large scale integration (VLSI), nuclear power plant (NPP), and aircraft, testing is the major factor of the maintenance of the system. Particularly, diagnosis time grows quickly with the complexity of the component. In this research, for reduce diagnosis time the authors derived the optimal testing sets that are the minimal testing sets required for detecting the failure and for locating of the failed component. For reduced diagnosis time, the technique presented by Hayes fits best for the approach to testing sets generation among many conventional methods. However, this method has the following disadvantages: (a) it considers only the simple network (b) it concerns only whether the system is in failed state or not and does not provide the way to locate the failed component. Therefore the authors have derived the optimal testing input sets that resolve these problems by Hayes while preserving its advantages. When they applied the optimal testing sets to the automatic fault diagnosis system (AFDS) which incorporates the advanced fault diagnosis method of artificial intelligence technique, they found that the fault diagnosis using the optimal testing sets makes testing the digital electronic circuits much faster than that using exhaustive testing input sets; when they applied them to test the Universal (UV) Card which is a nuclear power plant digital input/output solid state protection system card, they reduced the testing time up to about 100 times

  10. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  11. Analytical support for the B4C control rod test QUENCH-07

    International Nuclear Information System (INIS)

    Homann, C.; Hering, W.; Fernandez Benitez, J.A.; Ortega Bernardo, M.

    2003-04-01

    Degradation of B 4 C absorber rods during a beyond design accident in a nuclear power reactor may be a safety concern. Among others, the integral test QUENCH-07 is performed in the FZK QUENCH facility and supported by analytical work within the Euratom Fifth Framework Programme on Nuclear Fission Safety to get a more profound database. Since the test differed substantially from previous QUENCH tests, much more work had to be done for pretest calculations than usual to guarantee the safety of the facility and to derive the test protocol. Several institutions shared in this work with different computer code systems, as used for nuclear reactor safety analyses. Due to this effort, problems could be identified and solved, leading to several modifications of the originally planned test conduct, until a feasible test protocol could be derived and recommended. All calculations showed the same trends. Especially the high temperatures and hence the small safety margin for the facility were a concern. In this report, contributions of various authors, engaged in this work, are presented. The test QUENCH-07 and the related computational support by the engaged institutions were co-financed by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (COLOSS Project, contract No. FIKS-CT-1999-00002). (orig.)

  12. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  13. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  14. Process for nondestructively testing with radioactive gas using a chill set sealant

    International Nuclear Information System (INIS)

    Gibbons, C.B.

    1975-01-01

    An article surface is nondestructively tested for substantially invisible surface voids by absorbing a radioactive gas thereon. The adsorbed radioactive gas is disproportionately retained on those surfaces presented by the substantially invisible surface voids as compared to the remaining surfaces of the article contacted by the radioactive gas. The radiation released by the radioactive gas remaining adsorbed is used to identify the substantially invisible voids. To immobilize the radioactive gas adjacent or within the surface voids, a sealant composition is provided which is capable of being chill set. The temperatures of the article surface to be tested and the sealant composition are then related so that the article surface is at a temperature below the chill set temperature of the sealant composition and the sealant composition is at a temperature above its chill set temperature. The article portion to be tested is then coated with sealant composition to form a chill set coating thereon of substantially uniform thickness. (U.S.)

  15. An analytic method for S-expansion involving resonance and reduction

    Energy Technology Data Exchange (ETDEWEB)

    Ipinza, M.C.; Penafiel, D.M. [Departamento de Fisica, Universidad de Concepcion (Chile); DISAT, Politecnico di Torino (Italy); Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Torino (Italy); Lingua, F. [DISAT, Politecnico di Torino (Italy); Ravera, L. [DISAT, Politecnico di Torino (Italy); Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Torino (Italy)

    2016-11-15

    In this paper we describe an analytic method able to give the multiplication table(s) of the set(s) involved in an S-expansion process (with either resonance or 0{sub S}-resonant-reduction) for reaching a target Lie (super)algebra from a starting one, after having properly chosen the partitions over subspaces of the considered (super)algebras. This analytic method gives us a simple set of expressions to find the subset decomposition of the set(s) involved in the process. Then, we use the information coming from both the initial (super)algebra and the target one for reaching the multiplication table(s) of the mentioned set(s). Finally, we check associativity with an auxiliary computational algorithm, in order to understand whether the obtained set(s) can describe semigroup(s) or just abelian set(s) connecting two (super)algebras. We also give some interesting examples of application, which check and corroborate our analytic procedure and also generalize some result already presented in the literature. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  16. Nuclear forensics: strategies and analytical techniques

    International Nuclear Information System (INIS)

    Marin, Rafael C.; Sarkis, Jorge E.S.; Pestana, Rafael C.B.

    2013-01-01

    The development of nuclear forensics as a field of science arose in response to international demand for methods to investigate the illicit trafficking of nuclear materials. After being seized, unknown nuclear material is collected and analyzed by a set of analytical methods. The fingerprints of these materials can be identified and further used during the investigations. Data interpretation is an extensive process aiming to validate the hypotheses made by the experts, and can help confirm the origin of seized nuclear materials at the end of the process or investigation. This work presents the set of measures and analytical methods that have been inherited by nuclear forensics from several fields of science. The main characteristics of these methods are evaluated and the analytical techniques employed to determine the fingerprint of nuclear materials are described. (author)

  17. Germs of local automorphisms of real analytic CR structures and analytic dependence on the k-jets

    OpenAIRE

    ZAITSEV, DMITRI

    1997-01-01

    PUBLISHED The topic of the paper is the study of germs of local holomorphisms f between Cn and Cn ? such that f(M) #26; M? and df(TcM) = TcM? for M #26; Cn and M? #26; Cn ? generic real-analytic CR submanifolds of arbitrary codimensions. It is proved that for M minimal and M? finitely nondegenerate, such germs depend analytically on their jets. As a corollary, an analytic structure on the set of all germs of this type is obtained.

  18. The development and validation of the Closed-set Mandarin Sentence (CMS) test.

    Science.gov (United States)

    Tao, Duo-Duo; Fu, Qian-Jie; Galvin, John J; Yu, Ya-Feng

    2017-09-01

    Matrix-styled sentence tests offer a closed-set paradigm that may be useful when evaluating speech intelligibility. Ideally, sentence test materials should reflect the distribution of phonemes within the target language. We developed and validated the Closed-set Mandarin Sentence (CMS) test to assess Mandarin speech intelligibility in noise. CMS test materials were selected to be familiar words and to represent the natural distribution of vowels, consonants, and lexical tones found in Mandarin Chinese. Ten key words in each of five categories (Name, Verb, Number, Color, and Fruit) were produced by a native Mandarin talker, resulting in a total of 50 words that could be combined to produce 100,000 unique sentences. Normative data were collected in 10 normal-hearing, adult Mandarin-speaking Chinese listeners using a closed-set test paradigm. Two test runs were conducted for each subject, and 20 sentences per run were randomly generated while ensuring that each word was presented only twice in each run. First, the level of the words in each category were adjusted to produce equal intelligibility in noise. Test-retest reliability for word-in-sentence recognition was excellent according to Cronbach's alpha (0.952). After the category level adjustments, speech reception thresholds (SRTs) for sentences in noise, defined as the signal-to-noise ratio (SNR) that produced 50% correct whole sentence recognition, were adaptively measured by adjusting the SNR according to the correctness of response. The mean SRT was -7.9 (SE=0.41) and -8.1 (SE=0.34) dB for runs 1 and 2, respectively. The mean standard deviation across runs was 0.93 dB, and paired t-tests showed no significant difference between runs 1 and 2 (p=0.74) despite random sentences being generated for each run and each subject. The results suggest that the CMS provides large stimulus set with which to repeatedly and reliably measure Mandarin-speaking listeners' speech understanding in noise using a closed-set paradigm.

  19. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  20. Evaluation of analytical performance based on partial order methodology.

    Science.gov (United States)

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Vibration Based Diagnosis for Planetary Gearboxes Using an Analytical Model

    Directory of Open Access Journals (Sweden)

    Liu Hong

    2016-01-01

    Full Text Available The application of conventional vibration based diagnostic techniques to planetary gearboxes is a challenge because of the complexity of frequency components in the measured spectrum, which is the result of relative motions between the rotary planets and the fixed accelerometer. In practice, since the fault signatures are usually contaminated by noises and vibrations from other mechanical components of gearboxes, the diagnostic efficacy may further deteriorate. Thus, it is essential to develop a novel vibration based scheme to diagnose gear failures for planetary gearboxes. Following a brief literature review, the paper begins with the introduction of an analytical model of planetary gear-sets developed by the authors in previous works, which can predict the distinct behaviors of fault introduced sidebands. This analytical model is easy to implement because the only prerequisite information is the basic geometry of the planetary gear-set. Afterwards, an automated diagnostic scheme is proposed to cope with the challenges associated with the characteristic configuration of planetary gearboxes. The proposed vibration based scheme integrates the analytical model, a denoising algorithm, and frequency domain indicators into one synergistic system for the detection and identification of damaged gear teeth in planetary gearboxes. Its performance is validated with the dynamic simulations and the experimental data from a planetary gearbox test rig.

  2. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  3. A test on analytic continuation of thermal imaginary-time data

    International Nuclear Information System (INIS)

    Burnier, Y.; Laine, M.; Mether, L.

    2011-01-01

    Some time ago, Cuniberti et al. have proposed a novel method for analytically continuing thermal imaginary-time correlators to real time, which requires no model input and should be applicable with finite-precision data as well. Given that these assertions go against common wisdom, we report on a naive test of the method with an idealized example. We do encounter two problems, which we spell out in detail; this implies that systematic errors are difficult to quantify. On a more positive note, the method is simple to implement and allows for an empirical recipe by which a reasonable qualitative estimate for some transport coefficient may be obtained, if statistical errors of an ultraviolet-subtracted imaginary-time measurement can be reduced to roughly below the per mille level. (orig.)

  4. Analytic family of post-merger template waveforms

    Science.gov (United States)

    Del Pozzo, Walter; Nagar, Alessandro

    2017-06-01

    Building on the analytical description of the post-merger (ringdown) waveform of coalescing, nonprecessing, spinning binary black holes introduced by Damour and Nagar [Phys. Rev. D 90, 024054 (2014), 10.1103/PhysRevD.90.024054], we propose an analytic, closed form, time-domain, representation of the ℓ=m =2 gravitational radiation mode emitted after merger. This expression is given as a function of the component masses and dimensionless spins (m1 ,2,χ1 ,2) of the two inspiraling objects, as well as of the mass MBH and (complex) frequency σ1 of the fundamental quasinormal mode of the remnant black hole. Our proposed template is obtained by fitting the post-merger waveform part of several publicly available numerical relativity simulations from the Simulating eXtreme Spacetimes (SXS) catalog and then suitably interpolating over (symmetric) mass ratio and spins. We show that this analytic expression accurately reproduces (˜0.01 rad ) the phasing of the post-merger data of other data sets not used in its construction. This is notably the case of the spin-aligned run SXS:BBH:0305, whose intrinsic parameters are consistent with the 90% credible intervals reported in the parameter-estimation followup of GW150914 by B.P. Abbott et al. [Phys. Rev. Lett. 116, 241102 (2016), 10.1103/PhysRevLett.116.241102]. Using SXS waveforms as "experimental" data, we further show that our template could be used on the actual GW150914 data to perform a new measure of the complex frequency of the fundamental quasinormal mode so as to exploit the complete (high signal-to-noise-ratio) post-merger waveform. We assess the usefulness of our proposed template by analyzing, in a realistic setting, SXS full inspiral-merger-ringdown waveforms and constructing posterior probability distribution functions for the central frequency damping time of the first overtone of the fundamental quasinormal mode as well as for the physical parameters of the systems. We also briefly explore the possibility

  5. The PLUS family: A set of computer programs to evaluate analytical solutions of the diffusion equation and thermoelasticity

    International Nuclear Information System (INIS)

    Montan, D.N.

    1987-02-01

    This report is intended to describe, document and provide instructions for the use of new versions of a set of computer programs commonly referred to as the PLUS family. These programs were originally designed to numerically evaluate simple analytical solutions of the diffusion equation. The new versions include linear thermo-elastic effects from thermal fields calculated by the diffusion equation. After the older versions of the PLUS family were documented a year ago, it was realized that the techniques employed in the programs were well suited to the addition of linear thermo-elastic phenomena. This has been implemented and this report describes the additions. 3 refs., 14 figs

  6. Final report on the proficiency test of the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network

    International Nuclear Information System (INIS)

    Shakhashiro, A.; Radecki, Z.; Trinkl, A.; Sansone, U.; Benesch, T.

    2005-08-01

    This report presents the statistical evaluation of results from the analysis of 12 radionuclides in 8 samples within the frame of the First Proficiency Test of Analytical Laboratories for the Measurement Environmental RAdioactivity (ALMERA) organized in 2001-2002 by the Chemistry Unit, Agency's Laboratory in Seibersdorf. The results were evaluated by using appropriate statistical means to assess laboratory analytical performance and to estimate the overall performance for the determination of each radionuclide. Evaluation of the analytical data for gamma emitting radionuclides showed that 68% of data obtained a 'Passed' final score for both the trueness and precision criteria applied to this exercise. However, transuranic radionuclides obtained only 58% for the same criteria. (author)

  7. Uniform approximation is more appropriate for Wilcoxon Rank-Sum Test in gene set analysis.

    Directory of Open Access Journals (Sweden)

    Zhide Fang

    Full Text Available Gene set analysis is widely used to facilitate biological interpretations in the analyses of differential expression from high throughput profiling data. Wilcoxon Rank-Sum (WRS test is one of the commonly used methods in gene set enrichment analysis. It compares the ranks of genes in a gene set against those of genes outside the gene set. This method is easy to implement and it eliminates the dichotomization of genes into significant and non-significant in a competitive hypothesis testing. Due to the large number of genes being examined, it is impractical to calculate the exact null distribution for the WRS test. Therefore, the normal distribution is commonly used as an approximation. However, as we demonstrate in this paper, the normal approximation is problematic when a gene set with relative small number of genes is tested against the large number of genes in the complementary set. In this situation, a uniform approximation is substantially more powerful, more accurate, and less intensive in computation. We demonstrate the advantage of the uniform approximations in Gene Ontology (GO term analysis using simulations and real data sets.

  8. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  9. A set of X-ray test objects for quality control in television fluoroscopy

    International Nuclear Information System (INIS)

    Hay, G.A.; Clarke, O.F.; Coleman, N.J.; Cowen, A.R.

    1985-01-01

    The history of performance testing in Leeds of television fluoroscopic systems is briefly outlined. Using the visual, physical and technological requirements as a basis, a set of nine test objects for quality control in television fluoroscopy is described. The factors measured by the test objects are listed in the introduction; the test objects and their function are fully described in the remainder of the paper. The test objects, in conjunction with a television oscilloscope, give both subjective and objective information about the X-ray system. Three of the test objects enable the physicist or engineer to adjust certain aspects of the performance of the X-ray system. The set of nine test objects is available commercially. (author)

  10. Analytic Coleman-de Luccia Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Xi; /Stanford U., ITP /Stanford U., Phys. Dept. /SLAC; Harlow, Daniel; /Stanford U., ITP /Stanford U., Phys. Dept.

    2012-02-16

    We present the necessary and sufficient conditions for a Euclidean scale factor to be a solution of the Coleman-de Luccia equations for some analytic potential V ({psi}), with a Lorentzian continuation describing the growth of a bubble of lower-energy vacuum surrounded by higher-energy vacuum. We then give a set of explicit examples that satisfy the conditions and thus are closed-form analytic examples of Coleman-de Luccia geometries.

  11. Extending Climate Analytics-As to the Earth System Grid Federation

    Science.gov (United States)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  12. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    Science.gov (United States)

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  13. Determination of natural and depleted uranium in urine at the ppt level: an interlaboratory analytical exercise

    International Nuclear Information System (INIS)

    D'Agostino, P.A.; Ough, E.A.; Glover, S.E.; Vallerand, A.L.

    2002-10-01

    An analytical exercise was initiated in order to determine those analytical procedures with the capacity to measure uranium isotope ratios ( 238 U/ 235 U) in urine samples containing less that 1μ uranium /L urine. A host laboratory was tasked with the preparation of six sets (12 samples per set) of synthetic urine samples spiked with varying amounts of natural and depleted (0.2% 235 U) uranium. The sets of samples contained total uranium in the range 25 ng U/L urine to 770 ng U/L urine, with isotope ratios ( 238 U/ 235 U) from 137.9 (natural uranium) to 215 (∼50% depleted uranium). Sets of samples were shipped to five testing laboratories (four Canadian and one European) for total and isotopic assay. The techniques employed in the analyses included sector field inductively coupled plasma mass spectrometry (ICP-SF-MS), quadrupole inductively coupled plasma mass spectrometry (ICP-Q-MS), thermal ionization mass spectrometry (TIMS) and neutron activation analysis (NAA). Full results were obtained from three testing labs (ICP-SF-MS, ICP-Q-MS and TIMS). Their results, plus partial results from the NAA lab, have been included in this report. Total uranium and isotope ratio results obtained from ICP-SF-MS and ICP-Q-MS were in good agreement with the host lab values. Neutron activation analysis and TIMS reported total uranium concentrations that differed from the host lab. An incomplete set of isotopic ratios was obtained from the NAA lab with some results reporting enriched uranium (% 235 U > 0.7). Based on the reported results, the four analytical procedures were ranked: ICP-SF-MS (1), ICP-Q-MS (2), TIMS (3) and NAA (4). (author)

  14. An active learning representative subset selection method using net analyte signal

    Science.gov (United States)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  15. The detection of problem analytes in a single proficiency test challenge in the absence of the Health Care Financing Administration rule violations.

    Science.gov (United States)

    Cembrowski, G S; Hackney, J R; Carey, N

    1993-04-01

    The Clinical Laboratory Improvement Act of 1988 (CLIA 88) has dramatically changed proficiency testing (PT) practices having mandated (1) satisfactory PT for certain analytes as a condition of laboratory operation, (2) fixed PT limits for many of these "regulated" analytes, and (3) an increased number of PT specimens (n = 5) for each testing cycle. For many of these analytes, the fixed limits are much broader than the previously employed Standard Deviation Index (SDI) criteria. Paradoxically, there may be less incentive to identify and evaluate analytically significant outliers to improve the analytical process. Previously described "control rules" to evaluate these PT results are unworkable as they consider only two or three results. We used Monte Carlo simulations of Kodak Ektachem analyzers participating in PT to determine optimal control rules for the identification of PT results that are inconsistent with those from other laboratories using the same methods. The analysis of three representative analytes, potassium, creatine kinase, and iron was simulated with varying intrainstrument and interinstrument standard deviations (si and sg, respectively) obtained from the College of American Pathologists (Northfield, Ill) Quality Assurance Services data and Proficiency Test data, respectively. Analytical errors were simulated in each of the analytes and evaluated in terms of multiples of the interlaboratory SDI. Simple control rules for detecting systematic and random error were evaluated with power function graphs, graphs of probability of error detected vs magnitude of error. Based on the simulation results, we recommend screening all analytes for the occurrence of two or more observations exceeding the same +/- 1 SDI limit. For any analyte satisfying this condition, the mean of the observations should be calculated. For analytes with sg/si ratios between 1.0 and 1.5, a significant systematic error is signaled by the mean exceeding 1.0 SDI. Significant random error

  16. Coagulation Tests and Selected Biochemical Analytes in Dairy Cows with Hepatic Lipidosis

    Directory of Open Access Journals (Sweden)

    S. Padilla-Arellanes

    2007-01-01

    Full Text Available The aim of this study was to determine the values and changes in conventional and optimised clotting tests, as well as in selected biochemical analytes during hepatic lipidosis in postpartum dairy cows. Ten healthy and ten Holstein cows with hepatic lipidosis were selected based upon clinical history, clinical examination, liver biopsy, flotation test and histological analysis of hepatic tissue. Prothrombin time (PT and partial thromboplastin time (PTT were determined in non-diluted and diluted blood plasma samples. Clotting times determined in diluted plasma samples were prolonged in cows with hepatic lipidosis and there was a difference in the PT value at both 50% and 25% plasma dilutions between both groups of animals (P = 0.004 and P = 0.001. Significant differences between healthy animals and cows with hepatic lipidosis were observed in blood serum values for free fatty acids (FFA, aspartate aminotransferase (AST and triacyglycerols (P = 0.001, P = 0.007 and P = 0.044, respectively. FFA and liver biopsy are better diagnostic indicators for hepatic lipidosis than coagulation tests. The optimised PT is prolonged in cows with hepatic lipidosis and can detect this alteration that cannot be appreciated using conventional PT test.

  17. Dilution testing using rapid diagnostic tests in a HIV diagnostic algorithm: a novel alternative for confirmation testing in resource limited settings.

    Science.gov (United States)

    Shanks, Leslie; Siddiqui, M Ruby; Abebe, Almaz; Piriou, Erwan; Pearce, Neil; Ariti, Cono; Masiga, Johnson; Muluneh, Libsework; Wazome, Joseph; Ritmeijer, Koert; Klarkowski, Derryck

    2015-05-14

    Current WHO testing guidelines for resource limited settings diagnose HIV on the basis of screening tests without a confirmation test due to cost constraints. This leads to a potential risk of false positive HIV diagnosis. In this paper, we evaluate the dilution test, a novel method for confirmation testing, which is simple, rapid, and low cost. The principle of the dilution test is to alter the sensitivity of a rapid diagnostic test (RDT) by dilution of the sample, in order to screen out the cross reacting antibodies responsible for falsely positive RDT results. Participants were recruited from two testing centres in Ethiopia where a tiebreaker algorithm using 3 different RDTs in series is used to diagnose HIV. All samples positive on the initial screening RDT and every 10th negative sample underwent testing with the gold standard and dilution test. Dilution testing was performed using Determine™ rapid diagnostic test at 6 different dilutions. Results were compared to the gold standard of Western Blot; where Western Blot was indeterminate, PCR testing determined the final result. 2895 samples were recruited to the study. 247 were positive for a prevalence of 8.5 % (247/2895). A total of 495 samples underwent dilution testing. The RDT diagnostic algorithm misclassified 18 samples as positive. Dilution at the level of 1/160 was able to correctly identify all these 18 false positives, but at a cost of a single false negative result (sensitivity 99.6 %, 95 % CI 97.8-100; specificity 100 %, 95 % CI: 98.5-100). Concordance between the gold standard and the 1/160 dilution strength was 99.8 %. This study provides proof of concept for a new, low cost method of confirming HIV diagnosis in resource-limited settings. It has potential for use as a supplementary test in a confirmatory algorithm, whereby double positive RDT results undergo dilution testing, with positive results confirming HIV infection. Negative results require nucleic acid testing to rule out false

  18. Correlation of finite element free vibration predictions using random vibration test data. M.S. Thesis - Cleveland State Univ.

    Science.gov (United States)

    Chambers, Jeffrey A.

    1994-01-01

    Finite element analysis is regularly used during the engineering cycle of mechanical systems to predict the response to static, thermal, and dynamic loads. The finite element model (FEM) used to represent the system is often correlated with physical test results to determine the validity of analytical results provided. Results from dynamic testing provide one means for performing this correlation. One of the most common methods of measuring accuracy is by classical modal testing, whereby vibratory mode shapes are compared to mode shapes provided by finite element analysis. The degree of correlation between the test and analytical mode shapes can be shown mathematically using the cross orthogonality check. A great deal of time and effort can be exhausted in generating the set of test acquired mode shapes needed for the cross orthogonality check. In most situations response data from vibration tests are digitally processed to generate the mode shapes from a combination of modal parameters, forcing functions, and recorded response data. An alternate method is proposed in which the same correlation of analytical and test acquired mode shapes can be achieved without conducting the modal survey. Instead a procedure is detailed in which a minimum of test information, specifically the acceleration response data from a random vibration test, is used to generate a set of equivalent local accelerations to be applied to the reduced analytical model at discrete points corresponding to the test measurement locations. The static solution of the analytical model then produces a set of deformations that once normalized can be used to represent the test acquired mode shapes in the cross orthogonality relation. The method proposed has been shown to provide accurate results for both a simple analytical model as well as a complex space flight structure.

  19. Features Students Really Expect from Learning Analytics

    Science.gov (United States)

    Schumacher, Clara; Ifenthaler, Dirk

    2016-01-01

    In higher education settings more and more learning is facilitated through online learning environments. To support and understand students' learning processes better, learning analytics offers a promising approach. The purpose of this study was to investigate students' expectations toward features of learning analytics systems. In a first…

  20. [The analytical setting of rotary speed of centrifuge rotor and centrifugation time in chemical, biochemical and microbiological practice].

    Science.gov (United States)

    Zolotarev, K V

    2012-08-01

    The researchers happen to face with suspensions in their chemical, biochemical and microbiological practice. The suspensions are the disperse systems with solid dispersed phase and liquid dispersion medium and with dispersed phase particle size > 100 nm (10-7 m). Quite often the necessity occurs to separate solid particles from liquid. To use for this purpose the precipitation in gravitation field can make the process to progress too long. In this respect an effective mode is the precipitation in the field of centrifugal forces--the centrifugation. The rotary speed of centrifuge rotor and centrifugation time can be set analytically using regularities of general dynamics and hydrodynamics. To this effect, should be written and transformed the equation of First and Second Newton Laws for suspension particle being in the field of centrifugal forces and forces of resistance of liquid and vessel wall. The force of liquid resistance depends on particle motion condition in liquid. To determine the regimen the Archimedes and Reynolds numerical dimensionless criteria are to be applied. The article demonstrates the results of these transformations as analytical inverse ratio dependence of centrifugation time from rotary speed. The calculation of series of "rate-time" data permits to choose the optimal data pair on the assumption of centrifuge capacity and practical reasonability. The results of calculations are validated by actual experimental data hence the physical mathematical apparatus can be considered as effective one. The setting progress depends both from parameter (Reynolds criterion) and data series calculation. So, the most convenient way to apply this operation is the programming approach. The article proposes to use the program Microsoft Excel and VBA programming language for this purpose. The possibility to download the file from Internet to use it for fast solution is proposed.

  1. Sociocultural determinants of anticipated oral cholera vaccine acceptance in three African settings: a meta-analytic approach.

    Science.gov (United States)

    Sundaram, Neisha; Schaetti, Christian; Merten, Sonja; Schindler, Christian; Ali, Said M; Nyambedha, Erick O; Lapika, Bruno; Chaignat, Claire-Lise; Hutubessy, Raymond; Weiss, Mitchell G

    2016-01-14

    Controlling cholera remains a significant challenge in Sub-Saharan Africa. In areas where access to safe water and sanitation are limited, oral cholera vaccine (OCV) can save lives. Establishment of a global stockpile for OCV reflects increasing priority for use of cholera vaccines in endemic settings. Community acceptance of vaccines, however, is critical and sociocultural features of acceptance require attention for effective implementation. This study identifies and compares sociocultural determinants of anticipated OCV acceptance across populations in Southeastern Democratic Republic of Congo, Western Kenya and Zanzibar. Cross-sectional studies were conducted using similar but locally-adapted semistructured interviews among 1095 respondents in three African settings. Logistic regression models identified sociocultural determinants of OCV acceptance from these studies in endemic areas of Southeastern Democratic Republic of Congo (SE-DRC), Western Kenya (W-Kenya) and Zanzibar. Meta-analytic techniques highlighted common and distinctive determinants in the three settings. Anticipated OCV acceptance was high in all settings. More than 93% of community respondents overall indicated interest in a no-cost vaccine. Higher anticipated acceptance was observed in areas with less access to public health facilities. In all settings awareness of cholera prevention methods (safe food consumption and garbage disposal) and relating ingestion to cholera causation were associated with greater acceptance. Higher age, larger households, lack of education, social vulnerability and knowledge of oral rehydration solution for self-treatment were negatively associated with anticipated OCV acceptance. Setting-specific determinants of acceptance included reporting a reliable income (W-Kenya and Zanzibar, not SE-DRC). In SE-DRC, intention to purchase an OCV appeared unrelated to ability to pay. Rural residents were less likely than urban counterparts to accept an OCV in W-Kenya, but more

  2. Evaluation of the separate effects tests (SET) validation matrix

    International Nuclear Information System (INIS)

    1996-11-01

    This work is the result of a one year extended mandate which has been given by the CSNI on the request of the PWG 2 and the Task Group on Thermal Hydraulic System Behaviour (TG THSB) in late 1994. The aim was to evaluate the SET validation matrix in order to define the real needs for further experimental work. The statistical evaluation tables of the SET matrix provide an overview of the data base including the parameter ranges covered for each phenomenon and selected parameters, and questions posed to obtain answers concerning the need for additional experimental data with regard to the objective of nuclear power plant safety. A global view of the data base is first presented focussing on areas lacking in data and on hot topics. A new systematic evaluation has been done based on the authors technical judgments and giving evaluation tables. In these tables, global and indicative information are included. Four main parameters have been chosen as the most important and relevant parameters: a state parameter given by the operating pressure of the tests, a flow parameter expressed as mass flux, mass flow rate or volumetric flow rate in the tests, a geometrical parameter provided through a typical dimension expressed by a diameter, an equivalent diameter (hydraulic or heated) or a cross sectional area of the test sections, and an energy or heat transfer parameter given as the fluid temperature, the heat flux or the heat transfer surface temperature of the tests

  3. Labour market driven learning analytics

    NARCIS (Netherlands)

    Kobayashi, V.; Mol, S.T.; Kismihók, G.

    2014-01-01

    This paper briefly outlines a project about integrating labour market information in a learning analytics goal-setting application that provides guidance to students in their transition from education to employment.

  4. Labour Market Driven Learning Analytics

    Science.gov (United States)

    Kobayashi, Vladimer; Mol, Stefan T.; Kismihók, Gábor

    2014-01-01

    This paper briefly outlines a project about integrating labour market information in a learning analytics goal-setting application that provides guidance to students in their transition from education to employment.

  5. New nuclear data set ABBN-90 and its testing on macroscopic experiments

    International Nuclear Information System (INIS)

    Kosh'cheev, V.N.; Manturov, G.N.; Nikolaev, M.N.; Rineyskiy, A.A.; Sinitsa, V.V.; Tsyboolya, A.M.; Zabrodskaya, S.V.

    1993-01-01

    The new group constant set ABBN-90 is developed now. It based on the FOND-2 evaluated neutron data library processed with the code GRUCON. Some results of the testing ABBN-90 set in different macroscopic experiments are presented. (author)

  6. Information-analytical maintenance of AIC at regional level

    OpenAIRE

    MOYSEENKO I.P.

    2013-01-01

    Submitted approaches to system information and analytical support regional management. Formulated methodology of information and analytical support regional agribusiness management with regard to models of EHS and security settings. Describe the nature and function of monitoring objects of study.

  7. The Case for Adopting Server-side Analytics

    Science.gov (United States)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for

  8. The impact of repeat-testing of common chemistry analytes at critical concentrations.

    Science.gov (United States)

    Onyenekwu, Chinelo P; Hudson, Careen L; Zemlin, Annalise E; Erasmus, Rajiv T

    2014-12-01

    Early notification of critical values by the clinical laboratory to the treating physician is a requirement for accreditation and is essential for effective patient management. Many laboratories automatically repeat a critical value before reporting it to prevent possible misdiagnosis. Given today's advanced instrumentation and quality assurance practices, we questioned the validity of this approach. We performed an audit of repeat-testing in our laboratory to assess for significant differences between initial and repeated test results, estimate the delay caused by repeat-testing and to quantify the cost of repeating these assays. A retrospective audit of repeat-tests for sodium, potassium, calcium and magnesium in the first quarter of 2013 at Tygerberg Academic Laboratory was conducted. Data on the initial and repeat-test values and the time that they were performed was extracted from our laboratory information system. The Clinical Laboratory Improvement Amendment criteria for allowable error were employed to assess for significant difference between results. A total of 2308 repeated tests were studied. There was no significant difference in 2291 (99.3%) of the samples. The average delay ranged from 35 min for magnesium to 42 min for sodium and calcium. At least 2.9% of laboratory running costs for the analytes was spent on repeating them. The practice of repeating a critical test result appears unnecessary as it yields similar results, delays notification to the treating clinician and increases laboratory running costs.

  9. Identifying genetic marker sets associated with phenotypes via an efficient adaptive score test

    KAUST Repository

    Cai, T.

    2012-06-25

    In recent years, genome-wide association studies (GWAS) and gene-expression profiling have generated a large number of valuable datasets for assessing how genetic variations are related to disease outcomes. With such datasets, it is often of interest to assess the overall effect of a set of genetic markers, assembled based on biological knowledge. Genetic marker-set analyses have been advocated as more reliable and powerful approaches compared with the traditional marginal approaches (Curtis and others, 2005. Pathways to the analysis of microarray data. TRENDS in Biotechnology 23, 429-435; Efroni and others, 2007. Identification of key processes underlying cancer phenotypes using biologic pathway analysis. PLoS One 2, 425). Procedures for testing the overall effect of a marker-set have been actively studied in recent years. For example, score tests derived under an Empirical Bayes (EB) framework (Liu and others, 2007. Semiparametric regression of multidimensional genetic pathway data: least-squares kernel machines and linear mixed models. Biometrics 63, 1079-1088; Liu and others, 2008. Estimation and testing for the effect of a genetic pathway on a disease outcome using logistic kernel machine regression via logistic mixed models. BMC bioinformatics 9, 292-2; Wu and others, 2010. Powerful SNP-set analysis for case-control genome-wide association studies. American Journal of Human Genetics 86, 929) have been proposed as powerful alternatives to the standard Rao score test (Rao, 1948. Large sample tests of statistical hypotheses concerning several parameters with applications to problems of estimation. Mathematical Proceedings of the Cambridge Philosophical Society, 44, 50-57). The advantages of these EB-based tests are most apparent when the markers are correlated, due to the reduction in the degrees of freedom. In this paper, we propose an adaptive score test which up- or down-weights the contributions from each member of the marker-set based on the Z-scores of

  10. Negations in syllogistic reasoning: evidence for a heuristic-analytic conflict.

    Science.gov (United States)

    Stupple, Edward J N; Waterhouse, Eleanor F

    2009-08-01

    An experiment utilizing response time measures was conducted to test dominant processing strategies in syllogistic reasoning with the expanded quantifier set proposed by Roberts (2005). Through adding negations to existing quantifiers it is possible to change problem surface features without altering logical validity. Biases based on surface features such as atmosphere, matching, and the probability heuristics model (PHM; Chater & Oaksford, 1999; Wetherick & Gilhooly, 1995) would not be expected to show variance in response latencies, but participant responses should be highly sensitive to changes in the surface features of the quantifiers. In contrast, according to analytic accounts such as mental models theory and mental logic (e.g., Johnson-Laird & Byrne, 1991; Rips, 1994) participants should exhibit increased response times for negated premises, but not be overly impacted upon by the surface features of the conclusion. Data indicated that the dominant response strategy was based on a matching heuristic, but also provided evidence of a resource-demanding analytic procedure for dealing with double negatives. The authors propose that dual-process theories offer a stronger account of these data whereby participants employ competing heuristic and analytic strategies and fall back on a heuristic response when analytic processing fails.

  11. Analytic continuation of dual Feynman amplitudes

    International Nuclear Information System (INIS)

    Bleher, P.M.

    1981-01-01

    A notion of dual Feynman amplitude is introduced and a theorem on the existence of analytic continuation of this amplitude from the convergence domain to the whole complex is proved. The case under consideration corresponds to massless power propagators and the analytic continuation is constructed on the propagators powers. Analytic continuation poles and singular set of external impulses are found explicitly. The proof of the theorem on the existence of analytic continuation is based on the introduction of α-representation for dual Feynman amplitudes. In proving, the so-called ''trees formula'' and ''trees-with-cycles formula'' are established that are dual by formulation to the trees and 2-trees formulae for usual Feynman amplitudes. (Auth.)

  12. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  13. Solving large sets of coupled equations iteratively by vector processing on the CYBER 205 computer

    International Nuclear Information System (INIS)

    Tolsma, L.D.

    1985-01-01

    The set of coupled linear second-order differential equations which has to be solved for the quantum-mechanical description of inelastic scattering of atomic and nuclear particles can be rewritten as an equivalent set of coupled integral equations. When some type of functions is used as piecewise analytic reference solutions, the integrals that arise in this set can be evaluated analytically. The set of integral equations can be solved iteratively. For the results mentioned an inward-outward iteration scheme has been applied. A concept of vectorization of coupled-channel Fortran programs, based on this integral method, is presented for the use on the Cyber 205 computer. It turns out that, for two heavy ion nuclear scattering test cases, this vector algorithm gives an overall speed-up of about a factor of 2 to 3 compared to a highly optimized scalar algorithm for a one vector pipeline computer

  14. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  15. Savannah River Site TEP-SET tests uncertainty report

    International Nuclear Information System (INIS)

    Taylor, D.J.N.

    1993-09-01

    This document presents a measurement uncertainty analysis for the instruments used for the Phase I, II and III of the Savannah River One-Fourth Linear Scale, One-Sixth Sector, Tank/Muff/Pump (TMP) Separate Effects Tests (SET) Experiment Series. The Idaho National Engineering Laboratory conducted the tests for the Savannah River Site (SRS). The tests represented a range of hydraulic conditions and geometries that bound anticipated Large Break Loss of Coolant Accidents in the SRS reactors. Important hydraulic phenomena were identified from experiments. In addition, code calculations will be benchmarked from these experiments. The experimental system includes the following measurement groups: coolant density; absolute and differential pressures; turbine flowmeters (liquid phase); thermal flowmeters (gas phase); ultrasonic liquid level meters; temperatures; pump torque; pump speed; moderator tank liquid inventory via a load cells measurement; and relative humidity meters. This document also analyzes data acquisition system including the presampling filters as it relates to these measurements

  16. Analytical Plan for Roman Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.; Schwantes, Jon M.; Olszta, Matthew J.; Thevuthasan, Suntharampillai; Heeren, Ronald M.

    2011-01-01

    Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University of Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.

  17. MATLAB-SIMULINK BASED INFORMATION SUPPORT FOR DIGITAL OVERCURRENT PROTECTION TEST SETS

    Directory of Open Access Journals (Sweden)

    I. V. Novash

    2017-01-01

    Full Text Available The implementation of information support for PC-based and hardware-software based sets for digital overcurrent protection devices and their models testing using MatLab-Simulink environment is considered. It is demonstrated that the mathematical modeling of a part of the power system – viz. of the generalized electric power object – could be based on rigid and flexible models. Rigid models implemented on the basis of mathematical description of electrical and magnetic circuits of a power system can be considered as a reference model for the simulation results that have been obtained with the aid of another simulation system to be compared with. It is proposed to implement flexible models for generalized electric power object in the MatLabSimulink environment that includes the SimPowerSystems component library targeted to power system modeling. The features of the parameters calculation of the SimPowerSystems component library blocks that the power system model is formed of are considered. Out of the Simulink standard blocks the models of a wye-connected current transformers were composed as well as the digital overcurrent protection, missing in the component library. A comparison of simulation results of one and the same generalized electric power object implemented in various PC-based software packages was undertaken. The divergence of simulation results did not exceed 3 %; the latter allows us to recommend the MatLab-Simulink environment for information support creation for hardware-software based sets for digital overcurrent protection devices testing. The structure of the hardware-software based set for digital overcurrent protection device testing using the Omicron CMC 356 has been suggested. Time to trip comparison between the real digital protection device МР 801 and the model with the parameters which are exactly match the parameters of the prototype device was carried out using the identical test inputs. The results of the tests

  18. Analytic functionals on the sphere

    CERN Document Server

    Morimoto, Mitsuo

    1998-01-01

    This book treats spherical harmonic expansion of real analytic functions and hyperfunctions on the sphere. Because a one-dimensional sphere is a circle, the simplest example of the theory is that of Fourier series of periodic functions. The author first introduces a system of complex neighborhoods of the sphere by means of the Lie norm. He then studies holomorphic functions and analytic functionals on the complex sphere. In the one-dimensional case, this corresponds to the study of holomorphic functions and analytic functionals on the annular set in the complex plane, relying on the Laurent series expansion. In this volume, it is shown that the same idea still works in a higher-dimensional sphere. The Fourier-Borel transformation of analytic functionals on the sphere is also examined; the eigenfunction of the Laplacian can be studied in this way.

  19. High-Activity ICP-AES Measurements in the ATALANTE Facility Applied to Analytical Monitoring of an Extraction Test

    International Nuclear Information System (INIS)

    Esbelin, E.; Boyer-Deslys, V.; Beres, A.; Viallesoubranne, C.

    2008-01-01

    The Material Analysis and Metrology Laboratory (LAMM) of the Cea's Atalante complex ensures analytical monitoring of enhanced separation tests. Certain fission products, actinides and lanthanides were assayed by ICP-AES (Inductively Coupled Plasma-Atomic Emission Spectroscopy) in the CBA shielded analysis line. These analyses were particularly effective for controlling the Diamex test, and contributed to its success. The Diamex process consists in extracting the actinides and lanthanides from a Purex raffinate using a diamide, DMDOHEMA, followed by stripping at low acidity. The major elements analyzed during the test were Am, Nd, Mo, Fe, and Zr

  20. Using Set Covering with Item Sampling to Analyze the Infeasibility of Linear Programming Test Assembly Models

    Science.gov (United States)

    Huitzing, Hiddo A.

    2004-01-01

    This article shows how set covering with item sampling (SCIS) methods can be used in the analysis and preanalysis of linear programming models for test assembly (LPTA). LPTA models can construct tests, fulfilling a set of constraints set by the test assembler. Sometimes, no solution to the LPTA model exists. The model is then said to be…

  1. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  2. An exponential combination procedure for set-based association tests in sequencing studies.

    Science.gov (United States)

    Chen, Lin S; Hsu, Li; Gamazon, Eric R; Cox, Nancy J; Nicolae, Dan L

    2012-12-07

    State-of-the-art next-generation-sequencing technologies can facilitate in-depth explorations of the human genome by investigating both common and rare variants. For the identification of genetic factors that are associated with disease risk or other complex phenotypes, methods have been proposed for jointly analyzing variants in a set (e.g., all coding SNPs in a gene). Variants in a properly defined set could be associated with risk or phenotype in a concerted fashion, and by accumulating information from them, one can improve power to detect genetic risk factors. Many set-based methods in the literature are based on statistics that can be written as the summation of variant statistics. Here, we propose taking the summation of the exponential of variant statistics as the set summary for association testing. From both Bayesian and frequentist perspectives, we provide theoretical justification for taking the sum of the exponential of variant statistics because it is particularly powerful for sparse alternatives-that is, compared with the large number of variants being tested in a set, only relatively few variants are associated with disease risk-a distinctive feature of genetic data. We applied the exponential combination gene-based test to a sequencing study in anticancer pharmacogenomics and uncovered mechanistic insights into genes and pathways related to chemotherapeutic susceptibility for an important class of oncologic drugs. Copyright © 2012 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  3. Data for TROTS – The Radiotherapy Optimisation Test Set

    Directory of Open Access Journals (Sweden)

    Sebastiaan Breedveld

    2017-06-01

    Full Text Available The Radiotherapy Optimisation Test Set (TROTS is an extensive set of problems originating from radiotherapy (radiation therapy treatment planning. This dataset is created for 2 purposes: (1 to supply a large-scale dense dataset to measure performance and quality of mathematical solvers, and (2 to supply a dataset to investigate the multi-criteria optimisation and decision-making nature of the radiotherapy problem. The dataset contains 120 problems (patients, divided over 6 different treatment protocols/tumour types. Each problem contains numerical data, a configuration for the optimisation problem, and data required to visualise and interpret the results. The data is stored as HDF5 compatible Matlab files, and includes scripts to work with the dataset.

  4. Bell inequality, nonlocality and analyticity

    International Nuclear Information System (INIS)

    Socolovsky, M.

    2003-01-01

    The Bell and the Clauser-Horne-Shimony-Holt inequalities are shown to hold for both the cases of complex and real analytic nonlocality in the setting parameters of Einstein-Podolsky-Rosen-Bohm experiments for spin ((1)/(2)) particles and photons, in both the deterministic and stochastic cases. Therefore, the theoretical and experimental violation of the inequalities by quantum mechanics excludes all hidden variables theories with that kind of nonlocality. In particular, real analyticity leads to negative definite correlations, in contradiction with quantum mechanics

  5. Bell inequality, nonlocality and analyticity

    Energy Technology Data Exchange (ETDEWEB)

    Socolovsky, M

    2003-09-15

    The Bell and the Clauser-Horne-Shimony-Holt inequalities are shown to hold for both the cases of complex and real analytic nonlocality in the setting parameters of Einstein-Podolsky-Rosen-Bohm experiments for spin ((1)/(2)) particles and photons, in both the deterministic and stochastic cases. Therefore, the theoretical and experimental violation of the inequalities by quantum mechanics excludes all hidden variables theories with that kind of nonlocality. In particular, real analyticity leads to negative definite correlations, in contradiction with quantum mechanics.

  6. Alternate superior Julia sets

    International Nuclear Information System (INIS)

    Yadav, Anju; Rani, Mamta

    2015-01-01

    Alternate Julia sets have been studied in Picard iterative procedures. The purpose of this paper is to study the quadratic and cubic maps using superior iterates to obtain Julia sets with different alternate structures. Analytically, graphically and computationally it has been shown that alternate superior Julia sets can be connected, disconnected and totally disconnected, and also fattier than the corresponding alternate Julia sets. A few examples have been studied by applying different type of alternate structures

  7. The legal and ethical concerns that arise from using complex predictive analytics in health care.

    Science.gov (United States)

    Cohen, I Glenn; Amarasingham, Ruben; Shah, Anand; Xie, Bin; Lo, Bernard

    2014-07-01

    Predictive analytics, or the use of electronic algorithms to forecast future events in real time, makes it possible to harness the power of big data to improve the health of patients and lower the cost of health care. However, this opportunity raises policy, ethical, and legal challenges. In this article we analyze the major challenges to implementing predictive analytics in health care settings and make broad recommendations for overcoming challenges raised in the four phases of the life cycle of a predictive analytics model: acquiring data to build the model, building and validating it, testing it in real-world settings, and disseminating and using it more broadly. For instance, we recommend that model developers implement governance structures that include patients and other stakeholders starting in the earliest phases of development. In addition, developers should be allowed to use already collected patient data without explicit consent, provided that they comply with federal regulations regarding research on human subjects and the privacy of health information. Project HOPE—The People-to-People Health Foundation, Inc.

  8. Analytic cognitive style predicts religious and paranormal belief.

    Science.gov (United States)

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J; Fugelsang, Jonathan A

    2012-06-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined associations of God beliefs, religious engagement (attendance at religious services, praying, etc.), conventional religious beliefs (heaven, miracles, etc.) and paranormal beliefs (extrasensory perception, levitation, etc.) with performance measures of cognitive ability and analytic cognitive style. An analytic cognitive style negatively predicted both religious and paranormal beliefs when controlling for cognitive ability as well as religious engagement, sex, age, political ideology, and education. Participants more willing to engage in analytic reasoning were less likely to endorse supernatural beliefs. Further, an association between analytic cognitive style and religious engagement was mediated by religious beliefs, suggesting that an analytic cognitive style negatively affects religious engagement via lower acceptance of conventional religious beliefs. Results for types of God belief indicate that the association between an analytic cognitive style and God beliefs is more nuanced than mere acceptance and rejection, but also includes adopting less conventional God beliefs, such as Pantheism or Deism. Our data are consistent with the idea that two people who share the same cognitive ability, education, political ideology, sex, age and level of religious engagement can acquire very different sets of beliefs about the world if they differ in their propensity to think analytically. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Analytical support for the B{sub 4}C control rod test QUENCH-07

    Energy Technology Data Exchange (ETDEWEB)

    Homann, C.; Hering, W. [Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany). Inst. fuer Reaktorsicherheit]|[Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany). Programm Nukleare Sicherheitsforschung; Birchley, J. [Paul Scherrer Inst. (Switzerland); Fernandez Benitez, J.A.; Ortega Bernardo, M. [Univ. Politecnica de Madrid (Spain)

    2003-04-01

    Degradation of B{sub 4}C absorber rods during a beyond design accident in a nuclear power reactor may be a safety concern. Among others, the integral test QUENCH-07 is performed in the FZK QUENCH facility and supported by analytical work within the Euratom Fifth Framework Programme on Nuclear Fission Safety to get a more profound database. Since the test differed substantially from previous QUENCH tests, much more work had to be done for pretest calculations than usual to guarantee the safety of the facility and to derive the test protocol. Several institutions shared in this work with different computer code systems, as used for nuclear reactor safety analyses. Due to this effort, problems could be identified and solved, leading to several modifications of the originally planned test conduct, until a feasible test protocol could be derived and recommended. All calculations showed the same trends. Especially the high temperatures and hence the small safety margin for the facility were a concern. In this report, contributions of various authors, engaged in this work, are presented. The test QUENCH-07 and the related computational support by the engaged institutions were co-financed by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (COLOSS Project, contract No. FIKS-CT-1999-00002). (orig.)

  10. Analytical energy spectrum for hybrid mechanical systems

    International Nuclear Information System (INIS)

    Zhong, Honghua; Xie, Qiongtao; Lee, Chaohong; Guan, Xiwen; Gao, Kelin; Batchelor, Murray T

    2014-01-01

    We investigate the energy spectrum for hybrid mechanical systems described by non-parity-symmetric quantum Rabi models. A set of analytical solutions in terms of the confluent Heun functions and their analytical energy spectrum is obtained. The analytical energy spectrum includes regular and exceptional parts, which are both confirmed by direct numerical simulation. The regular part is determined by the zeros of the Wronskian for a pair of analytical solutions. The exceptional part is relevant to the isolated exact solutions and its energy eigenvalues are obtained by analyzing the truncation conditions for the confluent Heun functions. By analyzing the energy eigenvalues for exceptional points, we obtain the analytical conditions for the energy-level crossings, which correspond to two-fold energy degeneracy. (paper)

  11. The analytic renormalization group

    Directory of Open Access Journals (Sweden)

    Frank Ferrari

    2016-08-01

    Full Text Available Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k∈Z, associated with the Matsubara frequencies νk=2πk/β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct “Analytic Renormalization Group” linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk|<μ (with the possible exception of the zero mode G0, together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk|≥μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  12. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples

    DEFF Research Database (Denmark)

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart

    2017-01-01

    BACKGROUND: Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability...... in incurred samples. METHODS: We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration...... of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used...

  13. Introductory statistics and analytics a resampling perspective

    CERN Document Server

    Bruce, Peter C

    2014-01-01

    Concise, thoroughly class-tested primer that features basic statistical concepts in the concepts in the context of analytics, resampling, and the bootstrapA uniquely developed presentation of key statistical topics, Introductory Statistics and Analytics: A Resampling Perspective provides an accessible approach to statistical analytics, resampling, and the bootstrap for readers with various levels of exposure to basic probability and statistics. Originally class-tested at one of the first online learning companies in the discipline, www.statistics.com, the book primarily focuses on application

  14. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels

    2015-01-01

    of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics...

  15. Testing the effect of defaults on the thermostat settings of OECD employees

    International Nuclear Information System (INIS)

    Brown, Zachary; Johnstone, Nick; Haščič, Ivan; Vong, Laura; Barascud, Francis

    2013-01-01

    We describe a randomized controlled experiment in which the default settings on office thermostats in an OECD office building were manipulated during the winter heating season, and employees' chosen thermostat setting observed over a 6-week period. Using difference-in-differences, panel, and censored regression models (to control for maximum allowable thermostat settings), we find that a 1 °C decrease in the default caused a reduction in the chosen setting by 0.38 °C, on average. Sixty-five percent of this effect could be attributed to office occupant behavior (p-value = 0.044). The difference-in-differences models show that small decreases in the default (1°) led to a greater reduction in chosen settings than large decreases (2°). We also find that office occupants who were more apt to adjust their thermostats prior to the intervention were less susceptible to the default. We conclude that this kind of intervention can increase building-level energy efficiency, and discuss potential explanations and broader policy implications of our findings. - Highlights: • We conduct a randomized controlled trial to test if thermostat defaults affect agent behavior. • Two treatments (schedules of default settings) were tested against a control for 6 weeks at OECD. • Small changes in defaults had a greater effect on chosen settings than larger changes in defaults. • Occupants who frequently changed their thermostats in baseline were less affected by defaults. • Thermostat defaults in office environments can be manipulated to increase energy efficiency

  16. High-Activity ICP-AES Measurements in the ATALANTE Facility Applied to Analytical Monitoring of an Extraction Test

    Energy Technology Data Exchange (ETDEWEB)

    Esbelin, E.; Boyer-Deslys, V.; Beres, A.; Viallesoubranne, C. [CEA Marcoule, DEN/DRCP/SE2A/LAMM, BP17171, 30207 Bagnols-sur-Ceze (France)

    2008-07-01

    The Material Analysis and Metrology Laboratory (LAMM) of the Cea's Atalante complex ensures analytical monitoring of enhanced separation tests. Certain fission products, actinides and lanthanides were assayed by ICP-AES (Inductively Coupled Plasma-Atomic Emission Spectroscopy) in the CBA shielded analysis line. These analyses were particularly effective for controlling the Diamex test, and contributed to its success. The Diamex process consists in extracting the actinides and lanthanides from a Purex raffinate using a diamide, DMDOHEMA, followed by stripping at low acidity. The major elements analyzed during the test were Am, Nd, Mo, Fe, and Zr.

  17. Doubling immunochemistry laboratory testing efficiency with the cobas e 801 module while maintaining consistency in analytical performance.

    Science.gov (United States)

    Findeisen, P; Zahn, I; Fiedler, G M; Leichtle, A B; Wang, S; Soria, G; Johnson, P; Henzell, J; Hegel, J K; Bendavid, C; Collet, N; McGovern, M; Klopprogge, K

    2018-06-04

    The new immunochemistry cobas e 801 module (Roche Diagnostics) was developed to meet increasing demands on routine laboratories to further improve testing efficiency, while maintaining high quality and reliable data. During a non-interventional multicenter evaluation study, the overall performance, functionality and reliability of the new module was investigated under routine-like conditions. It was tested as a dedicated immunochemistry system at four sites and as a consolidator combined with clinical chemistry at three sites. We report on testing efficiency and analytical performance of the new module. Evaluation of sample workloads with site-specific routine request patterns demonstrated increased speed and almost doubled throughput (maximal 300 tests per h), thus revealing that one cobas e 801 module can replace two cobas e 602 modules while saving up to 44% floor space. Result stability was demonstrated by QC analysis per assay throughout the study. Precision testing over 21 days yielded excellent results within and between labs, and, method comparison performed versus the cobas e 602 module routine results showed high consistency of results for all assays under study. In a practicability assessment related to performance and handling, 99% of graded features met (44%) or even exceeded (55%) laboratory expectations, with enhanced reagent management and loading during operation being highlighted. By nearly doubling immunochemistry testing efficiency on the same footprint as a cobas e 602 module, the new module has a great potential to further consolidate and enhance laboratory testing while maintaining high quality analytical performance with Roche platforms. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Analytic Cognitive Style Predicts Religious and Paranormal Belief

    Science.gov (United States)

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J.; Fugelsang, Jonathan A.

    2012-01-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined…

  19. Student Perceptions of the Progress Test in Two Settings and the Implications for Test Deployment

    Science.gov (United States)

    Wade, Louise; Harrison, Chris; Hollands, James; Mattick, Karen; Ricketts, Chris; Wass, Val

    2012-01-01

    Background: The Progress Test (PT) was developed to assess student learning within integrated curricula. Whilst it is effective in promoting and rewarding deep approaches to learning in some settings, we hypothesised that implementation of the curriculum (design and assessment) may impact on students' preparation for the PT and their learning.…

  20. Delivering business analytics practical guidelines for best practice

    CERN Document Server

    Stubbs, Evan

    2013-01-01

    AVOID THE MISTAKES THAT OTHERS MAKE - LEARN WHAT LEADS TO BEST PRACTICE AND KICKSTART SUCCESS This groundbreaking resource provides comprehensive coverage across all aspects of business analytics, presenting proven management guidelines to drive sustainable differentiation. Through a rich set of case studies, author Evan Stubbs reviews solutions and examples to over twenty common problems spanning managing analytics assets and information, leveraging technology, nurturing skills, and defining processes. Delivering Business Analytics also outlines the Data Scientist's Code, fifteen principle

  1. Big data analytics in healthcare: promise and potential.

    Science.gov (United States)

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  2. Different goodness of fit tests for Rayleigh distribution in ranked set sampling

    Directory of Open Access Journals (Sweden)

    Amer Al-Omari

    2016-03-01

    Full Text Available In this paper, different goodness of fit tests for the Rayleigh distribution are considered based on simple random sampling (SRS and ranked set sampling (RSS techniques. The performance of the suggested estimators is evaluated in terms of the power of the tests by using Monte Carlo simulation. It is found that the suggested RSS tests perform better than their counterparts  in SRS.

  3. Analyticity and the Global Information Field

    Directory of Open Access Journals (Sweden)

    Evgeni A. Solov'ev

    2015-03-01

    Full Text Available The relation between analyticity in mathematics and the concept of a global information field in physics is reviewed. Mathematics is complete in the complex plane only. In the complex plane, a very powerful tool appears—analyticity. According to this property, if an analytic function is known on the countable set of points having an accumulation point, then it is known everywhere. This mysterious property has profound consequences in quantum physics. Analyticity allows one to obtain asymptotic (approximate results in terms of some singular points in the complex plane which accumulate all necessary data on a given process. As an example, slow atomic collisions are presented, where the cross-sections of inelastic transitions are determined by branch-points of the adiabatic energy surface at a complex internuclear distance. Common aspects of the non-local nature of analyticity and a recently introduced interpretation of classical electrodynamics and quantum physics as theories of a global information field are discussed.

  4. A semi-analytical solution for slug tests in an unconfined aquifer considering unsaturated flow

    Science.gov (United States)

    Sun, Hongbing

    2016-01-01

    A semi-analytical solution considering the vertical unsaturated flow is developed for groundwater flow in response to a slug test in an unconfined aquifer in Laplace space. The new solution incorporates the effects of partial penetrating, anisotropy, vertical unsaturated flow, and a moving water table boundary. Compared to the Kansas Geological Survey (KGS) model, the new solution can significantly improve the fittings of the modeled to the measured hydraulic heads at the late stage of slug tests in an unconfined aquifer, particularly when the slug well has a partially submerged screen and moisture drainage above the water table is significant. The radial hydraulic conductivities estimated with the new solution are comparable to those from the KGS, Bouwer and Rice, and Hvorslev methods. In addition, the new solution also can be used to examine the vertical conductivity, specific storage, specific yield, and the moisture retention parameters in an unconfined aquifer based on slug test data.

  5. EuroClonality/BIOMED-2 guidelines for interpretation and reporting of Ig/TCR clonality testing in suspected lymphoproliferations

    NARCIS (Netherlands)

    Langerak, A. W.; Groenen, P. J. T. A.; Brüggemann, M.; Beldjord, K.; Bellan, C.; Bonello, L.; Boone, E.; Carter, G. I.; Catherwood, M.; Davi, F.; Delfau-Larue, M.-H.; Diss, T.; Evans, P. A. S.; Gameiro, P.; Garcia Sanz, R.; Gonzalez, D.; Grand, D.; Håkansson, A.; Hummel, M.; Liu, H.; Lombardia, L.; Macintyre, E. A.; Milner, B. J.; Montes-Moreno, S.; Schuuring, E.; Spaargaren, M.; Hodges, E.; van Dongen, J. J. M.

    2012-01-01

    PCR-based immunoglobulin (Ig)/T-cell receptor (TCR) clonality testing in suspected lymphoproliferations has largely been standardized and has consequently become technically feasible in a routine diagnostic setting. Standardization of the pre-analytical and post-analytical phases is now essential to

  6. A support vector machine based test for incongruence between sets of trees in tree space

    Science.gov (United States)

    2012-01-01

    Background The increased use of multi-locus data sets for phylogenetic reconstruction has increased the need to determine whether a set of gene trees significantly deviate from the phylogenetic patterns of other genes. Such unusual gene trees may have been influenced by other evolutionary processes such as selection, gene duplication, or horizontal gene transfer. Results Motivated by this problem we propose a nonparametric goodness-of-fit test for two empirical distributions of gene trees, and we developed the software GeneOut to estimate a p-value for the test. Our approach maps trees into a multi-dimensional vector space and then applies support vector machines (SVMs) to measure the separation between two sets of pre-defined trees. We use a permutation test to assess the significance of the SVM separation. To demonstrate the performance of GeneOut, we applied it to the comparison of gene trees simulated within different species trees across a range of species tree depths. Applied directly to sets of simulated gene trees with large sample sizes, GeneOut was able to detect very small differences between two set of gene trees generated under different species trees. Our statistical test can also include tree reconstruction into its test framework through a variety of phylogenetic optimality criteria. When applied to DNA sequence data simulated from different sets of gene trees, results in the form of receiver operating characteristic (ROC) curves indicated that GeneOut performed well in the detection of differences between sets of trees with different distributions in a multi-dimensional space. Furthermore, it controlled false positive and false negative rates very well, indicating a high degree of accuracy. Conclusions The non-parametric nature of our statistical test provides fast and efficient analyses, and makes it an applicable test for any scenario where evolutionary or other factors can lead to trees with different multi-dimensional distributions. The

  7. Experience with Dismantling of the Analytic Cell in the JRTF Decommissioning Program

    International Nuclear Information System (INIS)

    Annoh, Akio; Nemoto, Koichi; Tajiri, Hideo; Saito, Keiichiro; Miyajima, Kazutoshi; Myodo, Masato

    2003-01-01

    The analytic cell was mainly used for process control analysis of the reprocessing process and for the measurement of fuel burn up ratio in JAERI's Reprocessing Test Facility (JRTF). The analytic cell was a heavy shielded one and equipped with a conveyor. The cell was alpha and beta(gamma)contaminated. For dismantling of analytic cells, it is very important to establish a method to remove the heavy shield safely and reduce the exposure. At first, a green house was set up to prevent the spread out of contamination, and next, the analytic cell was dismantled. Depending on the contamination condition, the workers wore protective suits such as air ventilated-suits for prevention of internal exposure and vinyl chloride aprons, lead aprons in order to reduce external exposure. From the work carried out, various data such as needed manpower for the activities, the collective dose of workers by external exposure, the amount of radioactive wastes and the relation between the weight of the shield and its dismantling efficiency were obtained and input for the database. The method of dismantling and the experience with the dismantling of the analytic cell in the JRTF, carried out during 2001 and 2002, are described in this paper

  8. The Pacific Marine Energy Center - South Energy Test Site (PMEC-SETS)

    Energy Technology Data Exchange (ETDEWEB)

    Batten, Belinda [Oregon State Univ., Corvallis, OR (United States); Hellin, Dan [Oregon State Univ., Corvallis, OR (United States)

    2018-02-07

    The overall goal of this project was to build on existing progress to establish the Pacific Marine Energy Center South Energy Test Site (PMEC-SETS) as the nation's first fully permitted test site for wave energy converter arrays. Specifically, it plays an essential role in reducing levelized cost of energy for the wave energy industry by providing both the facility and resources to address the challenges of cost reduction.

  9. New nucleic acid testing devices to diagnose infectious diseases in resource-limited settings.

    Science.gov (United States)

    Maffert, P; Reverchon, S; Nasser, W; Rozand, C; Abaibou, H

    2017-10-01

    Point-of-care diagnosis based on nucleic acid testing aims to incorporate all the analytical steps, from sample preparation to nucleic acid amplification and detection, in a single device. This device needs to provide a low-cost, robust, sensitive, specific, and easily readable analysis. Microfluidics has great potential for handling small volumes of fluids on a single platform. Microfluidic technology has recently been applied to paper, which is already used in low-cost lateral flow tests. Nucleic acid extraction from a biological specimen usually requires cell filtration and lysis on specific membranes, while affinity matrices, such as chitosan or polydiacetylene, are well suited to concentrating nucleic acids for subsequent amplification. Access to electricity is often difficult in resource-limited areas, so the amplification step needs to be equipment-free. Consequently, the reaction has to be isothermal to alleviate the need for a thermocycler. LAMP, NASBA, HDA, and RPA are examples of the technologies available. Nucleic acid detection techniques are currently based on fluorescence, colorimetry, or chemiluminescence. For point-of-care diagnostics, the results should be readable with the naked eye. Nowadays, interpretation and communication of results to health professionals could rely on a smartphone, used as a telemedicine device. The major challenge of creating an "all-in-one" diagnostic test involves the design of an optimal solution and a sequence for each analytical step, as well as combining the execution of all these steps on a single device. This review provides an overview of available materials and technologies which seem to be adapted to point-of-care nucleic acid-based diagnosis, in low-resource areas.

  10. Set-up and Test Procedure for Suction Installation and Uninstallation of Bucket Foundation

    DEFF Research Database (Denmark)

    Koteras, Aleksandra Katarzyna

    This technical report describes the set-up and the test procedures for installation and uninstallation of medium-scale model of bucket foundation that can be performed in the geotechnical part of laboratory in Aalborg University. The installation of bucket foundation can be tested with the use of......) and loading frame used for those tests have been already used for axially static and cyclic loading of piles (Thomassen, 2015a) and for axially static and cyclic loading of bucket foundation (Vaitkunaite et al., 2015).......This technical report describes the set-up and the test procedures for installation and uninstallation of medium-scale model of bucket foundation that can be performed in the geotechnical part of laboratory in Aalborg University. The installation of bucket foundation can be tested with the use...... of suction under the bucket lid or by applying additional force through the hydraulic piston, forcing the bucket to penetrate into the soil. Tests for uninstallation are performed also with the use of water pressure, as a reverse process to the suction installation. Both installation and uninstallation tests...

  11. Illness Perception and Depressive Symptoms among Persons with Type 2 Diabetes Mellitus: An Analytical Cross-Sectional Study in Clinical Settings in Nepal.

    Science.gov (United States)

    Joshi, Suira; Dhungana, Raja Ram; Subba, Usha Kiran

    2015-01-01

    Background. This study aimed to assess the relationship between illness perception and depressive symptoms among persons with diabetes. Method. This was an analytical cross-sectional study conducted among 379 type 2 diabetic patients from three major clinical settings of Kathmandu, Nepal. Results. The prevalence of depressive symptoms was 44.1% (95% CI: 39.1, 49.1). Females (p perception and depressive symptoms among diabetic patients. Study finding indicated that persons living with diabetes in Nepal need comprehensive diabetes education program for changing poor illness perception, which ultimately helps to prevent development of depressive symptoms.

  12. Comparison of EPRI safety valve test data with analytically determined hydraulic results

    International Nuclear Information System (INIS)

    Smith, L.C.; Howe, K.S.

    1983-01-01

    NUREG-0737 (November 1980) and all subsequent U.S. NRC generic follow-up letters require that all operating plant licensees and applicants verify the acceptability of plant specific pressurizer safety valve piping systems for valve operation transients by testing. To aid in this verification process, the Electric Power Research Institute (EPRI) conducted an extensive testing program at the Combustion Engineering Test Facility. Pertinent tests simulating dynamic opening of the safety valves for representative upstream environments were carried out. Different models and sizes of safety valves were tested at the simulated operating conditions. Transducers placed at key points in the system monitored a variety of thermal, hydraulic and structural parameters. From this data, a more complete description of the transient can be made. The EPRI test configuration was analytically modeled using a one-dimensional thermal hydraulic computer program that uses the method of characteristics approach to generate key fluid parameters as a function of space and time. The conservative equations are solved by applying both the implicit and explicit characteristic methods. Unbalanced or wave forces were determined for each straight run of pipe bounded on each side by a turn or elbow. Blowdown forces were included, where appropriate. Several parameters were varied to determine the effects on the pressure, hydraulic forces and timings of events. By comparing these quantities with the experimentally obtained data, an approximate picture of the flow dynamics is arrived at. Two cases in particular are presented. These are the hot and cold loop seal discharge tests made with the Crosby 6M6 spring-loaded safety valve. Included in the paper is a description of the hydraulic code, modeling techniques and assumptions, a comparison of the numerical results with experimental data and a qualitative description of the factors which govern pipe support loading. (orig.)

  13. Efficient analytical implementation of the DOT Riemann solver for the de Saint Venant-Exner morphodynamic model

    Science.gov (United States)

    Carraro, F.; Valiani, A.; Caleffi, V.

    2018-03-01

    Within the framework of the de Saint Venant equations coupled with the Exner equation for morphodynamic evolution, this work presents a new efficient implementation of the Dumbser-Osher-Toro (DOT) scheme for non-conservative problems. The DOT path-conservative scheme is a robust upwind method based on a complete Riemann solver, but it has the drawback of requiring expensive numerical computations. Indeed, to compute the non-linear time evolution in each time step, the DOT scheme requires numerical computation of the flux matrix eigenstructure (the totality of eigenvalues and eigenvectors) several times at each cell edge. In this work, an analytical and compact formulation of the eigenstructure for the de Saint Venant-Exner (dSVE) model is introduced and tested in terms of numerical efficiency and stability. Using the original DOT and PRICE-C (a very efficient FORCE-type method) as reference methods, we present a convergence analysis (error against CPU time) to study the performance of the DOT method with our new analytical implementation of eigenstructure calculations (A-DOT). In particular, the numerical performance of the three methods is tested in three test cases: a movable bed Riemann problem with analytical solution; a problem with smooth analytical solution; a test in which the water flow is characterised by subcritical and supercritical regions. For a given target error, the A-DOT method is always the most efficient choice. Finally, two experimental data sets and different transport formulae are considered to test the A-DOT model in more practical case studies.

  14. Foreign Language Optical Character Recognition, Phase II: Arabic and Persian Training and Test Data Sets

    National Research Council Canada - National Science Library

    Davidson, Robert

    1997-01-01

    .... Each data set is divided into a training set, which is made available to developers, and a carefully matched equal-sized set of closely analogous samples, which is reserved for testing of the developers' products...

  15. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    Science.gov (United States)

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Electromagnetic analysis of the Korean helium cooled ceramic reflector test blanket module set

    International Nuclear Information System (INIS)

    Lee, Youngmin; Ku, Duck Young; Lee, Dong Won; Ahn, Mu-Young; Park, Yi-Hyun; Cho, Seungyon

    2016-01-01

    Korean helium cooled ceramic reflector (HCCR) test blanket module set (TBM-set) will be installed at equatorial port #18 of Vacuum Vessel in ITER in order to test the breeding blanket performance for forthcoming fusion power plant. Since ITER tokamak has a set of electromagnetic coils (Central Solenoid, Poloidal Field and Toroidal Field coil set) around Vacuum Vessel, the HCCR TBM-set, the TBM and associated shield, is greatly influenced by magnetic field generated by these coils. In the case of fast transient electromagnetic events such as major disruption, vertical displacement event or magnet fast discharge, magnetic field and induced eddy current results in huge electromagnetic load, known as Lorentz load, on the HCCR TBM-set. In addition, the TBM-set experiences electromagnetic load due to magnetization of the structural material not only during the fast transient events but also during normal operation since the HCCR TBM adopts Reduced Activation Ferritic Martensitic (RAFM) steel as a structural material. This is known as Maxwell load which includes Lorentz load as well as load due to magnetization of structure material. This paper presents electromagnetic analysis results for the HCCR TBM-set. For analysis, a 20° sector finite model was constructed considering ITER configuration such as Vacuum Vessel, ITER shield blankets, Central Solenoid, Poloidal Field, Toroidal Field coil set as well as the HCCR TBM-set. Three major disruptions (operational event, likely event and highly unlikely event) were selected for analysis based on the load specifications. ANSYS-EMAG was used as a calculation tool. The results of EM analysis will be used as input data for the structural analysis.

  17. Electromagnetic analysis of the Korean helium cooled ceramic reflector test blanket module set

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youngmin, E-mail: ymlee@nfri.re.kr [National Fusion Research Institute, Daejeon (Korea, Republic of); Ku, Duck Young [National Fusion Research Institute, Daejeon (Korea, Republic of); Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young; Park, Yi-Hyun; Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-11-01

    Korean helium cooled ceramic reflector (HCCR) test blanket module set (TBM-set) will be installed at equatorial port #18 of Vacuum Vessel in ITER in order to test the breeding blanket performance for forthcoming fusion power plant. Since ITER tokamak has a set of electromagnetic coils (Central Solenoid, Poloidal Field and Toroidal Field coil set) around Vacuum Vessel, the HCCR TBM-set, the TBM and associated shield, is greatly influenced by magnetic field generated by these coils. In the case of fast transient electromagnetic events such as major disruption, vertical displacement event or magnet fast discharge, magnetic field and induced eddy current results in huge electromagnetic load, known as Lorentz load, on the HCCR TBM-set. In addition, the TBM-set experiences electromagnetic load due to magnetization of the structural material not only during the fast transient events but also during normal operation since the HCCR TBM adopts Reduced Activation Ferritic Martensitic (RAFM) steel as a structural material. This is known as Maxwell load which includes Lorentz load as well as load due to magnetization of structure material. This paper presents electromagnetic analysis results for the HCCR TBM-set. For analysis, a 20° sector finite model was constructed considering ITER configuration such as Vacuum Vessel, ITER shield blankets, Central Solenoid, Poloidal Field, Toroidal Field coil set as well as the HCCR TBM-set. Three major disruptions (operational event, likely event and highly unlikely event) were selected for analysis based on the load specifications. ANSYS-EMAG was used as a calculation tool. The results of EM analysis will be used as input data for the structural analysis.

  18. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection

    Science.gov (United States)

    Cross, Robert W.; Boisen, Matthew L.; Millett, Molly M.; Nelson, Diana S.; Oottamasathien, Darin; Hartnett, Jessica N.; Jones, Abigal B.; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A.; Fusco, Marnie L.; Abelson, Dafna M.; Oda, Shunichiro; Brown, Bethany L.; Pham, Ha; Rowland, Megan M.; Agans, Krystle N.; Geisbert, Joan B.; Heinrich, Megan L.; Kulakosky, Peter C.; Shaffer, Jeffrey G.; Schieffelin, John S.; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M.; Wilson, Russell B.; Saphire, Erica Ollmann; Pitts, Kelly R.; Khan, Sheik Humarr; Grant, Donald S.; Geisbert, Thomas W.; Branco, Luis M.; Garry, Robert F.

    2016-01-01

    Background. Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013–2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases. Methods. Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance. Results. The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription–polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 105–9.0 × 108 genomes/mL. Conclusions. The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. PMID:27587634

  19. Analytical model for advective-dispersive transport involving flexible boundary inputs, initial distributions and zero-order productions

    Science.gov (United States)

    Chen, Jui-Sheng; Li, Loretta Y.; Lai, Keng-Hsin; Liang, Ching-Ping

    2017-11-01

    A novel solution method is presented which leads to an analytical model for the advective-dispersive transport in a semi-infinite domain involving a wide spectrum of boundary inputs, initial distributions, and zero-order productions. The novel solution method applies the Laplace transform in combination with the generalized integral transform technique (GITT) to obtain the generalized analytical solution. Based on this generalized analytical expression, we derive a comprehensive set of special-case solutions for some time-dependent boundary distributions and zero-order productions, described by the Dirac delta, constant, Heaviside, exponentially-decaying, or periodically sinusoidal functions as well as some position-dependent initial conditions and zero-order productions specified by the Dirac delta, constant, Heaviside, or exponentially-decaying functions. The developed solutions are tested against an analytical solution from the literature. The excellent agreement between the analytical solutions confirms that the new model can serve as an effective tool for investigating transport behaviors under different scenarios. Several examples of applications, are given to explore transport behaviors which are rarely noted in the literature. The results show that the concentration waves resulting from the periodically sinusoidal input are sensitive to dispersion coefficient. The implication of this new finding is that a tracer test with a periodic input may provide additional information when for identifying the dispersion coefficients. Moreover, the solution strategy presented in this study can be extended to derive analytical models for handling more complicated problems of solute transport in multi-dimensional media subjected to sequential decay chain reactions, for which analytical solutions are not currently available.

  20. A primer of analytical mechanics

    CERN Document Server

    Strocchi, Franco

    2018-01-01

    This book presents the basic elements of Analytical Mechanics, starting from the physical motivations that favor it with respect to the Newtonian Mechanics in Cartesian coordinates. Rather than presenting Analytical Mechanics mainly as a formal development of Newtonian Mechanics, it highlights its effectiveness due to the following five important achievements: 1) the most economical description of time evolution in terms of the minimal set of coordinates, so that there are no constraint forces in their evolution equations; 2) the form invariance of the evolution equations, which automatically solves the problem of fictitious forces; 3) only one scalar function encodes the formulation of the dynamics, rather than the full set of vectors which describe the forces in Cartesian Newtonian Mechanics; 4) in the Hamiltonian formulation, the corresponding evolution equations are of first order in time and are fully governed by the Hamiltonian function (usually corresponding to the energy); 5) the emergence of the Hami...

  1. Interacting Brownian Swarms: Some Analytical Results

    Directory of Open Access Journals (Sweden)

    Guillaume Sartoretti

    2016-01-01

    Full Text Available We consider the dynamics of swarms of scalar Brownian agents subject to local imitation mechanisms implemented using mutual rank-based interactions. For appropriate values of the underlying control parameters, the swarm propagates tightly and the distances separating successive agents are iid exponential random variables. Implicitly, the implementation of rank-based mutual interactions, requires that agents have infinite interaction ranges. Using the probabilistic size of the swarm’s support, we analytically estimate the critical interaction range below that flocked swarms cannot survive. In the second part of the paper, we consider the interactions between two flocked swarms of Brownian agents with finite interaction ranges. Both swarms travel with different barycentric velocities, and agents from both swarms indifferently interact with each other. For appropriate initial configurations, both swarms eventually collide (i.e., all agents interact. Depending on the values of the control parameters, one of the following patterns emerges after collision: (i Both swarms remain essentially flocked, or (ii the swarms become ultimately quasi-free and recover their nominal barycentric speeds. We derive a set of analytical flocking conditions based on the generalized rank-based Brownian motion. An extensive set of numerical simulations corroborates our analytical findings.

  2. Designing a Marketing Analytics Course for the Digital Age

    Science.gov (United States)

    Liu, Xia; Burns, Alvin C.

    2018-01-01

    Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…

  3. Analytic Methods Used in Quality Control in a Compounding Pharmacy.

    Science.gov (United States)

    Allen, Loyd V

    2017-01-01

    Analytical testing will no doubt become a more important part of pharmaceutical compounding as the public and regulatory agencies demand increasing documentation of the quality of compounded preparations. Compounding pharmacists must decide what types of testing and what amount of testing to include in their quality-control programs, and whether testing should be done in-house or outsourced. Like pharmaceutical compounding, analytical testing should be performed only by those who are appropriately trained and qualified. This article discusses the analytical methods that are used in quality control in a compounding pharmacy. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  4. Recommendations for translation and reliability testing of International Spinal Cord Injury Data Sets.

    Science.gov (United States)

    Biering-Sørensen, F; Alexander, M S; Burns, S; Charlifue, S; DeVivo, M; Dietz, V; Krassioukov, A; Marino, R; Noonan, V; Post, M W M; Stripling, T; Vogel, L; Wing, P

    2011-03-01

    To provide recommendations regarding translation and reliability testing of International Spinal Cord Injury (SCI) Data Sets. The Executive Committee for the International SCI Standards and Data Sets. Translations of any specific International SCI Data Set can be accomplished by translation from the English version into the target language, and be followed by a back-translation into English, to confirm that the original meaning has been preserved. Another approach is to have the initial translation performed by translators who have knowledge of SCI, and afterwards controlled by other person(s) with the same kind of knowledge. The translation process includes both language translation and cultural adaptation, and therefore shall not be made word for word, but will strive to include conceptual equivalence. At a minimum, the inter-rater reliability should be tested by no less than two independent observers, and preferably in multiple countries. Translations must include information on the name, role and background of everyone involved in the translation process, and shall be dated and noted with a version number. By following the proposed guidelines, translated data sets should assure comparability of data acquisition across countries and cultures. If the translation process identifies irregularities or misrepresentation in either the original English version or the target language, the working group for the particular International SCI Data Set shall revise the data set accordingly, which may include re-wording of the original English version in order to accomplish a compromise in the content of the data set.

  5. Monitoring training response in young Friesian dressage horses using two different standardised exercise tests (SETs)

    NARCIS (Netherlands)

    de Bruijn, Cornelis Marinus; Houterman, Willem; Ploeg, Margreet; Ducro, Bart; Boshuizen, Berit; Goethals, Klaartje; Verdegaal, Elisabeth-Lidwien; Delesalle, Catherine

    2017-01-01

    BACKGROUND: Most Friesian horses reach their anaerobic threshold during a standardized exercise test (SET) which requires lower intensity exercise than daily routine training. AIM: to study strengths and weaknesses of an alternative SET-protocol. Two different SETs (SETA and SETB) were applied

  6. Monitoring training response in young Friesian dressage horses using two different standardised exercise tests (SETs)

    NARCIS (Netherlands)

    Bruijn, de Cornelis Marinus; Houterman, Willem; Ploeg, Margreet; Ducro, Bart; Boshuizen, Berit; Goethals, Klaartje; Verdegaal, Elisabeth Lidwien; Delesalle, Catherine

    2017-01-01

    Background: Most Friesian horses reach their anaerobic threshold during a standardized exercise test (SET) which requires lower intensity exercise than daily routine training. Aim: to study strengths and weaknesses of an alternative SET-protocol. Two different SETs (SETA and SETB) were applied

  7. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  8. Editorial: Datasets for Learning Analytics

    NARCIS (Netherlands)

    Dietze, Stefan; George, Siemens; Davide, Taibi; Drachsler, Hendrik

    2018-01-01

    The European LinkedUp and LACE (Learning Analytics Community Exchange) project have been responsible for setting up a series of data challenges at the LAK conferences 2013 and 2014 around the LAK dataset. The LAK datasets consists of a rich collection of full text publications in the domain of

  9. System of Systems Analytic Workbench - 2017

    Science.gov (United States)

    2017-08-31

    Genetic Algorithm and Particle Swarm Optimization with Type-2 Fuzzy Sets for Generating Systems of Systems Architectures. Procedia Computer Science...The application effort involves modeling an existing messaging network to perform real-time situational awareness. The Analytical Workbench’s

  10. Association test based on SNP set: logistic kernel machine based test vs. principal component analysis.

    Directory of Open Access Journals (Sweden)

    Yang Zhao

    Full Text Available GWAS has facilitated greatly the discovery of risk SNPs associated with complex diseases. Traditional methods analyze SNP individually and are limited by low power and reproducibility since correction for multiple comparisons is necessary. Several methods have been proposed based on grouping SNPs into SNP sets using biological knowledge and/or genomic features. In this article, we compare the linear kernel machine based test (LKM and principal components analysis based approach (PCA using simulated datasets under the scenarios of 0 to 3 causal SNPs, as well as simple and complex linkage disequilibrium (LD structures of the simulated regions. Our simulation study demonstrates that both LKM and PCA can control the type I error at the significance level of 0.05. If the causal SNP is in strong LD with the genotyped SNPs, both the PCA with a small number of principal components (PCs and the LKM with kernel of linear or identical-by-state function are valid tests. However, if the LD structure is complex, such as several LD blocks in the SNP set, or when the causal SNP is not in the LD block in which most of the genotyped SNPs reside, more PCs should be included to capture the information of the causal SNP. Simulation studies also demonstrate the ability of LKM and PCA to combine information from multiple causal SNPs and to provide increased power over individual SNP analysis. We also apply LKM and PCA to analyze two SNP sets extracted from an actual GWAS dataset on non-small cell lung cancer.

  11. 8. All Polish Conference on Analytical Chemistry: Analytical Chemistry for the Community of the 21. Century

    International Nuclear Information System (INIS)

    Koscielniak, P.; Wieczorek, M.; Kozak, J.

    2010-01-01

    Book of Abstracts contains short descriptions of lectures, communications and posters presented during 8 th All Polish Conference on Analytical Chemistry (Cracow, 4-9.07.2010). Scientific programme consisted of: basic analytical problems, preparation of the samples, chemometry and metrology, miniaturization of the analytical procedures, environmental analysis, medicinal analyses, industrial analyses, food analyses, biochemical analyses, analysis of relicts of the past. Several posters were devoted to the radiochemical separations, radiochemical analysis, environmental behaviour of the elements important for the nuclear science and the professional tests.

  12. Analytical prediction of fuel assembly spacer grid loss coefficient

    International Nuclear Information System (INIS)

    Lim, J. S.; Nam, K. I.; Park, S. K.; Kwon, J. T.; Park, W. J.

    2002-01-01

    The analytical prediction model of the fuel assembly spacer grid pressure loss coefficient has been studied. The pressure loss of gap between the test section wall and spacer grid was separated from the current model and the different friction drag coefficient on spacer straps from high Reynolds number region were used to low Reynolds number region. The analytical model has been verified based on the hydraulic pressure drop test results for the spacer grids of three types for 5x5, 16x16(or 17x17) arrays. The analytical model predicts the pressure loss coefficients obtained from test results within the maximum errors of 12% and 7% for 5x5 test bundle and full size bundle, respectively, at Reynolds number 500,000 of the core operating condition. This result shows that the analytical model can be used for research and design change of the nuclear fuel assembly

  13. Test set up description and performances for HAWAII-2RG detector characterization at ESTEC

    Science.gov (United States)

    Crouzet, P.-E.; ter Haar, J.; de Wit, F.; Beaufort, T.; Butler, B.; Smit, H.; van der Luijt, C.; Martin, D.

    2012-07-01

    In the frame work of the European Space Agency's Cosmic Vision program, the Euclid mission has the objective to map the geometry of the Dark Universe. Galaxies and clusters of galaxies will be observed in the visible and near-infrared wavelengths by an imaging and spectroscopic channel. For the Near Infrared Spectrometer instrument (NISP), the state-of-the-art HAWAII-2RG detectors will be used, associated with the SIDECAR ASIC readout electronic which will perform the image frame acquisitions. To characterize and validate the performance of these detectors, a test bench has been designed, tested and validated. This publication describes the pre-tests performed to build the set up dedicated to dark current measurements and tests requiring reasonably uniform light levels (such as for conversion gain measurements). Successful cryogenic and vacuum tests on commercial LEDs and photodiodes are shown. An optimized feed through in stainless steel with a V-groove to pot the flex cable connecting the SIDECAR ASIC to the room temperature board (JADE2) has been designed and tested. The test set up for quantum efficiency measurements consisting of a lamp, a monochromator, an integrating sphere and set of cold filters, and which is currently under construction will ensure a uniform illumination across the detector with variations lower than 2%. A dedicated spot projector for intra-pixel measurements has been designed and built to reach a spot diameter of 5 μm at 920nm with 2nm of bandwidth [1].

  14. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection.

    Science.gov (United States)

    Cross, Robert W; Boisen, Matthew L; Millett, Molly M; Nelson, Diana S; Oottamasathien, Darin; Hartnett, Jessica N; Jones, Abigal B; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A; Fusco, Marnie L; Abelson, Dafna M; Oda, Shunichiro; Brown, Bethany L; Pham, Ha; Rowland, Megan M; Agans, Krystle N; Geisbert, Joan B; Heinrich, Megan L; Kulakosky, Peter C; Shaffer, Jeffrey G; Schieffelin, John S; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M; Wilson, Russell B; Saphire, Erica Ollmann; Pitts, Kelly R; Khan, Sheik Humarr; Grant, Donald S; Geisbert, Thomas W; Branco, Luis M; Garry, Robert F

    2016-10-15

     Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013-2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases.  Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance.  The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription-polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 10 5 -9.0 × 10 8 genomes/mL.  The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  15. Learning analytics as a "middle space"

    NARCIS (Netherlands)

    Suthers, D.D.; Verbert, K.; Suthers, D.; Verbert, K.; Duval, E.; Ochoa, X.

    2013-01-01

    Learning Analytics, an emerging field concerned with analyzing the vast data "given off" by learners in technology supported settings to inform educational theory and practice, has from its inception taken a multidisciplinary approach that integrates studies of learning with technological

  16. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings

    Directory of Open Access Journals (Sweden)

    Piérard GE

    2015-03-01

    Full Text Available Gérald E Piérard,1,2 Justine Courtois,1 Caroline Ritacco,1 Philippe Humbert,2,3 Ferial Fanian,3 Claudine Piérard-Franchimont1,4,5 1Laboratory of Skin Bioengineering and Imaging (LABIC, Department of Clinical Sciences, Liège University, Liège, Belgium; 2University of Franche-Comté, Besançon, France; 3Department of Dermatology, University Hospital Saint-Jacques, Besançon, France; 4Department of Dermatopathology, Unilab Lg, University Hospital of Liège, Liège, Belgium; 5Department of Dermatology, Regional Hospital of Huy, Huy, Belgium Background: In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD and cyanoacrylate skin surface strippings (CSSSs. Methods: Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results: With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion: A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. Keywords: irritation, morphometry, quantitative morphology, stripping

  17. Analytical Evaluation of Preliminary Drop Tests Performed to Develop a Robust Design for the Standardized DOE Spent Nuclear Fuel Canister

    International Nuclear Information System (INIS)

    Ware, A.G.; Morton, D.K.; Smith, N.L.; Snow, S.D.; Rahl, T.E.

    1999-01-01

    The Department of Energy (DOE) has developed a design concept for a set of standard canisters for the handling, interim storage, transportation, and disposal in the national repository, of DOE spent nuclear fuel (SNF). The standardized DOE SNF canister has to be capable of handling virtually all of the DOE SNF in a variety of potential storage and transportation systems. It must also be acceptable to the repository, based on current and anticipated future requirements. This expected usage mandates a robust design. The canister design has four unique geometries, with lengths of approximately 10 feet or 15 feet, and an outside nominal diameter of 18 inches or 24 inches. The canister has been developed to withstand a drop from 30 feet onto a rigid (flat) surface, sustaining only minor damage - but no rupture - to the pressure (containment) boundary. The majority of the end drop-induced damage is confined to the skirt and lifting/stiffening ring components, which can be removed if de sired after an accidental drop. A canister, with its skirt and stiffening ring removed after an accidental drop, can continue to be used in service with appropriate operational steps being taken. Features of the design concept have been proven through drop testing and finite element analyses of smaller test specimens. Finite element analyses also validated the canister design for drops onto a rigid (flat) surface for a variety of canister orientations at impact, from vertical to 45 degrees off vertical. Actual 30-foot drop testing has also been performed to verify the final design, though limited to just two full-scale test canister drops. In each case, the analytical models accurately predicted the canister response

  18. On the Performance of Three In-Memory Data Systems for On Line Analytical Processing

    Directory of Open Access Journals (Sweden)

    Ionut HRUBARU

    2017-01-01

    Full Text Available In-memory database systems are among the most recent and most promising Big Data technologies, being developed and released either as brand new distributed systems or as extensions of old monolith (centralized database systems. As name suggests, in-memory systems cache all the data into special memory structures. Many are part of the NewSQL strand and target to bridge the gap between OLTP and OLAP into so-called Hybrid Transactional Analytical Systems (HTAP. This paper aims to test the performance of using such type of systems for TPCH analytical workloads. Performance is analyzed in terms of data loading, memory footprint and execution time of the TPCH query set for three in-memory data systems: Oracle, SQL Server and MemSQL. Tests are subsequently deployed on classical on-disk architectures and results compared to in-memory solutions. As in-memory is an enterprise edition feature, associated costs are also considered.

  19. User-calibration of Mettler AT200 analytical balance

    International Nuclear Information System (INIS)

    Estill, J.

    1996-01-01

    The purpose of this technical implementing procedure (TIP) is to describe the calibration of the Mettler AT200 analytical balance or similar type balance (henceforth called the balance). This balance is used for activities of the Scientific Investigation Plan (SIP) ''Metal Barrier Selection and Testing'' (SIP-CM-01, WBS nr. 1.2.2.5.1). In particular, it will be used for Activity E-20-50, ''Long-Term Corrosion Studies.'' The balance will be used for weighing test specimens and reagent chemicals. However, it is not limited to these uses. The calibration procedures consist of activating the internal (self) calibration of the apparatus, and weighing and recording of traceable standards. The balance is equipped with self (internal) calibration and linearization capabilities. It has an internal (built in) set of weights which are used for self calibration. The standard weights are traceable to National Institute of Standards and Technology (NIST)

  20. An evaluation system of the setting up of predictive maintenance programmes

    International Nuclear Information System (INIS)

    Carnero, MaCarmen

    2006-01-01

    Predictive Maintenance can provide an increase in safety, quality and availability in industrial plants. However, the setting up of a Predictive Maintenance Programme is a strategic decision that until now has lacked analysis of questions related to its setting up, management and control. In this paper, an evaluation system is proposed that carries out the decision making in relation to the feasibility of the setting up. The evaluation system uses a combination of tools belonging to operational research such as: Analytic Hierarchy Process, decision rules and Bayesian tools. This system is a help tool available to the managers of Predictive Maintenance Programmes which can both increase the number of Predictive Maintenance Programmes set up and avoid the failure of these programmes. The Evaluation System has been tested in a petrochemical plant and in a food industry

  1. Latent structure of the Wisconsin Card Sorting Test: a confirmatory factor analytic study.

    Science.gov (United States)

    Greve, Kevin W; Stickle, Timothy R; Love, Jeffrey M; Bianchini, Kevin J; Stanford, Matthew S

    2005-05-01

    The present study represents the first large scale confirmatory factor analysis of the Wisconsin Card Sorting Test (WCST). The results generally support the three factor solutions reported in the exploratory factor analysis literature. However, only the first factor, which reflects general executive functioning, is statistically sound. The secondary factors, while likely reflecting meaningful cognitive abilities, are less stable except when all subjects complete all 128 cards. It is likely that having two discontinuation rules for the WCST has contributed to the varied factor analytic solutions reported in the literature and early discontinuation may result in some loss of useful information. Continued multivariate research will be necessary to better clarify the processes underlying WCST performance and their relationships to one another.

  2. Preservatives and neutralizing substances in milk: analytical sensitivity of official specific and nonspecific tests, microbial inhibition effect, and residue persistence in milk

    Directory of Open Access Journals (Sweden)

    Livia Cavaletti Corrêa da Silva

    2015-09-01

    Full Text Available Milk fraud has been a recurring problem in Brazil; thus, it is important to know the effect of most frequently used preservatives and neutralizing substances as well as the detection capability of official tests. The objective of this study was to evaluate the analytical sensitivity of legislation-described tests and nonspecific microbial inhibition tests, and to investigate the effect of such substances on microbial growth inhibition and the persistence of detectable residues after 24/48h of refrigeration. Batches of raw milk, free from any contaminant, were divided into aliquots and mixed with different concentrations of formaldehyde, hydrogen peroxide, sodium hypochlorite, chlorine, chlorinated alkaline detergent, or sodium hydroxide. The analytical sensitivity of the official tests was 0.005%, 0.003%, and 0.013% for formaldehyde, hydrogen peroxide, and hypochlorite, respectively. Chlorine and chlorinated alkaline detergent were not detected by regulatory tests. In the tests for neutralizing substances, sodium hydroxide could not be detected when acidity was accurately neutralized. The yogurt culture test gave results similar to those obtained by official tests for the detection of specific substances. Concentrations of 0.05% of formaldehyde, 0.003% of hydrogen peroxide and 0.013% of sodium hypochlorite significantly reduced (P

  3. Development of a test set for adjustment of residential furnaces and boilers. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    A program was undertaken to design and develop a portable test set for simplified field adjustment of residential furnaces and boilers to achieve peak operating efficiency. Advanced technology was applied to provide continuous analysis of flue gases and the display of temperature, oxygen concentrations, smoke value and furnace efficiency. Prototype models were constructed and delivered to Brookhaven National Laboratory for further testing. A survey of furnace dealers was conducted, and a commercialization plan was developed based on survey responses and the status of the equipment developed under the program. Goals for a marketable test set and development steps to achieve a projected energy savings were determined and recommended. Recommendations for specific areas of further development are included.

  4. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  5. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    Science.gov (United States)

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  6. Use of hot set and gel content test in QA/QC for irradiation of commercial products

    International Nuclear Information System (INIS)

    Ruzalina Baharin; Siti Aiasah Hashim; Zulkafli Ghazali; Zarina Mohd Noor

    2006-01-01

    Currently, irradiation of polyolefin heat shrinkable tube is the main commercial activity at Alurtron. Other products include medical devices and cosmetic products. As required by the quality management system, ISO 9001:2000, Alurtron is responsible to ensure that customer received their irradiation dose as requested. At present two test methods are employed namely, hot set and gel content measurement to verify received dose . Hot set test is a physical test to determine longitudinal shrinkage of the irradiated products. Whereas gel content measurement determine to certain extend the degree of crosslinking occurred in polymeric material upon irradiation, in the customer products. Both test are routinely used to cross-check the dose required by customer. This paper outlines the correlation between hot set and gel content measurement of customer products. (Author)

  7. A design for a high voltage magnet coil ringer test set

    International Nuclear Information System (INIS)

    Koska, W.; Sims, R.E.

    1992-04-01

    By discharging a bank of charged capacitors through a high power SCR switch into an SSC dipole magnet assembly, it is possible to ''ring'' the coil and develop a voltage stress of greater than 50 volts turn-to-turn, thereby verifying the insulation integrity. We will present an overview of the test set design for a 2 kV isolated SCR firing circuit, including safety features, selectable capacitor banks, and digital waveform storage system. Results from testing typical coils and magnets will be included. Possible upgrades are also discussed

  8. Closing the brain-to-brain loop in laboratory testing.

    Science.gov (United States)

    Plebani, Mario; Lippi, Giuseppe

    2011-07-01

    Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.

  9. Analytic cognitive style predicts religious and paranormal belief

    OpenAIRE

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J.; Fugelsang, Jonathan A.

    2012-01-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined associations of God beliefs, religious engagement (attendance at religious services, praying, etc.), conventional religious beliefs (heaven, miracle...

  10. Data mining and business analytics with R

    CERN Document Server

    Ledolter, Johannes

    2013-01-01

    Collecting, analyzing, and extracting valuable information from a large amount of data requires easily accessible, robust, computational and analytical tools. Data Mining and Business Analytics with R utilizes the open source software R for the analysis, exploration, and simplification of large high-dimensional data sets. As a result, readers are provided with the needed guidance to model and interpret complicated data and become adept at building powerful models for prediction and classification. Highlighting both underlying concepts and practical computational skills, Data Mining

  11. Analytical Music Therapy with Adults in Mental Health and in Counseling Work

    DEFF Research Database (Denmark)

    Pedersen, Inge Nygaard

    2002-01-01

    This chapter gives an overview of Analytical (oriented)Music Therapy applied in psychiatry and in counseling work. Definitions,setting, methods, counter transference conditions, referral criterias, understanding of the musical structure and documentation are in focus.......This chapter gives an overview of Analytical (oriented)Music Therapy applied in psychiatry and in counseling work. Definitions,setting, methods, counter transference conditions, referral criterias, understanding of the musical structure and documentation are in focus....

  12. Comparison of in-plant performance test data with analytic prediction of reactor safety system injection transient (U)

    International Nuclear Information System (INIS)

    Roy, B.N.; Neill, C.H. Jr.

    1993-01-01

    This paper compares the performance test data from injection transients for both of the subsystems of the Supplementary Safety System of the Savannah River Site production reactor with analytical predictions from an in-house thermal hydraulic computer code. The code was initially developed for design validation of the new Supplementary Safety System subsystem, but is shown to be equally capable of predicting the performance of the Supplementary Safety System existing subsystem even though the two subsystem transient injections have marked differences. The code itself was discussed and its validation using prototypic tests with simulated fluids was reported in an earlier paper (Roy and Nomm 1991)

  13. Expanding the test set: Chemicals with potential to disrupt mammalian brain development

    Science.gov (United States)

    High-throughput test methods including molecular, cellular, and alternative species-based assays that examine critical events of normal brain development are being developed for detection of developmental neurotoxcants. As new assays are developed, a "training set' of chemicals i...

  14. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  15. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Analytical Work in Support of the Design and Operation of Two Dimensional Self Streamlining Test Sections

    Science.gov (United States)

    Judd, M.; Wolf, S. W. D.; Goodyer, M. J.

    1976-01-01

    A method has been developed for accurately computing the imaginary flow fields outside a flexible walled test section, applicable to lifting and non-lifting models. The tolerances in the setting of the flexible walls introduce only small levels of aerodynamic interference at the model. While it is not possible to apply corrections for the interference effects, they may be reduced by improving the setting accuracy of the portions of wall immediately above and below the model. Interference effects of the truncation of the length of the streamlined portion of a test section are brought to an acceptably small level by the use of a suitably long test section with the model placed centrally.

  17. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  18. Clinical Neuropathology practice news 1-2014: Pyrosequencing meets clinical and analytical performance criteria for routine testing of MGMT promoter methylation status in glioblastoma

    Science.gov (United States)

    Preusser, Matthias; Berghoff, Anna S.; Manzl, Claudia; Filipits, Martin; Weinhäusel, Andreas; Pulverer, Walter; Dieckmann, Karin; Widhalm, Georg; Wöhrer, Adelheid; Knosp, Engelbert; Marosi, Christine; Hainfellner, Johannes A.

    2014-01-01

    Testing of the MGMT promoter methylation status in glioblastoma is relevant for clinical decision making and research applications. Two recent and independent phase III therapy trials confirmed a prognostic and predictive value of the MGMT promoter methylation status in elderly glioblastoma patients. Several methods for MGMT promoter methylation testing have been proposed, but seem to be of limited test reliability. Therefore, and also due to feasibility reasons, translation of MGMT methylation testing into routine use has been protracted so far. Pyrosequencing after prior DNA bisulfite modification has emerged as a reliable, accurate, fast and easy-to-use method for MGMT promoter methylation testing in tumor tissues (including formalin-fixed and paraffin-embedded samples). We performed an intra- and inter-laboratory ring trial which demonstrates a high analytical performance of this technique. Thus, pyrosequencing-based assessment of MGMT promoter methylation status in glioblastoma meets the criteria of high analytical test performance and can be recommended for clinical application, provided that strict quality control is performed. Our article summarizes clinical indications, practical instructions and open issues for MGMT promoter methylation testing in glioblastoma using pyrosequencing. PMID:24359605

  19. Summary report of the TC regional project on 'QA/QC of nuclear analytical techniques' RER-2-004 (1999-2001)

    International Nuclear Information System (INIS)

    Akgun, A. Fadil

    2002-01-01

    This report provides a summary of the Cekmece Nuclear Research and Training Centre participation in the Project. The Project helped in setting up quality assurance system in the Centre and resulted in a progress in analytical proficiency as shown in the proficiency test results. The main accomplishments are listed along with the tasks to be done

  20. Analytic information processing style in epilepsy patients.

    Science.gov (United States)

    Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano

    2017-08-01

    Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. A Visual Analytics Approach for Extracting Spatio-Temporal Urban Mobility Information from Mobile Network Traffic

    Directory of Open Access Journals (Sweden)

    Euro Beinat

    2012-11-01

    Full Text Available In this paper we present a visual analytics approach for deriving spatio-temporal patterns of collective human mobility from a vast mobile network traffic data set. More than 88 million movements between pairs of radio cells—so-called handovers—served as a proxy for more than two months of mobility within four urban test areas in Northern Italy. In contrast to previous work, our approach relies entirely on visualization and mapping techniques, implemented in several software applications. We purposefully avoid statistical or probabilistic modeling and, nonetheless, reveal characteristic and exceptional mobility patterns. The results show, for example, surprising similarities and symmetries amongst the total mobility and people flows between the test areas. Moreover, the exceptional patterns detected can be associated to real-world events such as soccer matches. We conclude that the visual analytics approach presented can shed new light on large-scale collective urban mobility behavior and thus helps to better understand the “pulse” of dynamic urban systems.

  2. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  3. Experience With Routine Vaginal pH Testing in a Family Practice Setting

    Directory of Open Access Journals (Sweden)

    Adriana J. Pavletic

    2004-01-01

    Full Text Available Background: Despite recommendations by Centers for Disease Control and the American College of Obstetricians and Gynecologists, pH testing is infrequently performed during the evaluation of vaginitis. Consequently, little information exists on its use in a primary care setting.

  4. Paper-Based Analytical Device for Zinc Ion Quantification in Water Samples with Power-Free Analyte Concentration

    Directory of Open Access Journals (Sweden)

    Hiroko Kudo

    2017-04-01

    Full Text Available Insufficient sensitivity is a general issue of colorimetric paper-based analytical devices (PADs for trace analyte detection, such as metal ions, in environmental water. This paper demonstrates the colorimetric detection of zinc ions (Zn2+ on a paper-based analytical device with an integrated analyte concentration system. Concentration of Zn2+ ions from an enlarged sample volume (1 mL has been achieved with the aid of a colorimetric Zn2+ indicator (Zincon electrostatically immobilized onto a filter paper substrate in combination with highly water-absorbent materials. Analyte concentration as well as sample pretreatment, including pH adjustment and interferent masking, has been elaborated. The resulting device enables colorimetric quantification of Zn2+ in environmental water samples (tap water, river water from a single sample application. The achieved detection limit of 0.53 μM is a significant improvement over that of a commercial colorimetric Zn2+ test paper (9.7 μM, demonstrating the efficiency of the developed analyte concentration system not requiring any equipment.

  5. Test strip and method for its use

    International Nuclear Information System (INIS)

    1981-01-01

    A test strip device is described which is useful in performing binding assays involving antigens, antibodies, hormones, vitamins, metabolites or pharmacological agents. The device is capable of application to analytical methods in which a set of sequential test reactions is involved and in which a minute sample size may be used. This test strip is particularly useful in radioimmunoassays. The use of the device is illustrated in radioimmunoassays for 1) thyroxine in serum, 2) the triiodothyronine binding capacity of serum and 3) folic acid and its analogues in serum. (U.K.)

  6. Analytical performance, agreement and user-friendliness of six point-of-care testing urine analysers for urinary tract infection in general practice

    NARCIS (Netherlands)

    Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M

    2015-01-01

    OBJECTIVE: Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection

  7. 42 CFR 493.1250 - Condition: Analytic systems.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Analytic systems. 493.1250 Section 493.1250 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... quality testing. The laboratory must monitor and evaluate the overall quality of the analytic systems and...

  8. Customer Intelligence Analytics on Social Networks

    Directory of Open Access Journals (Sweden)

    Brano MARKIĆ

    2016-08-01

    Full Text Available Discovering needs, habits and consumer behavior is the primary task of marketing analytics. It is necessary to integrate marketing and analytical skills with IT skills. Such knowledge integration allows access to data (structured and unstructured, their analysis and finding out information about the opinions, attitudes, needs and behavior of customers. In the paper is set the hypothesis that software tools can collect data (messages from social networks, analyze the content of messages and get to know the attitudes of customers about a product, service, tourist destination with the ultimate goal of improving customer relations. Experimental results are based on the analysis of the content of social network Facebook by using the package and function R language. This language showed a satisfactory application and development power in analysis of textual data on social networks for marketing analytics.

  9. Cross-sectional evaluation of an internet-based hearing screening test in an occupational setting

    NARCIS (Netherlands)

    Sheikh Rashid, Marya; Leensen, Monique Cj; de Laat, Jan Apm; Dreschler, Wouter A.

    2017-01-01

    Objectives The Occupational Earcheck (OEC) is an online internet test to detect high-frequency hearing loss for the purposes of occupational hearing screening. In this study, we evaluated the OEC in an occupational setting in order to assess test sensitivity, specificity, and validity. Methods A

  10. Constraint-Referenced Analytics of Algebra Learning

    Science.gov (United States)

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  11. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    Science.gov (United States)

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  12. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  13. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  14. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J. William

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP

  15. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  16. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J.W. [California Univ., Riverside, CA (United States). Dept. of Physics

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.) 6 refs.

  17. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J.W.

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.)

  18. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Science.gov (United States)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e +e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon hets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  19. Analytical Performance Characteristics of the Cepheid GeneXpert Ebola Assay for the Detection of Ebola Virus

    Science.gov (United States)

    Pinsky, Benjamin A.; Sahoo, Malaya K.; Sandlund, Johanna; Kleman, Marika; Kulkarni, Medha; Grufman, Per; Nygren, Malin; Kwiatkowski, Robert; Baron, Ellen Jo; Tenover, Fred; Denison, Blake; Higuchi, Russell; Van Atta, Reuel; Beer, Neil Reginald; Carrillo, Alda Celena; Naraghi-Arani, Pejman; Mire, Chad E.; Ranadheera, Charlene; Grolla, Allen; Lagerqvist, Nina; Persing, David H.

    2015-01-01

    Background The recently developed Xpert® Ebola Assay is a novel nucleic acid amplification test for simplified detection of Ebola virus (EBOV) in whole blood and buccal swab samples. The assay targets sequences in two EBOV genes, lowering the risk for new variants to escape detection in the test. The objective of this report is to present analytical characteristics of the Xpert® Ebola Assay on whole blood samples. Methods and Findings This study evaluated the assay’s analytical sensitivity, analytical specificity, inclusivity and exclusivity performance in whole blood specimens. EBOV RNA, inactivated EBOV, and infectious EBOV were used as targets. The dynamic range of the assay, the inactivation of virus, and specimen stability were also evaluated. The lower limit of detection (LoD) for the assay using inactivated virus was estimated to be 73 copies/mL (95% CI: 51–97 copies/mL). The LoD for infectious virus was estimated to be 1 plaque-forming unit/mL, and for RNA to be 232 copies/mL (95% CI 163–302 copies/mL). The assay correctly identified five different Ebola viruses, Yambuku-Mayinga, Makona-C07, Yambuku-Ecran, Gabon-Ilembe, and Kikwit-956210, and correctly excluded all non-EBOV isolates tested. The conditions used by Xpert® Ebola for inactivation of infectious virus reduced EBOV titer by ≥6 logs. Conclusion In summary, we found the Xpert® Ebola Assay to have high analytical sensitivity and specificity for the detection of EBOV in whole blood. It offers ease of use, fast turnaround time, and remote monitoring. The test has an efficient viral inactivation protocol, fulfills inclusivity and exclusivity criteria, and has specimen stability characteristics consistent with the need for decentralized testing. The simplicity of the assay should enable testing in a wide variety of laboratory settings, including remote laboratories that are not capable of performing highly complex nucleic acid amplification tests, and during outbreaks where time to detection

  20. The analyst's participation in the analytic process.

    Science.gov (United States)

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  1. An analytical solution for the Marangoni mixed convection boundary layer flow

    DEFF Research Database (Denmark)

    Moghimi, M. A.; Kimiaeifar, Amin; Rahimpour, M.

    2010-01-01

    In this article, an analytical solution for a Marangoni mixed convection boundary layer flow is presented. A similarity transform reduces the Navier-Stokes equations to a set of nonlinear ordinary differential equations, which are solved analytically by means of the homotopy analysis method (HAM...... the convergence of the solution. The numerical solution of the similarity equations is developed and the results are in good agreement with the analytical results based on the HAM....

  2. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    Science.gov (United States)

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  3. Evolutionary developments in x ray and electron energy loss microanalysis instrumentation for the analytical electron microscope

    Science.gov (United States)

    Zaluzec, Nester J.

    Developments in instrumentation for both X ray Dispersive and Electron Energy Loss Spectroscopy (XEDS/EELS) over the last ten years have given the experimentalist a greatly enhanced set of analytical tools for characterization. Microanalysts have waited for nearly two decades now in the hope of getting a true analytical microscope and the development of 300 to 400 kV instruments should have allowed us to attain this goal. Unfortunately, this has not generally been the case. While there have been some major improvements in the techniques, there has also been some devolution in the modern AEM (Analytical Electron Microscope). In XEDS, the majority of today's instruments are still plagued by the hole count effect, which was first described in detail over fifteen years ago. The magnitude of this problem can still reach the 20 percent level for medium atomic number species in a conventional off-the-shelf intermediate voltage AEM. This is an absurd situation and the manufacturers should be severely criticized. Part of the blame, however, also rests on the AEM community for not having come up with a universally agreed upon standard test procedure. Fortunately, such a test procedure is in the early stages of refinement. The proposed test specimen consists of an evaporated Cr film approx. 500 to 1000A thick supported upon a 3mm diameter Molybdenum 200 micron aperture.

  4. Road Transportable Analytical Laboratory (RTAL) system

    International Nuclear Information System (INIS)

    1993-01-01

    The goal of this contractual effort is the development and demonstration of a Road Transportable Analytical Laboratory (RTAL) system to meet the unique needs of the Department of Energy (DOE) for rapid, accurate analysis of a wide variety of hazardous and radioactive contaminants in soil, groundwater, and surface waters. This laboratory system will be designed to provide the field and laboratory analytical equipment necessary to detect and quantify radionuclides, organics, heavy metals and other inorganics, and explosive materials. The planned laboratory system will consist of a set of individual laboratory modules deployable independently or as an interconnected group to meet each DOE site's specific needs

  5. Creating Web Area Segments with Google Analytics

    Science.gov (United States)

    Segments allow you to quickly access data for a predefined set of Sessions or Users, such as government or education users, or sessions in a particular state. You can then apply this segment to any report within the Google Analytics (GA) interface.

  6. Practical web analytics for user experience how analytics can help you understand your users

    CERN Document Server

    Beasley, Michael

    2013-01-01

    Practical Web Analytics for User Experience teaches you how to use web analytics to help answer the complicated questions facing UX professionals. Within this book, you'll find a quantitative approach for measuring a website's effectiveness and the methods for posing and answering specific questions about how users navigate a website. The book is organized according to the concerns UX practitioners face. Chapters are devoted to traffic, clickpath, and content use analysis, measuring the effectiveness of design changes, including A/B testing, building user profiles based on search hab

  7. Local analytic geometry

    CERN Document Server

    Abhyankar, Shreeram Shankar

    1964-01-01

    This book provides, for use in a graduate course or for self-study by graduate students, a well-motivated treatment of several topics, especially the following: (1) algebraic treatment of several complex variables; (2) geometric approach to algebraic geometry via analytic sets; (3) survey of local algebra; (4) survey of sheaf theory. The book has been written in the spirit of Weierstrass. Power series play the dominant role. The treatment, being algebraic, is not restricted to complex numbers, but remains valid over any complete-valued field. This makes it applicable to situations arising from

  8. Experimental analytical study on heat pipes

    International Nuclear Information System (INIS)

    Ismail, K.A.R.; Liu, C.Y.; Murcia, N.

    1981-01-01

    An analytical model is developed for optimizing the thickness distribution of the porous material in heat pipes. The method was used to calculate, design and construct heat pipes with internal geometrical changes. Ordinary pipes are also constructed and tested together with the modified ones. The results showed that modified tubes are superior in performance and that the analytical model can predict their performance to within 1.5% precision. (Author) [pt

  9. Urine testing and urinary tract infections in febrile infants seen in office settings: the Pediatric Research in Office Settings' Febrile Infant Study.

    Science.gov (United States)

    Newman, Thomas B; Bernzweig, Jane A; Takayama, John I; Finch, Stacia A; Wasserman, Richard C; Pantell, Robert H

    2002-01-01

    To determine the predictors and results of urine testing of young febrile infants seen in office settings. Prospective cohort study. Offices of 573 pediatric practitioners from 219 practices in the American Academy of Pediatrics Pediatric Research in Office Settings' research network. A total of 3066 infants 3 months or younger with temperatures of 38 degrees C or higher were evaluated and treated according to the judgment of their practitioners. Urine testing results, early and late urinary tract infections (UTIs), and UTIs with bacteremia. Fifty-four percent of the infants initially had urine tested, of whom 10% had a UTI. The height of the fever was associated with urine testing and a UTI among those tested (adjusted odds ratio per degree Celsius, 2.2 for both). Younger age, ill appearance, and lack of a fever source were associated with urine testing but not with a UTI, whereas lack of circumcision (adjusted odds ratio, 11.6), female sex (adjusted odds ratio, 5.4), and longer duration of fever (adjusted odds ratio, 1.8 for fever lasting > or = 24 hours) were not associated with urine testing but were associated with a UTI. Bacteremia accompanied the UTI in 10% of the patients, including 17% of those younger than 1 month. Among 807 infants not initially tested or treated with antibiotics, only 2 had a subsequent documented UTI; both did well. Practitioners order urine tests selectively, focusing on younger and more ill-appearing infants and on those without an apparent fever source. Such selective urine testing, with close follow-up, was associated with few late UTIs in this large study. Urine testing should focus particularly on uncircumcised boys, girls, the youngest and sickest infants, and those with persistent fever.

  10. Estimation of maximal oxygen uptake via submaximal exercise testing in sports, clinical, and home settings.

    Science.gov (United States)

    Sartor, Francesco; Vernillo, Gianluca; de Morree, Helma M; Bonomi, Alberto G; La Torre, Antonio; Kubis, Hans-Peter; Veicsteinas, Arsenio

    2013-09-01

    Assessment of the functional capacity of the cardiovascular system is essential in sports medicine. For athletes, the maximal oxygen uptake [Formula: see text] provides valuable information about their aerobic power. In the clinical setting, the (VO(2max)) provides important diagnostic and prognostic information in several clinical populations, such as patients with coronary artery disease or heart failure. Likewise, VO(2max) assessment can be very important to evaluate fitness in asymptomatic adults. Although direct determination of [VO(2max) is the most accurate method, it requires a maximal level of exertion, which brings a higher risk of adverse events in individuals with an intermediate to high risk of cardiovascular problems. Estimation of VO(2max) during submaximal exercise testing can offer a precious alternative. Over the past decades, many protocols have been developed for this purpose. The present review gives an overview of these submaximal protocols and aims to facilitate appropriate test selection in sports, clinical, and home settings. Several factors must be considered when selecting a protocol: (i) The population being tested and its specific needs in terms of safety, supervision, and accuracy and repeatability of the VO(2max) estimation. (ii) The parameters upon which the prediction is based (e.g. heart rate, power output, rating of perceived exertion [RPE]), as well as the need for additional clinically relevant parameters (e.g. blood pressure, ECG). (iii) The appropriate test modality that should meet the above-mentioned requirements should also be in line with the functional mobility of the target population, and depends on the available equipment. In the sports setting, high repeatability is crucial to track training-induced seasonal changes. In the clinical setting, special attention must be paid to the test modality, because multiple physiological parameters often need to be measured during test execution. When estimating VO(2max), one has

  11. Progress in engineering design of Indian LLCB TBM set for testing in ITER

    International Nuclear Information System (INIS)

    Chaudhuri, Paritosh; Ranjithkumar, S.; Sharma, Deepak; Danani, Chandan; Swami, H.L.; Bhattacharya, R.; Patel, Anita; Kumar, E. Rajendra; Vyas, K.N.

    2014-01-01

    Highlights: • The tritium breeding for LLCB TBM has been evaluated by neutronic analysis. • Details of thermal-hydraulic analyses performed for FW and internal components of LLCB TBM and shield block have been provided.. • The optimum dimensions of CB zones and Pb–Li flow have been selected to have the maximum temperatures of all components used to lie within their respective temperature window. • The design and thermal analysis of shield block and attachment system have been performed. - Abstract: The Indian Lead–Lithium Ceramic Breeder (LLCB) Test Blanket Module (TBM) is the Indian DEMO relevant blanket module, as a part of the TBM program in ITER. The LLCB TBM will be tested from the first phase of ITER operation in one-half of an ITER port no. 2. LLCB TBM-set consists of LLCB TBM module and shield block, which are attached with the help of attachment systems. This LLCB TBM set is inserted in a water-cooled stainless steel frame called ‘TBM frame’, which also provides the separation between the neighboring TBM-sets (Chinese TBM set) in port no. 2. In LLCB TBM, high-pressure helium gas is used to cool the first wall (FW) structure and lead–lithium eutectic (Pb–Li) flowing separately around the ceramic breeder (CB) pebble bed to cool the TBM internals which are heated due to the volumetric neutron heating during plasma operation. Low-pressure helium is purged inside the CB zones to extract the bred tritium. Thermal-structural analyses have been performed independently on LLCB TBM and shield block for TBM set using ANSYS. This paper will also describe the performance analysis of individual components of LLCB TBM set and their different configurations to optimize their performances

  12. Identifying genetic marker sets associated with phenotypes via an efficient adaptive score test

    KAUST Repository

    Cai, T.; Lin, X.; Carroll, R. J.

    2012-01-01

    the overall effect of a marker-set have been actively studied in recent years. For example, score tests derived under an Empirical Bayes (EB) framework (Liu and others, 2007. Semiparametric regression of multidimensional genetic pathway data: least

  13. Thermo Techno Modern Analytical Equipment for Research and Industrial Laboratories

    Directory of Open Access Journals (Sweden)

    Khokhlov, S.V.

    2014-03-01

    Full Text Available A brief overview of some models of Thermo Techno analytical equipment and possible areas of their application is given. Thermo Techno Company was created in 2000 as a part of representative office of international corporation Thermo Fisher Scientific — world leader in manufacturing analytical equipments. Thermo Techno is a unique company in its integrated approach in solving the problems of the user, which includes a series of steps: setting the analytical task, selection of effective analysis methods, sample delivery and preparation as well as data transmitting and archiving.

  14. Effects of fecal sampling on preanalytical and analytical phases in quantitative fecal immunochemical tests for hemoglobin.

    Science.gov (United States)

    Rapi, Stefano; Berardi, Margherita; Cellai, Filippo; Ciattini, Samuele; Chelazzi, Laura; Ognibene, Agostino; Rubeca, Tiziana

    2017-07-24

    Information on preanalytical variability is mandatory to bring laboratories up to ISO 15189 requirements. Fecal sampling is greatly affected by lack of harmonization in laboratory medicine. The aims of this study were to obtain information on the devices used for fecal sampling and to explore the effect of different amounts of feces on the results from the fecal immunochemical test for hemoglobin (FIT-Hb). Four commercial sample collection devices for quantitative FIT-Hb measurements were investigated. The volume of interest (VOI) of the probes was measured from diameter and geometry. Quantitative measurements of the mass of feces were carried out by gravimetry. The effects of an increased amount of feces on the analytical environment were investigated measuring the Hb values with a single analytical method. VOI was 8.22, 7.1 and 9.44 mm3 for probes that collected a target of 10 mg of feces, and 3.08 mm3 for one probe that targeted 2 mg of feces. The ratio between recovered and target amounts of devices ranged from 56% to 121%. Different changes in the measured Hb values were observed, in adding increasing amounts of feces in commercial buffers. The amounts of collected materials are related to the design of probes. Three out 4 manufacturers declare the same target amount using different sampling volumes and obtaining different amounts of collected materials. The introduction of a standard probes to reduce preanalytical variability could be an useful step for fecal test harmonization and to fulfill the ISO 15189 requirements.

  15. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  16. Three-dimensional eddy current solution of a polyphase machine test model (abstract)

    Science.gov (United States)

    Pahner, Uwe; Belmans, Ronnie; Ostovic, Vlado

    1994-05-01

    This abstract describes a three-dimensional (3D) finite element solution of a test model that has been reported in the literature. The model is a basis for calculating the current redistribution effects in the end windings of turbogenerators. The aim of the study is to see whether the analytical results of the test model can be found using a general purpose finite element package, thus indicating that the finite element model is accurate enough to treat real end winding problems. The real end winding problems cannot be solved analytically, as the geometry is far too complicated. The model consists of a polyphase coil set, containing 44 individual coils. This set generates a two pole mmf distribution on a cylindrical surface. The rotating field causes eddy currents to flow in the inner massive and conducting rotor. In the analytical solution a perfect sinusoidal mmf distribution is put forward. The finite element model contains 85824 tetrahedra and 16451 nodes. A complex single scalar potential representation is used in the nonconducting parts. The computation time required was 3 h and 42 min. The flux plots show that the field distribution is acceptable. Furthermore, the induced currents are calculated and compared with the values found from the analytical solution. The distribution of the eddy currents is very close to the distribution of the analytical solution. The most important results are the losses, both local and global. The value of the overall losses is less than 2% away from those of the analytical solution. Also the local distribution of the losses is at any given point less than 7% away from the analytical solution. The deviations of the results are acceptable and are partially due to the fact that the sinusoidal mmf distribution was not modeled perfectly in the finite element method.

  17. Research on mechanical and sensoric set-up for high strain rate testing of high performance fibers

    Science.gov (United States)

    Unger, R.; Schegner, P.; Nocke, A.; Cherif, C.

    2017-10-01

    Within this research project, the tensile behavior of high performance fibers, such as carbon fibers, is investigated under high velocity loads. This contribution (paper) focuses on the clamp set-up of two testing machines. Based on a kinematic model, weight optimized clamps are designed and evaluated. By analyzing the complex dynamic behavior of conventional high velocity testing machines, it has been shown that the impact typically exhibits an elastic characteristic. This leads to barely predictable breaking speeds and will not work at higher speeds when acceleration force exceeds material specifications. Therefore, a plastic impact behavior has to be achieved, even at lower testing speeds. This type of impact behavior at lower speeds can be realized by means of some minor test set-up adaptions.

  18. Proficiency Test Program Involvement as a Tool for External Quality Control for Radiochemistry and Environmental Laboratory, Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Nurrul Assyikeen Mohd Jaffary; Wo, Y.M.; Zal U'yun Wan Mahmood; Norfaizal Mohamed; Abdul Kadir Ishak; Noor Fadzilah Yusof; Jalal Sharib

    2016-01-01

    As the only Laboratory in Malaysia under the IAEA Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) Network, the Radiochemistry and Environmental Laboratory (RAS), Malaysian Nuclear Agency participates in the proficiency test programmes organised by ALMERA to achieve mutual acceptance of analytical data. The ALMERA has been providing quality support of proficiency tests using sets of different samples matrices and radionuclide levels typically encountered in environmental and food monitoring laboratories. The involvement of RAS laboratory in the IAEA proficiency tests gives opportunity to improve the laboratory capability and personnel skills in the field of radioactivity testing. (author)

  19. Proactive Supply Chain Performance Management with Predictive Analytics

    Directory of Open Access Journals (Sweden)

    Nenad Stefanovic

    2014-01-01

    Full Text Available Today’s business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators. Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  20. Proactive supply chain performance management with predictive analytics.

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  1. Proactive Supply Chain Performance Management with Predictive Analytics

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605

  2. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  3. Thermal calculations for the design, construction, operation, and evaluation of the Spent Fuel Test - Climax, Nevada Test Site

    International Nuclear Information System (INIS)

    Montan, D.N.; Patrick, W.C.

    1981-01-01

    The Spent Fuel Test-Climax (SFT-C) is a test of retrievable deep geologic storage of commercially generated spent nuclear reactor fuel in granitic rock. Eleven spent fuel assemblies, together with six electrical simulators and 20 guard heaters, are emplaced 420 m below the surface in the Climax granite at the US Department of Energy Nevada Test Site. On June 2, 1978 LLNL secured funding for the SFT-C, and completed spent fuel emplacement May 28, 1980. This report documents a series of thermal calculations that were performed in support of the SFT-C. Early calculations employed analytical solutions to address such design and construction issues as drift layout and emplacement hole spacings. Operational aspects of the test required more detailed numerical solutions dealing with ventilation and guard-heater power levels. The final set of calculations presented here provides temperature histories throughout the test facility for evaluation of the response of the SFT-C and for comparison of calculations with acquired data. This final set of calculations employs the as-built test geometry and best-available material properties

  4. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    Science.gov (United States)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  5. A Generic analytical solution for modelling pumping tests in wells intersecting fractures

    Science.gov (United States)

    Dewandel, Benoît; Lanini, Sandra; Lachassagne, Patrick; Maréchal, Jean-Christophe

    2018-04-01

    The behaviour of transient flow due to pumping in fractured rocks has been studied for at least the past 80 years. Analytical solutions were proposed for solving the issue of a well intersecting and pumping from one vertical, horizontal or inclined fracture in homogeneous aquifers, but their domain of application-even if covering various fracture geometries-was restricted to isotropic or anisotropic aquifers, whose potential boundaries had to be parallel or orthogonal to the fracture direction. The issue thus remains unsolved for many field cases. For example, a well intersecting and pumping a fracture in a multilayer or a dual-porosity aquifer, where intersected fractures are not necessarily parallel or orthogonal to aquifer boundaries, where several fractures with various orientations intersect the well, or the effect of pumping not only in fractures, but also in the aquifer through the screened interval of the well. Using a mathematical demonstration, we show that integrating the well-known Theis analytical solution (Theis, 1935) along the fracture axis is identical to the equally well-known analytical solution of Gringarten et al. (1974) for a uniform-flux fracture fully penetrating a homogeneous aquifer. This result implies that any existing line- or point-source solution can be used for implementing one or more discrete fractures that are intersected by the well. Several theoretical examples are presented and discussed: a single vertical fracture in a dual-porosity aquifer or in a multi-layer system (with a partially intersecting fracture); one and two inclined fractures in a leaky-aquifer system with pumping either only from the fracture(s), or also from the aquifer between fracture(s) in the screened interval of the well. For the cases with several pumping sources, analytical solutions of flowrate contribution from each individual source (fractures and well) are presented, and the drawdown behaviour according to the length of the pumped screened interval of

  6. Analytical performance specifications for external quality assessment - definitions and descriptions.

    Science.gov (United States)

    Jones, Graham R D; Albarede, Stephanie; Kesseler, Dagmar; MacKenzie, Finlay; Mammen, Joy; Pedersen, Morten; Stavelin, Anne; Thelen, Marc; Thomas, Annette; Twomey, Patrick J; Ventura, Emma; Panteghini, Mauro

    2017-06-27

    External Quality Assurance (EQA) is vital to ensure acceptable analytical quality in medical laboratories. A key component of an EQA scheme is an analytical performance specification (APS) for each measurand that a laboratory can use to assess the extent of deviation of the obtained results from the target value. A consensus conference held in Milan in 2014 has proposed three models to set APS and these can be applied to setting APS for EQA. A goal arising from this conference is the harmonisation of EQA APS between different schemes to deliver consistent quality messages to laboratories irrespective of location and the choice of EQA provider. At this time there are wide differences in the APS used in different EQA schemes for the same measurands. Contributing factors to this variation are that the APS in different schemes are established using different criteria, applied to different types of data (e.g. single data points, multiple data points), used for different goals (e.g. improvement of analytical quality; licensing), and with the aim of eliciting different responses from participants. This paper provides recommendations from the European Federation of Laboratory Medicine (EFLM) Task and Finish Group on Performance Specifications for External Quality Assurance Schemes (TFG-APSEQA) and on clear terminology for EQA APS. The recommended terminology covers six elements required to understand APS: 1) a statement on the EQA material matrix and its commutability; 2) the method used to assign the target value; 3) the data set to which APS are applied; 4) the applicable analytical property being assessed (i.e. total error, bias, imprecision, uncertainty); 5) the rationale for the selection of the APS; and 6) the type of the Milan model(s) used to set the APS. The terminology is required for EQA participants and other interested parties to understand the meaning of meeting or not meeting APS.

  7. Toward analytic aids for standard setting in nuclear regulation

    International Nuclear Information System (INIS)

    Brown, R.V.; O'Connor, M.F.; Peterson, C.R.

    1979-05-01

    US NRC promulgates standards for nuclear reprocessing and other facilities to safeguard against the diversion of nuclear material. Two broad tasks have been directed toward establishing performance criteria for standard settings: general-purpose modeling, and analysis specific to a particular performance criterion option. This report emphasizes work on the second task. Purpose is to provide a framework for the evaluation of such options that organizes the necessary components in a way that provides for meaningful assessments with respect to required inputs

  8. Local properties of analytic functions and non-standard analysis

    International Nuclear Information System (INIS)

    O'Brian, N.R.

    1976-01-01

    This is an expository account which shows how the methods of non-standard analysis can be applied to prove the Nullstellensatz for germs of analytic functions. This method of proof was discovered originally by Abraham Robinson. The necessary concepts from model theory are described in some detail and the Nullstellensatz is proved by investigating the relation between the set of infinitesimal elements in the complex n-plane and the spectrum of the ring of germs of analytic functions. (author)

  9. Behavioral Analytic Approach to Placement of Patients in Community Settings.

    Science.gov (United States)

    Glickman, Henry S.; And Others

    Twenty adult psychiatric outpatients were assessed by their primary therapists on the Current Behavior Inventory prior to placing them in community settings. The diagnoses included schizophrenia, major affective disorder, dysthymic disorder, and atypical paranoid disorder. The inventory assessed behaviors in four areas: independent community…

  10. Testing a parametric function against a nonparametric alternative in IV and GMM settings

    DEFF Research Database (Denmark)

    Gørgens, Tue; Wurtz, Allan

    This paper develops a specification test for functional form for models identified by moment restrictions, including IV and GMM settings. The general framework is one where the moment restrictions are specified as functions of data, a finite-dimensional parameter vector, and a nonparametric real ...

  11. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  12. Negative hallucinations, dreams and hallucinations: The framing structure and its representation in the analytic setting.

    Science.gov (United States)

    Perelberg, Rosine Jozef

    2016-12-01

    This paper explores the meaning of a patient's hallucinatory experiences in the course of a five times a week analysis. I will locate my understanding within the context of André Green's ideas on the role of the framing structure and the negative hallucination in the structuring of the mind. The understanding of the transference and countertransference was crucial in the creation of meaning and enabling the transformations that took place in the analytic process. Through a detailed analysis of a clinical example the author examines Bion's distinction between hysterical hallucinations and psychotic hallucinations and formulates her own hypothesis about the distinctions between the two. The paper suggests that whilst psychotic hallucinations express a conflict between life and death, in the hysterical hallucination it is between love and hate. The paper also contains some reflections on the dramatic nature of the analytic encounter. Copyright © 2016 Institute of Psychoanalysis.

  13. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. ANALYTIC FITS FOR PARTIAL PHOTOIONIZATION CROSS-SECTIONS

    NARCIS (Netherlands)

    VERNER, DA; YAKOVLEV, DG

    We present a compact, uniform and complete set of analytic fits to the partial Hartree-Dirac-Slater photoionization cross sections for the ground state shells of all atoms and ions of elements from H to Zn (Z less-than-or-equal-to 30). Comparison with experiment and theory demonstrates generally

  15. On spaces Cb(X) weakly K-analytic

    Czech Academy of Sciences Publication Activity Database

    Ferrando, J.C.; Kąkol, Jerzy; López-Pellicer, M.

    2017-01-01

    Roč. 290, č. 16 (2017), s. 2612-2618 ISSN 0025-584X R&D Projects: GA ČR GF16-34860L Institutional support: RVO:67985840 Keywords : K-analytic space * pseudocompact space * rainwater set Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.742, year: 2016

  16. Analytical techniques for thin films treatise on materials science and technology

    CERN Document Server

    Tu, K N

    1988-01-01

    Treatise on Materials Science and Technology, Volume 27: Analytical Techniques for Thin Films covers a set of analytical techniques developed for thin films and interfaces, all based on scattering and excitation phenomena and theories. The book discusses photon beam and X-ray techniques; electron beam techniques; and ion beam techniques. Materials scientists, materials engineers, chemical engineers, and physicists will find the book invaluable.

  17. Evaluation of the Standard Setting on the 2005 Grade 12 National Assessment of Educational Progress Mathematics Test

    Science.gov (United States)

    Sireci, Stephen G.; Hauger, Jeffrey B.; Wells, Craig S.; Shea, Christine; Zenisky, April L.

    2009-01-01

    The National Assessment Governing Board used a new method to set achievement level standards on the 2005 Grade 12 NAEP Math test. In this article, we summarize our independent evaluation of the process used to set these standards. The evaluation data included observations of the standard-setting meeting, observations of advisory committee meetings…

  18. Stress Testing Water Resource Systems at Regional and National Scales with Synthetic Drought Event Sets

    Science.gov (United States)

    Hall, J. W.; Mortazavi-Naeini, M.; Coxon, G.; Guillod, B. P.; Allen, M. R.

    2017-12-01

    Water resources systems can fail to deliver the services required by water users (and deprive the environment of flow requirements) in many different ways. In an attempt to make systems more resilient, they have also been made more complex, for example through a growing number of large-scale transfers, optimized storages and reuse plants. These systems may be vulnerable to complex variants of hydrological variability in space and time, and behavioural adaptations by water users. In previous research we have used non-parametric stochastic streamflow generators to test the vulnerability of water resource systems. Here we use a very large ensemble of regional climate model outputs from the weather@home crowd-sourced citizen science project, which has generated more than 30,000 years of synthetic weather for present and future climates in the UK and western Europe, using the HadAM3P regional climate model. These simulations have been constructed in order to preserve prolonged drought characteristics, through treatment of long-memory processes in ocean circulations and soil moisture. The weather simulations have been propagated through the newly developed DynaTOP national hydrological for Britain, in order to provide low flow simulations at points of water withdrawal for public water supply, energy and agricultural abstractors. We have used the WATHNET water resource simulation model, set up for the Thames Basin and for all of the large water resource zones in England, to simulate the frequency, severity and duration of water shortages in all of these synthetic weather conditions. In particular, we have sought to explore systemic vulnerabilities associated with inter-basin transfers and the trade-offs between different water users. This analytical capability is providing the basis for (i) implementation of the Duty of Resilience, which has been placed upon the water industry in the 2014 Water Act and (ii) testing reformed abstraction arrangements which the UK government

  19. Chemical/Biological Agent Resistance Test (CBART) Test Fixture System Verification and Analytical Monitioring System Development

    Science.gov (United States)

    2011-03-15

    progress was made towards the proportional intergral derivative (PID) tuning. The CBART NRT analytical system was developed, moved, replumbed, and...efficacy, or applicability of the contents hereof. The use of trade names in this report does not constitute endorsement of any commercial product ...Office MFC mass flow controller MS mass spectrometer MSD mass selective detector NRT near real-time PID proportional intergral derivative

  20. Comparison of Video Head Impulse Test (vHIT) Gains Between Two Commercially Available Devices and by Different Gain Analytical Methods.

    Science.gov (United States)

    Lee, Sang Hun; Yoo, Myung Hoon; Park, Jun Woo; Kang, Byung Chul; Yang, Chan Joo; Kang, Woo Suk; Ahn, Joong Ho; Chung, Jong Woo; Park, Hong Ju

    2018-06-01

    To evaluate whether video head impulse test (vHIT) gains are dependent on the measuring device and method of analysis. Prospective study. vHIT was performed in 25 healthy subjects using two devices simultaneously. vHIT gains were compared between these instruments and using five different methods of comparing position and velocity gains during head movement intervals. The two devices produced different vHIT gain results with the same method of analysis. There were also significant differences in the vHIT gains measured using different analytical methods. The gain analytic method that compares the areas under the velocity curve (AUC) of the head and eye movements during head movements showed lower vHIT gains than a method that compared the peak velocities of the head and eye movements. The former method produced the vHIT gain with the smallest standard deviation among the five procedures tested in this study. vHIT gains differ in normal subjects depending on the device and method of analysis used, suggesting that it is advisable for each device to have its own normal values. Gain calculations that compare the AUC of the head and eye movements during the head movements show the smallest variance.

  1. Analytic solutions of QCD motivated Hamiltonians at low energy

    International Nuclear Information System (INIS)

    Yepez, T.; Amor, A.; Hess, P.O.; Szczepaniak, A.; Civitarese, O.

    2011-01-01

    A model Hamiltonian, motivated by QCD, is investigated in order to study only the quark sector, then only the gluon sector and finally both together. Restricting to the pure quark sector and setting the mass of the quarks to zero, we find analytic solutions, involving two to three orbitals. Allowing the mass of the quarks to be different to zero, we find semi-analytic solutions involving an arbitrary number of orbitals. Afterwards, we indicate on how to incorporate gluons. (author)

  2. Collection of analytes from microneedle patches.

    Science.gov (United States)

    Romanyuk, Andrey V; Zvezdin, Vasiliy N; Samant, Pradnya; Grenader, Mark I; Zemlyanova, Marina; Prausnitz, Mark R

    2014-11-04

    Clinical medicine and public health would benefit from simplified acquisition of biological samples from patients that can be easily obtained at point of care, in the field, and by patients themselves. Microneedle patches are designed to serve this need by collecting dermal interstitial fluid containing biomarkers without the dangers, pain, or expertise needed to collect blood. This study presents novel methods to collect biomarker analytes from microneedle patches for analysis by integration into conventional analytical laboratory microtubes and microplates. Microneedle patches were made out of cross-linked hydrogel composed of poly(methyl vinyl ether-alt-maleic acid) and poly(ethylene glycol) prepared by micromolding. Microneedle patches were shown to swell with water up to 50-fold in volume, depending on degree of polymer cross-linking, and to collect interstitial fluid from the skin of rats. To collect analytes from microneedle patches, the patches were mounted within the cap of microcentrifuge tubes or formed the top of V-bottom multiwell microplates, and fluid was collected in the bottom of the tubes under gentle centrifugation. In another method, microneedle patches were attached to form the bottom of multiwell microplates, thereby enabling in situ analysis. The simplicity of biological sample acquisition using microneedle patches coupled with the simplicity of analyte collection from microneedles patches integrated into conventional analytical equipment could broaden the reach of future screening, diagnosis, and monitoring of biomarkers in healthcare and environmental/workplace settings.

  3. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  4. International Congress on Analytical Chemistry. Abstracts. V. 2

    International Nuclear Information System (INIS)

    1997-01-01

    The collection of materials of the international congress on analytical chemistry taken place in Moscow in June 1997 is presented. The main directs of investigations are elucidated in such regions of analytical chemistry as quantitative and qualitative chemical analysis, sample preparation, express test methods of environmental and biological materials, clinical analysis, analysis of food and agricultural products

  5. International Congress on Analytical Chemistry. Abstracts. V. 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    The collection of materials of the international congress on analytical chemistry taken place in Moscow in June 1997 is presented. The main directs of investigations are elucidated in such regions of analytical chemistry as quantitative and qualitative chemical analysis, sample preparation, express test methods of environmental and biological materials, clinical analysis, analysis of food and agricultural products

  6. Quality system implementation for nuclear analytical techniques

    International Nuclear Information System (INIS)

    2004-01-01

    The international effort (UNIDO, ILAC, BIPM, etc.) to establish a functional infrastructure for metrology and accreditation in many developing countries needs to be complemented by assistance to implement high quality practices and high quality output by service providers and producers in the respective countries. Knowledge of how to approach QA systems that justify a formal accreditation is available in only a few countries and the dissemination of know how and development of skills is needed bottom up from the working level of laboratories and institutes. Awareness building, convincing of management, introduction of good management practices, technical expertise and good documentation will lead to the creation of a quality culture that assures a sustainability and inherent development of quality practices as a prerequisite of economic success. Quality assurance and quality control can be used as a valuable management tool and is a prerequisite for international trade and information exchange. This publication tries to assist quality managers, Laboratory Managers and staff involved in setting up a QA/QC system in a nuclear analytical laboratory to take appropriate action to start and complete the necessary steps for a successful quality system for ultimate national accreditation. This guidebook contributes to a better understanding of the basic ideas behind ISO/IEC 17025, the international standard for 'General requirements for the competence of testing and calibration laboratories'. It provides basic information and detailed explanation about the establishment of the QC system in analytical and nuclear analytical laboratories. It is a proper training material for training of trainers and makes managers with QC management and implementation familiar. This training material aims to facilitate the implementation of internationally accepted quality principles and to promote attempts by Member States' laboratories to obtain accreditation for nuclear analytical

  7. Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Heusen, M.; Shalchi, A., E-mail: husseinm@myumanitoba.ca, E-mail: andreasm4@yahoo.com [Department of Physics and Astronomy, University of Manitoba, Winnipeg, MB R3T 2N2 (Canada)

    2017-04-20

    In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.

  8. Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence

    International Nuclear Information System (INIS)

    Heusen, M.; Shalchi, A.

    2017-01-01

    In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.

  9. Analytical modeling of post-tensioned precast beam-to-column connections

    International Nuclear Information System (INIS)

    Kaya, Mustafa; Arslan, A. Samet

    2009-01-01

    In this study, post-tensioned precast beam-to-column connections are tested experimentally at different stress levels, and are modelled analytically using 3D nonlinear finite element modelling method. ANSYS finite element software is used for this purposes. Nonlinear static analysis is used to determine the connection strength, behavior and stiffness when subjected to cyclic inelastic loads simulating ground excitation during an earthquake. The results obtained from the analytical studies are compared with the test results. In terms of stiffness, it was seen that the initial stiffness of the analytical models was lower than that of the tested specimens. As a result, modelling of these types of connection using 3D FEM can give crucial beforehand information, and overcome the disadvantages of time consuming workmanship and cost of experimental studies.

  10. Inkjet-printed point-of-care immunoassay on a nanoscale polymer brush enables subpicomolar detection of analytes in blood

    Science.gov (United States)

    Joh, Daniel Y.; Hucknall, Angus M.; Wei, Qingshan; Mason, Kelly A.; Lund, Margaret L.; Fontes, Cassio M.; Hill, Ryan T.; Blair, Rebecca; Zimmers, Zackary; Achar, Rohan K.; Tseng, Derek; Gordan, Raluca; Freemark, Michael; Ozcan, Aydogan; Chilkoti, Ashutosh

    2017-08-01

    The ELISA is the mainstay for sensitive and quantitative detection of protein analytes. Despite its utility, ELISA is time-consuming, resource-intensive, and infrastructure-dependent, limiting its availability in resource-limited regions. Here, we describe a self-contained immunoassay platform (the “D4 assay”) that converts the sandwich immunoassay into a point-of-care test (POCT). The D4 assay is fabricated by inkjet printing assay reagents as microarrays on nanoscale polymer brushes on glass chips, so that all reagents are “on-chip,” and these chips show durable storage stability without cold storage. The D4 assay can interrogate multiple analytes from a drop of blood, is compatible with a smartphone detector, and displays analytical figures of merit that are comparable to standard laboratory-based ELISA in whole blood. These attributes of the D4 POCT have the potential to democratize access to high-performance immunoassays in resource-limited settings without sacrificing their performance.

  11. A Cryogenic Test Set-Up for the Qualification of Pre-Series Test Cells for the LHC Cryogenic Distribution Line

    CERN Document Server

    Livran, J; Parente, C; Riddone, G; Rybkowski, D; Veillet, N

    2000-01-01

    Three pre-series Test Cells of the LHC Cryogenic Distribution Line (QRL) [1], manufactured by three European industrial companies, will be tested in the year 2000 to qualify the design chosen and verify the thermal and mechanical performances. A dedicated test stand (170 m x 13 m) has been built for extensive testing and performance assessment of the pre-series units in parallel. They will be fed with saturated liquid helium at 4.2 K supplied by a mobile helium dewar. In addition, LN2 cooled helium will be used for cool-down and thermal shielding. For each of the three pre-series units, a set of end boxes has been designed and manufactured at CERN. This paper presents the layout of the cryogenic system for the pre-series units, the calorimetric methods as well as the results of the thermal calculation of the end box test.

  12. Conceptual and analytical modeling of fracture zone aquifers in hard rock. Implications of pumping tests in the Pohjukansalo well field, east-central Finland

    International Nuclear Information System (INIS)

    Leveinen, J.

    2001-01-01

    Fracture zones with an interconnected network of open fractures can conduct significant groundwater flow and as in the case of the Pohjukansalo well field in Leppaevirta, can yield sufficiently for small-scale municipal water supply. Glaciofluvial deposits comprising major aquifers commonly overlay fracture zones that can contribute to the water balance directly or indirectly by providing hydraulic interconnections between different formations. Fracture zones and fractures can also transport contaminants in a poorly predictable way. Consequently, hydrogeological research of fracture zones is important for the management and protection of soil aquifers in Finland. Hydraulic properties of aquifers are estimated in situ by well test analyses based on analytical models. Most analytical models rely on the concepts of radial flow and horizontal slab aquifer. In Paper 1, pump test responses of fracture zones in the Pohjukansalo well field were characterised based on alternative analytical models developed for channelled flow cases. In Paper 2, the tests were analysed based on the generalised radial flow (GRF) model and a concept of a fracture network possessing fractional flow dimension due to limited connectivity compared to ideal 2- or 3- dimensional systems. The analysis provides estimates of hydraulic properties in terms of parameters that do not have concrete meaning when the flow dimension of the aquifer has fractional values. Concrete estimates of hydraulic parameters were produced by making simplified assumptions and by using the composite model developed in Paper 3. In addition to estimates of hydraulic parameters, analysis of hydraulic tests provides qualitative information that is useful when the hydraulic connections in the fracture system are not well known. However, attention should be paid to the frequency of drawdown measurements-particularly for the application of derivative curves. In groundwater studies, analytical models have been also used to estimate

  13. GSMA: Gene Set Matrix Analysis, An Automated Method for Rapid Hypothesis Testing of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Chris Cheadle

    2007-01-01

    Full Text Available Background: Microarray technology has become highly valuable for identifying complex global changes in gene expression patterns. The assignment of functional information to these complex patterns remains a challenging task in effectively interpreting data and correlating results from across experiments, projects and laboratories. Methods which allow the rapid and robust evaluation of multiple functional hypotheses increase the power of individual researchers to data mine gene expression data more efficiently.Results: We have developed (gene set matrix analysis GSMA as a useful method for the rapid testing of group-wise up- or downregulation of gene expression simultaneously for multiple lists of genes (gene sets against entire distributions of gene expression changes (datasets for single or multiple experiments. The utility of GSMA lies in its flexibility to rapidly poll gene sets related by known biological function or as designated solely by the end-user against large numbers of datasets simultaneously.Conclusions: GSMA provides a simple and straightforward method for hypothesis testing in which genes are tested by groups across multiple datasets for patterns of expression enrichment.

  14. MERRA Analytic Services

    Science.gov (United States)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  15. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David; Wolverton, Michael J.; Bruce, Joseph R.; Burtner, Edwin R.; Endert, Alexander

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying models of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.

  16. Integrated analytical assets aid botanical authenticity and adulteration management.

    Science.gov (United States)

    Simmler, Charlotte; Graham, James G; Chen, Shao-Nong; Pauli, Guido F

    2017-11-22

    This article reviews and develops a perspective for the meaning of authenticity in the context of quality assessment of botanical materials and the challenges associated with discerning adulterations vs. contaminations vs. impurities. Authentic botanicals are by definition non-adulterated, a mutually exclusive relationship that is confirmed through the application of a multilayered set of analytical methods designed to validate the (chemo)taxonomic identity of a botanical and certify that it is devoid of any adulteration. In practice, the ever-increasing sophistication in the process of intentional adulteration, as well as the growing number of botanicals entering the market, altogether necessitate a constant adaptation and reinforcement of authentication methods with new approaches, especially new technologies. This article summarizes the set of analytical methods - classical and contemporary - that can be employed in the authentication of botanicals. Particular emphasis is placed on the application of untargeted metabolomics and chemometrics. An NMR-based untargeted metabolomic model is proposed as a rapid, systematic, and complementary screening for the discrimination of authentic vs. potentially adulterated botanicals. Such analytical model can help advance the evaluation of botanical integrity in natural product research. Copyright © 2017. Published by Elsevier B.V.

  17. Quality-control analytical methods: endotoxins: essential testing for pyrogens in the compounding laboratory, part 3: a simplified endotoxin test method for compounded sterile preparations.

    Science.gov (United States)

    Cooper, James F

    2011-01-01

    The first two parts of the IJPC series on endotoxin testing explained the nature of pyrogenic contamination and described various Limulus amebocyte lysate methods for detecting and measuring endotoxin levels with the bacterial endotoxin test described in the United States Pharmacopeia. This third article in that series describes the endotoxin test that is simplest to permorm for pharmacists who prefer to conduct an endotoxin assa at the time of compounding in the pharmacy setting.

  18. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  19. Analytical purpose electron backscattering system

    International Nuclear Information System (INIS)

    Desdin, L.; Padron, I.; Laria, J.

    1996-01-01

    In this work an analytical purposes electron backscattering system improved at the Center of Applied Studies for Nuclear Development is described. This system can be applied for fast, exact and nondestructive testing of binary and AL/Cu, AL/Ni in alloys and for other applications

  20. Mobile Landing Platform with Core Capability Set (MLP w/CCS): Combined Initial Operational Test and Evaluation and Live Fire Test and Evaluation Report

    Science.gov (United States)

    2015-07-01

    SUBTITLE Mobile Landing Platform with Core Capability Set (MLP w/CCS) Combined Initial Operational Test and Evaluation ( IOT &E) and Live Fire Test and...based on data from a series of integrated test events, a dedicated end-to-end Initial Operational Test and Evaluation ( IOT &E), and two Marine Corps...Internally Transportable Vehicles (ITVs).   ii the LMSR to anchor within a few miles of the shore. Using MLP (CCS), the equipment is transported ashore

  1. [Pre-analytical stability before centrifugation of 7 biochemical analytes in whole blood].

    Science.gov (United States)

    Perrier-Cornet, Andreas; Moineau, Marie-Pierre; Narbonne, Valérie; Plee-Gautier, Emmanuelle; Le Saos, Fabienne; Carre, Jean-Luc

    2015-01-01

    The pre-analytical stability of 7 biochemical parameters (parathyroid hormone -PTH-, vitamins A, C E and D, 1,25-dihydroxyvitamin D and insulin) at +4 °C, was studied on whole blood samples before centrifugation. The impact of freezing at -20°C was also analyzed/performed for PTH and vitamin D. The differences in the results of assays for whole blood samples, being kept for different times between sampling time and analysis, from 9 healthy adults, were compaired by using a Student t test. The 7 analytes investigated remained stable up to 4 hours at +4°C in whole blood. This study showed that it is possible to accept uncentrifuged whole blood specimens kept at +4°C before analysis. PTH is affected by freezing whereas vitamin D is not.

  2. Evaluation of Calypte AWARE HIV-1/2 OMT antibody test as a screening test in an Indian setting

    Directory of Open Access Journals (Sweden)

    Ingole N

    2010-01-01

    Full Text Available Purpose: Integrated counselling and testing centres (ICTC provide counselling and blood testing facilities for HIV diagnosis. Oral fluid tests provide an alternative for people whodo not want blood to be drawn. Also, it avoids the risk of occupational exposure. The goal of this study was to evaluate the utility of Calypte AWARE HIV-1/2 OMT antibody test as a screening test in an Indian setting. Materials and Methods: A cross-sectional study was carried out after ethics committee approval in 250 adult ICTC clients. Blood was collected and tested from these clients for HIV diagnosis as per routine policy and the results were considered as the gold standard. Also, after another written informed consent, oral fluid was collected from the clients and tested for the presence of HIV antibodies. Twenty five clients who had and 25 clients who had not completed their secondary school education (Group A and Group B, respectively were also asked to perform and interpret the test on their own and their findings and experiences were noted. Result: The sensitivity, specificity, PPV and NPV of the oral fluid antibody test were 100%, 98.51%, 94.11% and 100%, respectively. Seventy six percent of clients preferred oral fluid testing. Group B found it difficult to perform the test as compared to Group A and this difference was statistically significant (P ≤ 0.05. Conclusion: Oral fluid testing can be used as a screening test for HIV diagnosis; however, confirmation of reactive results by blood-based tests is a must.

  3. Analytical Chemistry Division : annual report (for) 1985

    International Nuclear Information System (INIS)

    Mahadevan, N.

    1986-01-01

    An account of the various activities of the Analytical Chemistry Division of the Bhabha Atomic Research Centre, Bombay, during 1985 is presented. The main function of the Division is to provide chemical analysis support to India's atomic energy programme. In addition, the Division also offers its analytical services, mostly for measurement of concentrations at trace levels to Indian industries and other research organization in the country. A list of these determinations is given. The report also describes the research and development (R and D) activities - both completed and in progress, in the form of individual summaries. During the year an ultra trace analytical laboratory for analysis of critical samples without contamination was set up using indigenous material and technology. Publications and training activities of the staff, training of the staff from other institution, guidance by the staff for post-graduate degree and invited talks by the staff are listed in the appendices at the end of the report. (M.G.B.)

  4. Testing a 1-D Analytical Salt Intrusion Model and the Predictive Equation in Malaysian Estuaries

    Science.gov (United States)

    Gisen, Jacqueline Isabella; Savenije, Hubert H. G.

    2013-04-01

    Little is known about the salt intrusion behaviour in Malaysian estuaries. Study on this topic sometimes requires large amounts of data especially if a 2-D or 3-D numerical models are used for analysis. In poor data environments, 1-D analytical models are more appropriate. For this reason, a fully analytical 1-D salt intrusion model, based on the theory of Savenije in 2005, was tested in three Malaysian estuaries (Bernam, Selangor and Muar) because it is simple and requires minimal data. In order to achieve that, site surveys were conducted in these estuaries during the dry season (June-August) at spring tide by moving boat technique. Data of cross-sections, water levels and salinity were collected, and then analysed with the salt intrusion model. This paper demonstrates a good fit between the simulated and observed salinity distribution for all three estuaries. Additionally, the calibrated Van der Burgh's coefficient K, Dispersion coefficient D0, and salt intrusion length L, for the estuaries also displayed a reasonable correlations with those calculated from the predictive equations. This indicates that not only is the salt intrusion model valid for the case studies in Malaysia but also the predictive model. Furthermore, the results from this study describe the current state of the estuaries with which the Malaysian water authority in Malaysia can make decisions on limiting water abstraction or dredging. Keywords: salt intrusion, Malaysian estuaries, discharge, predictive model, dispersion

  5. Field and analytical data relating to the 1972 and 1978 surveys of residual contamination of the Monte Bello Islands and Emu atomic weapons test sites

    International Nuclear Information System (INIS)

    Cooper, M.B.; Duggleby, J.C.

    1980-12-01

    Radiation surveys of the Monte Bello Islands test site in Western Australia, and the Emu test site in South Australia, were carried out in 1972 and 1978. The results have been published in ARL reports ARL/TR--010 and ARL/TR--012. The detailed field and analytical data which formed the basis of those publications are given

  6. Healthcare predictive analytics: An overview with a focus on Saudi Arabia.

    Science.gov (United States)

    Alharthi, Hana

    2018-03-08

    Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  7. Setting up and performance of a laser enhanced ionisation spectrometer

    International Nuclear Information System (INIS)

    Chandola, L.C.; Khanna, P.P.; Razvi, M.A.N.

    1990-01-01

    A laser enhanced ionisation (LEI) spectrometer has been successfuly set up around an excimer laser pumped dye laser. The performance of the spectrometer has been tested by analysing sodium in water solutions. A straight line working curve has been obtained in the concentration range of 1-1000 ng/ml of Na. The effect of parameters like laser power, ion collector electrode voltage and the load resistance on LEI signals has been investigated. The spectrometer is useful not only for analytical purposes but also for laser spectroscopic studies of species formed in flames, study of phenomenon of combustion etc. (author). 1 tab ., 10 figs., 5 refs

  8. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    Science.gov (United States)

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. MAGNETO-FRICTIONAL MODELING OF CORONAL NONLINEAR FORCE-FREE FIELDS. I. TESTING WITH ANALYTIC SOLUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keppens, R. [School of Astronomy and Space Science, Nanjing University, Nanjing 210023 (China); Xia, C. [Centre for mathematical Plasma-Astrophysics, Department of Mathematics, KU Leuven, B-3001 Leuven (Belgium); Valori, G., E-mail: guoyang@nju.edu.cn [University College London, Mullard Space Science Laboratory, Holmbury St. Mary, Dorking, Surrey RH5 6NT (United Kingdom)

    2016-09-10

    We report our implementation of the magneto-frictional method in the Message Passing Interface Adaptive Mesh Refinement Versatile Advection Code (MPI-AMRVAC). The method aims at applications where local adaptive mesh refinement (AMR) is essential to make follow-up dynamical modeling affordable. We quantify its performance in both domain-decomposed uniform grids and block-adaptive AMR computations, using all frequently employed force-free, divergence-free, and other vector comparison metrics. As test cases, we revisit the semi-analytic solution of Low and Lou in both Cartesian and spherical geometries, along with the topologically challenging Titov–Démoulin model. We compare different combinations of spatial and temporal discretizations, and find that the fourth-order central difference with a local Lax–Friedrichs dissipation term in a single-step marching scheme is an optimal combination. The initial condition is provided by the potential field, which is the potential field source surface model in spherical geometry. Various boundary conditions are adopted, ranging from fully prescribed cases where all boundaries are assigned with the semi-analytic models, to solar-like cases where only the magnetic field at the bottom is known. Our results demonstrate that all the metrics compare favorably to previous works in both Cartesian and spherical coordinates. Cases with several AMR levels perform in accordance with their effective resolutions. The magneto-frictional method in MPI-AMRVAC allows us to model a region of interest with high spatial resolution and large field of view simultaneously, as required by observation-constrained extrapolations using vector data provided with modern instruments. The applications of the magneto-frictional method to observations are shown in an accompanying paper.

  10. Analytical quality control [An IAEA service

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1973-07-01

    In analytical chemistry the determination of small or trace amounts of elements or compounds in different types of materials is increasingly important. The results of these findings have a great influence on different fields of science, and on human life. Their reliability, precision and accuracy must, therefore, be checked by analytical quality control measures. The International Atomic Energy Agency (IAEA) set up an Analytical Quality Control Service (AQCS) in 1962 to assist laboratories in Member States in the assessment of their reliability in radionuclide analysis, and in other branches of applied analysis in which radionuclides may be used as analytical implements. For practical reasons, most analytical laboratories are not in a position to check accuracy internally, as frequently resources are available for only one method; standardized sample material, particularly in the case of trace analysis, is not available and can be prepared by the institutes themselves only in exceptional cases; intercomparisons are organized rather seldom and many important types of analysis are so far not covered. AQCS assistance is provided by the shipment to laboratories of standard reference materials containing known quantities of different trace elements or radionuclides, as well as by the organization of analytical intercomparisons in which the participating laboratories are provided with aliquots of homogenized material of unknown composition for analysis. In the latter case the laboratories report their data to the Agency's laboratory, which calculates averages and distributions of results and advises each laboratory of its performance relative to all the others. Throughout the years several dozens of intercomparisons have been organized and many thousands of samples provided. The service offered, as a consequence, has grown enormously. The programme for 1973 and 1974, which is currently being distributed to Member States, will contain 31 different types of materials.

  11. Evaluation of farmed cod products by a trained sensory panel and consumers in different test settings

    NARCIS (Netherlands)

    Sveinsdottir, K.; Martinsdottir, E.; Thorsdottir, F.; Schelvis-Smit, A.A.M.; Kole, A.; Thorsdottir, I.

    2010-01-01

    Sensory characteristics of farmed cod exposed to low or conventional stress levels prior to slaughter were evaluated by a trained sensory panel. Consumers in two different settings, central location test (CLT) and home-use test (HUT), also tasted the products and rated them according to overall

  12. SM4AM: A Semantic Metamodel for Analytical Metadata

    DEFF Research Database (Denmark)

    Varga, Jovan; Romero, Oscar; Pedersen, Torben Bach

    2014-01-01

    Next generation BI systems emerge as platforms where traditional BI tools meet semi-structured and unstructured data coming from the Web. In these settings, the user-centric orientation represents a key characteristic for the acceptance and wide usage by numerous and diverse end users in their data....... We present SM4AM, a Semantic Metamodel for Analytical Metadata created as an RDF formalization of the Analytical Metadata artifacts needed for user assistance exploitation purposes in next generation BI systems. We consider the Linked Data initiative and its relevance for user assistance...

  13. X-ray fluorescence (XRF) set-up with a low power X-ray tube

    International Nuclear Information System (INIS)

    Gupta, Sheenu; Deep, Kanan; Jain, Lalita; Ansari, M.A.; Mittal, Vijay Kumar; Mittal, Raj

    2010-01-01

    The X-ray fluorescence set-up with a 100 W X-ray tube comprises a computer controlled system developed for remote operation and monitoring of tube and an adjustable stable 3D arrangement to procure variable excitation energies with low scattered background. The system was tested at different filament currents/anode voltages. The MDL of the set-up at 0.05-1.00 mA/4-12 kV is found ∼(1-100) ppm for K and L excitations and ∼(200-700) ppm for M excitations of elements and improves with filament current and anode voltage. Moreover, L measurements for Sm and Eu at five K X-ray energies of elements(Z=29-40) and analytical determination in some synthetic samples were undertaken.

  14. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    Science.gov (United States)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  15. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  16. Testing of the analytical anisotropic algorithm for photon dose calculation

    International Nuclear Information System (INIS)

    Esch, Ann van; Tillikainen, Laura; Pyykkonen, Jukka; Tenhunen, Mikko; Helminen, Hannu; Siljamaeki, Sami; Alakuijala, Jyrki; Paiusco, Marta; Iori, Mauro; Huyskens, Dominique P.

    2006-01-01

    The analytical anisotropic algorithm (AAA) was implemented in the Eclipse (Varian Medical Systems) treatment planning system to replace the single pencil beam (SPB) algorithm for the calculation of dose distributions for photon beams. AAA was developed to improve the dose calculation accuracy, especially in heterogeneous media. The total dose deposition is calculated as the superposition of the dose deposited by two photon sources (primary and secondary) and by an electron contamination source. The photon dose is calculated as a three-dimensional convolution of Monte-Carlo precalculated scatter kernels, scaled according to the electron density matrix. For the configuration of AAA, an optimization algorithm determines the parameters characterizing the multiple source model by optimizing the agreement between the calculated and measured depth dose curves and profiles for the basic beam data. We have combined the acceptance tests obtained in three different departments for 6, 15, and 18 MV photon beams. The accuracy of AAA was tested for different field sizes (symmetric and asymmetric) for open fields, wedged fields, and static and dynamic multileaf collimation fields. Depth dose behavior at different source-to-phantom distances was investigated. Measurements were performed on homogeneous, water equivalent phantoms, on simple phantoms containing cork inhomogeneities, and on the thorax of an anthropomorphic phantom. Comparisons were made among measurements, AAA, and SPB calculations. The optimization procedure for the configuration of the algorithm was successful in reproducing the basic beam data with an overall accuracy of 3%, 1 mm in the build-up region, and 1%, 1 mm elsewhere. Testing of the algorithm in more clinical setups showed comparable results for depth dose curves, profiles, and monitor units of symmetric open and wedged beams below d max . The electron contamination model was found to be suboptimal to model the dose around d max , especially for physical

  17. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  18. Aplikasi Analytical Hierarchy Process Pada Pemilihan Metode Analisis Zat Organik Dalam Air

    Directory of Open Access Journals (Sweden)

    Dino Rimantho

    2016-07-01

    Full Text Available Water is one of the food products analyzed in water chemistry and environmental laboratories. One of the parameters analyzed are organic substances. The number of samples that were not comparable with the analytical skills can cause delays in test results. Analytical Hierarchy Process applied to evaluate the analytical methods used. Alternative methods tested include titrimetric method, spectrophotometry, and total organic carbon (TOC. Respondents consisted of deputy technical manager, laboratory coordinator, and two senior analysts. Alternative results obtained are methods of TOC. Proposed improvements alternative analytical method based on the results obtained, the method of the TOC with a 10-15 minute analysis time and use of CRM to the validity of the analysis results.

  19. The Yoccoz Combinatorial Analytic Invariant

    DEFF Research Database (Denmark)

    Petersen, Carsten Lunde; Roesch, Pascale

    2008-01-01

    In this paper we develop a combinatorial analytic encoding of the Mandelbrot set M. The encoding is implicit in Yoccoz' proof of local connectivity of M at any Yoccoz parameter, i.e. any at most finitely renormalizable parameter for which all periodic orbits are repelling. Using this encoding we ...... to reprove that the dyadic veins of M are arcs and that more generally any two Yoccoz parameters are joined by a unique ruled (in the sense of Douady-Hubbard) arc in M....

  20. Role of modern analytical techniques in the production of uranium metal

    International Nuclear Information System (INIS)

    Hareendran, K.N.; Roy, S.B.

    2009-01-01

    Production of nuclear grade uranium metal conforming to its stringent specification with respect to metallic and non metallic impurities necessitates implementation of a comprehensive quality control regime. Founding members of Uranium Metal Plant realised the importance of this aspect of metal production and a quality control laboratory was set up as part of the production plant. In the initial stages of its existence, the laboratory mainly catered to the process control analysis of the plant process samples and Spectroscopy Division and Analytical Division of BARC provided analysis of trace metallic impurities in the intermediates as well as in the product uranium metal. This laboratory also provided invaluable R and D support for the optimization of the process involving both calciothermy and magnesiothermy. Prior to 1985, analytical procedures used were limited to classical methods of analysis with minimal instrumental procedures. The first major analytical instrument, a Flame AAS was installed in 1985 and a beginning to the trace analysis was made. However during the last 15 years the Quality Control Section has modernized the analytical set up by acquiring appropriate instruments. Presently the facility has implemented a complete quality control and quality assurance program required to cover all aspects of uranium metal production viz analysis of raw materials, process samples, waste disposal samples and also determination of all the specification elements in uranium metal. The current analytical practices followed in QCS are presented here

  1. Detection of sensor failures in nuclear plants using analytic redundancy

    International Nuclear Information System (INIS)

    Kitamura, M.

    1980-01-01

    A method for on-line, nonperturbative detection and identification of sensor failures in nuclear power plants was studied to determine its feasibility. This method is called analytic redundancy, or functional redundancy. Sensor failure has traditionally been detected by comparing multiple signals from redundant sensors, such as in two-out-of-three logic. In analytic redundancy, with the help of an assumed model of the physical system, the signals from a set of sensors are processed to reproduce the signals from all system sensors

  2. HIV rapid testing in a Veterans Affairs hospital ED setting: a 5-year sustainability evaluation.

    Science.gov (United States)

    Knapp, Herschel; Hagedorn, Hildi; Anaya, Henry D

    2014-08-01

    Routine HIV testing in primary care settings is now recommended in the United States. The US Department of Veterans Affairs (VA) has increased the number of patients tested for HIV, but overall HIV testing rates in VA remain low. A proven strategy for increasing such testing involves nurse-initiated HIV rapid testing (HIV RT). The purpose of this work was to use a mixed methodology approach to evaluate the 5-year sustainability of an intervention that implemented HIV RT in a VA emergency department setting in a large, urban VA medical center to reduce missed diagnostic and treatment opportunities in this vulnerable patient population. In-person semistructured interviews were conducted with providers and stakeholders. Interview notes were qualitatively coded for emerging themes. Quarterly testing rates were evaluated for a 5-year time span starting from the launch in July 2008. Findings indicate that HIV RT was sustained by the enthusiasm of 2 clinical champions who oversaw the registered nurses responsible for conducting the testing. The departure of the clinical champions was correlated with a substantial drop-off in testing. Findings also indicate potential strategies for improving sustainability including engaging senior leadership in the project, engaging line staff in the implementation planning from the start to increase ownership over the innovation, incorporating information into initial training explaining the importance of the innovation to quality patient care, providing ongoing training to maintain skills, and providing routine progress reports to staff to demonstrate the ongoing impact of their efforts. Published by Elsevier Inc.

  3. Elasto-plastic strain analysis by a semi-analytical method

    Indian Academy of Sciences (India)

    deformation problems following a semi-analytical method, incorporating the com- ..... The set of equations in (8) are non-linear in nature, which is solved by direct ...... Here, [K] and [M] are stiffness matrix and mass matrix which are of the form ...

  4. Analytical solutions in the two-cavity coupling problem

    International Nuclear Information System (INIS)

    Ayzatsky, N.I.

    2000-01-01

    Analytical solutions of precise equations that describe the rf-coupling of two cavities through a co-axial cylindrical hole are given for various limited cases.For their derivation we have used the method of solution of an infinite set of linear algebraic equations,based on its transformation into dual integral equations

  5. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  6. Interacting steps with finite-range interactions: Analytical approximation and numerical results

    Science.gov (United States)

    Jaramillo, Diego Felipe; Téllez, Gabriel; González, Diego Luis; Einstein, T. L.

    2013-05-01

    We calculate an analytical expression for the terrace-width distribution P(s) for an interacting step system with nearest- and next-nearest-neighbor interactions. Our model is derived by mapping the step system onto a statistically equivalent one-dimensional system of classical particles. The validity of the model is tested with several numerical simulations and experimental results. We explore the effect of the range of interactions q on the functional form of the terrace-width distribution and pair correlation functions. For physically plausible interactions, we find modest changes when next-nearest neighbor interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  7. REACH, non-testing approaches and the urgent need for a change in mind set

    NARCIS (Netherlands)

    Schaafsma, G.; Kroese, E.D.; Tielemans, E.L.J.P.; Sandt, J.J.M. van de; Leeuwen, C.J. van

    2009-01-01

    The objectives of REACH cannot be achieved under the current risk assessment approach. A change in mind set among all the relevant stakeholders is needed: risk assessment should move away from a labor-intensive and animal-consuming approach to intelligent and pragmatic testing, by combining exposure

  8. Library improvement through data analytics

    CERN Document Server

    Farmer, Lesley S J

    2017-01-01

    This book shows how to act on and make sense of data in libraries. Using a range of techniques, tools and methodologies it explains how data can be used to help inform decision making at every level. Sound data analytics is the foundation for making an evidence-based case for libraries, in addition to guiding myriad organizational decisions, from optimizing operations for efficiency to responding to community needs. Designed to be useful for beginners as well as those with a background in data, this book introduces the basics of a six point framework that can be applied to a variety of library settings for effective system based, data-driven management. Library Improvement Through Data Analytics includes: - the basics of statistical concepts - recommended data sources for various library functions and processes, and guidance for using census, university, or - - government data in analysis - techniques for cleaning data - matching data to appropriate data analysis methods - how to make descriptive statistics m...

  9. A survey of residual analysis and a new test of residual trend.

    Science.gov (United States)

    McDowell, J J; Calvin, Olivia L; Klapes, Bryan

    2016-05-01

    A survey of residual analysis in behavior-analytic research reveals that existing methods are problematic in one way or another. A new test for residual trends is proposed that avoids the problematic features of the existing methods. It entails fitting cubic polynomials to sets of residuals and comparing their effect sizes to those that would be expected if the sets of residuals were random. To this end, sampling distributions of effect sizes for fits of a cubic polynomial to random data were obtained by generating sets of random standardized residuals of various sizes, n. A cubic polynomial was then fitted to each set of residuals and its effect size was calculated. This yielded a sampling distribution of effect sizes for each n. To test for a residual trend in experimental data, the median effect size of cubic-polynomial fits to sets of experimental residuals can be compared to the median of the corresponding sampling distribution of effect sizes for random residuals using a sign test. An example from the literature, which entailed comparing mathematical and computational models of continuous choice, is used to illustrate the utility of the test. © 2016 Society for the Experimental Analysis of Behavior.

  10. The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.

    Science.gov (United States)

    Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C

    2017-12-13

    Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.

  11. Interlaboratory analytical performance studies; a way to estimate measurement uncertainty

    Directory of Open Access Journals (Sweden)

    El¿bieta £ysiak-Pastuszak

    2004-09-01

    Full Text Available Comparability of data collected within collaborative programmes became the key challenge of analytical chemistry in the 1990s, including monitoring of the marine environment. To obtain relevant and reliable data, the analytical process has to proceed under a well-established Quality Assurance (QA system with external analytical proficiency tests as an inherent component. A programme called Quality Assurance in Marine Monitoring in Europe (QUASIMEME was established in 1993 and evolved over the years as the major provider of QA proficiency tests for nutrients, trace metals and chlorinated organic compounds in marine environment studies. The article presents an evaluation of results obtained in QUASIMEME Laboratory Performance Studies by the monitoring laboratory of the Institute of Meteorology and Water Management (Gdynia, Poland in exercises on nutrient determination in seawater. The measurement uncertainty estimated from routine internal quality control measurements and from results of analytical performance exercises is also presented in the paper.

  12. Installation for analytic chemistry under irradiation

    International Nuclear Information System (INIS)

    Fradin, J.; Azoeuf, P.; Guillon, A.

    1966-01-01

    An installation has been set up for carrying out manipulations and chemical analyses on radioactive products. It is completely remote-controlled and is of linear shape, 15 metres long; it is made up of three zones: - an active zone containing the apparatus, - a rear zone giving access to the active zone, - a forward zone independent of the two others and completely protected from which the remote-control of the apparatus is effected. The whole assembly has been designed so that each apparatus corresponding to an analytical technique is set up in a sealed enclosure. The sealed enclosures are interconnected by a conveyor. After three years operation, a critical review is now made of the installation. (authors) [fr

  13. Pre-analytical and post-analytical evaluation in the era of molecular diagnosis of sexually transmitted diseases: cellularity control and internal control

    Directory of Open Access Journals (Sweden)

    Loria Bianchi

    2014-06-01

    Full Text Available Background. Increase of molecular tests performed on DNA extracted from various biological materials should not be carried out without an adequate standardization of the pre-analytical and post-analytical phase. Materials and Methods. Aim of this study was to evaluate the role of internal control (IC to standardize pre-analytical phase and the role of cellularity control (CC in the suitability evaluation of biological matrices, and their influence on false negative results. 120 cervical swabs (CS were pre-treated and extracted following 3 different protocols. Extraction performance was evaluated by amplification of: IC, added in each mix extraction; human gene HPRT1 (CC with RT-PCR to quantify sample cellularity; L1 region of HPV with SPF10 primers. 135 urine, 135 urethral swabs, 553 CS and 332 ThinPrep swabs (TP were tested for C. trachomatis (CT and U. parvum (UP with RT-PCR and for HPV by endpoint-PCR. Samples were also tested for cellularity. Results. Extraction protocol with highest average cellularity (Ac/sample showed lowest number of samples with inhibitors; highest HPV positivity was achieved by protocol with greatest Ac/PCR. CS and TP under 300.000 cells/sample showed a significant decrease of UP (P<0.01 and HPV (P<0.005 positivity. Female urine under 40.000 cells/mL were inadequate to detect UP (P<0.05. Conclusions. Our data show that IC and CC allow optimization of pre-analytical phase, with an increase of analytical quality. Cellularity/sample allows better sample adequacy evaluation, crucial to avoid false negative results, while cellularity/PCR allows better optimization of PCR amplification. Further data are required to define the optimal cut-off for result normalization.

  14. Social Set Visualizer (SoSeVi) II

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Vatrapu, Ravi; Mukkamala, Raghava Rao

    2016-01-01

    SeVi). The development of the dashboard involved cutting-edge open source visual analytics libraries (D3.js) and creation of new visualizations such as visualizations of actor mobility across time and space, conversational comets, and more. Evaluation of the dashboard consisted of technical testing, usability testing......Current state-of-the-art in big social data analytics is largely limited to graph theoretical approaches such as social network analysis (SNA) informed by the social philosophical approach of relational sociology. This paper proposes and illustrates an alternate holistic approach to big social data...

  15. The Manifestation of Stopping Sets and Absorbing Sets as Deviations on the Computation Trees of LDPC Codes

    Directory of Open Access Journals (Sweden)

    Eric Psota

    2010-01-01

    Full Text Available The error mechanisms of iterative message-passing decoders for low-density parity-check codes are studied. A tutorial review is given of the various graphical structures, including trapping sets, stopping sets, and absorbing sets that are frequently used to characterize the errors observed in simulations of iterative decoding of low-density parity-check codes. The connections between trapping sets and deviations on computation trees are explored in depth using the notion of problematic trapping sets in order to bridge the experimental and analytic approaches to these error mechanisms. A new iterative algorithm for finding low-weight problematic trapping sets is presented and shown to be capable of identifying many trapping sets that are frequently observed during iterative decoding of low-density parity-check codes on the additive white Gaussian noise channel. Finally, a new method is given for characterizing the weight of deviations that result from problematic trapping sets.

  16. Analytical reasoning task reveals limits of social learning in networks.

    Science.gov (United States)

    Rahwan, Iyad; Krasnoshtan, Dmytro; Shariff, Azim; Bonnefon, Jean-François

    2014-04-06

    Social learning-by observing and copying others-is a highly successful cultural mechanism for adaptation, outperforming individual information acquisition and experience. Here, we investigate social learning in the context of the uniquely human capacity for reflective, analytical reasoning. A hallmark of the human mind is its ability to engage analytical reasoning, and suppress false associative intuitions. Through a set of laboratory-based network experiments, we find that social learning fails to propagate this cognitive strategy. When people make false intuitive conclusions and are exposed to the analytic output of their peers, they recognize and adopt this correct output. But they fail to engage analytical reasoning in similar subsequent tasks. Thus, humans exhibit an 'unreflective copying bias', which limits their social learning to the output, rather than the process, of their peers' reasoning-even when doing so requires minimal effort and no technical skill. In contrast to much recent work on observation-based social learning, which emphasizes the propagation of successful behaviour through copying, our findings identify a limit on the power of social networks in situations that require analytical reasoning.

  17. Analytic posteriors for Pearson's correlation coefficient.

    Science.gov (United States)

    Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan

    2018-02-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.

  18. Analytic posteriors for Pearson's correlation coefficient

    OpenAIRE

    Ly, A.; Marsman, M.; Wagenmakers, E.-J.

    2018-01-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open‐source software package JASP.

  19. Analytic properties of Feynman diagrams in quantum field theory

    CERN Document Server

    Todorov, I T

    1971-01-01

    Analytic Properties of Feynman Diagrams in Quantum Field Theory deals with quantum field theory, particularly in the study of the analytic properties of Feynman graphs. This book is an elementary presentation of a self-contained exposition of the majorization method used in the study of these graphs. The author has taken the intermediate position between Eden et al. who assumes the physics of the analytic properties of the S-matrix, containing physical ideas and test results without using the proper mathematical methods, and Hwa and Teplitz, whose works are more mathematically inclined with a

  20. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    Science.gov (United States)

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  1. Pros and cons of analytical methods to quantify surrogate contaminants from the challenge test in recycled polyethylene terephthalate

    Energy Technology Data Exchange (ETDEWEB)

    Felix, Juliana S., E-mail: jfelix@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain); Alfaro, Pilar, E-mail: palfarot@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain); Nerin, Cristina, E-mail: cnerin@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain)

    2011-02-14

    Different analytical methods were optimized and applied to quantify certain surrogate contaminants (toluene, chlorobenzene, phenol, limonene and benzophenone) in samples of contaminated and recycled flakes and virgin pellets of polyethylene terephthalate (PET) coming from the industrial challenge test. A screening analysis of the PET samples was carried out by direct solid-phase microextraction (SPME) in headspace mode (HS). The methods developed and used for quantitative analysis were a) total dissolution of PET samples in dichloroacetic acid and analysis by HS-SPME coupled to gas chromatography-mass spectrometry (GC-MS) and, b) dichloromethane extraction and analysis by GC-MS. The concentration of all surrogates in the contaminated PET flakes analyzed by HS-SPME method was lower than expected according to information provided by the supplier. Dichloroacetic acid interacted with the surrogates, resulting in a tremendous decrease of limonene concentration. The degradation compounds from limonene were identified. Dichloromethane extraction and GC-MS analysis evidenced the highest values of analytes in these PET samples. Based on the foregoing data, the efficiency of the recycling process was evaluated, whereby the removal of 99.9% of the surrogates proceeding from the contaminated flakes was confirmed.

  2. Pros and cons of analytical methods to quantify surrogate contaminants from the challenge test in recycled polyethylene terephthalate

    International Nuclear Information System (INIS)

    Felix, Juliana S.; Alfaro, Pilar; Nerin, Cristina

    2011-01-01

    Different analytical methods were optimized and applied to quantify certain surrogate contaminants (toluene, chlorobenzene, phenol, limonene and benzophenone) in samples of contaminated and recycled flakes and virgin pellets of polyethylene terephthalate (PET) coming from the industrial challenge test. A screening analysis of the PET samples was carried out by direct solid-phase microextraction (SPME) in headspace mode (HS). The methods developed and used for quantitative analysis were a) total dissolution of PET samples in dichloroacetic acid and analysis by HS-SPME coupled to gas chromatography-mass spectrometry (GC-MS) and, b) dichloromethane extraction and analysis by GC-MS. The concentration of all surrogates in the contaminated PET flakes analyzed by HS-SPME method was lower than expected according to information provided by the supplier. Dichloroacetic acid interacted with the surrogates, resulting in a tremendous decrease of limonene concentration. The degradation compounds from limonene were identified. Dichloromethane extraction and GC-MS analysis evidenced the highest values of analytes in these PET samples. Based on the foregoing data, the efficiency of the recycling process was evaluated, whereby the removal of 99.9% of the surrogates proceeding from the contaminated flakes was confirmed.

  3. Effects on fatigue life of gate valves due to higher torque switch settings during operability testing

    International Nuclear Information System (INIS)

    Richins, W.D.; Snow, S.D.; Miller, G.K.; Russell, M.J.; Ware, A.G.

    1995-12-01

    Some motor operated valves now have higher torque switch settings due to regulatory requirements to ensure valve operability with appropriate margins at design basis conditions. Verifying operability with these settings imposes higher stem loads during periodic inservice testing. These higher test loads increase stresses in the various valve internal parts which may in turn increase the fatigue usage factors. This increased fatigue is judged to be a concern primarily in the valve disks, seats, yokes, stems, and stem nuts. Although the motor operators may also have significantly increased loading, they are being evaluated by the manufacturers and are beyond the scope of this study. Two gate valves representative of both relatively weak and strong valves commonly used in commercial nuclear applications were selected for fatigue analyses. Detailed dimensional and test data were available for both valves from previous studies at the Idaho National Engineering Laboratory. Finite element models were developed to estimate maximum stresses in the internal parts of the valves and to identity the critical areas within the valves where fatigue may be a concern. Loads were estimated using industry standard equations for calculating torque switch settings prior and subsequent to the testing requirements of USNRC Generic Letter 89--10. Test data were used to determine both; (1) the overshoot load between torque switch trip and final seating of the disk during valve closing and (2) the stem thrust required to open the valves. The ranges of peak stresses thus determined were then used to estimate the increase in the fatigue usage factors due to the higher stem thrust loads. The usages that would be accumulated by 100 base cycles plus one or eight test cycles per year over 40 and 60 years of operation were calculated

  4. Versatile electrophoresis-based self-test platform.

    Science.gov (United States)

    Guijt, Rosanne M

    2015-03-01

    Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Destructive analytical services to safeguards: Activity report 1984-1987

    International Nuclear Information System (INIS)

    Bagliano, G.

    1987-08-01

    The report gives an evaluation of the volume, delays and quality of destructive analytical services provided in 84/01/01-87/06/30 in support of Agency Safeguards. General observations are also made. Problems and improvements are identified and trends indicated. The main features of the considered period are as follows: a) About 4,100 inspections samples were received and analyzed in the period 84/01/01-87/06/30 by SAL and NWAL. The samples represented 20 different types of materials issued from different nuclear fuel cycles. b) 496 out of 895 spent fuel samples analyzed in this period were distributed to NWAL. c) The chemical and isotopic analyses of Uranium, Thorium and Plutonium requested by the Inspectors (Americium also but occasionally) resulted in the report of a total of about 14,200 analytical results. d) The calibration of on-site measurements, the certification of Working Reference Materials for Destructive Analysis and the maintenance and improvement of Destructive Analysis required additional analytical services which were provided. e) At present, compared to 1983, the total verification delays by DA were shortened by 16%, 57% and 55% respectively for Uranium, Plutonium and spent fuel materials. f) Timely detection of abrupt diversion is achievable for the low enriched Uranium samples and are close to be achievable for the Plutonium and Spent Fuel Samples. g) Concepts, Procedures, and Materials for Analytical Quality Control Programmes of the measurements performed at SAL and NWAL were prepared. The analysis performed can be adequately accurate. h) Problems continue however to be encountered in the quality of the overall DA verification system. i) Major upgrading of equipment and procedures was undertaken at SAL. j) Other selective chemical assays were being tested and Isotope Dilution Mass Spec assays have been successfully set up for the analysis of 3 mg-sized Plutonium product samples. k) Close contacts have been kept with NWAL via Consultants

  6. Strain accumulation in a prototypic lmfbr nozzle: Experimental and analytical correlation

    International Nuclear Information System (INIS)

    Woodward, W.S.; Dhalia, A.K.; Berton, P.A.

    1986-01-01

    At an early stage in the design of the primary inlet nozzle for the Intermediate Heat Exchanger (IHX) of the Fast Flux Test Facility (FFTF), it was predicted that the inelastic strain accumulation during elevated temperature operation (1050 0 F/566 0 C) would exceed the ASME Code design allowables. Therefore, a proof test of a prototypic FFTF IHX nozzle was performed in the Westinghouse Creep Ratcheting Test Facility (CRTF) to measure the ratchet strain increments during the most severe postulated FFTF plant thermal transients. In addition, analytical procedures similar to those used in the plant design, were used to predict strain accumulation in the CRTF nozzle. This paper describes how the proof test was successfully completed, and it shows that both the test measurements and analytical predictions confirm that the FFTF IHX nozzle, subjected to postulated thermal and mechanical loadings, complies with the ASME Code strain limits. Also, these results provide a measure of validation for the analytical procedures used in the design of FFTF as well as demonstrate the structural adequacy of the FFTF IHX primary inlet nozzle

  7. On analytical justification of phase synchronization in different chaotic systems

    International Nuclear Information System (INIS)

    Erjaee, G.H.

    2009-01-01

    In analytical or numerical synchronizations studies of coupled chaotic systems the phase synchronizations have less considered in the leading literatures. This article is an attempt to find a sufficient analytical condition for stability of phase synchronization in some coupled chaotic systems. The method of nonlinear feedback function and the scheme of matrix measure have been used to justify this analytical stability, and tested numerically for the existence of the phase synchronization in some coupled chaotic systems.

  8. The introduction of syphilis point of care tests in resource limited settings.

    Science.gov (United States)

    Marks, Michael; Mabey, David Cw

    2017-04-01

    Syphilis remains an important and preventable cause of stillbirth and neonatal mortality. About 1 million women with active syphilis become pregnant each year. Without treatment, 25% of them will deliver a stillborn baby and 33% a low birth weight baby with an increased chance of dying in the first month of life. Adverse pregnancy outcomes due to syphilis can be prevented by screening pregnant women, and treating those who test positive with a single dose of penicillin before 28 weeks' gestation. Areas covered: This manuscript covers the impact of syphilis on pregnancy outcome, the diagnosis of syphilis, with a special focus on point of care (POC) tests, and challenges to the introduction of POC tests, and their potential impact on the control and prevention of syphilis in resource limited settings. Expert commentary: POC tests for syphilis are available which meet the ASSURED criteria, and could make syphilis screening accessible to all women anywhere in the world who attend an antenatal clinic. High quality dual POC tests for HIV and syphilis could ensure that well-funded programmes for the prevention of mother to child transmission of HIV can contribute towards increased coverage of antenatal syphilis screening, and prevent more than 300,000 adverse pregnancy outcomes due to syphilis annually. Alongside investment to increase availability of syphilis POC tests, operational research is needed to understand how best to improve screening of pregnant women and to translate test availability into improved pregnancy outcomes.

  9. Effect of Caffeine on Attention and Alertness Measured in a Home-Setting, Using Web-Based Cognition Tests.

    Science.gov (United States)

    Pasman, Wilrike J; Boessen, Ruud; Donner, Yoni; Clabbers, Nard; Boorsma, André

    2017-09-07

    There is an increasing interest among nutritional researchers to perform lifestyle and nutritional intervention studies in a home setting instead of testing subjects in a clinical unit. The term used in other disciplines is 'ecological validity' stressing a realistic situation. This becomes more and more feasible because devices and self-tests that enable such studies are more commonly available. Here, we present such a study in which we reproduced the effect of caffeine on attention and alertness in an at-home setting. The study was aimed to reproduce the effect of caffeine on attention and alertness using a Web-based study environment of subjects, at home, performing different Web-based cognition tests. The study was designed as a randomized, placebo-controlled, double-blind, crossover study. Subjects were provided with coffee sachets (2 with and 2 without caffeine). They were also provided with a written instruction of the test days. Healthy volunteers consumed a cup of coffee after an overnight fast. Each intervention was repeated once. Before and 1 hour after coffee consumption subjects performed Web-based cognitive performance tests at home, which measured alertness and attention, established by 3 computerized tests provided by QuantifiedMind. Each test was performed for 5 minutes. Web-based recruitment was fast and efficient. Within 2 weeks, 102 subjects applied, of whom 70 were eligible. Of the 66 subjects who started the study, 53 completed all 4 test sessions (80%), indicating that they were able to perform the do it yourself tests, at home, correctly. The Go-No Go cognition test performed at home showed the same significant improvement in reaction time with caffeine as found in controlled studies in a metabolic ward (P=.02). For coding and N-back the second block was performed approximately 10% faster. No effect was seen on correctness. The study showed that the effects of caffeine consumption on a cognition test in an at-home setting revealed similar

  10. Process and results of analytical framework and typology development for POINT

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Lehtonen, Markku; Bauler, Tom

    2009-01-01

    POINT is a project about how indicators are used in practice; to what extent and in what way indicators actually influence, support, or hinder policy and decision making processes, and what could be done to enhance the positive role of indicators in such processes. The project needs an analytical......, a set of core concepts and associated typologies, a series of analytic schemes proposed, and a number of research propositions and questions for the subsequent empirical work in POINT....

  11. Benchmark Tests to Develop Analytical Time-Temperature Limit for HANA-6 Cladding for Compliance with New LOCA Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sung Yong; Jang, Hun; Lim, Jea Young; Kim, Dae Il; Kim, Yoon Ho; Mok, Yong Kyoon [KEPCO Nuclear Fuel Co. Ltd., Daejeon (Korea, Republic of)

    2016-10-15

    According to 10CFR50.46c, two analytical time and temperature limits for breakaway oxidation and postquench ductility (PQD) should be determined by approved experimental procedure as described in NRC Regulatory Guide (RG) 1.222 and 1.223. According to RG 1.222 and 1.223, rigorous qualification requirements for test system are required, such as thermal and weight gain benchmarks. In order to meet these requirements, KEPCO NF has developed the new special facility to evaluate LOCA performance of zirconium alloy cladding. In this paper, qualification results for test facility and HT oxidation model for HANA-6 are summarized. The results of thermal benchmark tests of LOCA HT oxidation tester is summarized as follows. 1. The best estimate HT oxidation model of HANA- 6 was developed for the vender proprietary HT oxidation model. 2. In accordance with the RG 1.222 and 1.223, Benchmark tests were performed by using LOCA HT oxidation tester 3. The maximum axial and circumferential temperature difference are ± 9 .deg. C and ± 2 .deg. C at 1200 .deg. C, respectively. At the other temperature conditions, temperature difference is less than 1200 .deg. C result. Thermal benchmark test results meet the requirements of NRC RG 1.222 and 1.223.

  12. An experimental set-up to test heatmoisture exchangers

    NARCIS (Netherlands)

    N. Ünal (N.); J.C. Pompe (Jan); W.P. Holland (Wim); I. Gultuna; P.E.M. Huygen; K. Jabaaij (K.); C. Ince (Can); B. Saygin (B.); H.A. Bruining (Hajo)

    1995-01-01

    textabstractObjectives: The purpose of this study was to build an experimental set-up to assess continuously the humidification, heating and resistance properties of heat-moisture exchangers (HMEs) under clinical conditions. Design: The experimental set-up consists of a patient model, measurement

  13. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    Science.gov (United States)

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the

  14. Applications of nuclear analytical techniques to environmental studies

    International Nuclear Information System (INIS)

    Freitas, M.C.; Marques, A.P.; Reis, M.A.; Pacheco, A.M.G.; Barros, L.I.C.

    2001-01-01

    A few examples of application of nuclear-analytical techniques to biological monitors - natives and transplants - are given herein. Parmelia sulcata Taylor transplants were set up in a heavily industrialized area of Portugal - the Setubal peninsula, about 50 km south of Lisbon - where indigenous lichens are rare. The whole area was 10x15 km around an oil-fired power station, and a 2.5x2.5 km grid was used. In north-western Portugal, native thalli of the same epiphytes (Parmelia spp., mostly Parmelia sulcata Taylor) and bark from olive trees (Olea europaea) were sampled across an area of 50x50 km, using a 10x10 km grid. This area is densely populated and features a blend of rural, urban-industrial and coastal environments, together with the country's second-largest metro area (Porto). All biomonitors have been analyzed by INAA and PIXE. Results were put through nonparametric tests and factor analysis for trend significance and emission sources, respectively

  15. Sulfonylurea herbicides – methodological challenges in setting aquatic limit values

    DEFF Research Database (Denmark)

    Rosenkrantz, Rikke Tjørnhøj; Baun, Anders; Kusk, Kresten Ole

    according to the EU Water Framework Directive, the resulting Water Quality Standards (WQSs) are below the analytical quantification limit, making it difficult to verify compliance with the limit values. However, several methodological concerns may be raised in relation to the very low effect concentrations...... and rimsulfuron. The following parameters were varied during testing: pH, exposure duration, temperature and light/dark cycle. Preliminary results show that a decrease in pH causes an increase in toxicity for all compounds. Exposure to a high concentration for 24 hours caused a reduction in growth rate, from...... for setting limit values for SUs or if more detailed information should be gained by taking methodological considerations into account....

  16. Model-based Engineering for the Integration of Manufacturing Systems with Advanced Analytics

    OpenAIRE

    Lechevalier , David; Narayanan , Anantha; Rachuri , Sudarsan; Foufou , Sebti; Lee , Y Tina

    2016-01-01

    Part 3: Interoperability and Systems Integration; International audience; To employ data analytics effectively and efficiently on manufacturing systems, engineers and data scientists need to collaborate closely to bring their domain knowledge together. In this paper, we introduce a domain-specific modeling approach to integrate a manufacturing system model with advanced analytics, in particular neural networks, to model predictions. Our approach combines a set of meta-models and transformatio...

  17. On the analytic continuation of functions defined by Legendre series

    International Nuclear Information System (INIS)

    Grinstein, F.F.

    1981-07-01

    An infinite diagonal sequence of Punctual Pade Approximants is considered for the approximate analytical continuation of a function defined by a formal Legendre series. The technique is tested in the case of two series with exactly known analytical sum: the generating function for Legendre polynomials and the Coulombian scattering amplitude. (author)

  18. An analytical model for enantioseparation process in capillary electrophoresis

    Science.gov (United States)

    Ranzuglia, G. A.; Manzi, S. J.; Gomez, M. R.; Belardinelli, R. E.; Pereyra, V. D.

    2017-12-01

    An analytical model to explain the mobilities of enantiomer binary mixture in capillary electrophoresis experiment is proposed. The model consists in a set of kinetic equations describing the evolution of the populations of molecules involved in the enantioseparation process in capillary electrophoresis (CE) is proposed. These equations take into account the asymmetric driven migration of enantiomer molecules, chiral selector and the temporary diastomeric complexes, which are the products of the reversible reaction between the enantiomers and the chiral selector. The solution of these equations gives the spatial and temporal distribution of each species in the capillary, reproducing a typical signal of the electropherogram. The mobility, μ, of each specie is obtained by the position of the maximum (main peak) of their respective distributions. Thereby, the apparent electrophoretic mobility difference, Δμ, as a function of chiral selector concentration, [ C ] , can be measured. The behaviour of Δμ versus [ C ] is compared with the phenomenological model introduced by Wren and Rowe in J. Chromatography 1992, 603, 235. To test the analytical model, a capillary electrophoresis experiment for the enantiomeric separation of the (±)-chlorpheniramine β-cyclodextrin (β-CD) system is used. These data, as well as, other obtained from literature are in closed agreement with those obtained by the model. All these results are also corroborate by kinetic Monte Carlo simulation.

  19. Transport and traffic analytics in smart cities

    OpenAIRE

    Semanjski, Ivana

    2016-01-01

    Vast generation of high resolution spatial and temporal data, particularly in urban settings, started revolution in mobility and human behavior related research. However, after initial wave of first data oriented insights their integration into ongoing, and traditionally used, planning and decision making processes seems to be hindered by still opened challenges. These challenges suggest need for stronger integration between data analytics and dedicated domain knowledg...

  20. Development of analytical and numerical models for the assessment and interpretation of hydrogeological field tests

    Energy Technology Data Exchange (ETDEWEB)

    Mironenko, V.A.; Rumynin, V.G.; Konosavsky, P.K. [St. Petersburg Mining Inst. (Russian Federation); Pozdniakov, S.P.; Shestakov, V.M. [Moscow State Univ. (Russian Federation); Roshal, A.A. [Geosoft-Eastlink, Moscow (Russian Federation)

    1994-07-01

    Mathematical models of the flow and tracer tests in fractured aquifers are being developed for the further study of radioactive wastes migration in round water at the Lake Area, which is associated with one of the waste disposal site in Russia. The choice of testing methods, tracer types (chemical or thermal) and the appropriate models are determined by the nature of the ongoing ground-water pollution processes and the hydrogeological features of the site under consideration. Special importance is attached to the increased density of wastes as well as to the possible redistribution of solutes both in the liquid phase and in the absorbed state (largely, on fracture surfaces). This allows for studying physical-and-chemical (hydrogeochemical) interaction parameters which are hard to obtain (considering a fractured structure of the rock mass) in laboratory. Moreover, a theoretical substantiation is being given to the field methods of studying the properties of a fractured stratum aimed at the further construction of the drainage system or the subsurface flow barrier (cutoff wall), as well as the monitoring system that will evaluate the reliability of these ground-water protection measures. The proposed mathematical models are based on a tight combination of analytical and numerical methods, the former being preferred in solving the principal (2D axisymmetrical) class of the problems. The choice of appropriate problems is based on the close feedback with subsequent field tests in the Lake Area. 63 refs.

  1. Development of analytical and numerical models for the assessment and interpretation of hydrogeological field tests

    International Nuclear Information System (INIS)

    Mironenko, V.A.; Rumynin, V.G.; Konosavsky, P.K.; Pozdniakov, S.P.; Shestakov, V.M.; Roshal, A.A.

    1994-07-01

    Mathematical models of the flow and tracer tests in fractured aquifers are being developed for the further study of radioactive wastes migration in round water at the Lake Area, which is associated with one of the waste disposal site in Russia. The choice of testing methods, tracer types (chemical or thermal) and the appropriate models are determined by the nature of the ongoing ground-water pollution processes and the hydrogeological features of the site under consideration. Special importance is attached to the increased density of wastes as well as to the possible redistribution of solutes both in the liquid phase and in the absorbed state (largely, on fracture surfaces). This allows for studying physical-and-chemical (hydrogeochemical) interaction parameters which are hard to obtain (considering a fractured structure of the rock mass) in laboratory. Moreover, a theoretical substantiation is being given to the field methods of studying the properties of a fractured stratum aimed at the further construction of the drainage system or the subsurface flow barrier (cutoff wall), as well as the monitoring system that will evaluate the reliability of these ground-water protection measures. The proposed mathematical models are based on a tight combination of analytical and numerical methods, the former being preferred in solving the principal (2D axisymmetrical) class of the problems. The choice of appropriate problems is based on the close feedback with subsequent field tests in the Lake Area. 63 refs

  2. Pilot Integration of HIV Screening and Healthcare Settings with Multi- Component Social Network and Partner Testing for HIV Detection.

    Science.gov (United States)

    Rentz, Michael F; Ruffner, Andrew H; Ancona, Rachel M; Hart, Kimberly W; Kues, John R; Barczak, Christopher M; Lindsell, Christopher J; Fichtenbaum, Carl J; Lyons, Michael S

    2017-11-23

    Healthcare settings screen broadly for HIV. Public health settings use social network and partner testing ("Transmission Network Targeting (TNT)") to select high-risk individuals based on their contacts. HIV screening and TNT systems are not integrated, and healthcare settings have not implemented TNT. The study aimed to evaluate pilot implementation of multi-component, multi-venue TNT in conjunction with HIV screening by a healthcare setting. Our urban, academic health center implemented a TNT program in collaboration with the local health department for five months during 2011. High-risk or HIV positive patients of the infectious diseases clinic and emergency department HIV screening program were recruited to access social and partner networks via compensated peer-referral, testing of companions present with them, and partner notification services. Contacts became the next-generation index cases in a snowball recruitment strategy. The pilot TNT program yielded 485 HIV tests for 482 individuals through eight generations of recruitment with five (1.0%; 95% CI = 0.4%, 2.3%) new diagnoses. Of these, 246 (51.0%; 95% CI = 46.6%, 55.5%) reported that they had not been tested for HIV within the last 12 months and 383 (79.5%; 95% CI = 75.7%, 82.9%) had not been tested by the existing ED screening program within the last five years. TNT complements population screening by more directly targeting high-risk individuals and by expanding the population receiving testing. Information from existing healthcare services could be used to seed TNT programs, or TNT could be implemented within healthcare settings. Research evaluating multi-component, multi-venue HIV detection is necessary to maximize complementary approaches while minimizing redundancy. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Streaming Visual Analytics Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kristin A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Burtner, Edwin R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kritzstein, Brian P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brisbois, Brooke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mitson, Anna E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-31

    How can we best enable users to understand complex emerging events and make appropriate assessments from streaming data? This was the central question addressed at a three-day workshop on streaming visual analytics. This workshop was organized by Pacific Northwest National Laboratory for a government sponsor. It brought together forty researchers and subject matter experts from government, industry, and academia. This report summarizes the outcomes from that workshop. It describes elements of the vision for a streaming visual analytic environment and set of important research directions needed to achieve this vision. Streaming data analysis is in many ways the analysis and understanding of change. However, current visual analytics systems usually focus on static data collections, meaning that dynamically changing conditions are not appropriately addressed. The envisioned mixed-initiative streaming visual analytics environment creates a collaboration between the analyst and the system to support the analysis process. It raises the level of discourse from low-level data records to higher-level concepts. The system supports the analyst’s rapid orientation and reorientation as situations change. It provides an environment to support the analyst’s critical thinking. It infers tasks and interests based on the analyst’s interactions. The system works as both an assistant and a devil’s advocate, finding relevant data and alerts as well as considering alternative hypotheses. Finally, the system supports sharing of findings with others. Making such an environment a reality requires research in several areas. The workshop discussions focused on four broad areas: support for critical thinking, visual representation of change, mixed-initiative analysis, and the use of narratives for analysis and communication.

  4. Manual of selected physico-chemical analytical methods. IV

    International Nuclear Information System (INIS)

    Beran, M.; Klosova, E.; Krtil, J.; Sus, F.; Kuvik, V.; Vrbova, L.; Hamplova, M.; Lengyel, J.; Kelnar, L.; Zakouril, K.

    1990-11-01

    The Central Testing Laboratory of the Nuclear Research Institute at Rez has for a decade been participating in the development of analytical procedures and has been providing analyses of samples of different types and origin. The analytical procedures developed have been published in special journals and a number of them in the Manuals of analytical methods, in three parts. The 4th part of the Manual contains selected physico-chemical methods developed or modified by the Laboratory in the years 1986-1990 within the project ''Development of physico-chemical analytical methods''. In most cases, techniques are involved for non-nuclear applications. Some can find wider applications, especially in analyses of environmental samples. Others have been developed for specific cases of sample analyses or require special instrumentation (mass spectrometer), which partly restricts their applicability by other institutions. (author)

  5. Analytical study in 1D nuclear waste migration

    International Nuclear Information System (INIS)

    Perez Guerrero, Jesus S.; Heilbron Filho, Paulo L.; Romani, Zrinka V.

    1999-01-01

    The simulation of the nuclear waste migration phenomena are governed mainly by diffusive-convective equation that includes the effects of hydrodynamic dispersion (mechanical dispersion and molecular diffusion), radioactive decay and chemical interaction. For some special problems (depending on the boundary conditions and when the domain is considered infinite or semi-infinite) an analytical solution may be obtained using classical analytical methods such as Laplace Transform or variable separation. The hybrid Generalized Integral Transform Technique (GITT) is a powerful tool that can be applied to solve diffusive-convective linear problems to obtain formal analytical solutions. The aim of this work is to illustrate that the GITT may be used to obtain an analytical formal solution for the study of migration of radioactive waste in saturated flow porous media. A case test considering 241 Am radionuclide is presented. (author)

  6. TEST BEAM COORDINATION: The 2004 Test Beam Calorimetry set-up in H8

    CERN Multimedia

    Aleksa, M; Di Girolamo, B; Ferrari, C; Giugni, D; Santoni, C; Wingerter, I

    A new table has been designed, built and finally mounted to position the LAr cryostat in front of the Tilecal modules. The new table has been connected to the existing Tilecal table to be able to move the full set-up along eta values between 0 and 1.2. The table has been conceived by D. Giugni (INFN Milano and now CERN PH) and modeled by G. Braga (INFN Milano) in spring-summer 2003. The realization of the table has been done by an Italian firm (MatecImpianti, Fenegrò, Como) under the supervision of S. Coelli (INFN Milano) starting August 2003. Figure 1 shows the table assembled at the firm (left). Figure 1: The Tilecal-LAr table: in Fenegro (left) and at CERN (right). In November 2003 the table has been delivered to CERN and put in temporary storage to be assembled after the preparation of the Tilecal zone. In February 2004 two technicians from the firm and the team of technician coordinated by C. Ferrari (CERN AB/ATB), assembled, tested and commissioned the table under the supervision of S. Coelli...

  7. Social Context of First Birth Timing in a Rapidly Changing Rural Setting

    Science.gov (United States)

    Ghimire, Dirgha J.

    2016-01-01

    This article examines the influence of social context on the rate of first birth. Drawing on socialization models, I develop a theoretical framework to explain how different aspects of social context (i.e., neighbors), may affect the rate of first birth. Neighbors, who in the study setting comprise individuals’ immediate social context, have an important influence on the rate of first birth. To test my hypotheses, I leverage a setting, measures and analytical techniques designed to study the impact of macro-level social contexts on micro-level individual behavior. The results show that neighbors’ age at first birth, travel to the capital city and media exposure tend to reduce the first birth rate, while neighbors’ non-family work experience increases first birth rate. These effects are independent of neighborhood characteristics and are robust against several key variations in model specifications. PMID:27886737

  8. How Dispositional Learning Analytics helps understanding the worked-example principle

    NARCIS (Netherlands)

    Tempelaar, Dirk; Sampson, Demetrios G.; Spector, J. Michael; Ifenthaler, Dirk; Isaías, Pedro

    2017-01-01

    This empirical study aims to demonstrate how Dispositional Learning Analytics can contribute in the investigation of the effectiveness of didactical scenarios in authentic settings, where previous research has mostly been laboratory based. Using a showcase based on learning processes of 1080

  9. Simplified analytical modeling of the normal hole erosion test; Modelado analitico simplificado del ensayo normal de ersoion de tubo

    Energy Technology Data Exchange (ETDEWEB)

    Khamlichi, A.; Bezzazi, M.; El Bakkali, L.; Jabbouri, A.; Kissi, B.; Yakhlef, F.; Parron Vera, M. A.; Rubio Cintas, M. D.; Castillo Lopez, O.

    2009-07-01

    The role erosion test was developed in order to study erosion phenomenon which occurs in cracks appearing in hydraulic infrastructures such as dams. This test enables describing experimentally the erosive characteristics of soils by means of an index which is called erosion rate and a critical tension which indicates the threshold of surface erosion initiation. The objective of this work is to five modelling of this experiment by means of a simplified analytical approach. The erosion law is derived by taking into account the flow regime. This law shows that the erosion occurring in the tube is controlled by a first order dynamics where only two parameters are involved: the characteristic's time linked to the erosion rate and the stress shear threshold for which erosion begins to develop. (Author) 5 refs.

  10. Application of an analytical method for solution of thermal hydraulic conservation equations

    Energy Technology Data Exchange (ETDEWEB)

    Fakory, M.R. [Simulation, Systems & Services Technologies Company (S3 Technologies), Columbia, MD (United States)

    1995-09-01

    An analytical method has been developed and applied for solution of two-phase flow conservation equations. The test results for application of the model for simulation of BWR transients are presented and compared with the results obtained from application of the explicit method for integration of conservation equations. The test results show that with application of the analytical method for integration of conservation equations, the Courant limitation associated with explicit Euler method of integration was eliminated. The results obtained from application of the analytical method (with large time steps) agreed well with the results obtained from application of explicit method of integration (with time steps smaller than the size imposed by Courant limitation). The results demonstrate that application of the analytical approach significantly improves the numerical stability and computational efficiency.

  11. THE QuEChERS ANALYTICAL METHOD COMBINED WITH LOW ...

    African Journals Online (AJOL)

    The method has also been applied to different cereal samples and satisfactory average recoveries ... Analysis of multiclass pesticide residues in foods is a challenging task because of the ... compounds set by regulatory bodies. ..... analytes were used to evaluate the influences of the selected factors on performance of the.

  12. Numerical and analytical investigation of steel beam subjected to four-point bending

    Science.gov (United States)

    Farida, F. M.; Surahman, A.; Sofwan, A.

    2018-03-01

    A One type of bending tests is four-point bending test. The aim of this test is to investigate the properties and behavior of materials with structural applications. This study uses numerical and analytical studies. Results from both of these studies help to improve in experimental works. The purpose of this study is to predict steel beam behavior subjected to four-point bending test. This study intension is to analyze flexural beam subjected to four-point bending prior to experimental work. Main results of this research are location of strain gauge and LVDT on steel beam based on numerical study, manual calculation, and analytical study. Analytical study uses linear elasticity theory of solid objects. This study results is position of strain gauge and LVDT. Strain gauge is located between two concentrated loads at the top beam and bottom beam. LVDT is located between two concentrated loads.

  13. Construction of analytically solvable models for interacting species. [biological species competition

    Science.gov (United States)

    Rosen, G.

    1976-01-01

    The basic form of a model representation for systems of n interacting biological species is a set of essentially nonlinear autonomous ordinary differential equations. A generic canonical expression for the rate functions in the equations is reported which permits the analytical general solution to be obtained by elementary computation. It is shown that a general analytical solution is directly obtainable for models where the rate functions are prescribed by the generic canonical expression from the outset. Some illustrative examples are given which demonstrate that the generic canonical expression can be used to construct analytically solvable models for two interacting species with limit-cycle dynamics as well as for a three-species interdependence.

  14. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  15. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan

    OpenAIRE

    Najat, Dereen

    2017-01-01

    Background Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical ph...

  16. Process analytical chemistry applied to actinide waste streams

    International Nuclear Information System (INIS)

    Day, R.S.

    1994-01-01

    The Department of Energy is being called upon to clean up it's legacy of waste from the nuclear complex generated during the cold war period. Los Alamos National Laboratory is actively involved in waste minimization and waste stream polishing activities associated with this clean up. The Advanced Testing Line for Actinide Separations (ATLAS) at Los Alamos serves as a developmental test bed for integrating flow sheet development of nitric acid waste streams with process analytical chemistry and process control techniques. The wastes require processing in glove boxes because of the radioactive components, thus adding to the difficulties of making analytical measurements. Process analytical chemistry methods provide real-time chemical analysis in support of existing waste stream operations and enhances the development of new waste stream polishing initiatives. The instrumentation and methods being developed on ATLAS are designed to supply near-real time analyses on virtually all of the chemical parameters found in nitric acid processing of actinide waste. These measurements supply information on important processing parameters including actinide oxidation states, free acid concentration, interfering anions and metal impurities

  17. Development and pilot test of a new set of good practice indicators for chronic cancer pain management.

    Science.gov (United States)

    Saturno, P J; Martinez-Nicolas, I; Robles-Garcia, I S; López-Soriano, F; Angel-García, D

    2015-01-01

    Pain is among the most important symptoms in terms of prevalence and cause of distress for cancer patients and their families. However, there is a lack of clearly defined measures of quality pain management to identify problems and monitor changes in improvement initiatives. We built a comprehensive set of evidence-based indicators following a four-step model: (1) review and systematization of existing guidelines to list evidence-based recommendations; (2) review and systematization of existing indicators matching the recommendations; (3) development of new indicators to complete a set of measures for the identified recommendations; and (4) pilot test (in hospital and primary care settings) for feasibility, reliability (kappa), and usefulness for the identification of quality problems using the lot quality acceptance sampling (LQAS) method and estimates of compliance. Twenty-two indicators were eventually pilot tested. Seventeen were feasible in hospitals and 12 in all settings. Feasibility barriers included difficulties in identifying target patients, deficient clinical records and low prevalence of cases for some indicators. Reliability was mostly very good or excellent (k > 0.8). Four indicators, all of them related to medication and prevention of side effects, had acceptable compliance at 75%/40% LQAS level. Other important medication-related indicators (i.e., adjustment to pain intensity, prescription for breakthrough pain) and indicators concerning patient-centred care (i.e., attention to psychological distress and educational needs) had very low compliance, highlighting specific quality gaps. A set of good practice indicators has been built and pilot tested as a feasible, reliable and useful quality monitoring tool, and underscoring particular and important areas for improvement. © 2014 European Pain Federation - EFIC®

  18. Sample diagnosis using indicator elements and non-analyte signals for inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Antler, Margaret; Ying Hai; Burns, David H.; Salin, Eric D.

    2003-01-01

    A sample diagnosis procedure that uses both non-analyte and analyte signals to estimate matrix effects in inductively coupled plasma-mass spectrometry is presented. Non-analyte signals are those of background species in the plasma (e.g. N + , ArO + ), and changes in these signals can indicate changes in plasma conditions. Matrix effects of Al, Ba, Cs, K and Na on 19 non-analyte signals and 15 element signals were monitored. Multiple linear regression was used to build the prediction models, using a genetic algorithm for objective feature selection. Non-analyte elemental signals and non-analyte signals were compared for diagnosing matrix effects, and both were found to be suitable for estimating matrix effects. Individual analyte matrix effect estimation was compared with the overall matrix effect prediction, and models used to diagnose overall matrix effects were more accurate than individual analyte models. In previous work [Spectrochim. Acta Part B 57 (2002) 277], we tested models for analytical decision making. The current models were tested in the same way, and were able to successfully diagnose matrix effects with at least an 80% success rate

  19. Statistically qualified neuro-analytic failure detection method and system

    Science.gov (United States)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  20. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    Science.gov (United States)

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  1. Power Users and Patchworking – an Analytical Approach to Critical Studies of Young People’s Learning with Digital Media

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Dirckinck-Holmfeld, Lone

    2008-01-01

    This paper sets out to problematize generational categories such as ‘Power Users’ or ‘New Millennium Learners’ by discussing these in the light of recent research on youth and ICT. We then suggest analytic and conceptual pathways to engage in more critical and empirically founded studies of young...... people’s learning in technology and media-rich settings. Based on a study of a group of young ‘Power Users’ it is argued, that conceptualising and analysing learning as a process of patchworking can enhance our knowledge of young people’s learning in such settings. We argue that the analytical approach...... gives us ways of critically investigating young people’s learning in technology and media-rich settings, and study if these are processes of critical, reflexive enquiry where resources are creatively re-appropriated. With departure in an analytical example the paper presents the proposed metaphor...

  2. Analytical Propagation of Uncertainty in Life Cycle Assessment Using Matrix Formulation

    DEFF Research Database (Denmark)

    Imbeault-Tétreault, Hugues; Jolliet, Olivier; Deschênes, Louise

    2013-01-01

    with Monte Carlo results. The sensitivity and contribution of input parameters to output uncertainty were also analytically calculated. This article outlines an uncertainty analysis of the comparison between two case study scenarios. We conclude that the analytical method provides a good approximation...... on uncertainty calculation. This article shows the importance of the analytical method in uncertainty calculation, which could lead to a more complete uncertainty analysis in LCA practice....... uncertainty assessment is not a regular step in LCA. An analytical approach based on Taylor series expansion constitutes an effective means to overcome the drawbacks of the Monte Carlo method. This project aimed to test the approach on a real case study, and the resulting analytical uncertainty was compared...

  3. HIV Rapid Testing in a VA Emergency Department Setting: Cost Analysis at 5 Years.

    Science.gov (United States)

    Knapp, Herschel; Chan, Kee

    2015-07-01

    To conduct a comprehensive cost-minimization analysis to comprehend the financial attributes of the first 5 years of an implementation wherein emergency department (ED) registered nurses administered HIV oral rapid tests to patients. A health science research implementation team coordinated with ED stakeholders and staff to provide training, implementation guidelines, and support to launch ED registered nurse-administered HIV oral rapid testing. Deidentified quantitative data were gathered from the electronic medical records detailing quarterly HIV rapid test rates in the ED setting spanning the first 5 years. Comprehensive cost analyses were conducted to evaluate the financial impact of this implementation. At 5 years, a total of 2,620 tests were conducted with a quarterly mean of 131 ± 81. Despite quarterly variability in testing rates, regression analysis revealed an average increase of 3.58 tests per quarter. Over the course of this implementation, Veterans Health Administration policy transitioned from written to verbal consent for HIV testing, serving to reduce the time and cost(s) associated with the testing process. Our data indicated salient health outcome benefits for patients with respect to the potential for earlier detection, and associated long-run cost savings. Copyright © 2015. Published by Elsevier Inc.

  4. Characterization of Analytical Reference Glass-1 (ARG-1)

    International Nuclear Information System (INIS)

    Smith, G.L.

    1993-12-01

    High-level radioactive waste may be immobilized in borosilicate glass at the West Valley Demonstration Project, West Valley, New York, the Defense Waste Processing Facility (DWPF), Aiken, South Carolina, and the Hanford Waste Vitrification Project (HWVP), Richland, Washington. The vitrified waste form will be stored in stainless steel canisters before its eventual transfer to a geologic repository for long-term disposal. Waste Acceptance Product Specifications (WAPS) (DOE 1993), Section 1.1.2 requires that the waste form producers must report the measured chemical composition of the vitrified waste in their production records before disposal. Chemical analysis of glass waste forms is receiving increased attention due to qualification requirements of vitrified waste forms. The Pacific Northwest Laboratory (PNL) has been supporting the glass producers' analytical laboratories by a continuing program of multilaboratory analytical testing using interlaboratory ''round robin'' methods. At the PNL Materials Characterization Center Analytical Round Robin 4 workshop ''Analysis of Nuclear Waste Glass and Related Materials,'' January 16--17, 1990, Pleasanton, California, the meeting attendees decided that simulated nuclear waste analytical reference glasses were needed for use as analytical standards. Use of common standard analytical reference materials would allow the glass producers' analytical laboratories to calibrate procedures and instrumentation, to control laboratory performance and conduct self-appraisals, and to help qualify their various waste forms

  5. An analytical solution for improved HIFU SAR estimation

    International Nuclear Information System (INIS)

    Dillon, C R; Vyas, U; Christensen, D A; Roemer, R B; Payne, A

    2012-01-01

    Accurate determination of the specific absorption rates (SARs) present during high intensity focused ultrasound (HIFU) experiments and treatments provides a solid physical basis for scientific comparison of results among HIFU studies and is necessary to validate and improve SAR predictive software, which will improve patient treatment planning, control and evaluation. This study develops and tests an analytical solution that significantly improves the accuracy of SAR values obtained from HIFU temperature data. SAR estimates are obtained by fitting the analytical temperature solution for a one-dimensional radial Gaussian heating pattern to the temperature versus time data following a step in applied power and evaluating the initial slope of the analytical solution. The analytical method is evaluated in multiple parametric simulations for which it consistently (except at high perfusions) yields maximum errors of less than 10% at the center of the focal zone compared with errors up to 90% and 55% for the commonly used linear method and an exponential method, respectively. For high perfusion, an extension of the analytical method estimates SAR with less than 10% error. The analytical method is validated experimentally by showing that the temperature elevations predicted using the analytical method's SAR values determined for the entire 3D focal region agree well with the experimental temperature elevations in a HIFU-heated tissue-mimicking phantom. (paper)

  6. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  7. Use of evidence in a categorization task: analytic and holistic processing modes.

    Science.gov (United States)

    Greco, Alberto; Moretti, Stefania

    2017-11-01

    Category learning performance can be influenced by many contextual factors, but the effects of these factors are not the same for all learners. The present study suggests that these differences can be due to the different ways evidence is used, according to two main basic modalities of processing information, analytically or holistically. In order to test the impact of the information provided, an inductive rule-based task was designed, in which feature salience and comparison informativeness between examples of two categories were manipulated during the learning phases, by introducing and progressively reducing some perceptual biases. To gather data on processing modalities, we devised the Active Feature Composition task, a production task that does not require classifying new items but reproducing them by combining features. At the end, an explicit rating task was performed, which entailed assessing the accuracy of a set of possible categorization rules. A combined analysis of the data collected with these two different tests enabled profiling participants in regard to the kind of processing modality, the structure of representations and the quality of categorial judgments. Results showed that despite the fact that the information provided was the same for all participants, those who adopted analytic processing better exploited evidence and performed more accurately, whereas with holistic processing categorization is perfectly possible but inaccurate. Finally, the cognitive implications of the proposed procedure, with regard to involved processes and representations, are discussed.

  8. ANALYTiC: An Active Learning System for Trajectory Classification.

    Science.gov (United States)

    Soares Junior, Amilcar; Renso, Chiara; Matwin, Stan

    2017-01-01

    The increasing availability and use of positioning devices has resulted in large volumes of trajectory data. However, semantic annotations for such data are typically added by domain experts, which is a time-consuming task. Machine-learning algorithms can help infer semantic annotations from trajectory data by learning from sets of labeled data. Specifically, active learning approaches can minimize the set of trajectories to be annotated while preserving good performance measures. The ANALYTiC web-based interactive tool visually guides users through this annotation process.

  9. Innovative technology summary report: Road Transportable Analytical Laboratory (RTAL)

    International Nuclear Information System (INIS)

    1998-10-01

    The Road Transportable Analytical Laboratory (RTAL) has been used in support of US Department of Energy (DOE) site and waste characterization and remediation planning at Fernald Environmental Management Project (FEMP) and is being considered for implementation at other DOE sites, including the Paducah Gaseous Diffusion Plant. The RTAL laboratory system consists of a set of individual laboratory modules deployable independently or as an interconnected group to meet each DOE site's specific analysis needs. The prototype RTAL, deployed at FEMP Operable Unit 1 Waste Pits, has been designed to be synergistic with existing analytical laboratory capabilities, thereby reducing the occurrence of unplanned rush samples that are disruptive to efficient laboratory operations

  10. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Science.gov (United States)

    Najat, Dereen

    2017-01-01

    Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani

  11. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Directory of Open Access Journals (Sweden)

    Dereen Najat

    Full Text Available Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan.Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs.The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%, incorrect sample identification (8% and clotted samples (6%. Most quality control schemes

  12. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  13. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  14. Data Analytics of Hydraulic Fracturing Data

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jovan Yang [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Viswanathan, Hari [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hyman, Jeffery [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Middleton, Richard [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-11

    These are a set of slides on the data analytics of hydraulic fracturing data. The conclusions from this research are the following: they proposed a permeability evolution as a new mechanism to explain hydraulic fracturing trends; they created a model to include this mechanism and it showed promising results; the paper from this research is ready for submission; they devised a way to identify and sort refractures in order to study their effects, and this paper is currently being written.

  15. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  16. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  17. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r).

    Science.gov (United States)

    Nelms, Benjamin; Stambaugh, Cassandra; Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir

    2015-08-01

    The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. dicom RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms-pinnacle (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)-were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, Dmax, Dmin, and doses to % volume: D99, D95, D5, D1, D0.03 cm(3)) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. In Test 1, pinnacle produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, pinnacle and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding Dmin and Dmax as least clinically relevant would result in 32 (15%) vs 5 (2%) scored deviations for pinnacle vs PlanIQ in Test 1, while Test 2

  18. Pre-analytical conditions in non-invasive prenatal testing of cell-free fetal RHD.

    Directory of Open Access Journals (Sweden)

    Frederik Banch Clausen

    Full Text Available Non-invasive prenatal testing of cell-free fetal DNA (cffDNA in maternal plasma can predict the fetal RhD type in D negative pregnant women. In Denmark, routine antenatal screening for the fetal RhD gene (RHD directs the administration of antenatal anti-D prophylaxis only to women who carry an RhD positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based on data from routine antenatal RHD screening.Blood samples were drawn at gestational age 25 weeks. DNA extracted from 1 mL of plasma was analyzed for fetal RHD using a duplex method for exon 7/10. We investigated the effect of blood sample transportation time (n = 110 and ambient outdoor temperatures (n = 1539 on the levels of cffDNA and total DNA. We compared two different quantification methods, the delta Ct method and a universal standard curve. PCR pipetting was compared on two systems (n = 104.The cffDNA level was unaffected by blood sample transportation for up to 9 days and by ambient outdoor temperatures ranging from -10 °C to 28 °C during transport. The universal standard curve was applicable for cffDNA quantification. Identical levels of cffDNA were observed using the two automated PCR pipetting systems. We detected a mean of 100 fetal DNA copies/mL at a median gestational age of 25 weeks (range 10-39, n = 1317.The setup for real-time PCR-based, non-invasive prenatal testing of cffDNA in the Capital Region of Denmark is very robust. Our findings regarding the transportation of blood samples demonstrate the high stability of cffDNA. The applicability of a universal standard curve facilitates easy cffDNA quantification.

  19. Architectural Considerations for Highly Scalable Computing to Support On-demand Video Analytics

    Science.gov (United States)

    2017-04-19

    research were used to implement a distributed on-demand video analytics system that was prototyped for the use of forensics investigators in law...demand video intelligence; intelligent video system ; video analytics platform I. INTRODUCTION Video Analytics systems has been of tremendous interest...enforcement. The system was tested in the wild using video files as well as a commercial Video Management System supporting more than 100 surveillance

  20. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    Science.gov (United States)

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  1. Safety and quality of food contact materials. Part 1: evaluation of analytical strategies to introduce migration testing into good manufacturing practice.

    Science.gov (United States)

    Feigenbaum, A; Scholler, D; Bouquant, J; Brigot, G; Ferrier, D; Franzl, R; Lillemarktt, L; Riquet, A M; Petersen, J H; van Lierop, B; Yagoubi, N

    2002-02-01

    The results of a research project (EU AIR Research Programme CT94-1025) aimed to introduce control of migration into good manufacturing practice and into enforcement work are reported. Representative polymer classes were defined on the basis of chemical structure, technological function, migration behaviour and market share. These classes were characterized by analytical methods. Analytical techniques were investigated for identification of potential migrants. High-temperature gas chromatography was shown to be a powerful method and 1H-magnetic resonance provided a convenient fingerprint of plastic materials. Volatile compounds were characterized by headspace techniques, where it was shown to be essential to differentiate volatile compounds desorbed from those generated during the thermal desorption itself. For metal trace analysis, microwave mineralization followed by atomic absorption was employed. These different techniques were introduced into a systematic testing scheme that is envisaged as being suitable both for industrial control and for enforcement laboratories. Guidelines will be proposed in the second part of this paper.

  2. Linking job demands and resources to employee engagement and burnout: a theoretical extension and meta-analytic test.

    Science.gov (United States)

    Crawford, Eean R; Lepine, Jeffery A; Rich, Bruce Louis

    2010-09-01

    We refine and extend the job demands-resources model with theory regarding appraisal of stressors to account for inconsistencies in relationships between demands and engagement, and we test the revised theory using meta-analytic structural modeling. Results indicate support for the refined and updated theory. First, demands and burnout were positively associated, whereas resources and burnout were negatively associated. Second, whereas relationships among resources and engagement were consistently positive, relationships among demands and engagement were highly dependent on the nature of the demand. Demands that employees tend to appraise as hindrances were negatively associated with engagement, and demands that employees tend to appraise as challenges were positively associated with engagement. Implications for future research are discussed. Copyright 2010 APA, all rights reserved

  3. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  4. Analytical electron microscopy examination of solid reaction products in long-term test of SRL 200 waste glasses

    International Nuclear Information System (INIS)

    Buck, E.C.; Fortner, J.A.; Bates, J.K.; Feng, X.; Dietz, N.L.; Bradley, C.R.; Tani, B.S.

    1993-01-01

    Alteration phases, found on the leached surfaces and present as colloids in the leachates of 200-based frit (fully active and simulated) nuclear waste glass, reacted under static test conditions, at a surface area to leachate volume ratio of 20,000 m -1 for 15 days to 728 days, have been examined by analytical electron microscopy. The compositions of the secondary phases were determined using x-ray energy dispersive spectroscopy and electron energy loss spectroscopy, and structural analysis was accomplished by electron diffraction. Long-term samples of simulated glass, which had undergone an acceleration of reaction after 182 days, possessed a number of silicate secondary phases, including; smectite (iron silicate and potassium iron alumina-silicate, weeksite (uranium silicate), zeolite (calcium potassium alumino-silicate), tobermorite (calcium silicate), and a pure silica phase. However, uranium silicates and smectite have also been observed in tests, which have not undergone the acceleration of reaction, in both the leachate and leached layer, suggesting that these phases are not responsible for the acceleration of reaction

  5. Discussion and analytical test for inclusion of advanced field and boundary condition in theory of free electron lasers

    Science.gov (United States)

    Niknejadi, Pardis; Madey, John M. J.

    2017-09-01

    By the covariant statement of the distance in space-time separating transmitter and receivers, the emission and absorption of the retarded and advanced waves are all simultaneous. In other words, for signals carried on electromagnetic waves (advanced or retarded) the invariant interval (cdt) 2 -dr2 between the emission of a wave and it's absorption at the non-reflecting boundary is always identically zero. Utilizing this principle, we have previously explained the advantages of including the coherent radiation reaction force as a part of the solution to the boundary value problem for FELs that radiate into "free space" (Self Amplified Spontaneous Emission (SASE) FELs) and discussed how the advanced field of the absorber can interact with the radiating particles at the time of emission. Here we present an analytical test which verifies that a multilayer mirror can act as a band pass filter and can contribute to microbunching in the electron beam. Here we will discuss motivation, conditions and requirements, and method for testing this effect.

  6. Assessing fitness for use: the expected value of spatial data sets

    NARCIS (Netherlands)

    Bruin, de S.; Bregt, A.K.; Ven, van de M.

    2001-01-01

    This paper proposes and illustrates a decision analytical approach to compare the value of alternative spatial data sets. In contrast to other work addressing value of information, its focus is on value of control. This is a useful concept when choosing the best data set for decision making under

  7. Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Samatova, Nagiza [North Carolina State Univ., Raleigh, NC (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Liao, Wei-keng [Northwestern Univ., Evanston, IL (United States)

    2015-03-19

    This project developed a generic and optimized set of core data analytics functions. These functions organically consolidate a broad constellation of high performance analytical pipelines. As the architectures of emerging HPC systems become inherently heterogeneous, there is a need to design algorithms for data analysis kernels accelerated on hybrid multi-node, multi-core HPC architectures comprised of a mix of CPUs, GPUs, and SSDs. Furthermore, the power-aware trend drives the advances in our performance-energy tradeoff analysis framework which enables our data analysis kernels algorithms and software to be parameterized so that users can choose the right power-performance optimizations.

  8. Analytical model for screening potential CO2 repositories

    Science.gov (United States)

    Okwen, R.T.; Stewart, M.T.; Cunningham, J.A.

    2011-01-01

    Assessing potential repositories for geologic sequestration of carbon dioxide using numerical models can be complicated, costly, and time-consuming, especially when faced with the challenge of selecting a repository from a multitude of potential repositories. This paper presents a set of simple analytical equations (model), based on the work of previous researchers, that could be used to evaluate the suitability of candidate repositories for subsurface sequestration of carbon dioxide. We considered the injection of carbon dioxide at a constant rate into a confined saline aquifer via a fully perforated vertical injection well. The validity of the analytical model was assessed via comparison with the TOUGH2 numerical model. The metrics used in comparing the two models include (1) spatial variations in formation pressure and (2) vertically integrated brine saturation profile. The analytical model and TOUGH2 show excellent agreement in their results when similar input conditions and assumptions are applied in both. The analytical model neglects capillary pressure and the pressure dependence of fluid properties. However, simulations in TOUGH2 indicate that little error is introduced by these simplifications. Sensitivity studies indicate that the agreement between the analytical model and TOUGH2 depends strongly on (1) the residual brine saturation, (2) the difference in density between carbon dioxide and resident brine (buoyancy), and (3) the relationship between relative permeability and brine saturation. The results achieved suggest that the analytical model is valid when the relationship between relative permeability and brine saturation is linear or quasi-linear and when the irreducible saturation of brine is zero or very small. ?? 2011 Springer Science+Business Media B.V.

  9. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  10. European specialist porphyria laboratories: diagnostic strategies, analytical quality, clinical interpretation, and reporting as assessed by an external quality assurance program.

    Science.gov (United States)

    Aarsand, Aasne K; Villanger, Jørild H; Støle, Egil; Deybach, Jean-Charles; Marsden, Joanne; To-Figueras, Jordi; Badminton, Mike; Elder, George H; Sandberg, Sverre

    2011-11-01

    The porphyrias are a group of rare metabolic disorders whose diagnosis depends on identification of specific patterns of porphyrin precursor and porphyrin accumulation in urine, blood, and feces. Diagnostic tests for porphyria are performed by specialized laboratories in many countries. Data regarding the analytical and diagnostic performance of these laboratories are scarce. We distributed 5 sets of multispecimen samples from different porphyria patients accompanied by clinical case histories to 18-21 European specialist porphyria laboratories/centers as part of a European Porphyria Network organized external analytical and postanalytical quality assessment (EQA) program. The laboratories stated which analyses they would normally have performed given the case histories and reported results of all porphyria-related analyses available, interpretative comments, and diagnoses. Reported diagnostic strategies initially showed considerable diversity, but the number of laboratories applying adequate diagnostic strategies increased during the study period. We found an average interlaboratory CV of 50% (range 12%-152%) for analytes in absolute concentrations. Result normalization by forming ratios to the upper reference limits did not reduce this variation. Sixty-five percent of reported results were within biological variation-based analytical quality specifications. Clinical interpretation of the obtained analytical results was accurate, and most laboratories established the correct diagnosis in all distributions. Based on a case-based EQA scheme, variations were apparent in analytical and diagnostic performance between European specialist porphyria laboratories. Our findings reinforce the use of EQA schemes as an essential tool to assess both analytical and diagnostic processes and thereby to improve patient care in rare diseases.

  11. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976

    International Nuclear Information System (INIS)

    Baker, R.D.

    1976-08-01

    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described

  12. The body of the analyst and the analytic setting: reflections on the embodied setting and the symbiotic transference.

    Science.gov (United States)

    Lemma, Alessandra

    2014-04-01

    In this paper the author questions whether the body of the analyst may be helpfully conceptualized as an embodied feature of the setting and suggests that this may be especially helpful for understanding patients who develop a symbiotic transference and for whom any variance in the analyst's body is felt to be profoundly destabilizing. In such cases the patient needs to relate to the body of the analyst concretely and exclusively as a setting 'constant' and its meaning for the patient may thus remain inaccessible to analysis for a long time. When the separateness of the body of the analyst reaches the patient's awareness because of changes in the analyst's appearance or bodily state, it then mobilizes primitive anxieties in the patient. It is only when the body of the analyst can become a dynamic variable between them (i.e., part of the process) that it can be used by the patient to further the exploration of their own mind. Copyright © 2014 Institute of Psychoanalysis.

  13. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  14. Vertical equilibrium with sub-scale analytical methods for geological CO2 sequestration

    KAUST Repository

    Gasda, S. E.

    2009-04-23

    Large-scale implementation of geological CO2 sequestration requires quantification of risk and leakage potential. One potentially important leakage pathway for the injected CO2 involves existing oil and gas wells. Wells are particularly important in North America, where more than a century of drilling has created millions of oil and gas wells. Models of CO 2 injection and leakage will involve large uncertainties in parameters associated with wells, and therefore a probabilistic framework is required. These models must be able to capture both the large-scale CO 2 plume associated with the injection and the small-scale leakage problem associated with localized flow along wells. Within a typical simulation domain, many hundreds of wells may exist. One effective modeling strategy combines both numerical and analytical models with a specific set of simplifying assumptions to produce an efficient numerical-analytical hybrid model. The model solves a set of governing equations derived by vertical averaging with assumptions of a macroscopic sharp interface and vertical equilibrium. These equations are solved numerically on a relatively coarse grid, with an analytical model embedded to solve for wellbore flow occurring at the sub-gridblock scale. This vertical equilibrium with sub-scale analytical method (VESA) combines the flexibility of a numerical method, allowing for heterogeneous and geologically complex systems, with the efficiency and accuracy of an analytical method, thereby eliminating expensive grid refinement for sub-scale features. Through a series of benchmark problems, we show that VESA compares well with traditional numerical simulations and to a semi-analytical model which applies to appropriately simple systems. We believe that the VESA model provides the necessary accuracy and efficiency for applications of risk analysis in many CO2 sequestration problems. © 2009 Springer Science+Business Media B.V.

  15. Two-dimensional analytical solution for nodal calculation of nuclear reactors

    International Nuclear Information System (INIS)

    Silva, Adilson C.; Pessoa, Paulo O.; Silva, Fernando C.; Martinez, Aquilino S.

    2017-01-01

    Highlights: • A proposal for a coarse mesh nodal method is presented. • The proposal uses the analytical solution of the two-dimensional neutrons diffusion equation. • The solution is performed homogeneous nodes with dimensions of the fuel assembly. • The solution uses four average fluxes on the node surfaces as boundary conditions. • The results show good accuracy and efficiency. - Abstract: In this paper, the two-dimensional (2D) neutron diffusion equation is analytically solved for two energy groups (2G). The spatial domain of reactor core is divided into a set of nodes with uniform nuclear parameters. To determine iteratively the multiplication factor and the neutron flux in the reactor we combine the analytical solution of the neutron diffusion equation with an iterative method known as power method. The analytical solution for different types of regions that compose the reactor is obtained, such as fuel and reflector regions. Four average fluxes in the node surfaces are used as boundary conditions for analytical solution. Discontinuity factors on the node surfaces derived from the homogenization process are applied to maintain averages reaction rates and the net current in the fuel assembly (FA). To validate the results obtained by the analytical solution a relative power density distribution in the FAs is determined from the neutron flux distribution and compared with the reference values. The results show good accuracy and efficiency.

  16. TideGrapher: Visual Analytics of Tactical Situations for Rugby Matches

    Directory of Open Access Journals (Sweden)

    Yusuke Ishikawa

    2018-03-01

    Full Text Available Various attempts at exploiting information visualization for sports have recently been reported in the literature, although it is still challenging to analyze continuous ball matches. In this paper, we propose a novel visual analytics system, called TideGrapher, to track the transition of tactile situations in a rugby match. With a particular focus on the side position of the ball, we designed a dedicated spatial substrate based on the spatio-temporal trajectory of the ball and provided a set of basic interactions. Quantitative analysis was strengthened by adding a new index, called initiative, to commonly used possession (ball occupation and territory (dominance of territory. The feasibility of the proposed visual analytics system was proven empirically through application to datasets from real amateur and professional matches. Keywords: Information visualization, Sports visualization, Quantitative analysis, Visual analytics

  17. Does the Cognitive Reflection Test actually capture heuristic versus analytic reasoning styles in older adults?

    Science.gov (United States)

    Hertzog, Christopher; Smith, R Marit; Ariel, Robert

    2018-01-01

    Background/Study Context: This study evaluated adult age differences in the original three-item Cognitive Reflection Test (CRT; Frederick, 2005, The Journal of Economic Perspectives, 19, 25-42) and an expanded seven-item version of that test (Toplak et al., 2013, Thinking and Reasoning, 20, 147-168). The CRT is a numerical problem-solving test thought to capture a disposition towards either rapid, intuition-based problem solving (Type I reasoning) or a more thoughtful, analytical problem-solving approach (Type II reasoning). Test items are designed to induce heuristically guided errors that can be avoided if using an appropriate numerical representation of the test problems. We evaluated differences between young adults and old adults in CRT performance and correlates of CRT performance. Older adults (ages 60 to 80) were paid volunteers who participated in experiments assessing age differences in self-regulated learning. Young adults (ages 17 to 35) were students participating for pay as part of a project assessing measures of critical thinking skills or as a young comparison group in the self-regulated learning study. There were age differences in the number of CRT correct responses in two independent samples. Results with the original three-item CRT found older adults to have a greater relative proportion of errors based on providing the intuitive lure. However, younger adults actually had a greater proportion of intuitive errors on the long version of the CRT, relative to older adults. Item analysis indicated a much lower internal consistency of CRT items for older adults. These outcomes do not offer full support for the argument that older adults are higher in the use of a "Type I" cognitive style. The evidence was also consistent with an alternative hypothesis that age differences were due to lower levels of numeracy in the older samples. Alternative process-oriented evaluations of how older adults solve CRT items will probably be needed to determine

  18. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  19. SciDAC visualization and analytics center for enabling technology

    International Nuclear Information System (INIS)

    Bethel, E Wes; Johnson, Chris; Joy, Ken; Ahern, Sean; Pascucci, Valerio; Childs, Hank; Cohen, Jonathan; Duchaineau, Mark; Hamann, Bernd; Hansen, Charles; Laney, Dan; Lindstrom, Peter; Meredith, Jeremy; Ostrouchov, George; Parker, Steven; Silva, Claudio; Sanderson, Allen; Tricoche, Xavier

    2007-01-01

    The Visualization and Analytics Center for Enabling Technologies (VACET) focuses on leveraging scientific visualization and analytics software technology as an enabling technology for increasing scientific productivity and insight. Advances in computational technology have resulted in an 'information big bang,' which in turn has created a significant data understanding challenge. This challenge is widely acknowledged to be one of the primary bottlenecks in contemporary science. The vision of VACET is to adapt, extend, create when necessary, and deploy visual data analysis solutions that are responsive to the needs of DOE's computational and experimental scientists. Our center is engineered to be directly responsive to those needs and to deliver solutions for use in DOE's large open computing facilities. The research and development directly target data understanding problems provided by our scientific application stakeholders. VACET draws from a diverse set of visualization technology ranging from production quality applications and application frameworks to state-of-the-art algorithms for visualization, analysis, analytics, data manipulation, and data management

  20. An Analysis of Rocket Propulsion Testing Costs

    Science.gov (United States)

    Ramirez-Pagan, Carmen P.; Rahman, Shamim A.

    2009-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is generally performed within two arenas: (1) Production testing for certification and acceptance, and (2) Developmental testing for prototype or experimental purposes. The customer base consists of NASA programs, DOD programs, and commercial programs. Resources in place to perform on-site testing include both civil servants and contractor personnel, hardware and software including data acquisition and control, and 6 test stands with a total of 14 test positions/cells. For several business reasons there is the need to augment understanding of the test costs for all the various types of test campaigns. Historical propulsion test data was evaluated and analyzed in many different ways with the intent to find any correlation or statistics that could help produce more reliable and accurate cost estimates and projections. The analytical efforts included timeline trends, statistical curve fitting, average cost per test, cost per test second, test cost timeline, and test cost envelopes. Further, the analytical effort includes examining the test cost from the perspective of thrust level and test article characteristics. Some of the analytical approaches did not produce evidence strong enough for further analysis. Some other analytical approaches yield promising results and are candidates for further development and focused study. Information was organized for into its elements: a Project Profile, Test Cost Timeline, and Cost Envelope. The Project Profile is a snap shot of the project life cycle on a timeline fashion, which includes various statistical analyses. The Test Cost Timeline shows the cumulative average test cost, for each project, at each month where there was test activity. The Test Cost Envelope shows a range of cost for a given number of test(s). The supporting information upon which this study was performed came from diverse sources and thus it was necessary to

  1. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  2. Challenges of privacy protection in big data analytics

    DEFF Research Database (Denmark)

    Jensen, Meiko

    2013-01-01

    The big data paradigm implies that almost every type of information eventually can be derived from sufficiently large datasets. However, in such terms, linkage of personal data of individuals poses a severe threat to privacy and civil rights. In this position paper, we propose a set of challenges...... that have to be addressed in order to perform big data analytics in a privacy-compliant way....

  3. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  4. Almera Proficiency Test Determination of Naturally Occurring Radionuclides in Phosphogypsum and Water

    International Nuclear Information System (INIS)

    2010-01-01

    Phosphogypsum is generated as a by-product of the phosphoric acid based fertilizer industry. The discharge of phosphogypsum on earth surface deposits is a potential source of enhanced natural radiation and heavy metals, and the resulting environmental impact should be considered carefully to ensure safety and compliance with environmental regulations. A reliable determination of technologically enhanced naturally occurring radioactive materials in phosphogypsum is necessary to comply with the radiation protection and environmental regulations. This proficiency test (PT) is one of the series of the ALMERA network proficiency tests organised on a regular basis by the Chemistry Unit of the IAEA Terrestrial Environment Laboratory. These proficiency tests are designed to identify analytical problems, to support Member States laboratories to maintain their preparedness and to provide rapid and reliable analytical results. In this PT, the test item set consisted of six samples: one phosphogypsum (the IAEA-434 reference material) and five water samples spiked with natural radionuclides. The main task of the participating laboratories was to identify and quantify the activity levels of radionuclides present in these matrices. The tasks of IAEA were to prepare and distribute the samples to the participating laboratories, to collect and interpret analysis results and to compile a comprehensive report. The certified massic activity values of all radionuclides used in this PT were fulfilling the requirements of metrological traceability to international standards of radioactivity. In this PT, 306 test items (reference materials) were prepared and distributed to 52 participants from 40 countries in November 2008. The deadline for receiving the results from the participants was set to15 May 2009. For gross alpha/beta results the deadline was one working day from the date of sample delivery. The participating laboratories were requested to analyse Ra-226, U-234 and U-238 in water

  5. Pressure test behaviour of embalse nuclear power plant containment structure

    International Nuclear Information System (INIS)

    Bruschi, S.; Marinelli, C.

    1984-01-01

    It's described the structural behaviour of the containment structure during the pressure test of the Embalse plant (CANDU type, 600MW), made of prestressed concrete with an epoxi liner. Displacement, strain, temperature, and pressure measurements of the containment structure of the Embalse Nuclear Power Plant are presented. The instrumentation set up and measurement specifications are described for all variables of interest before, during and after the pressure test. The analytical models to simulate the heat transfer due to sun heating and air convenction and to predict the associated thermal strains and displacements are presented. (E.G.) [pt

  6. Testing and Analysis of Sensor Ports

    Science.gov (United States)

    Zhang, M.; Frendi, A.; Thompson, W.; Casiano, M. J.

    2016-01-01

    This Technical Publication summarizes the work focused on the testing and analysis of sensor ports. The tasks under this contract were divided into three areas: (1) Development of an Analytical Model, (2) Conducting a Set of Experiments, and (3) Obtaining Computational Solutions. Results from the experiment using both short and long sensor ports were obtained using harmonic, random, and frequency sweep plane acoustic waves. An amplification factor of the pressure signal between the port inlet and the back of the port is obtained and compared to models. Comparisons of model and experimental results showed very good agreement.

  7. On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials

    Science.gov (United States)

    Gates, Thomas S.

    2003-01-01

    A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.

  8. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    Directory of Open Access Journals (Sweden)

    Kronenwett Ralf

    2012-10-01

    Full Text Available Abstract Background EndoPredict (EP is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE tissue by reverse transcription-quantitative real-time PCR (RT-qPCR. Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Methods Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. Results PCR assays were linear up to Cq values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots resulted in a total noise (standard deviation of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14 was caused by the replicate-to-replicate noise of the PCR assays (repeatability and was not associated with different operating conditions (reproducibility. Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. Conclusions The EP test showed reproducible performance

  9. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    International Nuclear Information System (INIS)

    Kronenwett, Ralf; Brase, Jan C; Weber, Karsten E; Fisch, Karin; Müller, Berit M; Schmidt, Marcus; Filipits, Martin; Dubsky, Peter; Petry, Christoph; Dietel, Manfred; Denkert, Carsten; Bohmann, Kerstin; Prinzler, Judith; Sinn, Bruno V; Haufe, Franziska; Roth, Claudia; Averdick, Manuela; Ropers, Tanja; Windbergs, Claudia

    2012-01-01

    EndoPredict (EP) is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE) tissue by reverse transcription-quantitative real-time PCR (RT-qPCR). Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. PCR assays were linear up to C q values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots) resulted in a total noise (standard deviation) of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14) was caused by the replicate-to-replicate noise of the PCR assays (repeatability) and was not associated with different operating conditions (reproducibility). Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. The EP test showed reproducible performance characteristics with good precision and negligible laboratory

  10. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  11. Development and testing of analytical models for the pebble bed type HTRs

    International Nuclear Information System (INIS)

    Huda, M.Q.; Obara, T.

    2008-01-01

    The pebble bed type gas cooled high temperature reactor (HTR) appears to be a good candidate for the next generation nuclear reactor technology. These reactors have unique characteristics in terms of the randomness in geometry, and require special techniques to analyze their systems. This study includes activities concerning the testing of computational tools and the qualification of models. Indeed, it is essential that the validated analytical tools be available to the research community. From this viewpoint codes like MCNP, ORIGEN and RELAP5, which have been used in nuclear industry for many years, are selected to identify and develop new capabilities needed to support HTR analysis. The geometrical model of the full reactor is obtained by using lattice and universe facilities provided by MCNP. The coupled MCNP-ORIGEN code is used to estimate the burnup and the refuelling scheme. Results obtained from Monte Carlo analysis are interfaced with RELAP5 to analyze the thermal hydraulics and safety characteristics of the reactor. New models and methodologies are developed for several past and present experimental and prototypical facilities that were based on HTR pebble bed concepts. The calculated results are compared with available experimental data and theoretical evaluations showing very good agreement. The ultimate goal of the validation of the computer codes for pebble bed HTR applications is to acquire and reinforce the capability of these general purpose computer codes for performing HTR core design and optimization studies

  12. Set up of analytical methods for evaluation of specifications of recombinant Hepatitis-B vaccine

    Directory of Open Access Journals (Sweden)

    Daram M

    2009-06-01

    Full Text Available "nBackground: Hepatitis B vaccination has been included in routine immunization of all individuals according to WHO recommendations since 1991. Despite successful coverage, 3-5% of recipients fail to mount a desirable protection level of Ab. Vaccine failure results from: emergence of mutation, immune failure of individuals, decrease in vaccine potency, and etc. The quality of Hepatitis B vaccine should be evaluated by a reliable method. "n"nMethods: The amount of vaccine antigen was measured through the in vitro assay of Hepatitis B vaccines which consists of multiple dilutions of the reference material and samples. The preparations were evaluated by Elisa to determine the amount of HBsAg. The data were analyzed by parallel-line analysis software. The in vivo assay was performed by inoculating multiple doses of the reference and sample preparations in Balb/c mice. A control group was also inoculated with vaccine matrix. Four weeks later, the mice sera were evaluated to determine the presence of antibodies against Hepatitis B by Elisa method. The data were analyzed by Probit analysis software. "n"nResults: Both methods were set up in our laboratory by which different batches of Hepatitis B vaccine were evaluated. It was observed that In vivo and In vitro methods provide comparable results. Therefore we can use the in vitro method for routine testing of HB vaccine quality control. "n"nConclusion: In vitro method can be used in place of In vivo method because of its time and cost-effectiveness. Moreover, since no animals are used in in vitro method, it complies well with the 3R concept (Reduction, Refinement, and Replacement of animal testing and the current tendency to use alternative method.

  13. Making advanced analytics work for you.

    Science.gov (United States)

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  14. Test cases for interface tracking methods: methodology and current status

    International Nuclear Information System (INIS)

    Lebaigue, O.; Jamet, D.; Lemonnier, E.

    2004-01-01

    Full text of publication follows:In the past decade, a large number of new methods have been developed to deal with interfaces in the numerical simulation of two-phase flows. We have collected a set of 36 test cases, which can be seen as a tool to help engineers and researchers selecting the most appropriate method(s) for their specific fields of application. This set can be use: - To perform an initial evaluation of the capabilities of available methods with regard to the specificity of the final application and the most important features to be recovered from the simulation. - To measure the maximum mesh size to be used for a given physical problem in order to obtain an accurate enough solution. - To assess and quantify the performances of a selected method equipped with its set of physical models. The computation of a well-documented test case allows estimating the error due to the numerical technique by comparison with reference solutions. This process is compulsory to gain confidence and credibility on the prediction capabilities of a numerical method and its physical models. - To broaden the capabilities of a given numerical technique. The test cases may be used to identify the need for improvement of the overall numerical scheme or to determine the physical part of the model, which is responsible for the observed limitations. Each test case falls within one of the following categories: - Analytical solutions of well-known sets of equations corresponding to simple geometrical situations. - Reference numerical solutions of moderately complex problems, produced by accurate methods (e.g., boundary Fitted coordinate method) on refined meshes. - Separate effects analytical experiments. The presentation will suggest how to use the test cases for assessing the physical models and the numerical methods. The expected fallout of using test cases is indeed on the one hand to identify the merits of existing methods and on the other hand to orient further research towards

  15. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.

    Science.gov (United States)

    Karabatsos, George

    2018-06-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.

  16. Realising point of care testing

    International Nuclear Information System (INIS)

    Braybrook, J

    2009-01-01

    Efforts to move molecular diagnostic technologies out of a centralised lab setting and closer to the patient have proved problematic. Early diagnosis of disease is often dependent upon detection of trace amounts of a molecular marker in a complex background. This challenging analytical scenario is compounded when testing is done in rapid manner using miniaturized and portable instruments. Metrology will be fundamental to delivering high quality and reliable clinical data with measurable sensitivity and robustness. Quality of the sample, integrity of the analyser, and ease of use together with incorporation of appropriate QC standards and demonstration of 'fitness for purpose' will be key challenges.

  17. Synergistic relationships between Analytical Chemistry and written standards

    International Nuclear Information System (INIS)

    Valcárcel, Miguel; Lucena, Rafael

    2013-01-01

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived

  18. Synergistic relationships between Analytical Chemistry and written standards

    Energy Technology Data Exchange (ETDEWEB)

    Valcárcel, Miguel, E-mail: qa1vacam@uco.es; Lucena, Rafael

    2013-07-25

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived.

  19. Mars Analytical Microimager

    Science.gov (United States)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  20. Cost effectiveness of ovarian reserve testing in in vitro fertilization: a Markov decision-analytic model.

    Science.gov (United States)

    Moolenaar, Lobke M; Broekmans, Frank J M; van Disseldorp, Jeroen; Fauser, Bart C J M; Eijkemans, Marinus J C; Hompes, Peter G A; van der Veen, Fulco; Mol, Ben Willem J

    2011-10-01

    To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). A Markov decision model based on data from the literature and original patient data. Decision analytic framework. Computer-simulated cohort of subfertile women aged 20 to 45 years who are eligible for IVF. [1] No treatment, [2] up to three cycles of IVF limited to women under 41 years and no ovarian reserve testing, [3] up to three cycles of IVF with dose individualization of gonadotropins according to ovarian reserve, and [4] up to three cycles of IVF with ovarian reserve testing and exclusion of expected poor responders after the first cycle, with no treatment scenario as the reference scenario. Cumulative live birth over 1 year, total costs, and incremental cost-effectiveness ratios. The cumulative live birth was 9.0% in the no treatment scenario, 54.8% for scenario 2, 70.6% for scenario 3 and 51.9% for scenario 4. Absolute costs per woman for these scenarios were €0, €6,917, €6,678, and €5,892 for scenarios 1, 2, 3, and 4, respectively. Incremental cost-effectiveness ratios (ICER) for scenarios 2, 3, and 4 were €15,166, €10,837, and €13,743 per additional live birth. Sensitivity analysis showed the model to be robust over a wide range of values. Individualization of the follicle-stimulating hormone dose according to ovarian reserve is likely to be cost effective in women who are eligible for IVF, but this effectiveness needs to be confirmed in randomized clinical trials. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  1. A Kolmogorov-Smirnov Based Test for Comparing the Predictive Accuracy of Two Sets of Forecasts

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2015-08-01

    Full Text Available This paper introduces a complement statistical test for distinguishing between the predictive accuracy of two sets of forecasts. We propose a non-parametric test founded upon the principles of the Kolmogorov-Smirnov (KS test, referred to as the KS Predictive Accuracy (KSPA test. The KSPA test is able to serve two distinct purposes. Initially, the test seeks to determine whether there exists a statistically significant difference between the distribution of forecast errors, and secondly it exploits the principles of stochastic dominance to determine whether the forecasts with the lower error also reports a stochastically smaller error than forecasts from a competing model, and thereby enables distinguishing between the predictive accuracy of forecasts. We perform a simulation study for the size and power of the proposed test and report the results for different noise distributions, sample sizes and forecasting horizons. The simulation results indicate that the KSPA test is correctly sized, and robust in the face of varying forecasting horizons and sample sizes along with significant accuracy gains reported especially in the case of small sample sizes. Real world applications are also considered to illustrate the applicability of the proposed KSPA test in practice.

  2. Analytical Chemistry Core Capability Assessment - Preliminary Report

    International Nuclear Information System (INIS)

    Barr, Mary E.; Farish, Thomas J.

    2012-01-01

    The concept of 'core capability' can be nebulous one. Even at a fairly specific level, where core capability equals maintaining essential services, it is highly dependent upon the perspective of the requestor. Samples are submitted to analytical services because the requesters do not have the capability to conduct adequate analyses themselves. Some requests are for general chemical information in support of R and D, process control, or process improvement. Many analyses, however, are part of a product certification package and must comply with higher-level customer quality assurance requirements. So which services are essential to that customer - just those for product certification? Does the customer also (indirectly) need services that support process control and improvement? And what is the timeframe? Capability is often expressed in terms of the currently utilized procedures, and most programmatic customers can only plan a few years out, at best. But should core capability consider the long term where new technologies, aging facilities, and personnel replacements must be considered? These questions, and a multitude of others, explain why attempts to gain long-term consensus on the definition of core capability have consistently failed. This preliminary report will not try to define core capability for any specific program or set of programs. Instead, it will try to address the underlying concerns that drive the desire to determine core capability. Essentially, programmatic customers want to be able to call upon analytical chemistry services to provide all the assays they need, and they don't want to pay for analytical chemistry services they don't currently use (or use infrequently). This report will focus on explaining how the current analytical capabilities and methods evolved to serve a variety of needs with a focus on why some analytes have multiple analytical techniques, and what determines the infrastructure for these analyses. This information will be

  3. Analytical, Practical and Emotional Intelligence and Line Manager Competencies

    Directory of Open Access Journals (Sweden)

    Anna Baczyńska

    2015-12-01

    Full Text Available Purpose: The research objective was to examine to what extent line manager competencies are linked to intelligence, and more specifically, three types of intelligence: analytical (fluid, practical and emotional. Methodology: The research was carried out with line managers (N=98 who took part in 12 Assessment Centre sessions and completed tests measuring analytical, practical and emotional intelligence. The adopted hypotheses were tested using a multiple regression. In the regression model, the dependent variable was a managerial competency (management and striving for results, social skills, openness to change, problem solving, employee development and the explanatory variables were the three types of intelligence. Five models, each for a separate management competency, were tested in this way. Findings: In the study, it was hypothesized that practical intelligence relates to procedural tacit knowledge and is the strongest indicator of managerial competency. Analysis of the study results testing this hypothesis indicated that practical intelligence largely accounts for the level of competency used in managerial work (from 21% to 38%. The study findings suggest that practical intelligence is a better indicator of managerial competencies among line managers than traditionally measured IQ or emotional intelligence. Originality: This research fills an important gap in the literature on the subject, indicating the links between major contemporary selection indicators (i.e., analytical, practical and emotional intelligence and managerial competencies presented in realistic work simulations measured using the Assessment Centre process.

  4. Maritime Analytics Prototype: Phase 3 Validation

    Science.gov (United States)

    2014-01-01

    different so we need a flexible analysis set hierarchy encoded as directories or groups – like a recipe [C.3.1.4n] Improve the GUI:  Provide more...Problems zooming and panning on the timeline [C.1.2.1c, C.1.2.4e, C.1.3.1c, C.1.1.4c, C.1.1.4b]  Selected the wrong year and then the vessel...Scholtz_VAMetrics_2006.pdf] [21] J. Thomas, and K. Cook , Illuminating the Path, the Research and Development Agenda for Visual analytics: IEEE, 2005. [22

  5. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    Science.gov (United States)

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  6. Analytical protocols for characterisation of sulphur-free lignin

    NARCIS (Netherlands)

    Gosselink, R.J.A.; Abächerli, A.; Semke, H.; Malherbe, R.; Käuper, P.; Nadif, A.; Dam, van J.E.G.

    2004-01-01

    Interlaboratory tests for chemical characterisation of sulphur-free lignins were performed by five laboratories to develop useful analytical protocols, which are lacking, and identify quality-related properties. Protocols have been established for reproducible determination of the chemical

  7. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  8. Accuracy of urine circulating cathodic antigen (CCA test for Schistosoma mansoni diagnosis in different settings of Côte d'Ivoire.

    Directory of Open Access Journals (Sweden)

    Jean T Coulibaly

    2011-11-01

    Full Text Available BACKGROUND: Promising results have been reported for a urine circulating cathodic antigen (CCA test for the diagnosis of Schistosoma mansoni. We assessed the accuracy of a commercially available CCA cassette test (designated CCA-A and an experimental formulation (CCA-B for S. mansoni diagnosis. METHODOLOGY: We conducted a cross-sectional survey in three settings of Côte d'Ivoire: settings A and B are endemic for S. mansoni, whereas S. haematobium co-exists in setting C. Overall, 446 children, aged 8-12 years, submitted multiple stool and urine samples. For S. mansoni diagnosis, stool samples were examined with triplicate Kato-Katz, whereas urine samples were tested with CCA-A. The first stool and urine samples were additionally subjected to an ether-concentration technique and CCA-B, respectively. Urine samples were examined for S. haematobium using a filtration method, and for microhematuria using Hemastix dipsticks. PRINCIPAL FINDINGS: Considering nine Kato-Katz as diagnostic 'gold' standard, the prevalence of S. mansoni in setting A, B and C was 32.9%, 53.1% and 91.8%, respectively. The sensitivity of triplicate Kato-Katz from the first stool and a single CCA-A test was 47.9% and 56.3% (setting A, 73.9% and 69.6% (setting B, and 94.2% and 89.6% (setting C. The respective sensitivity of a single CCA-B was 10.4%, 29.9% and 75.0%. The ether-concentration technique showed a low sensitivity for S. mansoni diagnosis (8.3-41.0%. The specificity of CCA-A was moderate (76.9-84.2%; CCA-B was high (96.7-100%. The likelihood of a CCA-A color reaction increased with higher S. mansoni fecal egg counts (odds ratio: 1.07, p<0.001. A concurrent S. haematobium infection or the presence of microhematuria did not influence the CCA-A test results for S. mansoni diagnosis. CONCLUSION/SIGNIFICANCE: CCA-A showed similar sensitivity than triplicate Kato-Katz for S. mansoni diagnosis with no cross-reactivity to S. haematobium and microhematuria. The low sensitivity

  9. Multiplicity distributions of gluon and quark jets and tests of QCD analytic predictions

    CERN Document Server

    Ackerstaff, K; Allison, J; Altekamp, N; Anderson, K J; Anderson, S; Arcelli, S; Asai, S; Axen, D A; Azuelos, Georges; Ball, A H; Barberio, E; Barlow, R J; Bartoldus, R; Batley, J Richard; Baumann, S; Bechtluft, J; Beeston, C; Behnke, T; Bell, A N; Bell, K W; Bella, G; Bentvelsen, Stanislaus Cornelius Maria; Bethke, Siegfried; Biebel, O; Biguzzi, A; Bird, S D; Blobel, Volker; Bloodworth, Ian J; Bloomer, J E; Bobinski, M; Bock, P; Bonacorsi, D; Boutemeur, M; Bouwens, B T; Braibant, S; Brigliadori, L; Brown, R M; Burckhart, Helfried J; Burgard, C; Bürgin, R; Capiluppi, P; Carnegie, R K; Carter, A A; Carter, J R; Chang, C Y; Charlton, D G; Chrisman, D; Clarke, P E L; Cohen, I; Conboy, J E; Cooke, O C; Cuffiani, M; Dado, S; Dallapiccola, C; Dallavalle, G M; Davis, R; De Jong, S; del Pozo, L A; Desch, Klaus; Dienes, B; Dixit, M S; do Couto e Silva, E; Doucet, M; Duchovni, E; Duckeck, G; Duerdoth, I P; Eatough, D; Edwards, J E G; Estabrooks, P G; Evans, H G; Evans, M; Fabbri, Franco Luigi; Fanti, M; Faust, A A; Fiedler, F; Fierro, M; Fischer, H M; Fleck, I; Folman, R; Fong, D G; Foucher, M; Fürtjes, A; Futyan, D I; Gagnon, P; Gary, J W; Gascon, J; Gascon-Shotkin, S M; Geddes, N I; Geich-Gimbel, C; Geralis, T; Giacomelli, G; Giacomelli, P; Giacomelli, R; Gibson, V; Gibson, W R; Gingrich, D M; Glenzinski, D A; Goldberg, J; Goodrick, M J; Gorn, W; Grandi, C; Gross, E; Grunhaus, Jacob; Gruwé, M; Hajdu, C; Hanson, G G; Hansroul, M; Hapke, M; Hargrove, C K; Hart, P A; Hartmann, C; Hauschild, M; Hawkes, C M; Hawkings, R; Hemingway, Richard J; Herndon, M; Herten, G; Heuer, R D; Hildreth, M D; Hill, J C; Hillier, S J; Hobson, P R; Homer, R James; Honma, A K; Horváth, D; Hossain, K R; Howard, R; Hüntemeyer, P; Hutchcroft, D E; Igo-Kemenes, P; Imrie, D C; Ingram, M R; Ishii, K; Jawahery, A; Jeffreys, P W; Jeremie, H; Jimack, Martin Paul; Joly, A; Jones, C R; Jones, G; Jones, M; Jost, U; Jovanovic, P; Junk, T R; Karlen, D A; Kartvelishvili, V G; Kawagoe, K; Kawamoto, T; Kayal, P I; Keeler, Richard K; Kellogg, R G; Kennedy, B W; Kirk, J; Klier, A; Kluth, S; Kobayashi, T; Kobel, M; Koetke, D S; Kokott, T P; Kolrep, M; Komamiya, S; Kress, T; Krieger, P; Von Krogh, J; Kyberd, P; Lafferty, G D; Lahmann, R; Lai, W P; Lanske, D; Lauber, J; Lautenschlager, S R; Layter, J G; Lazic, D; Lee, A M; Lefebvre, E; Lellouch, Daniel; Letts, J; Levinson, L; Lloyd, S L; Loebinger, F K; Long, G D; Losty, Michael J; Ludwig, J; Macchiolo, A; MacPherson, A L; Mannelli, M; Marcellini, S; Markus, C; Martin, A J; Martin, J P; Martínez, G; Mashimo, T; Mättig, P; McDonald, W J; McKenna, J A; McKigney, E A; McMahon, T J; McPherson, R A; Meijers, F; Menke, S; Merritt, F S; Mes, H; Meyer, J; Michelini, Aldo; Mikenberg, G; Miller, D J; Mincer, A; Mir, R; Mohr, W; Montanari, A; Mori, T; Morii, M; Müller, U; Mihara, S; Nagai, K; Nakamura, I; Neal, H A; Nellen, B; Nisius, R; O'Neale, S W; Oakham, F G; Odorici, F; Ögren, H O; Oh, A; Oldershaw, N J; Oreglia, M J; Orito, S; Pálinkás, J; Pásztor, G; Pater, J R; Patrick, G N; Patt, J; Pearce, M J; Pérez-Ochoa, R; Petzold, S; Pfeifenschneider, P; Pilcher, J E; Pinfold, J L; Plane, D E; Poffenberger, P R; Poli, B; Posthaus, A; Rees, D L; Rigby, D; Robertson, S; Robins, S A; Rodning, N L; Roney, J M; Rooke, A M; Ros, E; Rossi, A M; Routenburg, P; Rozen, Y; Runge, K; Runólfsson, O; Ruppel, U; Rust, D R; Rylko, R; Sachs, K; Saeki, T; Sarkisyan-Grinbaum, E; Sbarra, C; Schaile, A D; Schaile, O; Scharf, F; Scharff-Hansen, P; Schenk, P; Schieck, J; Schleper, P; Schmitt, B; Schmitt, S; Schöning, A; Schröder, M; Schultz-Coulon, H C; Schumacher, M; Schwick, C; Scott, W G; Shears, T G; Shen, B C; Shepherd-Themistocleous, C H; Sherwood, P; Siroli, G P; Sittler, A; Skillman, A; Skuja, A; Smith, A M; Snow, G A; Sobie, Randall J; Söldner-Rembold, S; Springer, R W; Sproston, M; Stephens, K; Steuerer, J; Stockhausen, B; Stoll, K; Strom, D; Szymanski, P; Tafirout, R; Talbot, S D; Tanaka, S; Taras, P; Tarem, S; Teuscher, R; Thiergen, M; Thomson, M A; Von Törne, E; Towers, S; Trigger, I; Trócsányi, Z L; Tsur, E; Turcot, A S; Turner-Watson, M F; Utzat, P; Van Kooten, R; Verzocchi, M; Vikas, P; Vokurka, E H; Voss, H; Wäckerle, F; Wagner, A; Ward, C P; Ward, D R; Watkins, P M; Watson, A T; Watson, N K; Wells, P S; Wermes, N; White, J S; Wilkens, B; Wilson, G W; Wilson, J A; Wolf, G; Wyatt, T R; Yamashita, S; Yekutieli, G; Zacek, V; Zer-Zion, D

    1999-01-01

    Gluon jets are identified in e+e- hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The charged particle multiplicity distribution of the gluon jets is presented, and is analyzed for its mean, dispersion, skew, and curtosis values, and for its factorial and cumulant moments. The results are compared to the analogous results found for a sample of light quark (uds) jets, also defined inclusively. We observe differences between the mean, skew and curtosis values of gluon and quark jets, but not between their dispersions. The cumulant moment results are compared to the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observe...

  10. A GPU code for analytic continuation through a sampling method

    Directory of Open Access Journals (Sweden)

    Johan Nordström

    2016-01-01

    Full Text Available We here present a code for performing analytic continuation of fermionic Green’s functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU. The code is based on the sampling method introduced by Mishchenko et al. (2000, and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  11. SALE, Quality Control of Analytical Chemical Measurements

    International Nuclear Information System (INIS)

    Bush, W.J.; Gentillon, C.D.

    1985-01-01

    1 - Description of problem or function: The Safeguards Analytical Laboratory Evaluation (SALE) program is a statistical analysis program written to analyze the data received from laboratories participating in the SALE quality control and evaluation program. The system is aimed at identifying and reducing analytical chemical measurement errors. Samples of well-characterized materials are distributed to laboratory participants at periodic intervals for determination of uranium or plutonium concentration and isotopic distributions. The results of these determinations are statistically evaluated and participants are informed of the accuracy and precision of their results. 2 - Method of solution: Various statistical techniques produce the SALE output. Assuming an unbalanced nested design, an analysis of variance is performed, resulting in a test of significance for time and analyst effects. A trend test is performed. Both within- laboratory and between-laboratory standard deviations are calculated. 3 - Restrictions on the complexity of the problem: Up to 1500 pieces of data for each nuclear material sampled by a maximum of 75 laboratories may be analyzed

  12. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  13. The Earth Data Analytic Services (EDAS) Framework

    Science.gov (United States)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  14. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    2015-01-01

    COSIATEC, SIATECCompress and Forth’s algorithm are point-set compression algorithms developed for discovering repeated patterns in music, such as themes and motives that would be of interest to a music analyst. To investigate their effectiveness and versatility, these algorithms were evaluated...... on three analytical tasks that depend on the discovery of repeated patterns: classifying folk song melodies into tune families, discovering themes and sections in polyphonic music, and discovering subject and countersubject entries in fugues. Each algorithm computes a compressed encoding of a point......-set representation of a musical object in the form of a list of compact patterns, each pattern being given with a set of vectors indicating its occurrences. However, the algorithms adopt different strategies in their attempts to discover encodings that maximize compression.The best-performing algorithm on the folk...

  15. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  16. The presentation of explicit analytical solutions of a class of nonlinear evolution equations

    International Nuclear Information System (INIS)

    Feng Jinshun; Guo Mingpu; Yuan Deyou

    2009-01-01

    In this paper, we introduce a function set Ω m . There is a conjecture that an arbitrary explicit travelling-wave analytical solution of a real constant coefficient nonlinear evolution equation is necessarily a linear (or nonlinear) combination of the product of some elements in Ω m . A widespread applicable approach for solving a class of nonlinear evolution equations is established. The new analytical solutions to two kinds of nonlinear evolution equations are described with the aid of the guess.

  17. Influence of centrifugation conditions on the results of 77 routine clinical chemistry analytes using standard vacuum blood collection tubes and the new BD-Barricor tubes.

    Science.gov (United States)

    Cadamuro, Janne; Mrazek, Cornelia; Leichtle, Alexander B; Kipman, Ulrike; Felder, Thomas K; Wiedemann, Helmut; Oberkofler, Hannes; Fiedler, Georg M; Haschke-Becher, Elisabeth

    2018-02-15

    Although centrifugation is performed in almost every blood sample, recommendations on duration and g-force are heterogeneous and mostly based on expert opinions. In order to unify this step in a fully automated laboratory, we aimed to evaluate different centrifugation settings and their influence on the results of routine clinical chemistry analytes. We collected blood from 41 healthy volunteers into BD Vacutainer PST II-heparin-gel- (LiHepGel), BD Vacutainer SST II-serum-, and BD Vacutainer Barricor heparin-tubes with a mechanical separator (LiHepBar). Tubes were centrifuged at 2000xg for 10 minutes and 3000xg for 7 and 5 minutes, respectively. Subsequently 60 and 21 clinical chemistry analytes were measured in plasma and serum samples, respectively, using a Roche COBAS instrument. High sensitive Troponin T, pregnancy-associated plasma protein A, ß human chorionic gonadotropin and rheumatoid factor had to be excluded from statistical evaluation as many of the respective results were below the measuring range. Except of free haemoglobin (fHb) measurements, no analyte result was altered by the use of shorter centrifugation times at higher g-forces. Comparing LiHepBar to LiHepGel tubes at different centrifugation setting, we found higher lactate-dehydrogenase (LD) (P = 0.003 to centrifuged at higher speed (3000xg) for a shorter amount of time (5 minutes) without alteration of the analytes tested in this study. When using LiHepBar tubes for blood collection, a separate LD reference value might be needed.

  18. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  19. Generic test procedure for the qualification of a nuclear emergency generator

    International Nuclear Information System (INIS)

    Simonis, J.C.; Bowman, S.W.

    1985-01-01

    This paper describes the proposed analytical and experimental procedures for the qualification of the standby generators and exciters used in the Emergency Diesel Generator Systems in a nuclear power plant. The components which require qualification are identified through a failure mode analysis of the systems conducted using engineering drawings updated to include all field changes. The qualification of each component includes the margins given in IEEE Std. 323-1974, ''Qualifying Class 1E Equipment for Nuclear Power Generating Stations.'' These margins are combined with the plant specific data to define an enveloping set of environmental parameters. This set of enveloping parameters, plus margin, form the bases for the analysis or test qualification tasks. Proposed qualification of the composite electrical insulation systems used in the generator and exciter on the form or random wound coils is by traceable testing. However, before testing the thermal and radiation degradation data used in the design of the generator and exciter are evaluated to identify if these data are sufficiently traceable to eliminate the need for additional insulation tests. The required tests are guided by applicable standards

  20. Clinical laboratory analytics: Challenges and promise for an emerging discipline

    Directory of Open Access Journals (Sweden)

    Brian H Shirts

    2015-01-01

    Full Text Available The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.