WorldWideScience

Sample records for reliable analytical tool

  1. Analytic tools for investigating the structure of network reliability measures with regard to observation correlations

    Science.gov (United States)

    Prószyński, W.; Kwaśniak, M.

    2018-03-01

    A global measure of observation correlations in a network is proposed, together with the auxiliary indices related to non-diagonal elements of the correlation matrix. Based on the above global measure, a specific representation of the correlation matrix is presented, being the result of rigorously proven theorem formulated within the present research. According to the theorem, each positive definite correlation matrix can be expressed by a scale factor and a so-called internal weight matrix. Such a representation made it possible to investigate the structure of the basic reliability measures with regard to observation correlations. Numerical examples carried out for two test networks illustrate the structure of those measures that proved to be dependent on global correlation index. Also, the levels of global correlation are proposed. It is shown that one can readily find an approximate value of the global correlation index, and hence the correlation level, for the expected values of auxiliary indices being the only knowledge about a correlation matrix of interest. The paper is an extended continuation of the previous study of authors that was confined to the elementary case termed uniform correlation. The extension covers arbitrary correlation matrices and a structure of correlation effect.

  2. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  3. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  4. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  5. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  6. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  7. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  8. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  9. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  10. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    /purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  11. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  12. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    Selecting the appropriate analytical methods for characterizing the assembly and morphology of polymer-based vesicles, or polymersomes are required to reach their full potential in biotechnology. This work presents and compares 17 different techniques for their ability to adequately report size....../purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  13. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  14. Analytical approach for confirming the achievement of LMFBR reliability goals

    International Nuclear Information System (INIS)

    Ingram, G.E.; Elerath, J.G.; Wood, A.P.

    1981-01-01

    The approach, recommended by GE-ARSD, for confirming the achievement of LMFBR reliability goals relies upon a comprehensive understanding of the physical and operational characteristics of the system and the environments to which the system will be subjected during its operational life. This kind of understanding is required for an approach based on system hardware testing or analyses, as recommended in this report. However, for a system as complex and expensive as the LMFBR, an approach which relies primarily on system hardware testing would be prohibitive both in cost and time to obtain the required system reliability test information. By using an analytical approach, results of tests (reliability and functional) at a low level within the specific system of interest, as well as results from other similar systems can be used to form the data base for confirming the achievement of the system reliability goals. This data, along with information relating to the design characteristics and operating environments of the specific system, will be used in the assessment of the system's reliability

  15. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  16. Reliability concepts applied to cutting tool change time

    International Nuclear Information System (INIS)

    Patino Rodriguez, Carmen Elena; Francisco Martha de Souza, Gilberto

    2010-01-01

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  17. Application of analytical procedure on system reliability, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    2000-01-01

    In the Ship Research Institute, research and development of GO-FLOW procedure with various advanced functions as a system reliability analysis method occupying main part of the probabilistic safety assessment (PSA) were promoted. In this study, as an important evaluation technique on executing PSA with lower than level 3, by intending fundamental upgrading of the GO-FLOW procedure, a safety assessment system using the GO-FLOW as well as an analytical function coupling of dynamic behavior analytical function and physical behavior of the system with stochastic phenomenon change were developed. In 1998 fiscal year, preparation and verification of various functions such as dependence addition between the headings, rearrangement in order of time, positioning of same heading to plural positions, calculation of forming frequency with elapsing time were carried out. And, on a simulation analysis function of accident sequence, confirmation on analysis covering all of main accident sequence in the reactor for improved marine reactor, MRX was carried out. In addition, a function near automatically producible on input data for analysis was also prepared. As a result, the conventional analysis not always easy understanding on analytical results except an expert of PSA was solved, and understanding of the accident phenomenon, verification of validity on analysis, feedback to analysis, and feedback to design could be easily carried out. (G.K.)

  18. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Drachsler, Hendrik; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  19. PIXE as an analytical/educational tool

    International Nuclear Information System (INIS)

    Williams, E.T.

    1991-01-01

    The advantages and disadvantages of PIXE as an analytical method will be summarized. The authors will also discuss the advantages of PIXE as a means of providing interesting and feasible research projects for undergraduate science students

  20. Method for effective usage of Google Analytics tools

    Directory of Open Access Journals (Sweden)

    Ирина Николаевна Егорова

    2016-01-01

    Full Text Available Modern Google Analytics tools have been investigated against effective attraction channels for users and bottlenecks detection. Conducted investigation allowed to suggest modern method for effective usage of Google Analytics tools. The method is based on main traffic indicators analysis, as well as deep analysis of goals and their consecutive tweaking. Method allows to increase website conversion and might be useful for SEO and Web analytics specialists

  1. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  2. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  3. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  4. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  6. Seeking high reliability in primary care: Leadership, tools, and organization.

    Science.gov (United States)

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an

  7. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  8. Reldata - a tool for reliability database management

    International Nuclear Information System (INIS)

    Vinod, Gopika; Saraf, R.K.; Babar, A.K.; Sanyasi Rao, V.V.S.; Tharani, Rajiv

    2000-01-01

    Component failure, repair and maintenance data is a very important element of any Probabilistic Safety Assessment study. The credibility of the results of such study is enhanced if the data used is generated from operating experience of similar power plants. Towards this objective, a computerised database is designed, with fields such as, date and time of failure, component name, failure mode, failure cause, ways of failure detection, reactor operating power status, repair times, down time, etc. This leads to evaluation of plant specific failure rate, and on demand failure probability/unavailability for all components. Systematic data updation can provide a real time component reliability parameter statistics and trend analysis and this helps in planning maintenance strategies. A software package has been developed RELDATA, which incorporates the database management and data analysis methods. This report describes the software features and underlying methodology in detail. (author)

  9. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  10. Suprahyoid Muscle Complex: A Reliable Neural Assessment Tool For Dysphagia?

    DEFF Research Database (Denmark)

    Kothari, Mohit; Stubbs, Peter William; Pedersen, Asger Roer

    be a non-invasive reliable neural assessment tool for patients with dysphagia. Objective: To investigate the possibility of using the suprahyoid muscle complex (SMC) using surface electromyography (sEMG) to assess changes to neural pathways by determining the reliability of measurements in healthy...

  11. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  12. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    Science.gov (United States)

    Duffy, Stephen F.

    1997-01-01

    Ceramic matrix composites (CMC) and intermetallic materials (e.g., single crystal nickel aluminide) are high performance materials that exhibit attractive mechanical, thermal and chemical properties. These materials are critically important in advancing certain performance aspects of gas turbine engines. From an aerospace engineer's perspective the new generation of ceramic composites and intermetallics offers a significant potential for raising the thrust/weight ratio and reducing NO(x) emissions of gas turbine engines. These aspects have increased interest in utilizing these materials in the hot sections of turbine engines. However, as these materials evolve and their performance characteristics improve a persistent need exists for state-of-the-art analytical methods that predict the response of components fabricated from CMC and intermetallic material systems. This need provided the motivation for the technology developed under this research effort. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for "graceful" rather than catastrophic failure. When loaded in the fiber direction, these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Thus modeling efforts supported under this research effort have focused on predicting this sort of behavior. For single crystal intermetallics the issues that motivated the technology development involved questions relating to material behavior and component design. Thus the research effort supported by this grant had to determine the statistical nature and source of fracture in a high strength, Ni

  13. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  14. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  15. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  17. Analytical procedures for determining the impacts of reliability mitigation strategies.

    Science.gov (United States)

    2013-01-01

    Reliability of transport, especially the ability to reach a destination within a certain amount of time, is a regular concern of travelers and shippers. The definition of reliability used in this research is how travel time varies over time. The vari...

  18. Analytical modeling of nuclear power station operator reliability

    International Nuclear Information System (INIS)

    Sabri, Z.A.; Husseiny, A.A.

    1979-01-01

    The operator-plant interface is a critical component of power stations which requires the formulation of mathematical models to be applied in plant reliability analysis. The human model introduced here is based on cybernetic interactions and allows for use of available data from psychological experiments, hot and cold training and normal operation. The operator model is identified and integrated in the control and protection systems. The availability and reliability are given for different segments of the operator task and for specific periods of the operator life: namely, training, operation and vigilance or near retirement periods. The results can be easily and directly incorporated in system reliability analysis. (author)

  19. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  20. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran; Jonsson, Daniel K.; Lundberg, Kristina; Tyskeng, Sara; Wallgren, Oskar

    2009-01-01

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  1. The design and use of reliability data base with analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst`s current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs.

  2. The design and use of reliability data base with analysis tool

    International Nuclear Information System (INIS)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst's current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs

  3. A Novel Analytic Technique for the Service Station Reliability in a Discrete-Time Repairable Queue

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2013-01-01

    Full Text Available This paper presents a decomposition technique for the service station reliability in a discrete-time repairable GeomX/G/1 queueing system, in which the server takes exhaustive service and multiple adaptive delayed vacation discipline. Using such a novel analytic technique, some important reliability indices and reliability relation equations of the service station are derived. Furthermore, the structures of the service station indices are also found. Finally, special cases and numerical examples validate the derived results and show that our analytic technique is applicable to reliability analysis of some complex discrete-time repairable bulk arrival queueing systems.

  4. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    The ultrasonic trap offers an exceptional possibility for sample handling in the scale of microlitres. Using acoustic levitation the sample is positioned in a containerless gaseous environment and therefore evades the influence of solid surfaces. In this work, the possibilities of the ultrasonic trap are investigated experimentally for its operation in analytics. In combination with typical contactless analytical methods, like spectroscopy and X-ray scattering, the advantages of this levitation technique are demonstrated at several materials, such as inorganic, organic and pharmaceutical substances as far as proteins, nano and micro particles. It is shown that the utilization of acoustic levitation enables reliable a contactless sample handling for the use of spectroscopic methods (LIF, Raman) as well as for the first time of methods of X-ray scattering (EDXD, SAXS, WAXS) und X-ray fluorescence (RFA, XANES). For all these methods the containerless sample handling turns out to be advantageous. The obtained results are comparable with those of conventional sample holders and, moreover, they partly surpass them with regard to the obtained data quality. A novel experimental approach was the integration of the acoustic levitator in the experimental set-up at the synchrotron. The application of the ultrasonic trap at BESSY was established during this work and actually represents the basis of intensive interdisciplinary research. Additionally the potential of the trap for enrichment was recognized and applied to study evaporation controlled processes. The containerless and concentration dependent analysis over a sample volume region of three orders of magnitude at the same sample is a unique possibility. It allowed essentially contributing to the elucidation of questions of several areas of research. These investigations are the first in situ studies of the agglomeration in an acoustic levitated droplet, starting from small (in)organical molecules over proteins up to

  5. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  6. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  7. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  8. Some developments in human reliability analysis approaches and tools

    Energy Technology Data Exchange (ETDEWEB)

    Hannaman, G W; Worledge, D H

    1988-01-01

    Since human actions have been recognized as an important contributor to safety of operating plants in most industries, research has been performed to better understand and account for the way operators interact during accidents through the control room and equipment interface. This paper describes the integration of a series of research projects sponsored by the Electric Power Research Institute to strengthen the methods for performing the human reliability analysis portion of the probabilistic safety studies. It focuses on the analytical framework used to guide the analysis, the development of the models for quantifying time-dependent actions, and simulator experiments used to validate the models.

  9. Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests

    Science.gov (United States)

    Ebuoh, Casmir N.

    2018-01-01

    Literature revealed that the patterns/methods of scoring essay tests had been criticized for not being reliable and this unreliability is more likely to be more in internal examinations than in the external examinations. The purpose of this study is to find out the effects of analytical and holistic scoring patterns on scorer reliability in…

  10. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated....... The polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus...... of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  11. Laser-induced plasma spectrometry: truly a surface analytical tool

    International Nuclear Information System (INIS)

    Vadillo, Jose M.; Laserna, J.

    2004-01-01

    For a long period, analytical applications of laser induced plasma spectrometry (LIPS) have been mainly restricted to overall and quantitative determination of elemental composition in bulk, solid samples. However, introduction of new compact and reliable solid state lasers and technological development in multidimensional intensified detectors have made possible the seeking of new analytical niches for LIPS where its analytical advantages (direct sampling from any material irrespective of its conductive status without sample preparation and with sensitivity adequate for many elements in different matrices) could be fully exploited. In this sense, the field of surface analysis could take advantage from the cited advantages taking into account in addition, the capability of LIPS for spot analysis, line scan, depth-profiling, area analysis and compositional mapping with a single instrument in air at atmospheric pressure. This review paper outlines the fundamental principles of laser-induced plasma emission relevant to sample surface studies, discusses the experimental parameters governing the spatial (lateral and in-depth) resolution in LIPS analysis and presents the applications concerning surface examination

  12. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  13. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  14. An analytical framework for reliability growth of one-shot systems

    International Nuclear Information System (INIS)

    Hall, J. Brian; Mosleh, Ali

    2008-01-01

    In this paper, we introduce a new reliability growth methodology for one-shot systems that is applicable to the case where all corrective actions are implemented at the end of the current test phase. The methodology consists of four model equations for assessing: expected reliability, the expected number of failure modes observed in testing, the expected probability of discovering new failure modes, and the expected portion of system unreliability associated with repeat failure modes. These model equations provide an analytical framework for which reliability practitioners can estimate reliability improvement, address goodness-of-fit concerns, quantify programmatic risk, and assess reliability maturity of one-shot systems. A numerical example is given to illustrate the value and utility of the presented approach. This methodology is useful to program managers and reliability practitioners interested in applying the techniques above in their reliability growth program

  15. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998; ......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing....

  16. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  17. Reliability of the Hazelbaker Assessment Tool for Lingual Frenulum Function

    Directory of Open Access Journals (Sweden)

    James Jennifer P

    2006-03-01

    Full Text Available Abstract Background About 3% of infants are born with a tongue-tie which may lead to breastfeeding problems such as ineffective latch, painful attachment or poor weight gain. The Hazelbaker Assessment Tool for Lingual Frenulum Function (HATLFF has been developed to give a quantitative assessment of the tongue-tie and recommendation about frenotomy (release of the frenulum. The aim of this study was to assess the inter-rater reliability of the HATLFF. Methods Fifty-eight infants referred to the Breastfeeding Education and Support Services (BESS at The Royal Women's Hospital for assessment of tongue-tie and 25 control infants were assessed by two clinicians independently. Results The Appearance items received kappas between about 0.4 to 0.6, which represents "moderate" reliability. The first three Function items (lateralization, lift and extension of tongue had kappa values over 0.65 which indicates "substantial" agreement. The four Function items relating to infant sucking (spread, cupping, peristalsis and snapback received low kappa values with insignificant p values. There was 96% agreement between the two assessors on the recommendation for frenotomy (kappa 0.92, excellent agreement. The study found that the Function Score can be more simply assessed using only the first three function items (ie not scoring the sucking items, with a cut-off of ≤4 for recommendation of frenotomy. Conclusion We found that the HATLFF has a high reliability in a study of infants with tongue-tie and control infants

  18. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  19. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  20. Analytical Tools for the Routine Use of the UMAE

    International Nuclear Information System (INIS)

    D'Auria, F.; Eramo, A.; Gianotti, W.

    1998-01-01

    UMAE (Uncertainty Methodology based on Accuracy Extrapolation) is methodology developed to calculate the uncertainties related to thermal-hydraulic code results in Nuclear Power Plant transient analysis. The use of the methodology has shown the need of specific analytical tools to simplify some steps of its application and making clearer individual procedures adopted in the development. Three of these have been recently completed and are illustrated in this paper. The first one makes it possible to attribute ''weight factors'' to the experimental Integral Test Facilities; results are also shown. The second one deals with the calculation of the accuracy of a code results. The concerned computer program makes a comparison between experimental and calculated trends of any quantity and gives as output the accuracy of the calculation. The third one consists in a computer program suitable to get continuous uncertainty bands from single valued points. (author)

  1. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 3: HARP Graphics Oriented (GO) input user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.

  2. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    Science.gov (United States)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  3. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  4. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  5. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan [Iowa State Univ., Ames, IA (United States)

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  6. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  7. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  8. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  9. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  10. Big data analytics for the Future Circular Collider reliability and availability studies

    Science.gov (United States)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  11. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  12. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  13. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  14. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  15. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  16. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  17. Analytical Tools for Functional Assessment of Architectural Layouts

    Science.gov (United States)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these

  18. Electrochemical sensors: a powerful tool in analytical chemistry

    Directory of Open Access Journals (Sweden)

    Stradiotto Nelson R.

    2003-01-01

    Full Text Available Potentiometric, amperometric and conductometric electrochemical sensors have found a number of interesting applications in the areas of environmental, industrial, and clinical analyses. This review presents a general overview of the three main types of electrochemical sensors, describing fundamental aspects, developments and their contribution to the area of analytical chemistry, relating relevant aspects of the development of electrochemical sensors in Brazil.

  19. A Valid and Reliable Tool to Assess Nursing Students` Clinical Performance

    OpenAIRE

    Mehrnoosh Pazargadi; Tahereh Ashktorab; Sharareh Khosravi; Hamid Alavi majd

    2013-01-01

    Background: The necessity of a valid and reliable assessment tool is one of the most repeated issues in nursing students` clinical evaluation. But it is believed that present tools are not mostly valid and can not assess students` performance properly.Objectives: This study was conducted to design a valid and reliable assessment tool for evaluating nursing students` performance in clinical education.Methods: In this methodological study considering nursing students` performance definition; th...

  20. Reliability of the ECHOWS Tool for Assessment of Patient Interviewing Skills.

    Science.gov (United States)

    Boissonnault, Jill S; Evans, Kerrie; Tuttle, Neil; Hetzel, Scott J; Boissonnault, William G

    2016-04-01

    History taking is an important component of patient/client management. Assessment of student history-taking competency can be achieved via a standardized tool. The ECHOWS tool has been shown to be valid with modest intrarater reliability in a previous study but did not demonstrate sufficient power to definitively prove its stability. The purposes of this study were: (1) to assess the reliability of the ECHOWS tool for student assessment of patient interviewing skills and (2) to determine whether the tool discerns between novice and experienced skill levels. A reliability and construct validity assessment was conducted. Three faculty members from the United States and Australia scored videotaped histories from standardized patients taken by students and experienced clinicians from each of these countries. The tapes were scored twice, 3 to 6 weeks apart. Reliability was assessed using interclass correlation coefficients (ICCs) and repeated measures. Analysis of variance models assessed the ability of the tool to discern between novice and experienced skill levels. The ECHOWS tool showed excellent intrarater reliability (ICC [3,1]=.74-.89) and good interrater reliability (ICC [2,1]=.55) as a whole. The summary of performance (S) section showed poor interrater reliability (ICC [2,1]=.27). There was no statistical difference in performance on the tool between novice and experienced clinicians. A possible ceiling effect may occur when standardized patients are not coached to provide complex and obtuse responses to interviewer questions. Variation in familiarity with the ECHOWS tool and in use of the online training may have influenced scoring of the S section. The ECHOWS tool demonstrates excellent intrarater reliability and moderate interrater reliability. Sufficient training with the tool prior to student assessment is recommended. The S section must evolve in order to provide a more discerning measure of interviewing skills. © 2016 American Physical Therapy

  1. Reliability Centered Maintenance as a tool for plant life extension

    International Nuclear Information System (INIS)

    Elliott, J.O.; Mulay, J.N.; Nakahara, Y.

    1991-01-01

    Currently in the nuclear industry there is a growing interest in lowering the cost and complexity of maintenance activities while at the same time improving plant reliability and safety in an effort to prepare for the technical and regulatory challenges of life extension. This seemingly difficult task is being aided by the introduction of a maintenance philosophy developed originally by the airline industry and subsequently applied with great success both in that industry and the U.S. military services. Reliability Centered Maintenance (RCM), in its basic form, may be described as a consideration of reliability and maintenance problems from a systems level approach, allowing a focus on preservation of system function as the aim of a maintenance program optimized for both safety and economics. It is this systematic view of plant maintenance, with the emphasis on overall functions rather than individual parts and components which sets RCM apart from past nuclear plant maintenance philosophies. It is also the factor which makes application of RCM an ideal first step in development of strategies for life extension, both for aging plants, and for plants just beginning their first license term. (J.P.N.)

  2. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    Science.gov (United States)

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Mapping healthcare systems: a policy relevant analytic tool.

    Science.gov (United States)

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  4. Physics-Based Probabilistic Design Tool with System-Level Reliability Constraint, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The work proposed herein would develop a set of analytic methodologies and a computer tool suite enabling aerospace hardware designers to rapidly determine optimum...

  5. Reliability of stellar inclination estimated from asteroseismology: analytical criteria, mock simulations and Kepler data analysis

    Science.gov (United States)

    Kamiaka, Shoya; Benomar, Othman; Suto, Yasushi

    2018-05-01

    Advances in asteroseismology of solar-like stars, now provide a unique method to estimate the stellar inclination i⋆. This enables to evaluate the spin-orbit angle of transiting planetary systems, in a complementary fashion to the Rossiter-McLaughlineffect, a well-established method to estimate the projected spin-orbit angle λ. Although the asteroseismic method has been broadly applied to the Kepler data, its reliability has yet to be assessed intensively. In this work, we evaluate the accuracy of i⋆ from asteroseismology of solar-like stars using 3000 simulated power spectra. We find that the low signal-to-noise ratio of the power spectra induces a systematic under-estimate (over-estimate) bias for stars with high (low) inclinations. We derive analytical criteria for the reliable asteroseismic estimate, which indicates that reliable measurements are possible in the range of 20° ≲ i⋆ ≲ 80° only for stars with high signal-to-noise ratio. We also analyse and measure the stellar inclination of 94 Kepler main-sequence solar-like stars, among which 33 are planetary hosts. According to our reliability criteria, a third of them (9 with planets, 22 without) have accurate stellar inclination. Comparison of our asteroseismic estimate of vsin i⋆ against spectroscopic measurements indicates that the latter suffers from a large uncertainty possibly due to the modeling of macro-turbulence, especially for stars with projected rotation speed vsin i⋆ ≲ 5km/s. This reinforces earlier claims, and the stellar inclination estimated from the combination of measurements from spectroscopy and photometric variation for slowly rotating stars needs to be interpreted with caution.

  6. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... provides information about e.g. subjects of interest, searching behaviour, browsing patterns in website structure as well as tag clouds, page views. The poster discusses benefits and challenges of the two web metrics, with a model of how to use search and tag data for the design of taxonomies, e.g. choice...

  7. [The analytical reliability of clinical laboratory information and role of the standards in its support].

    Science.gov (United States)

    Men'shikov, V V

    2012-12-01

    The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.

  8. A clinical assessment tool used for physiotherapy students--is it reliable?

    Science.gov (United States)

    Lewis, Lucy K; Stiller, Kathy; Hardy, Frances

    2008-01-01

    Educational institutions providing professional programs such as physiotherapy must provide high-quality student assessment procedures. To ensure that assessment is consistent, assessment tools should have an acceptable level of reliability. There is a paucity of research evaluating the reliability of clinical assessment tools used for physiotherapy students. This study evaluated the inter- and intrarater reliability of an assessment tool used for physiotherapy students during a clinical placement. Five clinical educators and one academic participated in the study. Each rater independently marked 22 student written assessments that had been completed by students after viewing a videotaped patient physiotherapy assessment. The raters repeated the marking process 7 weeks later, with the assessments provided in a randomised order. The interrater reliability (Intraclass Correlation Coefficient) for the total scores was 0.32, representing a poor level of reliability. A high level of intrarater reliability (percentage agreement) was found for the clinical educators, with a difference in section scores of one mark or less on 93.4% of occasions. Further research should be undertaken to reevaluate the reliability of this clinical assessment tool following training. The reliability of clinical assessment tools used in other areas of physiotherapy education should be formally measured rather than assumed.

  9. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  10. Measurement of HDO Products Using GC-TCD: Towards Obtaining Reliable Analytical Data

    Directory of Open Access Journals (Sweden)

    Zuas Oman

    2018-03-01

    Full Text Available This paper reported the method development and validation of a gas chromatography with thermal conductivity detector (GC-TCD method for the measurement of the gaseous products of hydrodeoxygenation (HDO. The method validation parameters include selectivity, precision (repeatability and reproducibility, accuracy, linearity, limit of detection (LoD, limit of quantitation (LoQ, and robustness. The results showed that the developed method was able to separate the target components (H2, CO2, CH4 and CO from their mixtures without any special sample treatment. The validated method was selective, precise, accurate, and robust. Application of the developed and validated GC-TCD method to the measurement of by-product components of HDO of bio-oil revealed a good performance with relative standard deviation (RSD less than 1.0% for all target components, implying that the process of method development and validation provides a trustworthy way of obtaining reliable analytical data.

  11. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  12. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  13. Improving web site performance using commercially available analytical tools.

    Science.gov (United States)

    Ogle, James A

    2010-10-01

    It is easy to accurately measure web site usage and to quantify key parameters such as page views, site visits, and more complex variables using commercially available tools that analyze web site log files and search engine use. This information can be used strategically to guide the design or redesign of a web site (templates, look-and-feel, and navigation infrastructure) to improve overall usability. The data can also be used tactically to assess the popularity and use of new pages and modules that are added and to rectify problems that surface. This paper describes software tools used to: (1) inventory search terms that lead to available content; (2) propose synonyms for commonly used search terms; (3) evaluate the effectiveness of calls to action; (4) conduct path analyses to targeted content. The American Academy of Orthopaedic Surgeons (AAOS) uses SurfRay's Behavior Tracking software (Santa Clara CA, USA, and Copenhagen, Denmark) to capture and archive the search terms that have been entered into the site's Google Mini search engine. The AAOS also uses Unica's NetInsight program to analyze its web site log files. These tools provide the AAOS with information that quantifies how well its web sites are operating and insights for making improvements to them. Although it is easy to quantify many aspects of an association's web presence, it also takes human involvement to analyze the results and then recommend changes. Without a dedicated resource to do this, the work often is accomplished only sporadically and on an ad hoc basis.

  14. Reliability Of Kraus-Weber Exercise Test As An Evaluation Tool In ...

    African Journals Online (AJOL)

    Reliability Of Kraus-Weber Exercise Test As An Evaluation Tool In Low Back ... strength and flexibility of the back, abdominal, psoas and hamstring muscles. ... Keywords: Kraus-Weber test, low back pain, muscle flexibility, muscle strength.

  15. A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network

    International Nuclear Information System (INIS)

    Zubair, Muhammad

    2014-01-01

    By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP and BN to increase nuclear power plant (NPP) safety has been developed. By using AHP, best alternative to improve safety, design, operation, and to allocate budget for all technical and non-technical factors related with nuclear safety has been investigated. We use a special structure of BN based on the method AHP. The graphs of the BN and the probabilities associated with nodes are designed to translate the knowledge of experts on the selection of best alternative. The results show that the improvement in regulatory authorities will decrease failure probabilities and increase safety and reliability in industrial area.

  16. Development of reliability centered maintenance methods and tools

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Dubreuil-Chambardel, A.; Lannoy, A.; Monnier, B.

    1992-12-01

    This paper recalls the development of the RCM (Reliability Centered Maintenance) approach in the nuclear industry and describes the trial study implemented by EDF in the context of the OMF (RCM) Project. The approach developed is currently being applied to about thirty systems (Industrial Project). On a parallel, R and D efforts are being maintained to improve the selectivity of the analysis methods. These methods use Probabilistic Safety Study models, thereby guaranteeing better selectivity in the identification of safety critical elements and enhancing consistency between Maintenance and Safety studies. They also offer more detailed analysis of operation feedback, invoking for example Bayes' methods combining expert judgement and feedback data. Finally, they propose a functional and material representation of the plant. This dual representation describes both the functions assured by maintenance provisions and the material elements required for their implementation. In the final chapter, the targets of the future OMF workstation are summarized and the latter's insertion in the EDF information system is briefly described. (authors). 5 figs., 2 tabs., 7 refs

  17. Reliability and validity of a tool to assess airway management skills in anesthesia trainees

    Directory of Open Access Journals (Sweden)

    Aliya Ahmed

    2016-01-01

    Conclusion: The tool designed to assess bag-mask ventilation and tracheal intubation skills in anesthesia trainees demonstrated excellent inter-rater reliability, fair test-retest reliability, and good construct validity. The authors recommend its use for formative and summative assessment of junior anesthesia trainees.

  18. Reliability review of the remote tool delivery system locomotor

    Energy Technology Data Exchange (ETDEWEB)

    Chesser, J.B.

    1999-04-01

    The locomotor being built by RedZone Robotics is designed to serve as a remote tool delivery (RID) system for waste retrieval, tank cleaning, viewing, and inspection inside the high-level waste tanks 8D-1 and 8D-2 at West Valley Nuclear Services (WVNS). The RTD systm is to be deployed through a tank riser. The locomotor portion of the RTD system is designed to be inserted into the tank and is to be capable of moving around the tank by supporting itself and moving on the tank internal structural columns. The locomotor will serve as a mounting platform for a dexterous manipulator arm. The complete RTD system consists of the locomotor, dexterous manipulator arm, cameras, lights, cables, hoses, cable/hose management system, power supply, and operator control station.

  19. Practical applications of surface analytic tools in tribology

    Science.gov (United States)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  20. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    Karanassios, Vassili

    2004-01-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  1. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  2. Cluster Analysis as an Analytical Tool of Population Policy

    Directory of Open Access Journals (Sweden)

    Oksana Mikhaylovna Shubat

    2017-12-01

    Full Text Available The predicted negative trends in Russian demography (falling birth rates, population decline actualize the need to strengthen measures of family and population policy. Our research purpose is to identify groups of Russian regions with similar characteristics in the family sphere using cluster analysis. The findings should make an important contribution to the field of family policy. We used hierarchical cluster analysis based on the Ward method and the Euclidean distance for segmentation of Russian regions. Clustering is based on four variables, which allowed assessing the family institution in the region. The authors used the data of Federal State Statistics Service from 2010 to 2015. Clustering and profiling of each segment has allowed forming a model of Russian regions depending on the features of the family institution in these regions. The authors revealed four clusters grouping regions with similar problems in the family sphere. This segmentation makes it possible to develop the most relevant family policy measures in each group of regions. Thus, the analysis has shown a high degree of differentiation of the family institution in the regions. This suggests that a unified approach to population problems’ solving is far from being effective. To achieve greater results in the implementation of family policy, a differentiated approach is needed. Methods of multidimensional data classification can be successfully applied as a relevant analytical toolkit. Further research could develop the adaptation of multidimensional classification methods to the analysis of the population problems in Russian regions. In particular, the algorithms of nonparametric cluster analysis may be of relevance in future studies.

  3. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  4. Testing the reliability of the Fall Risk Screening Tool in an elderly ambulatory population.

    Science.gov (United States)

    Fielding, Susan J; McKay, Michael; Hyrkas, Kristiina

    2013-11-01

    To identify and test the reliability of a fall risk screening tool in an ambulatory outpatient clinic. The Fall Risk Screening Tool (Albert Lea Medical Center, MN, USA) was scripted for an interview format. Two interviewers separately screened a convenience sample of 111 patients (age ≥ 65 years) in an ambulatory outpatient clinic in a northeastern US city. The interviewers' scoring of fall risk categories was similar. There was good internal consistency (Cronbach's α = 0.834-0.889) and inter-rater reliability [intra-class correlation coefficients (ICC) = 0.824-0.881] for total, Risk Factor and Client's Health Status subscales. The Physical Environment scores indicated acceptable internal consistency (Cronbach's α = 0.742) and adequate reliability (ICC = 0.688). Two Physical Environment items (furniture and medical equipment condition) had low reliabilities [Kappa (K) = 0.323, P = 0.08; K = -0.078, P = 0.648), respectively. The scripted Fall Risk Screening Tool demonstrated good reliability in this sample. Rewording two Physical Environment items will be considered. A reliable instrument such as the scripted Fall Risk Screening Tool provides a standardised assessment for identifying high fall risk patients. This tool is especially useful because it assesses personal, behavioural and environmental factors specific to community-dwelling patients; the interview format also facilitates patient-provider interaction. © 2013 John Wiley & Sons Ltd.

  5. Positron spectroscopy as an analytical tool in material sciences

    International Nuclear Information System (INIS)

    Pujari, P.K.

    2010-01-01

    Full text: Positron annihilation spectroscopy has emerged as a powerful tool in material sciences due to its ability to provide information about the electron momentum distribution and electron density in a given medium. These features help in identifying altered state of electronic rearrangements as one encounters in phase transitions. In addition, positrons prefer regions of lower electron density such as open volume defects i.e. vacancies or vacancy clusters in metals, alloys and semiconductors or free-volumes in molecular solids. Its sensitivity to defects is extremely high e.g. it can detect as small a defect as monovacancy to concentration as low as parts per million(ppm). Innovative nuclear instrumentation has helped in getting chemical specificity at the annihilation site. For example, precipitates, embedded, nanoparticles or element decorated vacancies can now be easily identified. This presentation is structured to introduce the technique and provide a global perspective on area of applications. Specific examples on defect characterization, nanostructure-property correlations in polymers, advantages of elemental specificity by indexing the core electron momentum will be given. In addition, slow positron beam based studies on nanostructured materials as well as particle accelerator based positron spectroscopy for volumetric assay of defects in large engineering samples will be presented

  6. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers

    Directory of Open Access Journals (Sweden)

    Khalil Al Handawi

    2017-09-01

    Full Text Available Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  7. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers.

    Science.gov (United States)

    Al Handawi, Khalil; Vahdati, Nader; Shiryayev, Oleg; Lawand, Lydia

    2017-09-28

    Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs) for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI) change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber's modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR) while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  8. Allied health clinicians using translational research in action to develop a reliable stroke audit tool.

    Science.gov (United States)

    Abery, Philip; Kuys, Suzanne; Lynch, Mary; Low Choy, Nancy

    2018-05-23

    To design and establish reliability of a local stroke audit tool by engaging allied health clinicians within a privately funded hospital. Design: Two-stage study involving a modified Delphi process to inform stroke audit tool development and inter-tester reliability. Allied health clinicians. A modified Delphi process to select stroke guideline recommendations for inclusion in the audit tool. Reliability study: 1 allied health representative from each discipline audited 10 clinical records with sequential admissions to acute and rehabilitation services. Recommendations were admitted to the audit tool when 70% agreement was reached, with 50% set as the reserve agreement. Inter-tester reliability was determined using intra-class correlation coefficients (ICCs) across 10 clinical records. Twenty-two participants (92% female, 50% physiotherapists, 17% occupational therapists) completed the modified Delphi process. Across 6 voting rounds, 8 recommendations reached 70% agreement and 2 reached 50% agreement. Two recommendations (nutrition/hydration; goal setting) were added to ensure representation for all disciplines. Substantial consistency across raters was established for the audit tool applied in acute stroke (ICC .71; range .48 to .90) and rehabilitation (ICC.78; range .60 to .93) services. Allied health clinicians within a privately funded hospital generally agreed in an audit process to develop a reliable stroke audit tool. Allied health clinicians agreed on stroke guideline recommendations to inform a stroke audit tool. The stroke audit tool demonstrated substantial consistency supporting future use for service development. This process, which engages local clinicians, could be adopted by other facilities to design reliable audit tools to identify local service gaps to inform changes to clinical practice. © 2018 John Wiley & Sons, Ltd.

  9. MONITORING OF LARGE INSTABLE AREAS: system reliability and new tools.

    Science.gov (United States)

    Leandro, G.; Mucciarelli, M.; Pellicani, R.; Spilotro, G.

    2009-04-01

    The monitoring of unstable or potentially unstable areas is a necessary operation every time you can not remove the conditions of risk and apply to mitigation measures. In Italian Apennine regions there are many urban or extra-urban areas affected by instability, for which it is impracticable to remove hazard conditions, because of size and cost problems. The technological evolution exportable to the field of land instability monitoring is particularly lively and allows the use of warning systems unthinkable just few years ago. However, the monitoring of unstable or potentially unstable areas requires a very great knowledge of the specific problems, without which the reliability of the system may be dangerously overestimated. The movement may arise, indeed, in areas not covered by instrumentation, or covered with vegetation that prevents the acquisition of both reflected signals in the multi-beam laser techniques and radar signals. Environmental conditions (wind, concentrated sources of light, temperature changes, presence of animals) may also invalidate the accuracy of the measures, by introducing modulations or disturbance at a level well above the threshold of alarm signal, leading consequently to raise the values of the warning threshold. The Authors have gained long experience with the observation and monitoring of some large landslides in the Southern Apennine (Aliano, Buoninventre, Calciano, Carlantino, etc.) and unstable areas also at regional scale. One of the most important experiences is about the case of landslides of extensive areas, where unstable and stables zones coexist along transverse and longitudinal axis. In many of these cases you need the accurate control of the movement at selected points to evaluate the trend of displacement velocity, which can be achieved by means of a single-beam laser. The control of these movements, however, does not provide information on stress pattern into the stable areas. Among the sensitive precursors, acoustic

  10. PredicForex. A tool for a reliable market. Playing with currencies.

    Directory of Open Access Journals (Sweden)

    C. Cortés Velasco

    2009-12-01

    Full Text Available The Forex market is a very interesting market. Finding a suitable tool to forecast currency behavior will be of great interest. It is almost impossible to find a 100 % reliable tool. This market is like any other one, unpredictable. However we developed a very interesting tool that makes use of WebCrawler, data mining and web services to offer and forecast an advice to any user or broker.

  11. Raising Reliability of Web Search Tool Research through Replication and Chaos Theory

    OpenAIRE

    Nicholson, Scott

    1999-01-01

    Because the World Wide Web is a dynamic collection of information, the Web search tools (or "search engines") that index the Web are dynamic. Traditional information retrieval evaluation techniques may not provide reliable results when applied to the Web search tools. This study is the result of ten replications of the classic 1996 Ding and Marchionini Web search tool research. It explores the effects that replication can have on transforming unreliable results from one iteration into replica...

  12. The Mental Disability Military Assessment Tool : A Reliable Tool for Determining Disability in Veterans with Post-traumatic Stress Disorder

    NARCIS (Netherlands)

    Fokkens, Andrea S.; Groothoff, Johan W.; van der Klink, Jac J. L.; Popping, Roel; Stewart, Roy E.; van de Ven, Lex; Brouwer, Sandra; Tuinstra, Jolanda

    Purpose An assessment tool was developed to assess disability in veterans who suffer from post-traumatic stress disorder (PTSD) due to a military mission. The objective of this study was to determine the reliability, intra-rater and inter-rater variation of the Mental Disability Military (MDM)

  13. The Mental Disability Military Assessment Tool : A reliable tool for determining disability in veterans with post-traumatic stress disorder

    NARCIS (Netherlands)

    Fokkens, A.S.; Groothoff, J.W.; van der Klink, J.J.L.; Popping, R.; Stewart, S.E.; van de Ven, L.; Brouwer, S.; Tuinstra, J.

    2015-01-01

    Purpose An assessment tool was developed to assess disability in veterans who suffer from post-traumatic stress disorder (PTSD) due to a military mission. The objective of this study was to determine the reliability, intra-rater and inter-rater variation of the Mental Disability Military (MDM)

  14. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han

    2012-01-01

    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors

  15. Reliability of Lactation Assessment Tools Applied to Overweight and Obese Women.

    Science.gov (United States)

    Chapman, Donna J; Doughty, Katherine; Mullin, Elizabeth M; Pérez-Escamilla, Rafael

    2016-05-01

    The interrater reliability of lactation assessment tools has not been evaluated in overweight/obese women. This study aimed to compare the interrater reliability of 4 lactation assessment tools in this population. A convenience sample of 45 women (body mass index > 27.0) was videotaped while breastfeeding (twice daily on days 2, 4, and 7 postpartum). Three International Board Certified Lactation Consultants independently rated each videotaped session using 4 tools (Infant Breastfeeding Assessment Tool [IBFAT], modified LATCH [mLATCH], modified Via Christi [mVC], and Riordan's Tool [RT]). For each day and tool, we evaluated interrater reliability with 1-way repeated-measures analyses of variance, intraclass correlation coefficients (ICCs), and percentage absolute agreement between raters. Analyses of variance showed significant differences between raters' scores on day 2 (all scales) and day 7 (RT). Intraclass correlation coefficient values reflected good (mLATCH) to excellent reliability (IBFAT, mVC, and RT) on days 2 and 7. All day 4 ICCs reflected good reliability. The ICC for mLATCH was significantly lower than all others on day 2 and was significantly lower than IBFAT (day 7). Percentage absolute interrater agreement for scale components ranged from 31% (day 2: observable swallowing, RT) to 92% (day 7: IBFAT, fixing; and mVC, latch time). Swallowing scores on all scales had the lowest levels of interrater agreement (31%-64%). We demonstrated differences in the interrater reliability of 4 lactation assessment tools when applied to overweight/obese women, with the lowest values observed on day 4. Swallowing assessment was particularly unreliable. Researchers and clinicians using these scales should be aware of the differences in their psychometric behavior. © The Author(s) 2015.

  16. Evaluating the reliability of an injury prevention screening tool: Test-retest study.

    Science.gov (United States)

    Gittelman, Michael A; Kincaid, Madeline; Denny, Sarah; Wervey Arnold, Melissa; FitzGerald, Michael; Carle, Adam C; Mara, Constance A

    2016-10-01

    A standardized injury prevention (IP) screening tool can identify family risks and allow pediatricians to address behaviors. To assess behavior changes on later screens, the tool must be reliable for an individual and ideally between household members. Little research has examined the reliability of safety screening tool questions. This study utilized test-retest reliability of parent responses on an existing IP questionnaire and also compared responses between household parents. Investigators recruited parents of children 0 to 1 year of age during admission to a tertiary care children's hospital. When both parents were present, one was chosen as the "primary" respondent. Primary respondents completed the 30-question IP screening tool after consent, and they were re-screened approximately 4 hours later to test individual reliability. The "second" parent, when present, only completed the tool once. All participants received a 10-dollar gift card. Cohen's Kappa was used to estimate test-retest reliability and inter-rater agreement. Standard test-retest criteria consider Kappa values: 0.0 to 0.40 poor to fair, 0.41 to 0.60 moderate, 0.61 to 0.80 substantial, and 0.81 to 1.00 as almost perfect reliability. One hundred five families participated, with five lost to follow-up. Thirty-two (30.5%) parent dyads completed the tool. Primary respondents were generally mothers (88%) and Caucasian (72%). Test-retest of the primary respondents showed their responses to be almost perfect; average 0.82 (SD = 0.13, range 0.49-1.00). Seventeen questions had almost perfect test-retest reliability and 11 had substantial reliability. However, inter-rater agreement between household members for 12 objective questions showed little agreement between responses; inter-rater agreement averaged 0.35 (SD = 0.34, range -0.19-1.00). One question had almost perfect inter-rater agreement and two had substantial inter-rater agreement. The IP screening tool used by a single individual had excellent

  17. Development of a quality-assessment tool for experimental bruxism studies: reliability and validity.

    Science.gov (United States)

    Dawson, Andreas; Raphael, Karen G; Glaros, Alan; Axelsson, Susanna; Arima, Taro; Ernberg, Malin; Farella, Mauro; Lobbezoo, Frank; Manfredini, Daniele; Michelotti, Ambra; Svensson, Peter; List, Thomas

    2013-01-01

    To combine empirical evidence and expert opinion in a formal consensus method in order to develop a quality-assessment tool for experimental bruxism studies in systematic reviews. Tool development comprised five steps: (1) preliminary decisions, (2) item generation, (3) face-validity assessment, (4) reliability and discriminitive validity assessment, and (5) instrument refinement. The kappa value and phi-coefficient were calculated to assess inter-observer reliability and discriminative ability, respectively. Following preliminary decisions and a literature review, a list of 52 items to be considered for inclusion in the tool was compiled. Eleven experts were invited to join a Delphi panel and 10 accepted. Four Delphi rounds reduced the preliminary tool-Quality-Assessment Tool for Experimental Bruxism Studies (Qu-ATEBS)- to 8 items: study aim, study sample, control condition or group, study design, experimental bruxism task, statistics, interpretation of results, and conflict of interest statement. Consensus among the Delphi panelists yielded good face validity. Inter-observer reliability was acceptable (k = 0.77). Discriminative validity was excellent (phi coefficient 1.0; P reviews of experimental bruxism studies, exhibits face validity, excellent discriminative validity, and acceptable inter-observer reliability. Development of quality assessment tools for many other topics in the orofacial pain literature is needed and may follow the described procedure.

  18. Enhancement of the reliability of automated ultrasonic inspections using tools of quantitative NDT

    International Nuclear Information System (INIS)

    Kappes, W.; Baehr, W.; Kroening, M.; Schmitz, V.

    1994-01-01

    To achieve reliable test results from automated ultrasonic inspection of safety related components, optimization and integral consideration of the various inspection stages - inspection planning, inspection performance and evaluation of results - are indispensable. For this purpose, a large potential of methods is available: advanced measurement techniques, mathematical-numerical modelling processes, artificial intelligence tools, data bases and CAD systems. The potential inherent in these methods to enhance inspection reliability is outlined by way of different applications. (orig.) [de

  19. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  20. Knowledge modelling and reliability processing: presentation of the Figaro language and associated tools

    International Nuclear Information System (INIS)

    Bouissou, M.; Villatte, N.; Bouhadana, H.; Bannelier, M.

    1991-12-01

    EDF has been developing for several years an integrated set of knowledge-based and algorithmic tools for automation of reliability assessment of complex (especially sequential) systems. In this environment, the reliability expert has at his disposal all the powerful software tools for qualitative and quantitative processing, besides he gets various means to generate automatically the inputs for these tools, through the acquisition of graphical data. The development of these tools has been based on FIGARO, a specific language, which was built to get an homogeneous system modelling. Various compilers and interpreters get a FIGARO model into conventional models, such as fault-trees, Markov chains, Petri Networks. In this report, we introduce the main basics of FIGARO language, illustrating them with examples

  1. Pilot testing of SHRP 2 reliability data and analytical products: Washington.

    Science.gov (United States)

    2014-07-30

    The second Strategic Highway Research Program (SHRP 2) addresses the challenges of moving people and goods efficiently and safely on the nations highways. In its Reliability focus area, the research emphasizes improving the reliability of highway ...

  2. Process analytical technology (PAT) tools for the cultivation step in biopharmaceutical production

    NARCIS (Netherlands)

    Streefland, M.; Martens, D.E.; Beuvery, E.C.; Wijffels, R.H.

    2013-01-01

    The process analytical technology (PAT) initiative is now 10 years old. This has resulted in the development of many tools and software packages dedicated to PAT application on pharmaceutical processes. However, most applications are restricted to small molecule drugs, mainly for the relatively

  3. Current research relevant to the improvement of γ-ray spectroscopy as an analytical tool

    International Nuclear Information System (INIS)

    Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.

    1976-01-01

    Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors

  4. Perceptual attraction in tool use: evidence for a reliability-based weighting mechanism.

    Science.gov (United States)

    Debats, Nienke B; Ernst, Marc O; Heuer, Herbert

    2017-04-01

    Humans are well able to operate tools whereby their hand movement is linked, via a kinematic transformation, to a spatially distant object moving in a separate plane of motion. An everyday example is controlling a cursor on a computer monitor. Despite these separate reference frames, the perceived positions of the hand and the object were found to be biased toward each other. We propose that this perceptual attraction is based on the principles by which the brain integrates redundant sensory information of single objects or events, known as optimal multisensory integration. That is, 1 ) sensory information about the hand and the tool are weighted according to their relative reliability (i.e., inverse variances), and 2 ) the unisensory reliabilities sum up in the integrated estimate. We assessed whether perceptual attraction is consistent with optimal multisensory integration model predictions. We used a cursor-control tool-use task in which we manipulated the relative reliability of the unisensory hand and cursor position estimates. The perceptual biases shifted according to these relative reliabilities, with an additional bias due to contextual factors that were present in experiment 1 but not in experiment 2 The biased position judgments' variances were, however, systematically larger than the predicted optimal variances. Our findings suggest that the perceptual attraction in tool use results from a reliability-based weighting mechanism similar to optimal multisensory integration, but that certain boundary conditions for optimality might not be satisfied. NEW & NOTEWORTHY Kinematic tool use is associated with a perceptual attraction between the spatially separated hand and the effective part of the tool. We provide a formal account for this phenomenon, thereby showing that the process behind it is similar to optimal integration of sensory information relating to single objects. Copyright © 2017 the American Physiological Society.

  5. Validity and Reliability of Persian Version of Johns Hopkins Fall Risk Assessment Tool among Aged People

    Directory of Open Access Journals (Sweden)

    hadi hojati

    2018-04-01

    Full Text Available Background & Aim: It is crucial to identify aged patients in risk of falls in clinical settings. Johns Hopkins Fall Risk Assessment Tool (JHFRAT is one of most applied international instrument to assess elderly patients for the risk of falls. The aim of this study was to evaluate reliability and internal consistency of the JHFRAT. Methods & Materials: In this cross-sectional study for validity assessment of the tool, WHO’s standard protocol was applied for translation-back translation of the tool. Face and content validity of the tool was confirmed by ten person of expert faculty members for its applicability in clinical setting. In this pilot study, the inclusion criteria were being 60 or more years old, hospitalized in the last 8 hours prior to assessment and in proper cognitive condition assessed by MMSE. Subjects of the study were (n=70 elderly patients who were newly hospitalized in Shahroud Emam Hossein Hospital. Data were analyzed using SPSS software- version 16. Internal consistency of the tool was calculated by Cronbach’s alpha. Results: According to the results of the study Persian version of JHFRAT was a valid tool for application on clinical setting. The Persian version of the tool had Cronbach’s alpha equal to 0/733. Conclusion: Based on the findings of the current study, it can be concluded that Persian version of the JHFRAT is a valid and reliable tool to be applied for assessment of elderly senior citizens on admission in any clinical settings.

  6. The risk of bias in systematic reviews tool showed fair reliability and good construct validity.

    Science.gov (United States)

    Bühn, Stefanie; Mathes, Tim; Prengel, Peggy; Wegewitz, Uta; Ostermann, Thomas; Robens, Sibylle; Pieper, Dawid

    2017-11-01

    There is a movement from generic quality checklists toward a more domain-based approach in critical appraisal tools. This study aimed to report on a first experience with the newly developed risk of bias in systematic reviews (ROBIS) tool and compare it with A Measurement Tool to Assess Systematic Reviews (AMSTAR), that is, the most common used tool to assess methodological quality of systematic reviews while assessing validity, reliability, and applicability. Validation study with four reviewers based on 16 systematic reviews in the field of occupational health. Interrater reliability (IRR) of all four raters was highest for domain 2 (Fleiss' kappa κ = 0.56) and lowest for domain 4 (κ = 0.04). For ROBIS, median IRR was κ = 0.52 (range 0.13-0.88) for the experienced pair of raters compared to κ = 0.32 (range 0.12-0.76) for the less experienced pair of raters. The percentage of "yes" scores of each review of ROBIS ratings was strongly correlated with the AMSTAR ratings (r s  = 0.76; P = 0.01). ROBIS has fair reliability and good construct validity to assess the risk of bias in systematic reviews. More validation studies are needed to investigate reliability and applicability, in particular. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Monte Carlo simulation - a powerful tool to support experimental activities in structure reliability

    International Nuclear Information System (INIS)

    Yuritzinn, T.; Chapuliot, S.; Eid, M.; Masson, R.; Dahl, A.; Moinereau, D.

    2003-01-01

    Monte-Carlo Simulation (MCS) can have different uses in supporting structure reliability investigations and assessments. In this paper we focus our interest on the use of MCS as a numerical tool to support the fitting of the experimental data related to toughness experiments. (authors)

  8. Reliable tool life measurements in turning - an application to cutting fluid efficiency evaluation

    DEFF Research Database (Denmark)

    Axinte, Dragos A.; Belluco, Walter; De Chiffre, Leonardo

    2001-01-01

    The paper proposes a method to obtain reliable measurements of tool life in turning, discussing some aspects related to experimental procedure and measurement accuracy. The method (i) allows and experimental determination of the extended Taylor's equation, with a limited set of experiments and (ii......) provides efficiency evaluation. Six cutting oils, five of which formulated from vegetable basestock, were evaluated in turning. Experiments were run in a range of cutting parameters. according to a 2, 3-1 factorial design, machining AISI 316L stainless steel with coated carbide tools. Tool life...

  9. RADYBAN: A tool for reliability analysis of dynamic fault trees through conversion into dynamic Bayesian networks

    International Nuclear Information System (INIS)

    Montani, S.; Portinale, L.; Bobbio, A.; Codetta-Raiteri, D.

    2008-01-01

    In this paper, we present RADYBAN (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze a dynamic fault tree relying on its conversion into a dynamic Bayesian network. The tool implements a modular algorithm for automatically translating a dynamic fault tree into the corresponding dynamic Bayesian network and exploits classical algorithms for the inference on dynamic Bayesian networks, in order to compute reliability measures. After having described the basic features of the tool, we show how it operates on a real world example and we compare the unreliability results it generates with those returned by other methodologies, in order to verify the correctness and the consistency of the results obtained

  10. Reliability Oriented Design Tool For the New Generation of Grid Connected PV-Inverters

    DEFF Research Database (Denmark)

    Sintamarean, Nicolae Cristian; Blaabjerg, Frede; Wang, Huai

    2015-01-01

    is achieved and is further used as an input to the lifetime model. The proposed reliability-oriented design tool is used to study the impact of mission profile (MP) variation and device degradation (aging) in the PV inverter lifetime. The obtained results indicate that the MP of the field where the PV...... inverter is operating has an important impact (up to 70%) on the converter lifetime expectation, and it should be considered in the design stage to better optimize the converter design margin. In order to have correct lifetime estimation, it is crucial to consider also the device degradation feedback (in......This paper introduces a reliability-oriented design tool for a new generation of grid-connected photovoltaic (PV) inverters. The proposed design tool consists of a real field mission profile (RFMP) model (for two operating regions: USA and Denmark), a PV panel model, a grid-connected PV inverter...

  11. A Turkish Version of the Critical-Care Pain Observation Tool: Reliability and Validity Assessment.

    Science.gov (United States)

    Aktaş, Yeşim Yaman; Karabulut, Neziha

    2017-08-01

    The study aim was to evaluate the validity and reliability of the Critical-Care Pain Observation Tool in critically ill patients. A repeated measures design was used for the study. A convenience sample of 66 patients who had undergone open-heart surgery in the cardiovascular surgery intensive care unit in Ordu, Turkey, was recruited for the study. The patients were evaluated by using the Critical-Care Pain Observation Tool at rest, during a nociceptive procedure (suctioning), and 20 minutes after the procedure while they were conscious and intubated after surgery. The Turkish version of the Critical-Care Pain Observation Tool has shown statistically acceptable levels of validity and reliability. Inter-rater reliability was supported by moderate-to-high-weighted κ coefficients (weighted κ coefficient = 0.55 to 1.00). For concurrent validity, significant associations were found between the scores on the Critical-Care Pain Observation Tool and the Behavioral Pain Scale scores. Discriminant validity was also supported by higher scores during suctioning (a nociceptive procedure) versus non-nociceptive procedures. The internal consistency of the Critical-Care Pain Observation Tool was 0.72 during a nociceptive procedure and 0.71 during a non-nociceptive procedure. The validity and reliability of the Turkish version of the Critical-Care Pain Observation Tool was determined to be acceptable for pain assessment in critical care, especially for patients who cannot communicate verbally. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  12. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  13. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  14. Use of reliability engineering tools in safety and risk assessment of nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Raso, Amanda Laureano; Vasconcelos, Vanderley de; Marques, Raíssa Oliveira; Soares, Wellington Antonio; Mesquita, Amir Zacarias, E-mail: amandaraso@hotmail.com, E-mail: vasconv@cdtn.br, E-mail: raissaomarques@gmail.com, E-mail: soaresw@cdtn.br, E-mail: amir@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Serviço de Tecnologia de Reatores

    2017-07-01

    Safety, reliability and availability are fundamental criteria in design, construction and operation of nuclear facilities, as nuclear power plants. Deterministic and probabilistic risk assessments of such facilities are required by regulatory authorities in order to meet licensing regulations, contributing to assure safety, as well as reduce costs and environmental impacts. Probabilistic Risk Assessment has become an important part of licensing requirements of the nuclear power plants in Brazil and in the world. Risk can be defined as a qualitative and/or quantitative assessment of accident sequence frequencies (or probabilities) and their consequences. Risk management is a systematic application of management policies, procedures and practices to identify, analyze, plan, implement, control, communicate and document risks. Several tools and computer codes must be combined, in order to estimate both probabilities and consequences of accidents. Event Tree Analysis (ETA), Fault Tree Analysis (FTA), Reliability Block Diagrams (RBD), and Markov models are examples of evaluation tools that can support the safety and risk assessment for analyzing process systems, identifying potential accidents, and estimating consequences. Because of complexity of such analyzes, specialized computer codes are required, such as the reliability engineering software develop by Reliasoft® Corporation. BlockSim (FTA, RBD and Markov models), RENO (ETA and consequence assessment), Weibull++ (life data and uncertainty analysis), and Xfmea (qualitative risk assessment) are some codes that can be highlighted. This work describes an integrated approach using these tools and software to carry out reliability, safety, and risk assessment of nuclear facilities, as well as, and application example. (author)

  15. The PRECIS-2 tool has good interrater reliability and modest discriminant validity.

    Science.gov (United States)

    Loudon, Kirsty; Zwarenstein, Merrick; Sullivan, Frank M; Donnan, Peter T; Gágyor, Ildikó; Hobbelen, Hans J S M; Althabe, Fernando; Krishnan, Jerry A; Treweek, Shaun

    2017-08-01

    PRagmatic Explanatory Continuum Indicator Summary (PRECIS)-2 is a tool that could improve design insight for trialists. Our aim was to validate the PRECIS-2 tool, unlike its predecessor, testing the discriminant validity and interrater reliability. Over 80 international trialists, methodologists, clinicians, and policymakers created PRECIS-2 helping to ensure face validity and content validity. The interrater reliability of PRECIS-2 was measured using 19 experienced trialists who used PRECIS-2 to score a diverse sample of 15 randomized controlled trial protocols. Discriminant validity was tested with two raters to independently determine if the trial protocols were more pragmatic or more explanatory, with scores from the 19 raters for the 15 trials as predictors of pragmatism. Interrater reliability was generally good, with seven of nine domains having an intraclass correlation coefficient over 0.65. Flexibility (adherence) and recruitment had wide confidence intervals, but raters found these difficult to rate and wanted more information. Each of the nine PRECIS-2 domains could be used to differentiate between trials taking more pragmatic or more explanatory approaches with better than chance discrimination for all domains. We have assessed the validity and reliability of PRECIS-2. An elaboration study and web site provide guidance to help future users of the tool which is continuing to be tested by trial teams, systematic reviewers, and funders. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Use of reliability engineering tools in safety and risk assessment of nuclear facilities

    International Nuclear Information System (INIS)

    Raso, Amanda Laureano; Vasconcelos, Vanderley de; Marques, Raíssa Oliveira; Soares, Wellington Antonio; Mesquita, Amir Zacarias

    2017-01-01

    Safety, reliability and availability are fundamental criteria in design, construction and operation of nuclear facilities, as nuclear power plants. Deterministic and probabilistic risk assessments of such facilities are required by regulatory authorities in order to meet licensing regulations, contributing to assure safety, as well as reduce costs and environmental impacts. Probabilistic Risk Assessment has become an important part of licensing requirements of the nuclear power plants in Brazil and in the world. Risk can be defined as a qualitative and/or quantitative assessment of accident sequence frequencies (or probabilities) and their consequences. Risk management is a systematic application of management policies, procedures and practices to identify, analyze, plan, implement, control, communicate and document risks. Several tools and computer codes must be combined, in order to estimate both probabilities and consequences of accidents. Event Tree Analysis (ETA), Fault Tree Analysis (FTA), Reliability Block Diagrams (RBD), and Markov models are examples of evaluation tools that can support the safety and risk assessment for analyzing process systems, identifying potential accidents, and estimating consequences. Because of complexity of such analyzes, specialized computer codes are required, such as the reliability engineering software develop by Reliasoft® Corporation. BlockSim (FTA, RBD and Markov models), RENO (ETA and consequence assessment), Weibull++ (life data and uncertainty analysis), and Xfmea (qualitative risk assessment) are some codes that can be highlighted. This work describes an integrated approach using these tools and software to carry out reliability, safety, and risk assessment of nuclear facilities, as well as, and application example. (author)

  17. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    Science.gov (United States)

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  18. Reliability and criterion-related validity testing (construct) of the Endotracheal Suction Assessment Tool (ESAT©).

    Science.gov (United States)

    Davies, Kylie; Bulsara, Max K; Ramelet, Anne-Sylvie; Monterosso, Leanne

    2018-05-01

    To establish criterion-related construct validity and test-retest reliability for the Endotracheal Suction Assessment Tool© (ESAT©). Endotracheal tube suction performed in children can significantly affect clinical stability. Previously identified clinical indicators for endotracheal tube suction were used as criteria when designing the ESAT©. Content validity was reported previously. The final stages of psychometric testing are presented. Observational testing was used to measure construct validity and determine whether the ESAT© could guide "inexperienced" paediatric intensive care nurses' decision-making regarding endotracheal tube suction. Test-retest reliability of the ESAT© was performed at two time points. The researchers and paediatric intensive care nurse "experts" developed 10 hypothetical clinical scenarios with predetermined endotracheal tube suction outcomes. "Experienced" (n = 12) and "inexperienced" (n = 14) paediatric intensive care nurses were presented with the scenarios and the ESAT© guiding decision-making about whether to perform endotracheal tube suction for each scenario. Outcomes were compared with those predetermined by the "experts" (n = 9). Test-retest reliability of the ESAT© was measured at two consecutive time points (4 weeks apart) with "experienced" and "inexperienced" paediatric intensive care nurses using the same scenarios and tool to guide decision-making. No differences were observed between endotracheal tube suction decisions made by "experts" (n = 9), "inexperienced" (n = 14) and "experienced" (n = 12) nurses confirming the tool's construct validity. No differences were observed between groups for endotracheal tube suction decisions at T1 and T2. Criterion-related construct validity and test-retest reliability of the ESAT© were demonstrated. Further testing is recommended to confirm reliability in the clinical setting with the "inexperienced" nurse to guide decision-making related to endotracheal tube

  19. Design for Reliability and Robustness Tool Platform for Power Electronic Systems – Study Case on Motor Drive Applications

    DEFF Research Database (Denmark)

    Vernica, Ionut; Wang, Huai; Blaabjerg, Frede

    2018-01-01

    conventional approach, mainly based on failure statistics from the field, the reliability evaluation of the power devices is still a challenging task. In order to address the given problem, a MATLAB based reliability assessment tool has been developed. The Design for Reliability and Robustness (DfR2) tool...... allows the user to easily investigate the reliability performance of the power electronic components (or sub-systems) under given input mission profiles and operating conditions. The main concept of the tool and its framework are introduced, highlighting the reliability assessment procedure for power...... semiconductor devices. Finally, a motor drive application is implemented and the reliability performance of the power devices is investigated with the help of the DfR2 tool, and the resulting reliability metrics are presented....

  20. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  1. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  2. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  3. Pilot testing of SHRP 2 reliability data and analytical products: Southern California.

    Science.gov (United States)

    2015-01-01

    The second Strategic Highway Research Program (SHRP 2) has been investigating the critical subject of travel time reliability for several years. As part of this research, SHRP 2 supported multiple efforts to develop products to evaluate travel time r...

  4. Pilot testing of SHRP 2 reliability data and analytical products: Florida.

    Science.gov (United States)

    2015-01-01

    Transportation agencies have realized the importance of performance estimation, measurement, and management. The Moving Ahead for Progress in the 21st Century Act legislation identifies travel time reliability as one of the goals of the federal highw...

  5. Pilot testing of SHRP 2 reliability data and analytical products: Southern California. [supporting datasets

    Science.gov (United States)

    2014-01-01

    The objective of this project was to develop system designs for programs to monitor travel time reliability and to prepare a guidebook that practitioners and others can use to design, build, operate, and maintain such systems. Generally, such travel ...

  6. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  7. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  8. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  9. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  10. The Construct Validity and Reliability of an Assessment Tool for Competency in Cochlear Implant Surgery

    Directory of Open Access Journals (Sweden)

    Patorn Piromchai

    2014-01-01

    Full Text Available Introduction. We introduce a rating tool that objectively evaluates the skills of surgical trainees performing cochlear implant surgery. Methods. Seven residents and seven experts performed cochlear implant surgery sessions from mastoidectomy to cochleostomy on a standardized virtual reality temporal bone. A total of twenty-eight assessment videos were recorded and two consultant otolaryngologists evaluated the performance of each participant using these videos. Results. Interrater reliability was calculated using the intraclass correlation coefficient for both the global and checklist components of the assessment instrument. The overall agreement was high. The construct validity of this instrument was strongly supported by the significantly higher scores in the expert group for both components. Conclusion. Our results indicate that the proposed assessment tool for cochlear implant surgery is reliable, accurate, and easy to use. This instrument can thus be used to provide objective feedback on overall and task-specific competency in cochlear implantation.

  11. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    Science.gov (United States)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  12. A New Tool for Nutrition App Quality Evaluation (AQEL): Development, Validation, and Reliability Testing.

    Science.gov (United States)

    DiFilippo, Kristen Nicole; Huang, Wenhao; Chapman-Novakofski, Karen M

    2017-10-27

    The extensive availability and increasing use of mobile apps for nutrition-based health interventions makes evaluation of the quality of these apps crucial for integration of apps into nutritional counseling. The goal of this research was the development, validation, and reliability testing of the app quality evaluation (AQEL) tool, an instrument for evaluating apps' educational quality and technical functionality. Items for evaluating app quality were adapted from website evaluations, with additional items added to evaluate the specific characteristics of apps, resulting in 79 initial items. Expert panels of nutrition and technology professionals and app users reviewed items for face and content validation. After recommended revisions, nutrition experts completed a second AQEL review to ensure clarity. On the basis of 150 sets of responses using the revised AQEL, principal component analysis was completed, reducing AQEL into 5 factors that underwent reliability testing, including internal consistency, split-half reliability, test-retest reliability, and interrater reliability (IRR). Two additional modifiable constructs for evaluating apps based on the age and needs of the target audience as selected by the evaluator were also tested for construct reliability. IRR testing using intraclass correlations (ICC) with all 7 constructs was conducted, with 15 dietitians evaluating one app. Development and validation resulted in the 51-item AQEL. These were reduced to 25 items in 5 factors after principal component analysis, plus 9 modifiable items in two constructs that were not included in principal component analysis. Internal consistency and split-half reliability of the following constructs derived from principal components analysis was good (Cronbach alpha >.80, Spearman-Brown coefficient >.80): behavior change potential, support of knowledge acquisition, app function, and skill development. App purpose split half-reliability was .65. Test-retest reliability showed no

  13. The role of quality tools in assessing reliability of the internet for health information.

    Science.gov (United States)

    Hanif, Faisal; Read, Janet C; Goodacre, John A; Chaudhry, Afzal; Gibbs, Paul

    2009-12-01

    The Internet has made it possible for patients and their families to access vast quantities of information that previously would have been difficult for anyone but a physician or librarian to obtain. Health information websites, however, are recognised to differ widely in quality and reliability of their content. This has led to the development of various codes of conduct or quality rating tools to assess the quality of health websites. However, the validity and reliability of these quality tools and their applicability to different health websites also varies. In principle, rating tools should be available to consumers, require a limited number of elements to be assessed, be assessable in all elements, be readable and be able to gauge the readability and consistency of information provided from a patient's view point. This article reviews the literature on the trends of the Internet use for health and analyses various codes of conduct/ethics or 'quality tools' available to monitor the quality of health websites from a patient perspective.

  14. The cognitive environment simulation as a tool for modeling human performance and reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Pople, H. Jr.; Roth, E.M.

    1990-01-01

    The US Nuclear Regulatory Commission is sponsoring a research program to develop improved methods to model the cognitive behavior of nuclear power plant (NPP) personnel. Under this program, a tool for simulating how people form intentions to act in NPP emergency situations was developed using artificial intelligence (AI) techniques. This tool is called Cognitive Environment Simulation (CES). The Cognitive Reliability Assessment Technique (or CREATE) was also developed to specify how CBS can be used to enhance the measurement of the human contribution to risk in probabilistic risk assessment (PRA) studies. The next step in the research program was to evaluate the modeling tool and the method for using the tool for Human Reliability Analysis (HRA) in PRAs. Three evaluation activities were conducted. First, a panel of highly distinguished experts in cognitive modeling, AI, PRA and HRA provided a technical review of the simulation development work. Second, based on panel recommendations, CES was exercised on a family of steam generator tube rupture incidents where empirical data on operator performance already existed. Third, a workshop with HRA practitioners was held to analyze a worked example of the CREATE method to evaluate the role of CES/CREATE in HRA. The results of all three evaluations indicate that CES/CREATE represents a promising approach to modeling operator intention formation during emergency operations

  15. A clinical tool to measure plagiocephaly in infants using a flexicurve: a reliability study

    Directory of Open Access Journals (Sweden)

    Leung A

    2013-10-01

    % CI 0.897–0.983; and for interrater reliability, ICCdf17 = 0.874 (95% CI 0.696–0.951. Conclusion: The modified cranial vault asymmetry index using flexicurve in measuring plagiocephaly is a reliable assessment tool. It is economical and efficient for use in clinical settings. Keywords: plagiocephaly, modified cranial vault asymmetry index, infant, community health, reliability

  16. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  17. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  18. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    Directory of Open Access Journals (Sweden)

    Miroslav Pohanka

    2015-06-01

    Full Text Available Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE. The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.. It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance.

  19. Towards reliable multi-hop broadcast in VANETs : An analytical approach

    NARCIS (Netherlands)

    Gholibeigi, M.; Baratchi, M.; Berg, J.L. van den; Heijenk, G.

    2017-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  20. Towards Reliable Multi-Hop Broadcast in VANETs: An Analytical Approach

    NARCIS (Netherlands)

    Gholibeigi, Mozhdeh; Baratchi, Mitra; van den Berg, Hans Leo; Heijenk, Geert

    2016-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  1. The analytic hierarchy process as a systematic approach to the identification of important parameters for the reliability assessment of passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Cantarella, M.; Cammi, A.

    2003-01-01

    Passive systems play a crucial role in the development of future solutions for nuclear plant technology. A fundamental issue still to be resolved is the quantification of the reliability of such systems. In this paper, we firstly illustrate a systematic methodology to guide the definition of the failure criteria of a passive system and the evaluation of its probability of occurrence, through the identification of the relevant system parameters and the propagation of their associated uncertainties. Within this methodology, we propose the use of the analytic hierarchy process as a structured and reproducible tool for the decomposition of the problem and the identification of the dominant system parameters. An example of its application to a real passive system is illustrated in details

  2. Reliable identification of deep sulcal pits: the effects of scan session, scanner, and surface extraction tool.

    Directory of Open Access Journals (Sweden)

    Kiho Im

    Full Text Available Sulcal pit analysis has been providing novel insights into brain function and development. The purpose of this study was to evaluate the reliability of sulcal pit extraction with respect to the effects of scan session, scanner, and surface extraction tool. Five subjects were scanned 4 times at 3 MRI centers and other 5 subjects were scanned 3 times at 2 MRI centers, including 1 test-retest session. Sulcal pits were extracted on the white matter surfaces reconstructed with both Montreal Neurological Institute and Freesurfer pipelines. We estimated similarity of the presence of sulcal pits having a maximum value of 1 and their spatial difference within the same subject. The tests showed high similarity of the sulcal pit presence and low spatial difference. The similarity was more than 0.90 and the spatial difference was less than 1.7 mm in most cases according to different scan sessions or scanners, and more than 0.85 and about 2.0 mm across surface extraction tools. The reliability of sulcal pit extraction was more affected by the image processing-related factors than the scan session or scanner factors. Moreover, the similarity of sulcal pit distribution appeared to be largely influenced by the presence or absence of the sulcal pits on the shallow and small folds. We suggest that our sulcal pit extraction from MRI is highly reliable and could be useful for clinical applications as an imaging biomarker.

  3. Reliable identification of deep sulcal pits: the effects of scan session, scanner, and surface extraction tool.

    Science.gov (United States)

    Im, Kiho; Lee, Jong-Min; Jeon, Seun; Kim, Jong-Heon; Seo, Sang Won; Na, Duk L; Grant, P Ellen

    2013-01-01

    Sulcal pit analysis has been providing novel insights into brain function and development. The purpose of this study was to evaluate the reliability of sulcal pit extraction with respect to the effects of scan session, scanner, and surface extraction tool. Five subjects were scanned 4 times at 3 MRI centers and other 5 subjects were scanned 3 times at 2 MRI centers, including 1 test-retest session. Sulcal pits were extracted on the white matter surfaces reconstructed with both Montreal Neurological Institute and Freesurfer pipelines. We estimated similarity of the presence of sulcal pits having a maximum value of 1 and their spatial difference within the same subject. The tests showed high similarity of the sulcal pit presence and low spatial difference. The similarity was more than 0.90 and the spatial difference was less than 1.7 mm in most cases according to different scan sessions or scanners, and more than 0.85 and about 2.0 mm across surface extraction tools. The reliability of sulcal pit extraction was more affected by the image processing-related factors than the scan session or scanner factors. Moreover, the similarity of sulcal pit distribution appeared to be largely influenced by the presence or absence of the sulcal pits on the shallow and small folds. We suggest that our sulcal pit extraction from MRI is highly reliable and could be useful for clinical applications as an imaging biomarker.

  4. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  5. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  6. Development of the oil spill response cost-effectiveness analytical tool

    International Nuclear Information System (INIS)

    Etkin, D.S.; Welch, J.

    2005-01-01

    Decision-making during oil spill response operations or contingency planning requires balancing the need to remove as much oil as possible from the environment with the desire to minimize the impact of response operations on the environment they are intended to protect. This paper discussed the creation of a computer tool developed to help in planning and decision-making during response operations. The Oil Spill Response Cost-Effectiveness Analytical Tool (OSRCEAT) was developed to compare the costs of response with the benefits of response in both hypothetical and actual oil spills. The computer-based analytical tool can assist responders and contingency planners in decision-making processes as well as act as a basis of discussion in the evaluation of response options. Using inputs on spill parameters, location and response options, OSRCEAT can calculate response cost, costs of environmental and socioeconomic impacts of the oil spill and response impacts. Oil damages without any response are contrasted to oil damages with response, with expected improvements. Response damages are subtracted from the difference in damages with and without response in order to derive a more accurate response benefit. An OSRCEAT user can test various response options to compare potential benefits in order to maximize response benefit. OSRCEAT is best used to compare and contrast the relative benefits and costs of various response options. 50 refs., 19 tabs., 2 figs

  7. TheClinical Research Tool: a high-performance microdialysis-based system for reliably measuring interstitial fluid glucose concentration.

    Science.gov (United States)

    Ocvirk, Gregor; Hajnsek, Martin; Gillen, Ralph; Guenther, Arnfried; Hochmuth, Gernot; Kamecke, Ulrike; Koelker, Karl-Heinz; Kraemer, Peter; Obermaier, Karin; Reinheimer, Cornelia; Jendrike, Nina; Freckmann, Guido

    2009-05-01

    A novel microdialysis-based continuous glucose monitoring system, the so-called Clinical Research Tool (CRT), is presented. The CRT was designed exclusively for investigational use to offer high analytical accuracy and reliability. The CRT was built to avoid signal artifacts due to catheter clogging, flow obstruction by air bubbles, and flow variation caused by inconstant pumping. For differentiation between physiological events and system artifacts, the sensor current, counter electrode and polarization voltage, battery voltage, sensor temperature, and flow rate are recorded at a rate of 1 Hz. In vitro characterization with buffered glucose solutions (c(glucose) = 0 - 26 x 10(-3) mol liter(-1)) over 120 h yielded a mean absolute relative error (MARE) of 2.9 +/- 0.9% and a recorded mean flow rate of 330 +/- 48 nl/min with periodic flow rate variation amounting to 24 +/- 7%. The first 120 h in vivo testing was conducted with five type 1 diabetes subjects wearing two systems each. A mean flow rate of 350 +/- 59 nl/min and a periodic variation of 22 +/- 6% were recorded. Utilizing 3 blood glucose measurements per day and a physical lag time of 1980 s, retrospective calibration of the 10 in vivo experiments yielded a MARE value of 12.4 +/- 5.7. Clarke error grid analysis resulted in 81.0%, 16.6%, 0.8%, 1.6%, and 0% in regions A, B, C, D, and E, respectively. The CRT demonstrates exceptional reliability of system operation and very good measurement performance. The ability to differentiate between artifacts and physiological effects suggests the use of the CRT as a reference tool in clinical investigations. 2009 Diabetes Technology Society.

  8. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  9. Children's Physical Activity While Gardening: Development of a Valid and Reliable Direct Observation Tool.

    Science.gov (United States)

    Myers, Beth M; Wells, Nancy M

    2015-04-01

    Gardens are a promising intervention to promote physical activity (PA) and foster health. However, because of the unique characteristics of gardening, no extant tool can capture PA, postures, and motions that take place in a garden. The Physical Activity Research and Assessment tool for Garden Observation (PARAGON) was developed to assess children's PA levels, tasks, postures, and motions, associations, and interactions while gardening. PARAGON uses momentary time sampling in which a trained observer watches a focal child for 15 seconds and then records behavior for 15 seconds. Sixty-five children (38 girls, 27 boys) at 4 elementary schools in New York State were observed over 8 days. During the observation, children simultaneously wore Actigraph GT3X+ accelerometers. The overall interrater reliability was 88% agreement, and Ebel was .97. Percent agreement values for activity level (93%), garden tasks (93%), motions (80%), associations (95%), and interactions (91%) also met acceptable criteria. Validity was established by previously validated PA codes and by expected convergent validity with accelerometry. PARAGON is a valid and reliable observation tool for assessing children's PA in the context of gardening.

  10. A tale of two tools: Reliability and feasibility of social media measurement tools examining e-cigarette twitter mentions

    Directory of Open Access Journals (Sweden)

    Amelia Burke-Garcia

    Full Text Available Given 70% of Americans are seeking health information online, social media are becoming main sources of health-related information and discussions. Specifically, compounding rising trends in use of e-cigarettes in the US, there has been a rapid rise in e-cigarette marketing – much of which is happening on social media. Public health professionals seeking to understand consumer knowledge, attitudes and beliefs about e-cigarettes should consider analyzing social media data and to do so, there are numerous free and paid tools available. However, each uses different sources and processes, which makes data validation challenging. This exploratory study sought to understand the reliability and feasibility of two social media data tools analyzing e-cigarette tweets. Twitter mentions were pulled from two different industry standard tools (GNIP and Radian6 and data were evaluated on six measures, e.g. Cost, Feasibility, Ease of Use, Poster Type (individual/organization, Context (tweet content analysis, and Valence (positive/negative. Findings included similarities amongst the data sets in terms of the content themes but differences in cost and ease of use of the tools themselves. These findings align with prior research, notably that e-cigarette marketing tweets are most common and public health-related content is noticeably absent. Findings from this exploratory study can inform future social media studies as well as communication campaigns seeking to address the emerging issue of e-cigarette use. Keywords: E-cigarettes, Vaping, Twitter, Tweets, Social media

  11. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    Science.gov (United States)

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  12. Validation and inter-rater reliability of a three item falls risk screening tool

    Directory of Open Access Journals (Sweden)

    Catherine Maree Said

    2017-11-01

    Full Text Available Abstract Background Falls screening tools are routinely used in hospital settings and the psychometric properties of tools should be examined in the setting in which they are used. The aim of this study was to explore the concurrent and predictive validity of the Austin Health Falls Risk Screening Tool (AHFRST, compared with The Northern Hospital Modified St Thomas’s Risk Assessment Tool (TNH-STRATIFY, and the inter-rater reliability of the AHFRST. Methods A research physiotherapist used the AHFRST and TNH-STRATIFY to classify 130 participants admitted to Austin Health (five acute wards, n = 115 two subacute wards n = 15; median length of stay 6 days IQR 3–12 as ‘High’ or ‘Low’ falls risk. The AHFRST was also completed by nursing staff on patient admission. Falls data was collected from the hospital incident reporting system. Results Six falls occurred during the study period (fall rate of 4.6 falls per 1000 bed days. There was substantial agreement between the AHFRST and the TNH-STRATIFY (Kappa = 0.68, 95% CI 0.52–0.78. Both tools had poor predictive validity, with low specificity (AHFRST 46.0%, 95% CI 37.0–55.1; TNH-STRATIFY 34.7%, 95% CI 26.4–43.7 and positive predictive values (AHFRST 5.6%, 95% CI 1.6–13.8; TNH-STRATIFY 6.9%, 95% CI 2.6–14.4. The AHFRST showed moderate inter-rater reliability (Kappa = 0.54, 95% CI = 0.36–0.67, p < 0.001 although 18 patients did not have the AHFRST completed by nursing staff. Conclusions There was an acceptable level of agreement between the 3 item AHFRST classification of falls risk and the longer, 9 item TNH-STRATIFY classification. However, both tools demonstrated limited predictive validity in the Austin Health population. The results highlight the importance of evaluating the validity of falls screening tools, and the clinical utility of these tools should be reconsidered.

  13. ANALYTICAL MODEL FOR LATHE TOOL DISPLACEMENTS CALCULUS IN THE MANUFACTURING P ROCESS

    Directory of Open Access Journals (Sweden)

    Catălin ROŞU

    2014-01-01

    Full Text Available In this paper, we present an analytical model for lathe tools displacements calculus in the manufacturing process. We will present step by step the methodology for the displacements calculus and in the end we will insert these relations in a program for automatic calculus and we extract the conclusions. There is taken into account only the effects of the bending moments (because these insert the highest displacements. The simplifying assumptions and the calculus relations for the displacements (linea r and angular ones are presented in an original way.

  14. FIGURED WORLDS AS AN ANALYTIC AND METHODOLOGICAL TOOL IN PROFESSIONAL TEACHER DEVELOPMENT

    DEFF Research Database (Denmark)

    Møller, Hanne; Brok, Lene Storgaard

    . Hasse (2015) and Holland (1998) have inspired our study; i.e., learning is conceptualized as a social phenomenon, implying that contexts of learning are decisive for learner identity. The concept of Figured Worlds is used to understand the development and the social constitution of emergent interactions......,“(Holland et al., 1998, p. 52) and gives a framework for understanding meaning-making in particular pedagogical settings. We exemplify our use of the term Figured Worlds, both as an analytic and methodological tool for empirical studies in kindergarten and school. Based on data sources, such as field notes...

  15. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  16. Failure database and tools for wind turbine availability and reliability analyses. The application of reliability data for selected wind turbines

    DEFF Research Database (Denmark)

    Kozine, Igor; Christensen, P.; Winther-Jensen, M.

    2000-01-01

    The objective of this project was to develop and establish a database for collecting reliability and reliability-related data, for assessing the reliability of wind turbine components and subsystems and wind turbines as a whole, as well as for assessingwind turbine availability while ranking the ...... similar safety systems. The database was established with Microsoft Access DatabaseManagement System, the software for reliability and availability assessments was created with Visual Basic....... the contributions at both the component and system levels. The project resulted in a software package combining a failure database with programs for predicting WTB availability and the reliability of all thecomponents and systems, especially the safety system. The report consists of a description of the theoretical......The objective of this project was to develop and establish a database for collecting reliability and reliability-related data, for assessing the reliability of wind turbine components and subsystems and wind turbines as a whole, as well as for assessingwind turbine availability while ranking...

  17. Relative and Absolute Reliability of the Professionalism in Physical Therapy Core Values Self-Assessment Tool.

    Science.gov (United States)

    Furgal, Karen E; Norris, Elizabeth S; Young, Sonia N; Wallmann, Harvey W

    2018-01-01

    Development of professional behaviors in Doctor of Physical Therapy (DPT) students is an important part of professional education. The American Physical Therapy Association (APTA) has developed the Professionalism in Physical Therapy Core Values Self-Assessment (PPTCV-SA) tool to increase awareness of personal values in practice. The PPTCV-SA has been used to measure growth in professionalism following a clinical or educational experience. There are few studies reporting psychometric properties of the PPTCV-SA. The purpose of this study was to establish properties of relative reliability (intraclass correlation coefficient, iCC) and absolute reliability (standard error of measurement, SEM; minimal detectable change, MDC) of the PPTCV-SA. in this project, 29 first-year students in a DPT program were administered the PPTCVA-SA on two occasions, 2 weeks apart. Paired t-tests were used to examine stability in PPTCV-SA scores on the two occasions. iCCs were calculated as a measure of relative reliability and for use in the calculation of the absolute reliability measures of SEM and MDC. Results of paired t-tests indicated differences in the subscale scores between times 1 and 2 were non-significant, except for three subscales: Altruism (p=0.01), Excellence (p=0.05), and Social Responsibility (p=0.02). iCCs for test-retest reliability were moderate-to-good for all subscales, with SEMs ranging from 0.30 to 0.62, and MDC95 ranging from 0.83 to 1.71. These results can guide educators and researchers when determining the likelihood of true change in professionalism following a professional development activity.

  18. Kalisphera: an analytical tool to reproduce the partial volume effect of spheres imaged in 3D

    International Nuclear Information System (INIS)

    Tengattini, Alessandro; Andò, Edward

    2015-01-01

    In experimental mechanics, where 3D imaging is having a profound effect, spheres are commonly adopted for their simplicity and for the ease of their modeling. In this contribution we develop an analytical tool, ‘kalisphera’, to produce 3D raster images of spheres including their partial volume effect. This allows us to evaluate the metrological performance of existing image-based measurement techniques (knowing a priori the ground truth). An advanced application of ‘kalisphera’ is developed here to identify and accurately characterize spheres in real 3D x-ray tomography images with the objective of improving trinarization and contact detection. The effect of the common experimental imperfections is assessed and the overall performance of the tool tested on real images. (paper)

  19. Gearbox Reliability Collaborative Analytic Formulation for the Evaluation of Spline Couplings

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yi [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keller, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Errichello, Robert [GEARTECH, Houston, TX (United States); Halse, Chris [Romax Technology, Nottingham (United Kingdom)

    2013-12-01

    Gearboxes in wind turbines have not been achieving their expected design life; however, they commonly meet and exceed the design criteria specified in current standards in the gear, bearing, and wind turbine industry as well as third-party certification criteria. The cost of gearbox replacements and rebuilds, as well as the down time associated with these failures, has elevated the cost of wind energy. The National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) was established by the U.S. Department of Energy in 2006; its key goal is to understand the root causes of premature gearbox failures and improve their reliability using a combined approach of dynamometer testing, field testing, and modeling. As part of the GRC program, this paper investigates the design of the spline coupling often used in modern wind turbine gearboxes to connect the planetary and helical gear stages. Aside from transmitting the driving torque, another common function of the spline coupling is to allow the sun to float between the planets. The amount the sun can float is determined by the spline design and the sun shaft flexibility subject to the operational loads. Current standards address spline coupling design requirements in varying detail. This report provides additional insight beyond these current standards to quickly evaluate spline coupling designs.

  20. The Surgical Safety Checklist and Teamwork Coaching Tools: a study of inter-rater reliability.

    Science.gov (United States)

    Huang, Lyen C; Conley, Dante; Lipsitz, Stu; Wright, Christopher C; Diller, Thomas W; Edmondson, Lizabeth; Berry, William R; Singer, Sara J

    2014-08-01

    To assess the inter-rater reliability (IRR) of two novel observation tools for measuring surgical safety checklist performance and teamwork. Data surgical safety checklists can promote adherence to standards of care and improve teamwork in the operating room. Their use has been associated with reductions in mortality and other postoperative complications. However, checklist effectiveness depends on how well they are performed. Authors from the Safe Surgery 2015 initiative developed a pair of novel observation tools through literature review, expert consultation and end-user testing. In one South Carolina hospital participating in the initiative, two observers jointly attended 50 surgical cases and independently rated surgical teams using both tools. We used descriptive statistics to measure checklist performance and teamwork at the hospital. We assessed IRR by measuring percent agreement, Cohen's κ, and weighted κ scores. The overall percent agreement and κ between the two observers was 93% and 0.74 (95% CI 0.66 to 0.79), respectively, for the Checklist Coaching Tool and 86% and 0.84 (95% CI 0.77 to 0.90) for the Surgical Teamwork Tool. Percent agreement for individual sections of both tools was 79% or higher. Additionally, κ scores for six of eight sections on the Checklist Coaching Tool and for two of five domains on the Surgical Teamwork Tool achieved the desired 0.7 threshold. However, teamwork scores were high and variation was limited. There were no significant changes in the percent agreement or κ scores between the first 10 and last 10 cases observed. Both tools demonstrated substantial IRR and required limited training to use. These instruments may be used to observe checklist performance and teamwork in the operating room. However, further refinement and calibration of observer expectations, particularly in rating teamwork, could improve the utility of the tools. Published by the BMJ Publishing Group Limited. For permission to use (where not already

  1. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  2. Procedures for treating common cause failures in safety and reliability studies: Analytical background and techniques

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1989-01-01

    Volume I of this report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. This document, Volume 2, contains a series of appendices that provide additional background and methodological detail on several important topics discussed in Volume I

  3. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  4. A study of lip prints and its reliability as a forensic tool

    Science.gov (United States)

    Verma, Yogendra; Einstein, Arouquiaswamy; Gondhalekar, Rajesh; Verma, Anoop K.; George, Jiji; Chandra, Shaleen; Gupta, Shalini; Samadi, Fahad M.

    2015-01-01

    Introduction: Lip prints, like fingerprints, are unique to an individual and can be easily recorded. Therefore, we compared direct and indirect lip print patterns in males and females of different age groups, studied the inter- and intraobserver bias in recording the data, and observed any changes in the lip print patterns over a period of time, thereby, assessing the reliability of lip prints as a forensic tool. Materials and Methods: Fifty females and 50 males in the age group of 15 to 35 years were selected for the study. Lips with any deformity or scars were not included. Lip prints were registered by direct and indirect methods and transferred to a preformed registration sheet. Direct method of lip print registration was repeated after a six-month interval. All the recorded data were analyzed statistically. Results: The predominant patterns were vertical and branched. More females showed the branched pattern and males revealed an equal prevalence of vertical and reticular patterns. There was an interobserver agreement, which was 95%, and there was no change in the lip prints over time. Indirect registration of lip prints correlated with direct method prints. Conclusion: Lip prints can be used as a reliable forensic tool, considering the consistency of lip prints over time and the accurate correlation of indirect prints to direct prints. PMID:26668449

  5. Validity and reliability of a new tool to evaluate handwriting difficulties in Parkinson's disease.

    Directory of Open Access Journals (Sweden)

    Evelien Nackaerts

    Full Text Available Handwriting in Parkinson's disease (PD features specific abnormalities which are difficult to assess in clinical practice since no specific tool for evaluation of spontaneous movement is currently available.This study aims to validate the 'Systematic Screening of Handwriting Difficulties' (SOS-test in patients with PD.Handwriting performance of 87 patients and 26 healthy age-matched controls was examined using the SOS-test. Sixty-seven patients were tested a second time within a period of one month. Participants were asked to copy as much as possible of a text within 5 minutes with the instruction to write as neatly and quickly as in daily life. Writing speed (letters in 5 minutes, size (mm and quality of handwriting were compared. Correlation analysis was performed between SOS outcomes and other fine motor skill measurements and disease characteristics. Intrarater, interrater and test-retest reliability were assessed using the intraclass correlation coefficient (ICC and Spearman correlation coefficient.Patients with PD had a smaller (p = 0.043 and slower (p 0.769 for both groups.The SOS-test is a short and effective tool to detect handwriting problems in PD with excellent reliability. It can therefore be recommended as a clinical instrument for standardized screening of handwriting deficits in PD.

  6. Breast MRI used as a problem-solving tool reliably excludes malignancy

    International Nuclear Information System (INIS)

    Spick, Claudio; Szolar, Dieter H.M.; Preidler, Klaus W.; Tillich, Manfred; Reittner, Pia; Baltzer, Pascal A.

    2015-01-01

    Highlights: • Breast MRI reliably excludes malignancy in conventional BI-RADS 0 cases (NPV: 100%). • Malignancy rate in the BI-RADS 0 population is substantial with 13.5%. • Breast MRI used as a problem-solving tool reliably excludes malignancy. - Abstract: Purpose: To evaluate the diagnostic performance of breast MRI if used as a problem-solving tool in BI-RADS 0 cases. Material and methods: In this IRB-approved, single-center study, 687 women underwent high-resolution-3D, dynamic contrast-enhanced breast magnetic resonance imaging (MRI) between January 2012 and December 2012. Of these, we analyzed 111 consecutive patients (mean age, 51 ± 12 years; range, 20–83 years) categorized as BI-RADS 0. Breast MRI findings were stratified by clinical presentations, conventional imaging findings, and breast density. MRI results were compared to the reference standard, defined as histopathology or an imaging follow-up of at least 1 year. Results: One hundred eleven patients with BI-RADS 0 conventional imaging findings revealed 30 (27%) mammographic masses, 57 (51.4%) mammographic architectural distortions, five (4.5%) mammographic microcalcifications, 17 (15.3%) ultrasound-only findings, and two palpable findings without imaging correlates. There were 15 true-positive, 85 true-negative, 11 false-positive, and zero false-negative breast MRI findings, resulting in a sensitivity, specificity, PPV, and NPV of 100% (15/15), 88.5% (85/96), 57.7% (15/26), and 100% (85/85), respectively. Breast density and reasons for referral had no significant influence on the diagnostic performance of breast MRI (p > 0.05). Conclusion: Breast MRI reliably excludes malignancy in conventional BI-RADS 0 cases resulting in a NPV of 100% (85/85) and a PPV of 57.7% (15/26)

  7. The order progress diagram : A supportive tool for diagnosing delivery reliability performance in make-to-order companies

    NARCIS (Netherlands)

    Soepenberg, G.D.; Land, M.J.; Gaalman, G.J.C.

    This paper describes the development of a new tool for facilitating the diagnosis of logistic improvement opportunities in make-to-order (MTO) companies. Competitiveness of these companies increasingly imposes needs upon delivery reliability. In order to achieve high delivery reliability, both the

  8. The constant failure rate model for fault tree evaluation as a tool for unit protection reliability assessment

    International Nuclear Information System (INIS)

    Vichev, S.; Bogdanov, D.

    2000-01-01

    The purpose of this paper is to introduce the fault tree analysis method as a tool for unit protection reliability estimation. The constant failure rate model applies for making reliability assessment, and especially availability assessment. For that purpose an example for unit primary equipment structure and fault tree example for simplified unit protection system is presented (author)

  9. AN ANALYTICAL FRAMEWORK FOR ASSESSING RELIABLE NUCLEAR FUEL SERVICE APPROACHES: ECONOMIC AND NON-PROLIFERATION MERITS OF NUCLEAR FUEL LEASING

    International Nuclear Information System (INIS)

    Kreyling, Sean J.; Brothers, Alan J.; Short, Steven M.; Phillips, Jon R.; Weimar, Mark R.

    2010-01-01

    The goal of international nuclear policy since the dawn of nuclear power has been the peaceful expansion of nuclear energy while controlling the spread of enrichment and reprocessing technology. Numerous initiatives undertaken in the intervening decades to develop international agreements on providing nuclear fuel supply assurances, or reliable nuclear fuel services (RNFS) attempted to control the spread of sensitive nuclear materials and technology. In order to inform the international debate and the development of government policy, PNNL has been developing an analytical framework to holistically evaluate the economics and non-proliferation merits of alternative approaches to managing the nuclear fuel cycle (i.e., cradle-to-grave). This paper provides an overview of the analytical framework and discusses preliminary results of an economic assessment of one RNFS approach: full-service nuclear fuel leasing. The specific focus of this paper is the metrics under development to systematically evaluate the non-proliferation merits of fuel-cycle management alternatives. Also discussed is the utility of an integrated assessment of the economics and non-proliferation merits of nuclear fuel leasing.

  10. Online Analytical Processing (OLAP: A Fast and Effective Data Mining Tool for Gene Expression Databases

    Directory of Open Access Journals (Sweden)

    Alkharouf Nadim W.

    2005-01-01

    Full Text Available Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD. A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  11. Capturing student mathematical engagement through differently enacted classroom practices: applying a modification of Watson's analytical tool

    Science.gov (United States)

    Patahuddin, Sitti Maesuri; Puteri, Indira; Lowrie, Tom; Logan, Tracy; Rika, Baiq

    2018-04-01

    This study examined student mathematical engagement through the intended and enacted lessons taught by two teachers in two different middle schools in Indonesia. The intended lesson was developed using the ELPSA learning design to promote mathematical engagement. Based on the premise that students will react to the mathematical tasks in the forms of words and actions, the analysis focused on identifying the types of mathematical engagement promoted through the intended lesson and performed by students during the lesson. Using modified Watson's analytical tool (2007), students' engagement was captured from what the participants' did or said mathematically. We found that teachers' enacted practices had an influence on student mathematical engagement. The teacher who demonstrated content in explicit ways tended to limit the richness of the engagement; whereas the teacher who presented activities in an open-ended manner fostered engagement.

  12. Online analytical processing (OLAP): a fast and effective data mining tool for gene expression databases.

    Science.gov (United States)

    Alkharouf, Nadim W; Jamison, D Curtis; Matthews, Benjamin F

    2005-06-30

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  13. Reliability centered maintenance as an optimization tool for electrical power plants

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Bryla, P.; Martin-Mattei, C.; Meuwisse, C.

    1997-08-01

    Seven years ago, Electricite de France launched a Reliability Centered Maintenance (RCM) pilot project to optimize preventive maintenance for its nuclear power plants. After a feasibility study, a RCM method was standardized. It is now applied on a large scale to the 50 EDF nuclear units. A RCM workstation based on this standardized method has been developed and is now used in each plant. In the next step, it is considered whether a Risk based Approach can be included in this RCM process in order to analyze critical passive components such as pipes and supports. Considering the potential advantages of these optimization techniques, a dedicated process has been also developed for maintenance of future plants, gas turbines, or nuclear units. A survey of these different developments of methods and tools is presented. (author)

  14. Failure Modes Effects and Criticality Analysis, an Underutilized Safety, Reliability, Project Management and Systems Engineering Tool

    Science.gov (United States)

    Mullin, Daniel Richard

    2013-09-01

    The majority of space programs whether manned or unmanned for science or exploration require that a Failure Modes Effects and Criticality Analysis (FMECA) be performed as part of their safety and reliability activities. This comes as no surprise given that FMECAs have been an integral part of the reliability engineer's toolkit since the 1950s. The reasons for performing a FMECA are well known including fleshing out system single point failures, system hazards and critical components and functions. However, in the author's ten years' experience as a space systems safety and reliability engineer, findings demonstrate that the FMECA is often performed as an afterthought, simply to meet contract deliverable requirements and is often started long after the system requirements allocation and preliminary design have been completed. There are also important qualitative and quantitative components often missing which can provide useful data to all of project stakeholders. These include; probability of occurrence, probability of detection, time to effect and time to detect and, finally, the Risk Priority Number. This is unfortunate as the FMECA is a powerful system design tool that when used effectively, can help optimize system function while minimizing the risk of failure. When performed as early as possible in conjunction with writing the top level system requirements, the FMECA can provide instant feedback on the viability of the requirements while providing a valuable sanity check early in the design process. It can indicate which areas of the system will require redundancy and which areas are inherently the most risky from the onset. Based on historical and practical examples, it is this author's contention that FMECAs are an immense source of important information for all involved stakeholders in a given project and can provide several benefits including, efficient project management with respect to cost and schedule, system engineering and requirements management

  15. Is Google Trends a reliable tool for digital epidemiology? Insights from different clinical settings.

    Science.gov (United States)

    Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe

    2017-09-01

    Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  16. Reliability and Validity of the Korean Cancer Pain Assessment Tool (KCPAT)

    Science.gov (United States)

    Kim, Jeong A; Lee, Juneyoung; Park, Jeanno; Lee, Myung Ah; Yeom, Chang Hwan; Jang, Se Kwon; Yoon, Duck Mi; Kim, Jun Suk

    2005-01-01

    The Korean Cancer Pain Assessment Tool (KCPAT), which was developed in 2003, consists of questions concerning the location of pain, the nature of pain, the present pain intensity, the symptoms associated with the pain, and psychosocial/spiritual pain assessments. This study was carried out to evaluate the reliability and validity of the KCPAT. A stratified, proportional-quota, clustered, systematic sampling procedure was used. The study population (903 cancer patients) was 1% of the target population (90,252 cancer patients). A total of 314 (34.8%) questionnaires were collected. The results showed that the average pain score (5 point on Likert scale) according to the cancer type and the at-present average pain score (VAS, 0-10) were correlated (r=0.56, p<0.0001), and showed moderate agreement (kappa=0.364). The mean satisfaction score was 3.8 (1-5). The average time to complete the questionnaire was 8.9 min. In conclusion, the KCPAT is a reliable and valid instrument for assessing cancer pain in Koreans. PMID:16224166

  17. The effect on reliability and sensitivity to level of training of combining analytic and holistic rating scales for assessing communication skills in an internal medicine resident OSCE.

    Science.gov (United States)

    Daniels, Vijay John; Harley, Dwight

    2017-07-01

    Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Endoscopy nurse-administered propofol sedation performance. Development of an assessment tool and a reliability testing model

    DEFF Research Database (Denmark)

    Jensen, Jeppe Thue; Konge, Lars; Møller, Ann

    2014-01-01

    of training and for future certification. The aim of this study was to develop an assessment tool for measuring competency in propofol sedation and to explore the reliability and validity of the tool. MATERIAL AND METHODS: The nurse-administered propofol assessment tool (NAPSAT) was developed in a Delphi...... and good construct validity. This makes NAPSAT fit for formative assessment and proficiency feedback; however, high stakes and summative assessment cannot be advised....

  19. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    Science.gov (United States)

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  20. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  1. CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool.

    Directory of Open Access Journals (Sweden)

    Manuel Stemmer

    Full Text Available Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5' end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites.

  2. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  3. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  4. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    Science.gov (United States)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la

  5. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  6. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    Science.gov (United States)

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  7. Basic Concepts in Classical Test Theory: Tests Aren't Reliable, the Nature of Alpha, and Reliability Generalization as a Meta-analytic Method.

    Science.gov (United States)

    Helms, LuAnn Sherbeck

    This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…

  8. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  9. Emergency Severity Index version 4: a valid and reliable tool in pediatric emergency department triage.

    Science.gov (United States)

    Green, Nicole A; Durani, Yamini; Brecher, Deena; DePiero, Andrew; Loiselle, John; Attia, Magdy

    2012-08-01

    The Emergency Severity Index version 4 (ESI v.4) is the most recently implemented 5-level triage system. The validity and reliability of this triage tool in the pediatric population have not been extensively established. The goals of this study were to assess the validity of ESI v.4 in predicting hospital admission, emergency department (ED) length of stay (LOS), and number of resources utilized, as well as its reliability in a prospective cohort of pediatric patients. The first arm of the study was a retrospective chart review of 780 pediatric patients presenting to a pediatric ED to determine the validity of ESI v.4. Abstracted data included acuity level assigned by the triage nurse using ESI v.4 algorithm, disposition (admission vs discharge), LOS, and number of resources utilized in the ED. To analyze the validity of ESI v.4, patients were divided into 2 groups for comparison: higher-acuity patients (ESI levels 1, 2, and 3) and lower-acuity patients (ESI levels 4 and 5). Pearson χ analysis was performed for categorical variables. For continuous variables, we conducted a comparison of means based on parametric distribution of variables. The second arm was a prospective cohort study to determine the interrater reliability of ESI v.4 among and between pediatric triage (PT) nurses and pediatric emergency medicine (PEM) physicians. Three raters (2 PT nurses and 1 PEM physician) independently assigned triage scores to 100 patients; k and interclass correlation coefficient were calculated among PT nurses and between the primary PT nurses and physicians. In the validity arm, the distribution of ESI score levels among the 780 cases are as follows: ESI 1: 2 (0.25%); ESI 2: 73 (9.4%); ESI 3: 289 (37%); ESI 4: 251 (32%); and ESI 5: 165 (21%). Hospital admission rates by ESI level were 1: 100%, 2: 42%, 3: 14.9%, 4: 1.2%, and 5: 0.6%. The admission rate of the higher-acuity group (76/364, 21%) was significantly greater than the lower-acuity group (4/415, 0.96%), P group was

  10. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Science.gov (United States)

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  11. Photogrammetry: an accurate and reliable tool to detect thoracic musculoskeletal abnormalities in preterm infants.

    Science.gov (United States)

    Davidson, Josy; dos Santos, Amelia Miyashiro N; Garcia, Kessey Maria B; Yi, Liu C; João, Priscila C; Miyoshi, Milton H; Goulart, Ana Lucia

    2012-09-01

    To analyse the accuracy and reproducibility of photogrammetry in detecting thoracic abnormalities in infants born prematurely. Cross-sectional study. The Premature Clinic at the Federal University of São Paolo. Fifty-eight infants born prematurely in their first year of life. Measurement of the manubrium/acromion/trapezius angle (degrees) and the deepest thoracic retraction (cm). Digitised photographs were analysed by two blinded physiotherapists using a computer program (SAPO; http://SAPO.incubadora.fapesp.br) to detect shoulder elevation and thoracic retraction. Physical examinations performed independently by two physiotherapists were used to assess the accuracy of the new tool. Thoracic alterations were detected in 39 (67%) and in 40 (69%) infants by Physiotherapists 1 and 2, respectively (kappa coefficient=0.80). Using a receiver operating characteristic curve, measurement of the manubrium/acromion/trapezius angle and the deepest thoracic retraction indicated accuracy of 0.79 and 0.91, respectively. For measurement of the manubrium/acromion/trapezius angle, the Bland and Altman limits of agreement were -6.22 to 7.22° [mean difference (d)=0.5] for repeated measures by one physiotherapist, and -5.29 to 5.79° (d=0.75) between two physiotherapists. For thoracic retraction, the intra-rater limits of agreement were -0.14 to 0.18cm (d=0.02) and the inter-rater limits of agreement were -0.20 to -0.17cm (d=0.02). SAPO provided an accurate and reliable tool for the detection of thoracic abnormalities in preterm infants. Copyright © 2011 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  12. Development of innovative inspection tools for higher reliability of PHWR fuel

    International Nuclear Information System (INIS)

    Kamalesh Kumar, B.; Viswanathan, B.; Laxminarayana, B.; Ganguly, C.

    2003-01-01

    'Full text:' Advent of Computer aided manufacturing systems has led to very high rate of production with greater reliability. The conventional inspection tools and systems, which are often manual based do not complement with output of highly automated production line. In order to overcome the deficiency, a strategic plan was developed for having automated inspection facility for PHWR fuel assembly line. Laser based systems with their inherently high accuracy and quick response times are a favorite for metrology purpose. Non-contact nature of laser-based measurement ensures minimal contamination, low wear and tear and good repeatability. So far two laser-based systems viz. Pellet density measurement systems and triangulation sensors have been developed. Laser based fuel pellet inspection system and PHWR fuel bundle metric station are under development. Machine vision-based systems have been developed to overcome certain limitations when inspection has to be carried out on such a large scale manually. These deficiencies arise from limitations of resolution, accessibility, fatigue and absence of quantification ability. These problems get further compounded in inspection of fuel components because of their relatively small sizes, close tolerances required and the reflective surfaces. PC based vision system has been developed for inspecting components and fuel assemblies. The paper would touch upon the details of the various laser systems and vision systems that have been indigenously developed for PHWR Fuel Metrology and their impact on the assembly production line. (author)

  13. FURAX: assistance tools for the qualitative and quantitative analysis of systems reliability

    International Nuclear Information System (INIS)

    Moureau, R.

    1995-01-01

    FURAX is a set of tools for the qualitative and quantitative safety analysis of systems functioning. It is particularly well adapted to the study of networks (fluids, electrical..), i.e. systems in which importance is functionally given to a flux. The analysis is based on modeling which privileges these fluxes (skeleton representation of the system for a network, functional diagram for a non single-flux system) and on the representation of components support systems. Qualitative analyses are based on the research for possible flux ways and on the technical domain knowledge. The results obtained correspond to a simplified failure mode analysis, to fault-trees relative to the events expected by the user and to minimum sections. The possible calculations on these models are: tree calculations, Markov diagram calculations of the system reliability, and probabilistic calculation of a section viewed as a tree, as a well-ordered sequence of failures, or as the absorbing state of a Markov diagram. (J.S.). 6 refs

  14. Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.

    Science.gov (United States)

    Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert

    2017-07-01

    The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Fourier transform ion cyclotron resonance mass spectrometry: the analytical tool for heavy oil and bitumen characterization

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Thomas B.P; Brown, Melisa; Hsieh, Ben; Larter, Steve [Petroleum Reservoir Group (prg), Department of Geoscience, University of Calgary, Alberta (Canada)

    2011-07-01

    The Fourier Transform Ion Cyclotron Resonance Mass Spectrometry (FTICRMS), developed in the 1970's by Marshall and Comisarow at the University of British Columbia, has become a commercially available tool capable of analyzing several hundred thousand components in a petroleum mixture at once. This analytical technology will probably usher a dramatic revolution in geochemical capability, equal to or greater than the molecular revolution that occurred when GCMS technologies became cheaply available. The molecular resolution and information content given by the FTICRMS petroleum analysis can be compared to the information in the human genome. With current GCMS-based petroleum geochemical protocols perhaps a few hundred components can be quantitatively determined, but with FTICRMS, 1000 times this number of components could possibly be resolved. However, fluid and other properties depend on interaction of this multitude of hydrocarbon and non-hydrocarbon components, not the components themselves, and access to the full potential of this new petroleomics will depend on the definition of this interactome.

  16. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    Massa, R.J.

    1987-01-01

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  17. Impact of population and economic growth on carbon emissions in Taiwan using an analytic tool STIRPAT

    Directory of Open Access Journals (Sweden)

    Jong-Chao Yeh

    2017-01-01

    Full Text Available Carbon emission has increasingly become an issue of global concern because of climate change. Unfortunately, Taiwan was listed as top 20 countries of carbon emission in 2014. In order to provide appropriate measures to control carbon emission, it appears that there is an urgent need to address how such factors as population and economic growth impact the emission of carbon dioxide in any developing countries. In addition to total population, both the percentages of population living in urban area (i.e., urbanization percentage, and non-dependent population may also serve as limiting factors. On the other hand, the total energy-driven gross domestic production (GDP and the percentage of GDP generated by the manufacturing industries are assessed to see their respective degree of impact on carbon emission. Therefore, based on the past national data in the period 1990–2014 in Taiwan, an analytic tool of Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT was employed to see how well those aforementioned factors can describe their individual potential impact on global warming, which is measured by the total amount of carbon emission into the atmosphere. Seven scenarios of STIRPAT model were proposed and tested statistically for the significance of each proposed model. As a result, two models were suggested to predict the impact of carbon emission due to population and economic growth by the year 2025 in Taiwan.

  18. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  19. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    Science.gov (United States)

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Nottingham Prognostic Index in Triple-Negative Breast Cancer: a reliable prognostic tool?

    International Nuclear Information System (INIS)

    Albergaria, André; Ricardo, Sara; Milanezi, Fernanda; Carneiro, Vítor; Amendoeira, Isabel; Vieira, Daniella; Cameselle-Teijeiro, Jorge; Schmitt, Fernando

    2011-01-01

    A breast cancer prognostic tool should ideally be applicable to all types of invasive breast lesions. A number of studies have shown histopathological grade to be an independent prognostic factor in breast cancer, adding prognostic power to nodal stage and tumour size. The Nottingham Prognostic Index has been shown to accurately predict patient outcome in stratified groups with a follow-up period of 15 years after primary diagnosis of breast cancer. Clinically, breast tumours that lack the expression of Oestrogen Receptor, Progesterone Receptor and Human Epidermal growth factor Receptor 2 (HER2) are identified as presenting a 'triple-negative' phenotype or as triple-negative breast cancers. These poor outcome tumours represent an easily recognisable prognostic group of breast cancer with aggressive behaviour that currently lack the benefit of available systemic therapy. There are conflicting results on the prevalence of lymph node metastasis at the time of diagnosis in triple-negative breast cancer patients but it is currently accepted that triple-negative breast cancer does not metastasize to axillary nodes and bones as frequently as the non-triple-negative carcinomas, favouring instead, a preferentially haematogenous spread. Hypothetically, this particular tumour dissemination pattern would impair the reliability of using Nottingham Prognostic Index as a tool for triple-negative breast cancer prognostication. The present study tested the effectiveness of the Nottingham Prognostic Index in stratifying breast cancer patients of different subtypes with special emphasis in a triple-negative breast cancer patient subset versus non- triple-negative breast cancer. We demonstrated that besides the fact that TNBC disseminate to axillary lymph nodes as frequently as luminal or HER2 tumours, we also showed that TNBC are larger in size compared with other subtypes and almost all grade 3. Additionally, survival curves demonstrated that these prognostic factors are

  1. Extented second moment algebra as an efficient tool in structural reliability

    International Nuclear Information System (INIS)

    Ditlevsen, O.

    1982-01-01

    During the seventies, second moment structural reliability analysis was extensively discussed with respect to philosophy and method. One recent clarification into a consistent formalism is represented by the extended second moment reliability theory with the generalized reliability index as its measure of safety. Its methods of formal failure probability calculations are useful independent of the opinion that one may adopt about the philosophy of the second moment reliability formalism. After an introduction of the historical development of the philosphy the paper gives a short introductory review of the extended second moment structural reliability theory. (orig.)

  2. The Impact of a Mechanical Press on the Accuracy of Products and the Reliability of Tools in Cold Forging

    DEFF Research Database (Denmark)

    Krusic, V.; Arentoft, Mogens; Rodic, T.

    2005-01-01

    Cold extrusion is an economic production process for the production of elements of complex forms and accurate dimensions. The first part of the article is about the impact that mechanical press has on the accuracy of products and reliability of tools. There is a description of the mechanical pres...

  3. Reliability of a Simple Physical Therapist Screening Tool to Assess Errors during Resistance Exercises for Musculoskeletal Pain

    DEFF Research Database (Denmark)

    Andersen, Kenneth Jay; Sundstrup, E.; Andersen, L. L.

    2014-01-01

    The main objective was to investigate the intra- and intertester reliability of a simple screening tool assessing errors in exercise execution by visual observation. 38 participants with no previous resistance exercise experience practiced for two weeks four typical upper limb exercises using ela...

  4. Assessing Reliability and Validity of the "GroPromo" Audit Tool for Evaluation of Grocery Store Marketing and Promotional Environments

    Science.gov (United States)

    Kerr, Jacqueline; Sallis, James F.; Bromby, Erica; Glanz, Karen

    2012-01-01

    Objective: To evaluate reliability and validity of a new tool for assessing the placement and promotional environment in grocery stores. Methods: Trained observers used the "GroPromo" instrument in 40 stores to code the placement of 7 products in 9 locations within a store, along with other promotional characteristics. To test construct validity,…

  5. Systematic evaluation of the teaching qualities of Obstetrics and Gynecology faculty: reliability and validity of the SETQ tools

    NARCIS (Netherlands)

    van der Leeuw, Renée; Lombarts, Kiki; Heineman, Maas Jan; Arah, Onyebuchi

    2011-01-01

    The importance of effective clinical teaching for the quality of future patient care is globally understood. Due to recent changes in graduate medical education, new tools are needed to provide faculty with reliable and individualized feedback on their teaching qualities. This study validates two

  6. Semi-structured interview is a reliable and feasible tool for selection of doctors for general practice specialist training

    DEFF Research Database (Denmark)

    Isaksen, Jesper; Hertel, Niels Thomas; Kjær, Niels Kristian

    2013-01-01

    In order to optimise the selection process for admission to specialist training in family medicine, we developed a new design for structured applications and selection interviews. The design contains semi-structured interviews, which combine individualised elements from the applications...... with standardised behaviour-based questions. This paper describes the design of the tool, and offers reflections concerning its acceptability, reliability and feasibility....

  7. A multisource feedback tool to assess ward round leadership skills of senior paediatric trainees: (2) Testing reliability and practicability.

    Science.gov (United States)

    Goodyear, Helen M; Lakshminarayana, Indumathy; Wall, David; Bindal, Taruna

    2015-05-01

    A five-domain multisource feedback (MSF) tool was previously developed in 2009-2010 by the authors to assess senior paediatric trainees' ward round leadership skills. To determine whether this MSF tool is practicable and reliable, whether individuals' feedback varies over time and trainees' views of the tool. The MSF tool was piloted (April-July 2011) and field tested (September 2011-February 2013) with senior paediatric trainees. A focus group held at the end of field testing obtained trainees' views of the tool. In field testing, 96/115 (84%) trainees returned 633 individual assessments from three different ward rounds over 18 months. The MSF tool had high reliability (Cronbach's α 0.84, G coefficient 0.8 for three raters). In all five domains, data were shifted to the right with scores of 3 (good) and 4 (excellent). Consultants gave significantly lower scores (p<0.001), as did trainees for self-assessment (p<0.001). There was no significant change in MSF scores over 18 months but comments showed that trainees' performance improved. Trainees valued these comments and the MSF tool but had concerns about time taken for feedback and confusion about tool use and the paediatric assessment strategy. A five-domain MSF tool was found to be reliable on pilot and field testing, practicable to use and liked by trainees. Comments on performance were more helpful than scores in giving trainees feedback. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    Science.gov (United States)

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  9. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    Science.gov (United States)

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products.

  11. Assessing Households Preparedness for Earthquakes: An Exploratory Study in the Development of a Valid and Reliable Persian-version Tool.

    Science.gov (United States)

    Ardalan, Ali; Sohrabizadeh, Sanaz

    2016-02-25

    Iran is placed among countries suffering from the highest number of earthquake casualties. Household preparedness, as one component of risk reduction efforts, is often supported in quake-prone areas. In Iran, lack of a valid and reliable household preparedness tool was reported by previous disaster studies. This study is aimed to fill this gap by developing a valid and reliable tool for assessing household preparedness in the event of an earthquake.  This survey was conducted through three phases including literature review and focus group discussions with the participation of eight key informants, validity measurements and reliability measurements. Field investigation was completed with the participation of 450 households within three provinces of Iran. Content validity, construct validity, the use of factor analysis; internal consistency using Cronbach's alpha coefficient, and test-retest reliability were carried out to develop the tool.  Based on the CVIs, ranging from 0.80 to 0.100, and exploratory factor analysis with factor loading of more than 0.5, all items were valid. The amount of Cronbach's alpha (0.7) and test-retest examination by Spearman correlations indicated that the scale was also reliable. The final instrument consisted of six categories and 18 questions including actions at the time of earthquakes, nonstructural safety, structural safety, hazard map, communications, drill, and safety skills.  Using a Persian-version tool that is adjusted to the socio-cultural determinants and native language may result in more trustful information on earthquake preparedness. It is suggested that disaster managers and researchers apply this tool in their future household preparedness projects. Further research is needed to make effective policies and plans for transforming preparedness knowledge into behavior.

  12. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  13. Automation of analytical processes. A tool for higher efficiency and safety

    International Nuclear Information System (INIS)

    Groll, P.

    1976-01-01

    The analytical laboratory of a radiochemical facility is usually faced with the fact that numerous analyses of a similar type must be routinely carried out. Automation of such routine analytical procedures helps in increasing the efficiency and safety of the work. A review of the requirements for automation and its advantages is given and demonstrated on three examples. (author)

  14. Assessing communication skills in dietetic consultations: the development of the reliable and valid DIET-COMMS tool.

    Science.gov (United States)

    Whitehead, K A; Langley-Evans, S C; Tischler, V A; Swift, J A

    2014-04-01

    There is an increasing emphasis on the development of communication skills for dietitians but few evidence-based assessment tools available. The present study aimed to develop a dietetic-specific, short, reliable and valid assessment tool for measuring communication skills in patient consultations: DIET-COMMS. A literature review and feedback from 15 qualified dietitians were used to establish face and content validity during the development of DIET-COMMS. In total, 113 dietetic students and qualified dietitians were video-recorded undertaking mock consultations, assessed using DIET-COMMS by the lead author, and used to establish intra-rater reliability, as well as construct and predictive validity. Twenty recorded consultations were reassessed by nine qualified dietitians to assess inter-rater reliability: eight of these assessors were interviewed to determine user evaluation. Significant improvements in DIET-COMMS scores were achieved as students and qualified staff progressed through their training and gained experience, demonstrating construct validity, and also by qualified staff attending a training course, indicating predictive validity (P skills in practice was questioned. DIET-COMMS is a short, user-friendly, reliable and valid tool for measuring communication skills in patient consultations with both pre- and post-registration dietitians. Additional work is required to develop a training package for assessors and to identify how DIET-COMMS assessment can acceptably be incorporated into practice. © 2013 The British Dietetic Association Ltd.

  15. Analytical tools for thermal infrared engineerig: a thermal sensor simulation package

    Science.gov (United States)

    Jaggi, Sandeep

    1992-09-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration. To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering'--ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as SNR, NER, NETD etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters. In addition, ATTIRE can be used as a tutorial for understanding the distribution of thermal flux or solar irradiance over selected bandwidths of the spectrum. This spectrally distributed incident flux can then be analyzed as it propagates through the subsystems that constitute the entire sensor. ATTIRE provides a variety of functions ranging from plotting black-body curves for varying bandwidths and computing the integral flux, to performing transfer function analysis of the sensor system. The package runs from a menu- driven interface in a PC-DOS environment. Each sub-system of the sensor is represented by windows and icons. A user-friendly mouse-controlled point-and-click interface allows the user to simulate various aspects of a sensor. The package can simulate a theoretical sensor system. Trade-off studies can be easily done by changing the appropriate parameters and monitoring the effect of the system performance. The package can provide plots of system performance versus any system parameter. A parameter (such as the entrance aperture of the optics) could be varied and its effect on another parameter (e.g., NETD) can be plotted. A third parameter (e.g., the

  16. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-01-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)][M. M. Block, Eur. Phys. J. C 68, 683 (2010)] allow us to write fully decoupled solutions for the singlet structure function F s (x,Q 2 ) and G(x,Q 2 ) as F s (x,Q 2 )=F s (F s0 (x 0 ),G 0 (x 0 )) and G(x,Q 2 )=G(F s0 (x 0 ),G 0 (x 0 )), where the x 0 are the Bjorken x values at Q 0 2 . Here F s and G are known functions--found using LO DGLAP splitting functions--of the initial boundary conditions F s0 (x)≡F s (x,Q 0 2 ) and G 0 (x)≡G(x,Q 0 2 ), i.e., the chosen starting functions at the virtuality Q 0 2 . For both G(x) and F s (x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy--a computational fractional precision of O(10 -9 ). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F s distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)], starting from their initial values at Q 0 2 =1 GeV 2 and 1.69 GeV 2 , respectively, using their choice of α s (Q 2 ). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and F s satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of the starting functions on the evolved gluon and singlet structure functions, as functions of both Q

  17. A Comparison of Result Reliability for Investigation of Milk Composition by Alternative Analytical Methods in Czech Republic

    Directory of Open Access Journals (Sweden)

    Oto Hanuš

    2014-01-01

    Full Text Available The milk analyse result reliability is important for assurance of foodstuff chain quality. There are more direct and indirect methods for milk composition measurement (fat (F, protein (P, lactose (L and solids non fat (SNF content. The goal was to evaluate some reference and routine milk analytical procedures on result basis. The direct reference analyses were: F, fat content (Röse–Gottlieb method; P, crude protein content (Kjeldahl method; L, lactose (monohydrate, polarimetric method; SNF, solids non fat (gravimetric method. F, P, L and SNF were determined also by various indirect methods: – MIR (infrared (IR technology with optical filters, 7 instruments in 4 labs; – MIR–FT (IR spectroscopy with Fourier’s transformations, 10 in 6; – ultrasonic method (UM, 3 in 1; – analysis by the blue and red box (BRB, 1 v 1. There were used 10 reference milk samples. Coefficient of determination (R2, correlation coefficient (r and standard deviation of the mean of individual differences (MDsd, for n were evaluated. All correlations (r; for all indirect and alternative methods and all milk components were significant (P ≤ 0.001. MIR and MIR–FT (conventional methods explained considerably higher proportion of the variability in reference results than the UM and BRB methods (alternative. All r average values (x minus 1.64 × sd for 95% confidence interval can be used as standards for calibration quality evaluation (MIR, MIR–FT, UM and BRB: – for F 0.997, 0.997, 0.99 and 0.995; – for P 0.986, 0.981, 0.828 and 0.864; – for L 0.968, 0.871, 0.705 and 0.761; – for SNF 0.992, 0.993, 0.911 and 0.872. Similarly ​MDsd (x plus 1.64 × sd: – for F 0.071, 0.068, 0.132 and 0.101%; – for P 0.051, 0.054, 0.202 and 0.14%; – for L 0.037, 0.074, 0.113 and 0.11%; – for SNF 0.052, 0.068, 0.141 and 0.204.

  18. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  19. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  20. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  1. Learning analytics as a tool for closing the assessment loop in higher education

    OpenAIRE

    Karen D. Mattingly; Margaret C. Rice; Zane L. Berge

    2012-01-01

    This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and wha...

  2. Application-Driven Reliability Measures and Evaluation Tool for Fault-Tolerant Real-Time Systems

    National Research Council Canada - National Science Library

    Krishna, C

    2001-01-01

    .... The measure combines graphic-theoretic concepts in evaluating the underlying reliability of the network and other means to evaluate the ability of the network to support interprocessor traffic...

  3. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  4. Reliability and reproducibility analysis of the Cobb angle and assessing sagittal plane by computer-assisted and manual measurement tools.

    Science.gov (United States)

    Wu, Weifei; Liang, Jie; Du, Yuanli; Tan, Xiaoyi; Xiang, Xuanping; Wang, Wanhong; Ru, Neng; Le, Jinbo

    2014-02-06

    Although many studies on reliability and reproducibility of measurement have been performed on coronal Cobb angle, few results about reliability and reproducibility are reported on sagittal alignment measurement including the pelvis. We usually use SurgimapSpine software to measure the Cobb angle in our studies; however, there are no reports till date on its reliability and reproducible measurements. Sixty-eight standard standing posteroanterior whole-spine radiographs were reviewed. Three examiners carried out the measurements independently under the settings of manual measurement on X-ray radiographies and SurgimapSpine software on the computer. Parameters measured included pelvic incidence, sacral slope, pelvic tilt, Lumbar lordosis (LL), thoracic kyphosis, and coronal Cobb angle. SPSS 16.0 software was used for statistical analyses. The means, standard deviations, intraclass and interclass correlation coefficient (ICC), and 95% confidence intervals (CI) were calculated. There was no notable difference between the two tools (P = 0.21) for the coronal Cobb angle. In the sagittal plane parameters, the ICC of intraobserver reliability for the manual measures varied from 0.65 (T2-T5 angle) to 0.95 (LL angle). Further, for SurgimapSpine tool, the ICC ranged from 0.75 to 0.98. No significant difference in intraobserver reliability was found between the two measurements (P > 0.05). As for the interobserver reliability, measurements with SurgimapSpine tool had better ICC (0.71 to 0.98 vs 0.59 to 0.96) and Pearson's coefficient (0.76 to 0.99 vs 0.60 to 0.97). The reliability of SurgimapSpine measures was significantly higher in all parameters except for the coronal Cobb angle where the difference was not significant (P > 0.05). Although the differences between the two methods are very small, the results of this study indicate that the SurgimapSpine measurement is an equivalent measuring tool to the traditional manual in coronal Cobb angle, but is advantageous in spino

  5. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  6. Effect of standardized training on the reliability of the Cochrane risk of bias assessment tool: a prospective study.

    Science.gov (United States)

    da Costa, Bruno R; Beckett, Brooke; Diaz, Alison; Resta, Nina M; Johnston, Bradley C; Egger, Matthias; Jüni, Peter; Armijo-Olivo, Susan

    2017-03-03

    The Cochrane risk of bias tool is commonly criticized for having a low reliability. We aimed to investigate whether training of raters, with objective and standardized instructions on how to assess risk of bias, can improve the reliability of the Cochrane risk of bias tool. In this pilot study, four raters inexperienced in risk of bias assessment were randomly allocated to minimal or intensive standardized training for risk of bias assessment of randomized trials of physical therapy treatments for patients with knee osteoarthritis pain. Two raters were experienced risk of bias assessors who served as reference. The primary outcome of our study was between-group reliability, defined as the agreement of the risk of bias assessments of inexperienced raters with the reference assessments of experienced raters. Consensus-based assessments were used for this purpose. The secondary outcome was within-group reliability, defined as the agreement of assessments within pairs of inexperienced raters. We calculated the chance-corrected weighted Kappa to quantify agreement within and between groups of raters for each of the domains of the risk of bias tool. A total of 56 trials were included in our analysis. The Kappa for the agreement of inexperienced raters with reference across items of the risk of bias tool ranged from 0.10 to 0.81 for the minimal training group and from 0.41 to 0.90 for the standardized training group. The Kappa values for the agreement within pairs of inexperienced raters across the items of the risk of bias tool ranged from 0 to 0.38 for the minimal training group and from 0.93 to 1 for the standardized training group. Between-group differences in Kappa for the agreement of inexperienced raters with reference always favored the standardized training group and was most pronounced for incomplete outcome data (difference in Kappa 0.52, p training on risk of bias assessment may significantly improve the reliability of the Cochrane risk of bias tool.

  7. Is a sphygmomanometer a valid and reliable tool to measure the isometric strength of hip muscles? A systematic review.

    Science.gov (United States)

    Toohey, Liam Anthony; De Noronha, Marcos; Taylor, Carolyn; Thomas, James

    2015-02-01

    Muscle strength measurement is a key component of physiotherapists' assessment and is frequently used as an outcome measure. A sphygmomanometer is an instrument commonly used to measure blood pressure that can be potentially used as a tool to assess isometric muscle strength. To systematically review the evidence on the reliability and validity of a sphygmomanometer for measuring isometric strength of hip muscles. A literature search was conducted across four databases. Studies were eligible if they presented data on reliability and/or validity, used a sphygmomanometer to measure isometric muscle strength of the hip region, and were peer reviewed. The individual studies were evaluated for quality using a standardized critical appraisal tool. A total of 644 articles were screened for eligibility, with five articles chosen for inclusion. The use of a sphygmomanometer to objectively assess isometric muscle strength of the hip muscles appears to be reliable with intraclass correlation coefficient values ranging from 0.66 to 0.94 in elderly and young populations. No studies were identified that have assessed the validity of a sphygmomanometer. The sphygmomanometer appears to be reliable for assessment of isometric muscle strength around the hip joint, but further research is warranted to establish its validity.

  8. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    Science.gov (United States)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  9. Development of a quality-assessment tool for experimental bruxism studies: reliability and validity

    NARCIS (Netherlands)

    Dawson, A.; Raphael, K.G.; Glaros, A.; Axelsson, S.; Arima, T.; Ernberg, M.; Farella, M.; Lobbezoo, F.; Manfredini, D.; Michelotti, A.; Svensson, P.; List, T.

    2013-01-01

    AIMS: To combine empirical evidence and expert opinion in a formal consensus method in order to develop a quality-assessment tool for experimental bruxism studies in systematic reviews. METHODS: Tool development comprised five steps: (1) preliminary decisions, (2) item generation, (3) face-validity

  10. Cross learning synergies between Operation Management content and the use of generic analytic tools

    Directory of Open Access Journals (Sweden)

    Frederic Marimon

    2017-06-01

    By presenting both objectives simultaneously students are found to be more motivated towards working deeply in both objectives. Students know that the theoretical content will be put in practice through certain tools, strengthening the student's interest on the conceptual issues of the chapter. In turn, because students know that they will use a generic tool in a known context, their interests in these tools is reinforced. The result is a cross learning synergy.

  11. Semi-structured interview is a reliable and feasible tool for selection of doctors for general practice specialist training.

    Science.gov (United States)

    Isaksen, Jesper Hesselbjerg; Hertel, Niels Thomas; Kjær, Niels Kristian

    2013-09-01

    In order to optimise the selection process for admission to specialist training in family medicine, we developed a new design for structured applications and selection interviews. The design contains semi-structured interviews, which combine individualised elements from the applications with standardised behaviour-based questions. This paper describes the design of the tool, and offers reflections concerning its acceptability, reliability and feasibility. We used a combined quantitative and qualitative evaluation method. Ratings obtained by the applicants in two selection rounds were analysed for reliability and generalisability using the GENOVA programme. Applicants and assessors were randomly selected for individual semi-structured in-depth interviews. The qualitative data were analysed in accordance with the grounded theory method. Quantitative analysis yielded a high Cronbach's alpha of 0.97 for the first round and 0.90 for the second round, and a G coefficient of the first round of 0.74 and of the second round of 0.40. Qualitative analysis demonstrated high acceptability and fairness and it improved the assessors' judgment. Applicants reported concerns about loss of personality and some anxiety. The applicants' ability to reflect on their competences was important. The developed selection tool demonstrated an acceptable level of reliability, but only moderate generalisability. The users found that the tool provided a high degree of acceptability; it is a feasible and useful tool for -selection of doctors for specialist training if combined with work-based assessment. Studies on the benefits and drawbacks of this tool compared with other selection models are relevant. not relevant. not relevant.

  12. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  13. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    International Nuclear Information System (INIS)

    Silvestre-Brac, Bernard; Semay, Claude; Buisseret, Fabien

    2008-01-01

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature

  14. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre-Brac, Bernard [LPSC Universite Joseph Fourier, Grenoble 1, CNRS/IN2P3, Institut Polytechnique de Grenoble, Avenue des Martyrs 53, F-38026 Grenoble-Cedex (France); Semay, Claude; Buisseret, Fabien [Groupe de Physique Nucleaire Theorique, Universite de Mons-Hainaut, Academie universitaire Wallonie-Bruxelles, Place du Parc 20, B-7000 Mons (Belgium)], E-mail: silvestre@lpsc.in2p3.fr, E-mail: claude.semay@umh.ac.be, E-mail: fabien.buisseret@umh.ac.be

    2008-07-11

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature.

  15. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  16. The reliability of three psoriasis assessment tools: Psoriasis area and severity index, body surface area and physician global assessment.

    Science.gov (United States)

    Bożek, Agnieszka; Reich, Adam

    2017-08-01

    A wide variety of psoriasis assessment tools have been proposed to evaluate the severity of psoriasis in clinical trials and daily practice. The most frequently used clinical instrument is the psoriasis area and severity index (PASI); however, none of the currently published severity scores used for psoriasis meets all the validation criteria required for an ideal score. The aim of this study was to compare and assess the reliability of 3 commonly used assessment instruments for psoriasis severity: the psoriasis area and severity index (PASI), body surface area (BSA) and physician global assessment (PGA). On the scoring day, 10 trained dermatologists evaluated 9 adult patients with plaque-type psoriasis using the PASI, BSA and PGA. All the subjects were assessed twice by each physician. Correlations between the assessments were analyzed using the Pearson correlation coefficient. Intra-class correlation coefficient (ICC) was calculated to analyze intra-rater reliability, and the coefficient of variation (CV) was used to assess inter-rater variability. Significant correlations were observed among the 3 scales in both assessments. In all 3 scales the ICCs were > 0.75, indicating high intra-rater reliability. The highest ICC was for the BSA (0.96) and the lowest one for the PGA (0.87). The CV for the PGA and PASI were 29.3 and 36.9, respectively, indicating moderate inter-rater variability. The CV for the BSA was 57.1, indicating high inter-rater variability. Comparing the PASI, PGA and BSA, it was shown that the PGA had the highest inter-rater reliability, whereas the BSA had the highest intra-rater reliability. The PASI showed intermediate values in terms of interand intra-rater reliability. None of the 3 assessment instruments showed a significant advantage over the other. A reliable assessment of psoriasis severity requires the use of several independent evaluations simultaneously.

  17. Systematic evaluation of the teaching qualities of Obstetrics and Gynecology faculty: reliability and validity of the SETQ tools.

    Directory of Open Access Journals (Sweden)

    Renée van der Leeuw

    Full Text Available BACKGROUND: The importance of effective clinical teaching for the quality of future patient care is globally understood. Due to recent changes in graduate medical education, new tools are needed to provide faculty with reliable and individualized feedback on their teaching qualities. This study validates two instruments underlying the System for Evaluation of Teaching Qualities (SETQ aimed at measuring and improving the teaching qualities of obstetrics and gynecology faculty. METHODS AND FINDINGS: This cross-sectional multi-center questionnaire study was set in seven general teaching hospitals and two academic medical centers in the Netherlands. Seventy-seven residents and 114 faculty were invited to complete the SETQ instruments in the duration of one month from September 2008 to September 2009. To assess reliability and validity of the instruments, we used exploratory factor analysis, inter-item correlation, reliability coefficient alpha and inter-scale correlations. We also compared composite scales from factor analysis to global ratings. Finally, the number of residents' evaluations needed per faculty for reliable assessments was calculated. A total of 613 evaluations were completed by 66 residents (85.7% response rate. 99 faculty (86.8% response rate participated in self-evaluation. Factor analysis yielded five scales with high reliability (Cronbach's alpha for residents' and faculty: learning climate (0.86 and 0.75, professional attitude (0.89 and 0.81, communication of learning goals (0.89 and 0.82, evaluation of residents (0.87 and 0.79 and feedback (0.87 and 0.86. Item-total, inter-scale and scale-global rating correlation coefficients were significant (P<0.01. Four to six residents' evaluations are needed per faculty (reliability coefficient 0.60-0.80. CONCLUSIONS: Both SETQ instruments were found reliable and valid for evaluating teaching qualities of obstetrics and gynecology faculty. Future research should examine improvement of

  18. Development, initial reliability and validity testing of an observational tool for assessing technical skills of operating room nurses.

    Science.gov (United States)

    Sevdalis, Nick; Undre, Shabnam; Henry, Janet; Sydney, Elaine; Koutantji, Mary; Darzi, Ara; Vincent, Charles A

    2009-09-01

    The recent emergence of the Systems Approach to the safety and quality of surgical care has triggered individual and team skills training modules for surgeons and anaesthetists and relevant observational assessment tools have been developed. To develop an observational tool that captures operating room (OR) nurses' technical skill and can be used for assessment and training. The Imperial College Assessment of Technical Skills for Nurses (ICATS-N) assesses (i) gowning and gloving, (ii) setting up instrumentation, (iii) draping, and (iv) maintaining sterility. Three to five observable behaviours have been identified for each skill and are rated on 1-6 scales. Feasibility and aspects of reliability and validity were assessed in 20 simulation-based crisis management training modules for trainee nurses and doctors, carried out in a Simulated Operating Room. The tool was feasible to use in the context of simulation-based training. Satisfactory reliability (Cronbach alpha) was obtained across trainers' and trainees' scores (analysed jointly and separately). Moreover, trainer nurse's ratings of the four skills correlated positively, thus indicating adequate content validity. Trainer's and trainees' ratings did not correlate. Assessment of OR nurses' technical skill is becoming a training priority. The present evidence suggests that the ICATS-N could be considered for use as an assessment/training tool for junior OR nurses.

  19. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    Science.gov (United States)

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  20. Test-Retest Reliability of an Experienced Global Trigger Tool Review Team

    DEFF Research Database (Denmark)

    Bjørn, Brian; Anhøj, Jacob; Østergaard, Mette

    2018-01-01

    and review 2 and between period 1 and period 2. The increase was solely in category E, minor temporary harm. CONCLUSIONS: The very experienced GTT team could not reproduce harm rates found in earlier reviews. We conclude that GTT in its present form is not a reliable measure of harm rate over time....

  1. Evaluation and Design Tools for the Reliability of Wind Power Converter System

    DEFF Research Database (Denmark)

    Ma, Ke; Zhou, Dao; Blaabjerg, Frede

    2015-01-01

    grid. As a result, the correct assessment of reliable performance for power electronics is a crucial and emerging need; the assessment is essential for design improvement, as well as for the extension of converter lifetime and reduction of energy cost. Unfortunately, there still exists a lack...

  2. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  3. Online programs as tools to improve parenting: A meta-analytic review

    NARCIS (Netherlands)

    Nieuwboer, C.C.; Fukkink, R.G.; Hermanns, J.M.A.

    2013-01-01

    Background: A number of parenting programs, aimed at improving parenting competencies, have recently been adapted or designed with the use of online technologies. Although web-based services have been claimed to hold promise for parent support, a meta-analytic review of online parenting

  4. Reliability of a computer software angle tool for measuring spine and pelvic flexibility during the sit-and-reach test.

    Science.gov (United States)

    Mier, Constance M; Shapiro, Belinda S

    2013-02-01

    The purpose of this study was to determine the reliability of a computer software angle tool that measures thoracic (T), lumbar (L), and pelvic (P) angles as a means of evaluating spine and pelvic flexibility during the sit-and-reach (SR) test. Thirty adults performed the SR twice on separate days. The SR test was captured on video and later analyzed for T, L, and P angles using the computer software angle tool. During the test, 3 markers were placed over T1, T12, and L5 vertebrae to identify T, L, and P angles. Intraclass correlation coefficient (ICC) indicated a very high internal consistency (between trials) for T, L, and P angles (0.95-0.99); thus, the average of trials was used for test-retest (between days) reliability. Mean (±SD) values did not differ between days for T (51.0 ± 14.3 vs. 52.3 ± 16.2°), L (23.9 ± 7.1 vs. 23.0 ± 6.9°), or P (98.4 ± 15.6 vs. 98.3 ± 14.7°) angles. Test-retest reliability (ICC) was high for T (0.96) and P (0.97) angles and moderate for L angle (0.84). Both intrarater and interrater reliabilities were high for T (0.95, 0.94) and P (0.97, 0.97) angles and moderate for L angle (0.87, 0.82). Thus, the computer software angle tool is a highly objective method for assessing spine and pelvic flexibility during a video-captured SR test.

  5. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  7. Cross-cultural adaptation, reliability, and validity of the Persian version of the Cumberland Ankle Instability Tool.

    Science.gov (United States)

    Hadadi, Mohammad; Ebrahimi Takamjani, Ismail; Ebrahim Mosavi, Mohammad; Aminian, Gholamreza; Fardipour, Shima; Abbasi, Faeze

    2017-08-01

    The purpose of the present study was to translate and to cross-culturally adapt the Cumberland Ankle Instability Tool (CAIT) into Persian language and to evaluate its psychometric properties. The International Quality of Life Assessment process was pursued to translate CAIT into Persian. Two groups of Persian-speaking individuals, 105 participants with a history of ankle sprain and 30 participants with no history of ankle sprain, were asked to fill out Persian version of CAIT (CAIT-P), Foot and Ankle Ability Measure (FAAM), and Visual Analog Scale (VAS). Data obtained from the first administration of CAIT were used to evaluate floor and ceiling effects, internal consistency, dimensionality, and criterion validity. To determine the test-retest reliability, 45 individuals re-filled CAIT 5-7 days after the first session. Cronbach's alpha was over the cutoff point of 0.70 for both ankles and in both groups. The intra-class correlation coefficient was high for right (0.95) and left (0.91) ankles. There was a strong correlation between each item and the total score of the CAIT-P. Although the CAIT-P had strong correlation with VAS, its correlation with both subscales of FAAM was moderate. The CAIT-P has good validity and reliability and it can be used by clinicians and researchers for identification and investigation of functional ankle instability. Implications for Rehabilitation Chronic ankle instability is one of the most common consequences of acute ankle sprain. Cumberland Ankle Instability Tool is an acceptable measure to determine functional ankle instability and its severity. The Persian version of Cumberland Ankle Instability Tool is a valid and reliable tool for clinical and research purpose in Persian-speaking individuals.

  8. Development, Construct Validity, and Reliability of the Questionnaire on Infant Feeding: A Tool for Measuring Contemporary Infant-Feeding Behaviors.

    Science.gov (United States)

    O'Sullivan, Elizabeth J; Rasmussen, Kathleen M

    2017-12-01

    The breastfeeding surveillance tool in the United States, the National Immunization Survey, considers the maternal-infant dyad to be breastfeeding for as long as the infant consumes human milk (HM). However, many infants consume at least some HM from a bottle, which can lead to health outcomes different from those for at-the-breast feeding. Our aim was to develop a construct-valid questionnaire that categorizes infants by nutrition source, that is, own mother's HM, another mother's HM, infant formula, or other and feeding mode, that is, at the breast or from a bottle, and test the reliability of this questionnaire. The Questionnaire on Infant Feeding was developed through a literature review and modified based on qualitative research. Construct validity was assessed through cognitive interviews and a test-retest reliability study was conducted among mothers who completed the questionnaire twice, 1 month apart. Cognitive interviews were conducted with ten mothers from upstate New York between September and December 2014. A test-retest reliability study was conducted among 44 mothers from across the United States between March and May 2015. Equivalence of questions with continuous responses about the timing of starting and stopping various behaviors and the agreement between responses to questions with categorical responses on the two questionnaires completed 1 month apart. Reliability was assessed using paired-equivalence tests for questions about the timing of starting and stopping behaviors and weighted Cohen's κ for questions about the frequency and intensity of behaviors. Reliability of the Questionnaire on Infant Feeding was moderately high among mothers of infants aged 19 to 35 months, with most questions about the timing of starting and stopping behaviors equivalent to within 1 month. Weighted Cohen's κ for categorical questions indicated substantial agreement. The Questionnaire on Infant Feeding is a construct-valid tool to measure duration, intensity

  9. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    Science.gov (United States)

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  10. A Review on the Design Structure Matrix as an Analytical Tool for Product Development Management

    OpenAIRE

    Mokudai, Takefumi

    2006-01-01

    This article reviews fundamental concepts and analytical techniques of design structure matrix (DSM) as well as recent development of DSM studies. The DSM is a matrix representation of relationships between components of a complex system, such as products, development organizations and processes. Depending on targets of analysis, there are four basic types of DSM: Component-based DSM, Team-based DSM, Task-based DSM, and Parameter-based DSM. There are two streams of recent DSM studies: 1) ...

  11. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    Science.gov (United States)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  12. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    Science.gov (United States)

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  13. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  14. Effect of standardized training on the reliability of the Cochrane risk of bias assessment tool: a study protocol.

    Science.gov (United States)

    da Costa, Bruno R; Resta, Nina M; Beckett, Brooke; Israel-Stahre, Nicholas; Diaz, Alison; Johnston, Bradley C; Egger, Matthias; Jüni, Peter; Armijo-Olivo, Susan

    2014-12-13

    The Cochrane risk of bias (RoB) tool has been widely embraced by the systematic review community, but several studies have reported that its reliability is low. We aim to investigate whether training of raters, including objective and standardized instructions on how to assess risk of bias, can improve the reliability of this tool. We describe the methods that will be used in this investigation and present an intensive standardized training package for risk of bias assessment that could be used by contributors to the Cochrane Collaboration and other reviewers. This is a pilot study. We will first perform a systematic literature review to identify randomized clinical trials (RCTs) that will be used for risk of bias assessment. Using the identified RCTs, we will then do a randomized experiment, where raters will be allocated to two different training schemes: minimal training and intensive standardized training. We will calculate the chance-corrected weighted Kappa with 95% confidence intervals to quantify within- and between-group Kappa agreement for each of the domains of the risk of bias tool. To calculate between-group Kappa agreement, we will use risk of bias assessments from pairs of raters after resolution of disagreements. Between-group Kappa agreement will quantify the agreement between the risk of bias assessment of raters in the training groups and the risk of bias assessment of experienced raters. To compare agreement of raters under different training conditions, we will calculate differences between Kappa values with 95% confidence intervals. This study will investigate whether the reliability of the risk of bias tool can be improved by training raters using standardized instructions for risk of bias assessment. One group of inexperienced raters will receive intensive training on risk of bias assessment and the other will receive minimal training. By including a control group with minimal training, we will attempt to mimic what many review authors

  15. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    Science.gov (United States)

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  16. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  17. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  18. Validity and Reliability of Knowledge, Attitude and Behavior Assessment Tool Among Vulnerable Women Concerning Sexually Transmitted Diseases

    Directory of Open Access Journals (Sweden)

    Zahra Boroumandfar

    2016-05-01

    Full Text Available Objective: The study aimed to design and evaluate the content and face validity, and reliability of knowledge, attitude, and behavior questionnaire on preventive behaviors among vulnerable women concerning sexually transmitted diseases (STDs.Materials and methods: This cross-sectional study was carried out in two phases of an action research. In the first phase, to explain STDs preventive domains, 20 semi- structured interviews were conducted with the vulnerable women, residing at women prison and women referred to counseling centers. After analyzing content of interviews, three domains were identified: improve their knowledge, modify their attitude and change their behaviors. In the second phase, the questionnaire was designed and tested in a pilot study. Then, its content validity was evaluated. Face validity and reliability of the questionnaire were assessed by test re- test method and Cronbach alpha respectively.Results: Index of content validity in each three domain of the questionnaire (knowledge, attitude and behavior concerning STDs was obtained over 0.6. Overall content validity index was 0.86 in all three domains of the questionnaire. The Cronbach’s alpha as reliability of questionnaire was 0.80 for knowledge, 0.79 for attitude and 0.85 for behavior.Conclusion: The results showed that the designed questionnaire was a valid and reliable tool to measure knowledge, attitude and behavior of vulnerable women, predisposed to risk of STDs.

  19. Magnetic particle separation technique: a reliable and simple tool for RIA/IRMA and quantitative PCR assay

    International Nuclear Information System (INIS)

    Shen Rongsen; Shen Decun

    1998-01-01

    Five types of magnetic particles without or with aldehyde, amino and carboxyl functional groups, respectively were used to immobilize first or second antibody by three models, i. e. physical adsorption, chemical coupling and immuno-affinity, forming four types of magnetic particle antibodies. The second antibody immobilized on polyacrolein magnetic particles through aldehyde functional groups and the first antibodies immobilized on carboxylic polystyrene magnetic particles through carboxyl functional groups were recommended to apply to RIAs and/or IRMAs. Streptavidin immobilized on commercial magnetic particles through amino functional groups was successfully applied to separating specific PCR product for quantification of human cytomegalovirus. In the paper typical data on reliability of these magnetic particle ligands were reported and simplicity of the magnetic particle separation technique was discussed. The results showed that the technique was a reliable and simple tool for RIA/IRMA and quantitative PCR assay. (author)

  20. E-tool for business processes to improve travel time reliability.

    Science.gov (United States)

    2015-01-01

    The etool can be found on the TRB website by following this link: http://www.trb.org/Main/Blurbs/170579.aspx The research team developed an e-tool that can be used by practitioners for planning, implementing, integrating, and analyzing business proce...

  1. Autism detection in early childhood (ADEC): reliability and validity data for a Level 2 screening tool for autistic disorder.

    Science.gov (United States)

    Nah, Yong-Hwee; Young, Robyn L; Brewer, Neil; Berlingeri, Genna

    2014-03-01

    The Autism Detection in Early Childhood (ADEC; Young, 2007) was developed as a Level 2 clinician-administered autistic disorder (AD) screening tool that was time-efficient, suitable for children under 3 years, easy to administer, and suitable for persons with minimal training and experience with AD. A best estimate clinical Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) diagnosis of AD was made for 70 children using all available information and assessment results, except for the ADEC data. A screening study compared these children on the ADEC with 57 children with other developmental disorders and 64 typically developing children. Results indicated high internal consistency (α = .91). Interrater reliability and test-retest reliability of the ADEC were also adequate. ADEC scores reliably discriminated different diagnostic groups after controlling for nonverbal IQ and Vineland Adaptive Behavior Composite scores. Construct validity (using exploratory factor analysis) and concurrent validity using performance on the Autism Diagnostic Observation Schedule (Lord et al., 2000), the Autism Diagnostic Interview-Revised (Le Couteur, Lord, & Rutter, 2003), and DSM-IV-TR criteria were also demonstrated. Signal detection analysis identified the optimal ADEC cutoff score, with the ADEC identifying all children who had an AD (N = 70, sensitivity = 1.0) but overincluding children with other disabilities (N = 13, specificity ranging from .74 to .90). Together, the reliability and validity data indicate that the ADEC has potential to be established as a suitable and efficient screening tool for infants with AD. 2014 APA

  2. Program to develop analytical tools for environmental and safety assessment of nuclear material shipping container systems

    International Nuclear Information System (INIS)

    Butler, T.A.

    1978-11-01

    This paper describes a program for developing analytical techniques to evaluate the response of nuclear material shipping containers to severe accidents. Both lumped-mass and finite element techniques are employed to predict shipping container and shipping container-carrier response to impact. The general impact problem is computationally expensive because of its nonlinear, three-dimensional nature. This expense is minimized by using approximate models to parametrically identify critical cases before more exact analyses are performed. The computer codes developed for solving the problem are being experimentally substantiated with test data from full-scale and scale-model container drop tests. 6 figures, 1 table

  3. Web-based tools can be used reliably to detect patients with major depressive disorder and subsyndromal depressive symptoms

    Directory of Open Access Journals (Sweden)

    Tsai Shih-Jen

    2007-04-01

    Full Text Available Abstract Background Although depression has been regarded as a major public health problem, many individuals with depression still remain undetected or untreated. Despite the potential for Internet-based tools to greatly improve the success rate of screening for depression, their reliability and validity has not been well studied. Therefore the aim of this study was to evaluate the test-retest reliability and criterion validity of a Web-based system, the Internet-based Self-assessment Program for Depression (ISP-D. Methods The ISP-D to screen for major depressive disorder (MDD, minor depressive disorder (MinD, and subsyndromal depressive symptoms (SSD was developed in traditional Chinese. Volunteers, 18 years and older, were recruited via the Internet and then assessed twice on the online ISP-D system to investigate the test-retest reliability of the test. They were subsequently prompted to schedule face-to-face interviews. The interviews were performed by the research psychiatrists using the Mini-International Neuropsychiatric Interview and the diagnoses made according to DSM-IV diagnostic criteria were used for the statistics of criterion validity. Kappa (κ values were calculated to assess test-retest reliability. Results A total of 579 volunteer subjects were administered the test. Most of the subjects were young (mean age: 26.2 ± 6.6 years, female (77.7%, single (81.6%, and well educated (61.9% college or higher. The distributions of MDD, MinD, SSD and no depression specified were 30.9%, 7.4%, 15.2%, and 46.5%, respectively. The mean time to complete the ISP-D was 8.89 ± 6.77 min. One hundred and eighty-four of the respondents completed the retest (response rate: 31.8%. Our analysis revealed that the 2-week test-retest reliability for ISP-D was excellent (weighted κ = 0.801. Fifty-five participants completed the face-to-face interview for the validity study. The sensitivity, specificity, positive, and negative predictive values for major

  4. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    International Nuclear Information System (INIS)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury

    2007-01-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  5. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mail: luquetti@ien.gov.br; grecco@ien.gov.br; mol@ien.gov.br; paulov@ien.gov.br; mvitor@ien.gov.br; felipemury@superig.com.br

    2007-07-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  6. Reliability and validity study of a tool to measure cancer stigma: Patient version

    Directory of Open Access Journals (Sweden)

    Medine Yilmaz

    2017-01-01

    Full Text Available Objective: The aim of this methodological study is to establish the validity and reliability of the Turkish version of “A Questionnaire for Measuring Attitudes toward Cancer (Cancer Stigma - Patient version.” Methods: The sample comprised oncology patients who had active cancer treatment. The construct validity was assessed using the confirmatory and exploratory factor analysis. Results: The mean age of the participants was 54.9±12.3 years. In the confirmatory factor analysis, fit values were determined as comparative fit index = 0.93, goodness of fit index = 0.91, normed-fit index=0.91, and root mean square error of approximation RMSEA = 0.09 (P<0.05 (Kaiser–Meyer–Olkin = 0.88, χ2 = 1084.41, Df = 66, and Barletta's test P<0.000. The first factor was “impossibility of recovery and experience of social discrimination” and the second factor was “stereotypes of cancer patients.” The two-factor structure accounted for 56.74% of the variance. The Cronbach's alpha value was determined as 0.88 for the two-factor scale. Conclusions: “A questionnaire for measuring attitudes toward cancer (cancer stigma - Patient version” is a reliable and valid questionnaire to assess stigmatization of cancer in cancer patients.

  7. Reliability and Validity Study of a Tool to Measure Cancer Stigma: Patient Version.

    Science.gov (United States)

    Yılmaz, Medine; Dişsiz, Gülçin; Demir, Filiz; Irız, Sibel; Alacacioglu, Ahmet

    2017-01-01

    The aim of this methodological study is to establish the validity and reliability of the Turkish version of "A Questionnaire for Measuring Attitudes toward Cancer (Cancer Stigma) - Patient version." The sample comprised oncology patients who had active cancer treatment. The construct validity was assessed using the confirmatory and exploratory factor analysis. The mean age of the participants was 54.9±12.3 years. In the confirmatory factor analysis, fit values were determined as comparative fit index = 0.93, goodness of fit index = 0.91, normed-fit index=0.91, and root mean square error of approximation RMSEA = 0.09 ( P Kaiser-Meyer-Olkin = 0.88, χ 2 = 1084.41, Df = 66, and Barletta's test P <0.000). The first factor was "impossibility of recovery and experience of social discrimination" and the second factor was "stereotypes of cancer patients." The two-factor structure accounted for 56.74% of the variance. The Cronbach's alpha value was determined as 0.88 for the two-factor scale. "A questionnaire for measuring attitudes toward cancer (cancer stigma) - Patient version" is a reliable and valid questionnaire to assess stigmatization of cancer in cancer patients.

  8. Systematic evaluation of the teaching qualities of Obstetrics and Gynecology faculty: reliability and validity of the SETQ tools.

    Science.gov (United States)

    van der Leeuw, Renée; Lombarts, Kiki; Heineman, Maas Jan; Arah, Onyebuchi

    2011-05-03

    The importance of effective clinical teaching for the quality of future patient care is globally understood. Due to recent changes in graduate medical education, new tools are needed to provide faculty with reliable and individualized feedback on their teaching qualities. This study validates two instruments underlying the System for Evaluation of Teaching Qualities (SETQ) aimed at measuring and improving the teaching qualities of obstetrics and gynecology faculty. This cross-sectional multi-center questionnaire study was set in seven general teaching hospitals and two academic medical centers in the Netherlands. Seventy-seven residents and 114 faculty were invited to complete the SETQ instruments in the duration of one month from September 2008 to September 2009. To assess reliability and validity of the instruments, we used exploratory factor analysis, inter-item correlation, reliability coefficient alpha and inter-scale correlations. We also compared composite scales from factor analysis to global ratings. Finally, the number of residents' evaluations needed per faculty for reliable assessments was calculated. A total of 613 evaluations were completed by 66 residents (85.7% response rate). 99 faculty (86.8% response rate) participated in self-evaluation. Factor analysis yielded five scales with high reliability (Cronbach's alpha for residents' and faculty): learning climate (0.86 and 0.75), professional attitude (0.89 and 0.81), communication of learning goals (0.89 and 0.82), evaluation of residents (0.87 and 0.79) and feedback (0.87 and 0.86). Item-total, inter-scale and scale-global rating correlation coefficients were significant (Pteaching qualities of obstetrics and gynecology faculty. Future research should examine improvement of teaching qualities when using SETQ.

  9. State of the art on nailfold capillaroscopy: a reliable diagnostic tool and putative biomarker in rheumatology?

    Science.gov (United States)

    Cutolo, Maurizio; Smith, Vanessa

    2013-11-01

    Capillaroscopy is a non-invasive and safe tool to morphologically study the microcirculation. In rheumatology it has a dual use. First, it has a role in differential diagnosis of patients with RP. Second, it may have a role in the prediction of clinical complications in CTDs. In SSc, pilot studies have shown predictive associations with peripheral vascular and lung involvement hinting at a role of capillaroscopy as putative biomarker. Also and logically, in SSc, microangiopathy, as assessed by capillaroscopy, has been associated with markers of the disease such as angiogenic/static factors and SSc-specific antibodies. Moreover, morphological assessments of the microcirculation (capillaroscopy) seem to correlate with functional assessments (such as laser Doppler). Because of its clinical and research role, eyes are geared in Europe to expand the knowledge of this tool. Both the European League Against Rheumatism (EULAR) and the ACR are stepping forward to this need.

  10. Room temperature phosphorescence in the liquid state as a tool in analytical chemistry

    International Nuclear Information System (INIS)

    Kuijt, Jacobus; Ariese, Freek; Brinkman, Udo A.Th.; Gooijer, Cees

    2003-01-01

    A wide-ranging overview of room temperature phosphorescence in the liquid state (RTPL ) is presented, with a focus on recent developments. RTPL techniques like micelle-stabilized (MS)-RTP, cyclodextrin-induced (CD)-RTP, and heavy atom-induced (HAI)-RTP are discussed. These techniques are mainly applied in the stand-alone format, but coupling with some separation techniques appears to be feasible. Applications of direct, sensitized and quenched phosphorescence are also discussed. As regards sensitized and quenched RTP, emphasis is on the coupling with liquid chromatography (LC) and capillary electrophoresis (CE), but stand-alone applications are also reported. Further, the application of RTPL in immunoassays and in RTP optosensing - the optical sensing of analytes based on RTP - is reviewed. Next to the application of RTPL in quantitative analysis, its use for the structural probing of protein conformations and for time-resolved microscopy of labelled biomolecules is discussed. Finally, an overview is presented of the various analytical techniques which are based on the closely related phenomenon of long-lived lanthanide luminescence. The paper closes with a short evaluation of the state-of-the-art in RTP and a discussion on future perspectives

  11. A Probabilistic Approach for Reliability and Life Prediction of Electronics in Drilling and Evaluation Tools

    Science.gov (United States)

    2014-12-23

    while drilling (MWD) and logging while drilling ( LWD ). The OnTrak tool takes measurements like resistivity, gamma ray, pressure and vibration. (3) Bi...estimation. LVPS = Low voltage power supply LWD = Logging while drilling MaPS = Maintenance and performance system MLE = Maximum likelihood estimation...years industry experience, with prior roles in Geoscience and LWD Operations. He has a BSc in Geology from the University of South Australia, an MSc

  12. Reliability and validity of the KIPPPI: an early detection tool for psychosocial problems in toddlers.

    Directory of Open Access Journals (Sweden)

    Ingrid Kruizinga

    Full Text Available BACKGROUND: The KIPPPI (Brief Instrument Psychological and Pedagogical Problem Inventory is a Dutch questionnaire that measures psychosocial and pedagogical problems in 2-year olds and consists of a KIPPPI Total score, Wellbeing scale, Competence scale, and Autonomy scale. This study examined the reliability, validity, screening accuracy and clinical application of the KIPPPI. METHODS: Parents of 5959 2-year-old children in the Rotterdam area, the Netherlands, were invited to participate in the study. Parents of 3164 children (53.1% of all invited parents completed the questionnaire. The internal consistency was evaluated and in subsamples the test-retest reliability and concurrent validity with regard to the Child Behavioral Checklist (CBCL. Discriminative validity was evaluated by comparing scores of parents who worried about their child's upbringing and parent's that did not. Screening accuracy of the KIPPPI was evaluated against the CBCL by calculating the Receiver Operating Characteristic (ROC curves. The clinical application was evaluated by the relation between KIPPPI scores and the clinical decision made by the child health professionals. RESULTS: Psychometric properties of the KIPPPI Total score, Wellbeing scale, Competence scale and Autonomy scale were respectively: Cronbach's alphas: 0.88, 0.86, 0.83, 0.58. Test-retest correlations: 0.80, 0.76, 0.73, 0.60. Concurrent validity was as hypothesised. The KIPPPI was able to discriminate between parents that worried about their child and parents that did not. Screening accuracy was high (>0.90 for the KIPPPI Total score and for the Wellbeing scale. The KIPPPI scale scores and clinical decision of the child health professional were related (p<0.05, indicating a good clinical application. CONCLUSION: The results in this large-scale study of a diverse general population sample support the reliability, validity and clinical application of the KIPPPI Total score, Wellbeing scale and Competence

  13. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    Science.gov (United States)

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., regimes.

  14. Reducing variability of workforce as a tool to improve plan reliability

    DEFF Research Database (Denmark)

    Wandahl, Søren; Yicheng, S.; Zygmunt, K. J.

    Variability of flow is recognized as the greatest obstacle to production management. Since the work flow and labour flow are two dominators of work performance, it is important to manage them simultaneously. The objective of this paper is to examine whether by reducing the variance of a labour flow......, a plan reliability can be improved, therefore, three different construction labour data sets have been examined by utilizing Monte Carlo Simulation, to analyze the probability to finish simulated projects within a certain time. The research findings revealed that reducing variance in the workforce flow...... does not necessarily shorten the project length, nevertheless it increases probability to finish the tasks within a critical path duration. Additionally, it was concluded, that reducing the variance of crew allocation can improve the productivity....

  15. Reducing Variability of Workforce as a Tool to Improve Plan Reliability

    DEFF Research Database (Denmark)

    Shen, Yicheng; Zygmunt, Katarzyna Julia; Wandahl, Søren

    2017-01-01

    Variability of flow is recognized as one of the greatest obstacles to production management. Since the work flow and labour flow are two dominators of work performance, it is important to manage them simultaneously. The objective of this paper is to examine if an increased plan reliability could...... of the workforce flow does not necessarily shorten the project length, nevertheless it increases probability to finish the tasks within a critical path duration. Additionally, it was concluded, that reducing the variance of crew allocation can improve the productivity....... be reached by reducing the variance of a labour flow. Therefore, three different construction labour data sets have been examined by utilizing Monte Carlo Simulation, to analyze the probability to finish simulated projects within a certain time. The research findings revealed that reducing variance...

  16. Life management and operational experience feedback - tools to enhance safety and reliability of the NPP

    International Nuclear Information System (INIS)

    Mach, P.

    1997-01-01

    Preparation has started of the Temelin power plant centralized equipment database. Principles of reliability centered maintenance are studied, and use of these activities will be made in the Plant Ageing Management Programme. The aims of the Programme are as follows: selection of important components subject to ageing, data collection, determination of dominant stressors, development, selection and validation of ageing evaluation methods, setup of experience feedback, determination of responsibilities, methodologies and strategy, elaboration of programme procedures and documentation, and maintenance of programme flexibility. Pilot studies of component ageing are under way: for the reactor pressure vessel, steam generator, pressurizer, piping, ECCS and cables. The organizational structure of the Operational Experience Feedback system is described, as are the responsibility of staff and sources of information. (M.D.)

  17. Validation and assessment of uncertainty of chemical tests as a tool for the reliability analysis of wastewater IPEN

    International Nuclear Information System (INIS)

    Silva, Renan A.; Martins, Elaine A.J.; Furusawa, Helio A.

    2011-01-01

    The validation of analytical methods has become an indispensable tool for the analysis in chemical laboratories, including being required for such accreditation. However, even if a laboratory using validated methods of analysis there is the possibility that these methods generate results discrepant with reality by making necessary the addition of a quantitative attribute (a value) which indicates the degree of certainty the extent or the analytical method used. This measure assigned to the result of measurement is called measurement uncertainty. We estimate this uncertainty with a level of confidence both direction, an analytical result has limited significance if not carried out proper assessment of its uncertainty. One of the activities of this work was to elaborate a program to help the validation and evaluation of uncertainty in chemical analysis. The program was developed with Visual Basic programming language and method of evaluation of uncertainty introduced the following concepts based on the GUM (Guide to the Expression of Uncertainty in Measurement). This evaluation program uncertainty measurement will be applied to chemical analysis in support of the characterization of the Nuclear Fuel Cycle developed by IPEN and the study of organic substances in wastewater associated with professional activities of the Institute. In the first case, primarily for the determination of total uranium and the second case for substances that were generated by human activities and that are contained in resolution 357/2005. As strategy for development of this work was considered the PDCA cycle to improve the efficiency of each step and minimize errors while performing the experimental part. The program should be validated to meet requirements of standards such as, for example, the standard ISO/IEC 17025. The application, it is projected to use in other analytical procedures of both the Nuclear Fuel Cycle and in the control program and chemical waste management of IPEN

  18. Validation and assessment of uncertainty of chemical tests as a tool for the reliability analysis of wastewater IPEN

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Renan A.; Martins, Elaine A.J.; Furusawa, Helio A., E-mail: elaine@ipen.br, E-mail: helioaf@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The validation of analytical methods has become an indispensable tool for the analysis in chemical laboratories, including being required for such accreditation. However, even if a laboratory using validated methods of analysis there is the possibility that these methods generate results discrepant with reality by making necessary the addition of a quantitative attribute (a value) which indicates the degree of certainty the extent or the analytical method used. This measure assigned to the result of measurement is called measurement uncertainty. We estimate this uncertainty with a level of confidence both direction, an analytical result has limited significance if not carried out proper assessment of its uncertainty. One of the activities of this work was to elaborate a program to help the validation and evaluation of uncertainty in chemical analysis. The program was developed with Visual Basic programming language and method of evaluation of uncertainty introduced the following concepts based on the GUM (Guide to the Expression of Uncertainty in Measurement). This evaluation program uncertainty measurement will be applied to chemical analysis in support of the characterization of the Nuclear Fuel Cycle developed by IPEN and the study of organic substances in wastewater associated with professional activities of the Institute. In the first case, primarily for the determination of total uranium and the second case for substances that were generated by human activities and that are contained in resolution 357/2005. As strategy for development of this work was considered the PDCA cycle to improve the efficiency of each step and minimize errors while performing the experimental part. The program should be validated to meet requirements of standards such as, for example, the standard ISO/IEC 17025. The application, it is projected to use in other analytical procedures of both the Nuclear Fuel Cycle and in the control program and chemical waste management of IPEN

  19. Prepared for the thirtieth annual conference on bioassay analytical and environmental chemistry. Reliable analysis of high resolution gamma spectra

    International Nuclear Information System (INIS)

    Spitz, H.B.; Buschbom, R.; Rieksts, G.A.; Palmer, H.E.

    1985-01-01

    A new method has been developed to reliably analyze pulse height-energy spectra obtained from measurements employing high resolution germanium detectors. The method employs a simple data transformation and smoothing function to calculate background and identify photopeaks and isotopic analysis. This technique is elegant in its simplicity because it avoids dependence upon complex spectrum deconvolution, stripping, or other least-square-fitting techniques which complicate the assessment of measurement reliability. A moving median was chosen for data smoothing because, unlike moving averages, medians are not dominated by extreme data points. Finally, peaks are identified whenever the difference between the background spectrum and the transformed spectrum exceeds a pre-determined number of standard deviations

  20. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  1. Analytical tools for managing rock fall hazards in Australian coal mine roadways

    Energy Technology Data Exchange (ETDEWEB)

    Ross Seedsman; Nick Gordon; Naj Aziz [University of Wollongong (Australia)

    2009-03-15

    This report provides a reference source for the design of ground control measures in coal mine roadways using analytical methods. Collapse models are provided for roof and rib. The roof models recognise that different collapse modes can apply in different stress fields - high, intermediate, and zero compressive stresses. The rib models draw analogies to rock slope stability and also the impact of high vertical stresses. Methods for determining support or reinforcement requirements are provided. Suspension of collapsed masses is identified as the basis for roof support in both very high and zero compressive stress regimes. Reinforcement of bedding discontinuities is advocated for intermediate compressive stresses. For the ribs, restraint of coal blocks defined by pre-existing joints or by mining induced fractures is required.

  2. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  3. PLS2 regression as a tool for selection of optimal analytical modality

    DEFF Research Database (Denmark)

    Madsen, Michael; Esbensen, Kim

    Intelligent use of modern process analysers allows process technicians and engineers to look deep into the dynamic behaviour of production systems. This opens up for a plurality of new possibilities with respect to process optimisation. Oftentimes, several instruments representing different...... technologies and price classes are able to decipher relevant process information simultaneously. The question then is: how to choose between available technologies without compromising the quality and usability of the data. We apply PLS2 modelling to quantify the relative merits of competing, or complementing......, analytical modalities. We here present results from a feasibility study, where Fourier Transform Near InfraRed (FT-NIR), Fourier Transform Mid InfraRed (FT-MIR), and Raman laser spectroscopy were applied on the same set of samples obtained from a pilot-scale beer brewing process. Quantitative PLS1 models...

  4. Accept or Decline? An Analytics-Based Decision Tool for Kidney Offer Evaluation.

    Science.gov (United States)

    Bertsimas, Dimitris; Kung, Jerry; Trichakis, Nikolaos; Wojciechowski, David; Vagefi, Parsia A

    2017-12-01

    When a deceased-donor kidney is offered to a waitlisted candidate, the decision to accept or decline the organ relies primarily upon a practitioner's experience and intuition. Such decisions must achieve a delicate balance between estimating the immediate benefit of transplantation and the potential for future higher-quality offers. However, the current experience-based paradigm lacks scientific rigor and is subject to the inaccuracies that plague anecdotal decision-making. A data-driven analytics-based model was developed to predict whether a patient will receive an offer for a deceased-donor kidney at Kidney Donor Profile Index thresholds of 0.2, 0.4, and 0.6, and at timeframes of 3, 6, and 12 months. The model accounted for Organ Procurement Organization, blood group, wait time, DR antigens, and prior offer history to provide accurate and personalized predictions. Performance was evaluated on data sets spanning various lengths of time to understand the adaptability of the method. Using United Network for Organ Sharing match-run data from March 2007 to June 2013, out-of-sample area under the receiver operating characteristic curve was approximately 0.87 for all Kidney Donor Profile Index thresholds and timeframes considered for the 10 most populous Organ Procurement Organizations. As more data becomes available, area under the receiver operating characteristic curve values increase and subsequently level off. The development of a data-driven analytics-based model may assist transplant practitioners and candidates during the complex decision of whether to accept or forgo a current kidney offer in anticipation of a future high-quality offer. The latter holds promise to facilitate timely transplantation and optimize the efficiency of allocation.

  5. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    Science.gov (United States)

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  6. Response process and test–retest reliability of the Context Assessment for Community Health tool in Vietnam

    Directory of Open Access Journals (Sweden)

    Duong M. Duc

    2016-06-01

    Full Text Available Background: The recently developed Context Assessment for Community Health (COACH tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources , community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment through 49 items. Objective: The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. Designs: To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test–retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC and percent agreement and dimensions (ICC and Bland–Altman plots. Results: In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test–retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5–0.7, demonstrating that the instrument has an acceptable level of stability. Conclusions: This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify

  7. Response process and test-retest reliability of the Context Assessment for Community Health tool in Vietnam.

    Science.gov (United States)

    Duc, Duong M; Bergström, Anna; Eriksson, Leif; Selling, Katarina; Thi Thu Ha, Bui; Wallin, Lars

    2016-01-01

    The recently developed Context Assessment for Community Health (COACH) tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources, community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment) through 49 items. The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test-retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC) and percent agreement) and dimensions (ICC and Bland-Altman plots). In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test-retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5-0.7), demonstrating that the instrument has an acceptable level of stability. This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify some items and to further review items with low ICC.

  8. NMD Classifier: A reliable and systematic classification tool for nonsense-mediated decay events.

    Directory of Open Access Journals (Sweden)

    Min-Kung Hsu

    Full Text Available Nonsense-mediated decay (NMD degrades mRNAs that include premature termination codons to avoid the translation and accumulation of truncated proteins. This mechanism has been found to participate in gene regulation and a wide spectrum of biological processes. However, the evolutionary and regulatory origins of NMD-targeted transcripts (NMDTs have been less studied, partly because of the complexity in analyzing NMD events. Here we report NMD Classifier, a tool for systematic classification of NMD events for either annotated or de novo assembled transcripts. This tool is based on the assumption of minimal evolution/regulation-an event that leads to the least change is the most likely to occur. Our simulation results indicate that NMD Classifier can correctly identify an average of 99.3% of the NMD-causing transcript structural changes, particularly exon inclusions/exclusions and exon boundary alterations. Researchers can apply NMD Classifier to evolutionary and regulatory studies by comparing NMD events of different biological conditions or in different organisms.

  9. Reliability and validity of a novel tool to comprehensively assess food and beverage marketing in recreational sport settings.

    Science.gov (United States)

    Prowse, Rachel J L; Naylor, Patti-Jean; Olstad, Dana Lee; Carson, Valerie; Mâsse, Louise C; Storey, Kate; Kirk, Sara F L; Raine, Kim D

    2018-05-31

    Current methods for evaluating food marketing to children often study a single marketing channel or approach. As the World Health Organization urges the removal of unhealthy food marketing in children's settings, methods that comprehensively explore the exposure and power of food marketing within a setting from multiple marketing channels and approaches are needed. The purpose of this study was to test the inter-rater reliability and the validity of a novel settings-based food marketing audit tool. The Food and beverage Marketing Assessment Tool for Settings (FoodMATS) was developed and its psychometric properties evaluated in five public recreation and sport facilities (sites) and subsequently used in 51 sites across Canada for a cross-sectional analysis of food marketing. Raters recorded the count of food marketing occasions, presence of child-targeted and sports-related marketing techniques, and the physical size of marketing occasions. Marketing occasions were classified by healthfulness. Inter-rater reliability was tested using Cohen's kappa (κ) and intra-class correlations (ICC). FoodMATS scores for each site were calculated using an algorithm that represented the theoretical impact of the marketing environment on food preferences, purchases, and consumption. Higher FoodMATS scores represented sites with higher exposure to, and more powerful (unhealthy, child-targeted, sports-related, large) food marketing. Validity of the scoring algorithm was tested through (1) Pearson's correlations between FoodMATS scores and facility sponsorship dollars, and (2) sequential multiple regression for predicting "Least Healthy" food sales from FoodMATS scores. Inter-rater reliability was very good to excellent (κ = 0.88-1.00, p marketing in recreation facilities, the FoodMATS provides a novel means to comprehensively track changes in food marketing environments that can assist in developing and monitoring the impact of policies and interventions.

  10. 3D photography is a reliable burn wound area assessment tool compared to digital planimetry in very young children.

    Science.gov (United States)

    Gee Kee, E L; Kimble, R M; Stockton, K A

    2015-09-01

    Reliability and validity of 3D photography (3D LifeViz™ System) compared to digital planimetry (Visitrak™) has been established in a compliant cohort of children with acute burns. Further research is required to investigate these assessment tools in children representative of the general pediatric burns population, specifically children under the age of three years. To determine if 3D photography is a reliable wound assessment tool compared to Visitrak™ in children of all ages with acute burns ≤10% TBSA. Ninety-six children (median age 1 year 9 months) who presented to the Royal Children's Hospital Brisbane with an acute burn ≤10% TBSA were recruited into the study. Wounds were measured at the first dressing change using the Visitrak™ system and 3D photography. All measurements were completed by one investigator and level of agreement between wound surface area measurements was calculated. Wound surface area measurements were complete (i.e. participants had measurements from both techniques) for 75 participants. Level of agreement between wound surface area measurements calculated using an intra-class correlation coefficient (ICC) was excellent (ICC 0.96, 95% CI 0.93, 0.97). Visitrak™ tracings could not be completed in 19 participants with 16 aged less than two years. 3D photography could not be completed for one participant. Barriers to completing tracings were: excessive movement, pain, young age or wound location (e.g. face or perineum). This study has confirmed 3D photography as a reliable alternative to digital planimetry in children of all ages with acute burns ≤10% TBSA. In addition, 3D photography is more suitable for very young children given its non-invasive nature. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  11. Fast neutrons: Inexpensive and reliable tool to investigate high-LET particle radiobiology

    International Nuclear Information System (INIS)

    Gueulette, J.; Slabbert, J.P.; Bischoff, P.; Denis, J.M.; Wambersie, A.; Jones, D.

    2010-01-01

    Radiation therapy with carbon ions as well as missions into outer space have boosted the interest for high-LET particle radiobiology. Optimization of treatments in accordance with technical developments, as well as the radioprotection of cosmonauts during long missions require that research in these domains continue. Therefore suitable radiation fields are needed. Fast neutrons and carbon ions exhibit comparable LET values and similar radiobiological properties. Consequently, the findings obtained with each radiation quality could be shared to benefit knowledge in all concerned domains. The p(66+Be) neutron therapy facilities of iThemba LABS (South Africa) and the p(65)+Be neutron facility of Louvain-la-Neuve (Belgium) are in constant use to do radiobiological research for clinical applications with fast neutrons. These beams - which comply with all physical and technical requirements for clinical applications - are now fully reliable, easy to use and frequently accessible for radiobiological investigations. These facilities thus provide unique opportunities to undertake radiobiological experimentation, especially for investigations that require long irradiation times and/or fractionated treatments.

  12. GPUs, a new tool of acceleration in CFD: efficiency and reliability on smoothed particle hydrodynamics methods.

    Directory of Open Access Journals (Sweden)

    Alejandro C Crespo

    Full Text Available Smoothed Particle Hydrodynamics (SPH is a numerical method commonly used in Computational Fluid Dynamics (CFD to simulate complex free-surface flows. Simulations with this mesh-free particle method far exceed the capacity of a single processor. In this paper, as part of a dual-functioning code for either central processing units (CPUs or Graphics Processor Units (GPUs, a parallelisation using GPUs is presented. The GPU parallelisation technique uses the Compute Unified Device Architecture (CUDA of nVidia devices. Simulations with more than one million particles on a single GPU card exhibit speedups of up to two orders of magnitude over using a single-core CPU. It is demonstrated that the code achieves different speedups with different CUDA-enabled GPUs. The numerical behaviour of the SPH code is validated with a standard benchmark test case of dam break flow impacting on an obstacle where good agreement with the experimental results is observed. Both the achieved speed-ups and the quantitative agreement with experiments suggest that CUDA-based GPU programming can be used in SPH methods with efficiency and reliability.

  13. Assessing functional mobility in survivors of lower-extremity sarcoma: reliability and validity of a new assessment tool.

    Science.gov (United States)

    Marchese, Victoria G; Rai, Shesh N; Carlson, Claire A; Hinds, Pamela S; Spearing, Elena M; Zhang, Lijun; Callaway, Lulie; Neel, Michael D; Rao, Bhaskar N; Ginsberg, Jill P

    2007-08-01

    Reliability and validity of a new tool, Functional Mobility Assessment (FMA), were examined in patients with lower-extremity sarcoma. FMA requires the patients to physically perform the functional mobility measures, unlike patient self-report or clinician administered measures. A sample of 114 subjects participated, 20 healthy volunteers and 94 patients with lower-extremity sarcoma after amputation, limb-sparing, or rotationplasty surgery. Reliability of the FMA was examined by three raters testing 20 healthy volunteers and 23 subjects with lower-extremity sarcoma. Concurrent validity was examined using data from 94 subjects with lower-extremity sarcoma who completed the FMA, Musculoskeletal Tumor Society (MSTS), Short-Form 36 (SF-36v2), and Toronto Extremity Salvage Scale (TESS) scores. Construct validity was measured by the ability of the FMA to discriminate between subjects with and without functional mobility deficits. FMA demonstrated excellent reliability (ICC [2,1] >or=0.97). Moderate correlations were found between FMA and SF-36v2 (r = 0.60, P < 0.01), FMA and MSTS (r = 0.68, P < 0.01), and FMA and TESS (r = 0.62, P < 0.01). The patients with lower-extremity sarcoma scored lower on the FMA as compared to healthy controls (P < 0.01). The FMA is a reliable and valid functional outcome measure for patients with lower-extremity sarcoma. This study supports the ability of the FMA to discriminate between patients with varying functional abilities and supports the need to include measures of objective functional mobility in examination of patients with lower-extremity sarcoma.

  14. Improved reliability of serological tools for the diagnosis of West Nile fever in horses within Europe.

    Directory of Open Access Journals (Sweden)

    Cécile Beck

    2017-09-01

    Full Text Available West Nile Fever is a zoonotic disease caused by a mosquito-borne flavivirus, WNV. By its clinical sensitivity to the disease, the horse is a useful sentinel of infection. Because of the virus' low-level, short-term viraemia in horses, the primary tools used to diagnose WNV are serological tests. Inter-laboratory proficiency tests (ILPTs were held in 2010 and 2013 to evaluate WNV serological diagnostic tools suited for the European network of National Reference Laboratories (NRLs for equine diseases. These ILPTs were designed to evaluate the laboratories' and methods' performances in detecting WNV infection in horses through serology. The detection of WNV immunoglobulin G (IgG antibodies by ELISA is widely used in Europe, with 17 NRLs in 2010 and 20 NRLs in 2013 using IgG WNV assays. Thanks to the development of new commercial IgM capture kits, WNV IgM capture ELISAs were rapidly implemented in NRLs between 2010 (4 NRLs and 2013 (13 NRLs. The use of kits allowed the quick standardisation of WNV IgG and IgM detection assays in NRLs with more than 95% (20/21 and 100% (13/13 of satisfactory results respectively in 2013. Conversely, virus neutralisation tests (VNTs were implemented in 33% (7/21 of NRLs in 2013 and their low sensitivity was evidenced in 29% (2/7 of NRLs during this ILPT. A comparison of serological diagnostic methods highlighted the higher sensitivity of IgG ELISAs compared to WNV VNTs. They also revealed that the low specificity of IgG ELISA kits meant that it could detect animals infected with other flaviviruses. In contrast VNT and IgM ELISA assays were highly specific and did not detect antibodies against related flaviviruses. These results argue in favour of the need for and development of new, specific serological diagnostic assays that could be easily transferred to partner laboratories.

  15. An analytical method on the surface residual stress for the cutting tool orientation

    Science.gov (United States)

    Li, Yueen; Zhao, Jun; Wang, Wei

    2010-03-01

    The residual stress is measured by choosing 8 kinds orientations on cutting the H13 dies steel on the HSM in the experiment of this paper. The measured data shows on that the residual stress exists periodicity for the different rake angle (β) and side rake angle (θ) parameters, further study find that the cutting tool orientations have closed relationship with the residual stresses, and for the original of the machined residual stress on the surface from the cutting force and the axial force, it can be gained the simply model of tool-workpiece force, using the model it can be deduced the residual stress model, which is feasible to calculate the size of residual stress. And for almost all the measured residual stresses are compressed stress, the compressed stress size and the direction could be confirmed by the input data for the H13 on HSM. As the result, the residual stress model is the key for optimization of rake angle (β) and side rake angle (θ) in theory, using the theory the more cutting mechanism can be expressed.

  16. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  17. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Huaying Zhao

    Full Text Available Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  18. Thermal Lens Spectroscopy as a 'new' analytical tool for actinide determination in nuclear reprocessing processes

    International Nuclear Information System (INIS)

    Canto, Fabrice; Couston, Laurent; Magnaldo, Alastair; Broquin, Jean-Emmanuel; Signoret, Philippe

    2008-01-01

    Thermal Lens Spectroscopy (TLS) consists of measuring the effects induced by the relaxation of molecules excited by photons. Twenty years ago, the Cea already worked on TLS. Technologic reasons impeded. But, needs in sensitive analytical methods coupled with very low sample volumes (for example, traces of Np in the COEX TM process) and also the reduction of the nuclear wastes encourage us to revisit this method thanks to the improvement of optoelectronic technologies. We can also imagine coupling TLS with micro-fluidic technologies, decreasing significantly the experiments cost. Generally two laser beams are used for TLS: one for the selective excitation by molecular absorption (inducing the thermal lens) and one for probing the thermal lens. They can be coupled with different geometries, collinear or perpendicular, depending on the application and on the laser mode. Also, many possibilities of measurement have been studied to detect the thermal lens signal: interferometry, direct intensities variations, deflection etc... In this paper, one geometrical configuration and two measurements have been theoretically evaluated. For a single photodiode detection (z-scan) the limit of detection is calculated to be near 5*10 -6 mol*L -1 for Np(IV) in dodecane. (authors)

  19. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, B.; Cristea-Stan, D. [National Institute for Nuclear Physics and Engineering Horia Hulubei, Bucharest-Magurele (Romania); Kovács, I.; Szõkefalvi-Nagy, Z. [Wigner Research Centre for Phyics, Institute for Particle and Nuclear Physics, Budapest (Hungary)

    2013-07-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  20. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    International Nuclear Information System (INIS)

    Constantinescu, B.; Cristea-Stan, D.; Kovács, I.; Szõkefalvi-Nagy, Z.

    2013-01-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  1. Titer on chip: new analytical tool for influenza vaccine potency determination.

    Directory of Open Access Journals (Sweden)

    Laura R Kuck

    Full Text Available Titer on Chip (Flu-ToC is a new technique for quantification of influenza hemagglutinin (HA concentration. In order to evaluate the potential of this new technique, a comparison of Flu-ToC to more conventional methods was conducted using recombinant HA produced in a baculovirus expression system as a test case. Samples from current vaccine strains were collected from four different steps in the manufacturing process. A total of 19 samples were analysed by Flu-ToC (blinded, single radial immunodiffusion (SRID, an enzyme-linked immunosorbent assay (ELISA, and the purity adjusted bicinchoninic acid assay (paBCA. The results indicated reasonable linear correlation between Flu-ToC and SRID, ELISA, and paBCA, with regression slopes of log-log plots being 0.91, 1.03, and 0.91, respectively. The average ratio for HA content measured by Flu-ToC relative to SRID, ELISA, and paBCA was 83%, 147%, and 81%, respectively; indicating nearly equivalent potency determination for Flu-ToC relative to SRID and paBCA. These results, combined with demonstrated multiplexed analysis of all components within a quadrivalent formulation and robust response to HA strains over a wide time period, support the conclusion that Flu-ToC can be used as a reliable and time-saving alternative potency assay for influenza vaccines.

  2. Usefulness of analytical parameters in the management of paediatric patients with suspicion of acute pyelonephritis. Is procalcitonin reliable?

    Science.gov (United States)

    Bañuelos-Andrío, L; Espino-Hernández, M; Ruperez-Lucas, M; Villar-Del Campo, M C; Romero-Carrasco, C I; Rodríguez-Caravaca, G

    To investigate the usefulness of procalcitonin (PCT) and other analytical parameters (white blood cell count [WBC], C-reactive protein [CRP]) as markers of acute renal damage in children after a first febrile or afebrile urinary tract infection (UTI). A retrospective study was conducted on children with a first episode of UTI admitted between January 2009 to December 2011, and in whom serum PCT, CRP and white blood cell count were measured, as well as assessing the acute renal damage with renal scintigraphy with 99m Tc-DMSA (DMSA) within the first 72h after referral. A descriptive study was performed and ROC curves were plotted, with optimal cut-off points calculated for each parameter. The 101 enrolled patients were divided into two groups according to DMSA scintigraphy results, with 64 patients being classified with acute pyelonephritis (APN), and 37 with UTI. The mean WBC, CRP and PCT values were significantly higher in patients with APN with respect to normal acute DMSA. The area under the ROC curve was 0.862 for PCR, 0.774 for WBC, and 0.731 for PCT. The optimum statistical cut-off value for PCT was 0.285ng/ml (sensitivity 71.4% and specificity 75%). Although the mean levels of fever, WBC, CRP, and PCT were significantly increased in patients with APN than in those who had UTI, the sensitivity and specificity of these analytical parameters are unable to predict the existence of acute renal damage, making the contribution by renal DMSA scintigraphy essential. Copyright © 2016 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  3. An analytical tool to support the pedestrianisation process: The case of via Roma, Cagliari

    Directory of Open Access Journals (Sweden)

    Alfonso Annunziata

    2018-04-01

    Full Text Available The article focuses on the case of the modification of an urban road network: the transformation of a portion of an important distributor road in the urban area of Cagliari into a pedestrian space. By means of this case study the article aims to point out how pedestrianisation interventions have not been completely defined within a theoretical system that clearly establishes modes and conditions of implementation. This lack of theorization has led to the common understanding of pedestrianisation as good operations in and of itself and, as such, exportable, meant to produce the same effects everywhere (Bianchetti, 2016. This analysis uses the fundamental conditions of hierarchy as a tool to assess to what extent the modification of the road network articulation has resulted in conditions of lesser inter-connectivity, legibility and functionality. In this perspective the article proposes a system of criteria, founded on the principles of hierarchy, meant to be a theoretical support for processes of pedestrianisation.

  4. Thermodynamics and structure of liquid surfaces investigated directly with surface analytical tools

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Gunther [Flinders Univ., Adelaide, SA (Australia). Centre for NanoScale Science and Technology; Morgner, Harald [Leipzig Univ. (Germany). Wilhelm Ostwald Inst. for Physical and Theoretical Chemistry

    2017-06-15

    Measuring directly the composition, the distribution of constituents as function of the depth and the orientation of molecules at liquid surfaces is essential for determining physicochemical properties of liquid surfaces. While the experimental tools that have been developed for analyzing solid surfaces can in principal be applied to liquid surfaces, it turned out that they had to be adjusted to the particular challenges imposed by liquid samples, e.g. by the unavoidable vapor pressure and by the mobility of the constituting atoms/molecules. In the present work it is shown, how electron spectroscopy and ion scattering spectroscopy have been used for analyzing liquid surfaces. The emphasis of this review is on using the structural information gained for determining the physicochemical properties of liquid surfaces. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Science.gov (United States)

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P

    2012-01-01

    The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff. A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i) capture complexity, ii) analyze trends, and iii) identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%), heavy metals (10.9%) or bacteria (20.3%). Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting) as well as within (e.g. transgressing) activities. This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and transgressors, instantly revealed by our approach, could supplement the intelligence

  6. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff.A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i capture complexity, ii analyze trends, and iii identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%, heavy metals (10.9% or bacteria (20.3%. Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting as well as within (e.g. transgressing activities.This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and transgressors, instantly revealed by our approach, could supplement the

  7. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    Directory of Open Access Journals (Sweden)

    Chie Takahashi

    2011-10-01

    Full Text Available Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009. Variations in tool geometry also affect the reliability (precision of haptic size estimates, however, because they alter the change in hand opening caused by a given change in object size. Here, we examine whether the brain appropriately adjusts the weights given to visual and haptic size signals when tool geometry changes. We first estimated each cue's reliability by measuring size-discrimination thresholds in vision-alone and haptics-alone conditions. We varied haptic reliability using tools with different object-size:hand-opening ratios (1:1, 0.7:1, and 1.4:1. We then measured the weights given to vision and haptics with each tool, using a cue-conflict paradigm. The weight given to haptics varied with tool type in a manner that was well predicted by the single-cue reliabilities (MLE model; Ernst and Banks, 2002. This suggests that the process of visual-haptic integration appropriately accounts for variations in haptic reliability introduced by different tool geometries.

  8. Procedures for treating common cause failures in safety and reliability studies: Volume 2, Analytic background and techniques: Final report

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1988-12-01

    This report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. This document, Volume 2, contains a series of appendices that provide additional background and methodological detail on several important topics discussed in Volume 1

  9. Development of a new analytical tool for assessing the mutagen 2-methyl-1,4-dinitro-pyrrole in meat products by LC-ESI-MS/MS.

    Science.gov (United States)

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; Yotsuyanagi, Suzana Eri; da Silva Correa Lemos, Ana Lucia; Joussef, Antonio Carlos; De Dea Lindner, Juliano

    2018-08-01

    The use of sorbate and nitrite in meat processing may lead to the formation of 2-methyl-1,4-dinitro-pyrrole (DNMP), a mutagenic compound. This work was aimed at developing and validating an analytical method for the quantitation of DNMP by liquid chromatography-tandem mass spectrometry. Full validation was performed in accordance to Commission Decision 2002/657/EC and method applicability was checked in several samples of meat products. A simple procedure, with low temperature partitioning solid-liquid extraction, was developed. The nitrosation during the extraction was monitored by the N-nitroso-DL-pipecolic acid content. Chromatographic separation was achieved in 8 min with di-isopropyl-3-aminopropyl silane bound to hydroxylated silica as stationary phase. Samples of bacon and cooked sausage yielded the highest concentrations of DNMP (68 ± 3 and 50 ± 3 μg kg -1 , respectively). The developed method proved to be a reliable, selective, and sensitive tool for DNMP measurements in meat products. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Analytical tools and functions of GIS in the process control and decision support of mining company

    Directory of Open Access Journals (Sweden)

    Semrád Peter

    2001-12-01

    calculate their total quantity of reserves and distance and network analysis (modelling protection pillars as buffer zones for operating objects above ground, time calculation for transport of mineral resource, using optimal routes combined with cost calcultion, from the analytical apparatus and functions of GIS used in the process control and decision support of mining company.Modern mining is ranked to the specific group of fields with high information intensity. Because of high financial demands of the mine processes and technologies, the basical strategy of all mining companies is the utilization of information technologies for the reduction of expenses. Implementation of GIS in this area is, according to their options and functions, ideal.

  11. Experimental anti-GBM nephritis as an analytical tool for studying spontaneous lupus nephritis.

    Science.gov (United States)

    Du, Yong; Fu, Yuyang; Mohan, Chandra

    2008-01-01

    Systemic lupus erythematosus (SLE) is an autoimmune disease that results in immune-mediated damage to multiple organs. Among these, kidney involvement is the most common and fatal. Spontaneous lupus nephritis (SLN) in mouse models has provided valuable insights into the underlying mechanisms of human lupus nephritis. However, SLN in mouse models takes 6-12 months to manifest; hence there is clearly the need for a mouse model that can be used to unveil the pathogenic processes that lead to immune nephritis over a shorter time frame. In this article more than 25 different molecules are reviewed that have been studied both in the anti-glomerular basement membrane (anti-GBM) model and in SLN and it was found that these molecules influence both diseases in a parallel fashion, suggesting that the two disease settings share common molecular mechanisms. Based on these observations, the authors believe the experimental anti-GBM disease model might be one of the best tools currently available for uncovering the downstream molecular mechanisms leading to SLN.

  12. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    Science.gov (United States)

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-05-01

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  13. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    Science.gov (United States)

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  14. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    Science.gov (United States)

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Boehme, E.; Musekamp, W.; Hussels, U.; Becker, G.; Behr, H.; Luettgert, H.

    1994-01-01

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP) [de

  16. Evaluation And Selection Process of Suppliers Through Analytical Framework: An Emprical Evidence of Evaluation Tool

    Directory of Open Access Journals (Sweden)

    Imeri Shpend

    2015-09-01

    Full Text Available The supplier selection process is very important to companies as selecting the right suppliers that fit companies strategy needs brings drastic savings. Therefore, this paper seeks to address the key area of supplies evaluation from the supplier review perspective. The purpose was to identify the most important criteria for suppliers’ evaluation and develop evaluation tool based on surveyed criteria. The research was conducted through structured questionnaire and the sample focused on small to medium sized companies (SMEs in Greece. In total eighty companies participated in the survey answering the full questionnaire which consisted of questions whether these companies utilize some suppliers’ evaluation criteria and what criteria if any is applied. The main statistical instrument used in the study is Principal Component Analysis (PCA. Thus, the research has shown that the main criteria are: the attitude of the vendor towards the customer, supplier delivery time, product quality and price. Conclusions are made on the suitability and usefulness of suppliers’ evaluation criteria and in way they are applied in enterprises.

  17. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  18. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    International Nuclear Information System (INIS)

    Hervas, Miriam; Lopez, Miguel Angel; Escarpa, Alberto

    2009-01-01

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 μg L -1 and EC 50 0.079 μg L -1 were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  19. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  20. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  1. Reliable screening of various foodstuffs with respect to their irradiation status: A comparative study of different analytical techniques

    International Nuclear Information System (INIS)

    Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho

    2013-01-01

    Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0–10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700–5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods. - Highlights: • Detection of irradiated food is important to enforce the applied regulations. • Gamma-irradiated spices were investigated to confirm their irradiation status. • Screening techniques such as PSL, DEFT/APC, and E-nose were tested. • Specificity and potential applications of screening techniques were evaluated. • The screening results were confirmed by promising thermoluminescence technique

  2. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    Science.gov (United States)

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  3. Operational safety performance indicator system - a management tool for the self assessment of safety and reliability of nuclear power plants

    International Nuclear Information System (INIS)

    Anil Kumar; Mandowara, S.L.; Mittal, S.

    2006-01-01

    Operational Safety Performance Indicator system is one of the self assessment tools for station management to monitor safety and reliability of nuclear power plants. It provides information to station management about the performance of various areas of the plants by means of different colours of relevant performance indicators. Such systems have been implemented at many nuclear power plants in the world and have been considered as strength during WANO Peer Review. IAEA had a Coordinated Research Programme (CRP) on this with several countries participating including India. In NPCIL this system has been implemented in KAPS about a year back and found very useful in identifying areas which needs to be given more attention. Based on the KAPS feedback Implementation of this system has been taken up in RAPS-3 and 4 and KGS-l and 2. (author)

  4. Designing and Assessing the Validity and Reliability of the Hospital Readiness Assessment Tools to Conducting Quality Improvement Program

    Directory of Open Access Journals (Sweden)

    Kamal Gholipoor

    2016-09-01

    Full Text Available Background and objectives : Identifying the readiness of hospital and its strengths and weaknesses can be useful in developing appropriate planning and situation analyses and management to getting effective in clinical audit programs. The aim of this study was to design and assess the validity of the Hospital Readiness Assessment Tools to conduct quality improvement and clinical audit programs. Material and Methods: In this study, based on the results of a systematic review of literature, an initial questionnaire with 77 items was designed. Questionnaire content validity was reviewed by experts in the field of hospital management and quality improvement in Tabriz University of Medical Sciences. For this purpose, 20 questionnaires were sent to experts. Finally, 15 participants returned completed questionnaire. Questionnaire validity was reviewed and confirmed based on Content Validity Index and Content Validity Ratio. Questionnaire reliability was confirmed based on Cronbach's alpha index (α = 0.96 in a pilot study by participation of 30 hospital managers. Results: The results showed that the final questionnaire contains 54 questions as nine category as: data and information (9 items, teamwork (12 questions, resources (5 questions, patient and education (5, intervention design and implementation (5 questions, clinical audit management (4 questions, human resources (6 questions, evidence and standard (4 items and evaluation and feedback (4 items. The final questionnaire content validity index was 0.91 and final questionnaire Cronbach's alpha coefficient was 0.96. Conclusion: Considering the relative good validity and reliability of the designed tool in this study, it appears that the questionnaire can be used to identify and assess the readiness of hospitals for quality improvement and clinical audit program implementation

  5. Hazard assessment of exhaust emissions - The next generation of fast and reliable tools for in vitro screening

    Science.gov (United States)

    Rothen-Rutishauser, B.

    2017-12-01

    Hazard assessment of exhaust emissions - The next generation of fast and reliable tools for in vitro screening Barbara Rothen-Rutishauser Adolphe Merkle Institute, University of Fribourg, Switzerland; barbara.rothen@unifr.ch Pollution by vehicles is a major problem for the environment due to the various components in the exhaust gasses that are emitted into the atmosphere. A large number of epidemiological studies demonstrate the profound impact of vehicle emissions upon human health [1-3]. Such studies however, are unable to attribute a given subset of emissions to a certain adverse effect, which renders decision making difficult. Standardized protocols for exhaust toxicity assessment are lacking and it relies in many aspects on epidemiological and in vivo studies (animals), which are very time and cost-intensive and suffer from considerable ethical issues. An overview about the current state of research and clinical aspects in the field, as well as about the development of sophisticated in vitro approaches mimicking the inhalation of airborne particles / exhaust for the toxicological testing of engine emissions will be provided. Data will be presented that show that the combination of an air-liquid exposure system and 3D lung-cell culture model offers an adequate tool for fast and reliable investigations of complete exhaust toxicity as well as the effects of particulate fraction [4,5]. This approach yields important results for novel and improved emission technologies in the early stages of product development. [1] Donaldson et al. Part Fibre Toxicol 2005, 2: 10. [2] Ghio et al. J Toxicol Environ Health B Crit Rev 2012, 15: 1-21. [3] Peters et al. Res Rep Health Eff Inst 2009, 5-77. [4] Bisig et al. Emiss Control Sci Technol 2015, 1: 237-246. [5] Steiner et al. Atmos Environ 2013, 81: 380-388.

  6. Testing the reliability and efficiency of the pilot Mixed Methods Appraisal Tool (MMAT) for systematic mixed studies review.

    Science.gov (United States)

    Pace, Romina; Pluye, Pierre; Bartlett, Gillian; Macaulay, Ann C; Salsberg, Jon; Jagosh, Justin; Seller, Robbyn

    2012-01-01

    Systematic literature reviews identify, select, appraise, and synthesize relevant literature on a particular topic. Typically, these reviews examine primary studies based on similar methods, e.g., experimental trials. In contrast, interest in a new form of review, known as mixed studies review (MSR), which includes qualitative, quantitative, and mixed methods studies, is growing. In MSRs, reviewers appraise studies that use different methods allowing them to obtain in-depth answers to complex research questions. However, appraising the quality of studies with different methods remains challenging. To facilitate systematic MSRs, a pilot Mixed Methods Appraisal Tool (MMAT) has been developed at McGill University (a checklist and a tutorial), which can be used to concurrently appraise the methodological quality of qualitative, quantitative, and mixed methods studies. The purpose of the present study is to test the reliability and efficiency of a pilot version of the MMAT. The Center for Participatory Research at McGill conducted a systematic MSR on the benefits of Participatory Research (PR). Thirty-two PR evaluation studies were appraised by two independent reviewers using the pilot MMAT. Among these, 11 (34%) involved nurses as researchers or research partners. Appraisal time was measured to assess efficiency. Inter-rater reliability was assessed by calculating a kappa statistic based on dichotomized responses for each criterion. An appraisal score was determined for each study, which allowed the calculation of an overall intra-class correlation. On average, it took 14 min to appraise a study (excluding the initial reading of articles). Agreement between reviewers was moderate to perfect with regards to MMAT criteria, and substantial with respect to the overall quality score of appraised studies. The MMAT is unique, thus the reliability of the pilot MMAT is promising, and encourages further development. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Establishing magnetic resonance imaging as an accurate and reliable tool to diagnose and monitor esophageal cancer in a rat model.

    Directory of Open Access Journals (Sweden)

    Juliann E Kosovec

    Full Text Available OBJECTIVE: To assess the reliability of magnetic resonance imaging (MRI for detection of esophageal cancer in the Levrat model of end-to-side esophagojejunostomy. BACKGROUND: The Levrat model has proven utility in terms of its ability to replicate Barrett's carcinogenesis by inducing gastroduodenoesophageal reflux (GDER. Due to lack of data on the utility of non-invasive methods for detection of esophageal cancer, treatment efficacy studies have been limited, as adenocarcinoma histology has only been validated post-mortem. It would therefore be of great value if the validity and reliability of MRI could be established in this setting. METHODS: Chronic GDER reflux was induced in 19 male Sprague-Dawley rats using the modified Levrat model. At 40 weeks post-surgery, all animals underwent endoscopy, MRI scanning, and post-mortem histological analysis of the esophagus and anastomosis. With post-mortem histology serving as the gold standard, assessment of presence of esophageal cancer was made by five esophageal specialists and five radiologists on endoscopy and MRI, respectively. RESULTS: The accuracy of MRI and endoscopic analysis to correctly identify cancer vs. no cancer was 85.3% and 50.5%, respectively. ROC curves demonstrated that MRI rating had an AUC of 0.966 (p<0.001 and endoscopy rating had an AUC of 0.534 (p = 0.804. The sensitivity and specificity of MRI for identifying cancer vs. no-cancer was 89.1% and 80% respectively, as compared to 45.5% and 57.5% for endoscopy. False positive rates of MRI and endoscopy were 20% and 42.5%, respectively. CONCLUSIONS: MRI is a more reliable diagnostic method than endoscopy in the Levrat model. The non-invasiveness of the tool and its potential to volumetrically quantify the size and number of tumors likely makes it even more useful in evaluating novel agents and their efficacy in treatment studies of esophageal cancer.

  8. Portuguese translation, cross-cultural adaptation and reliability of the questionnaire «Start Back Screening Tool» (SBST).

    Science.gov (United States)

    Raimundo, Armando; Parraça, José; Batalha, Nuno; Tomas-Carus, Pablo; Branco, Jaime; Hill, Jonathan; Gusi, Narcis

    2017-01-01

    To translate and perform the cross-cultural adaptation of the StarT Back Screening Tool (SBST) questionnaire to assessment and screening low back pain for Portuguese application, and test their reliability. To establish conceptual equivalence in item, semantic and operational concern, there were performed two translations into Portuguese in a independently way. A combined version was obtained by consensus among the authors of the translations in order to be achieved a noticeable version in semantic terms and easy to understand. The synthesis version was administered to 40 subjects distributed by gender, young and older adults, with and without low back pain. Through cognitive interviews with the subjects of the sample, clarity, the acceptability, as well as the familiarization of the Portuguese version was evaluated, promoting the changes necessary for a better understanding. The final Portuguese version of the questionnaire was then back-translated into the original language. To evaluate the SBST-Portugal psychometric properties, 31 subjects with low back pain performed two interviews. Participants interviewees reported that in general the items were clear and comprehensible achieving face validity. The reliability of the SBST-Portugal showed a Kappa value of 0,74 (95%IC 0,53-0,95), and the internal consistency (Cronbach's alpha) was 0,93 for the total score and 0,93 for the psychosocial subscale. The Portuguese version of SBST questionnaire proved to be equivalent to the original English version and reliable for the Portuguese population with low back pain. Being an instrument of easy access and application it could be use in primary care.

  9. Clinical assessment of dysphagia in neurodegeneration (CADN): development, validity and reliability of a bedside tool for dysphagia assessment.

    Science.gov (United States)

    Vogel, Adam P; Rommel, Natalie; Sauer, Carina; Horger, Marius; Krumm, Patrick; Himmelbach, Marc; Synofzik, Matthis

    2017-06-01

    Screening assessments for dysphagia are essential in neurodegenerative disease. Yet there are no purpose-built tools to quantify swallowing deficits at bedside or in clinical trials. A quantifiable, brief, easy to administer assessment that measures the impact of dysphagia and predicts the presence or absence of aspiration is needed. The Clinical Assessment of Dysphagia in Neurodegeneration (CADN) was designed by a multidisciplinary team (neurology, neuropsychology, speech pathology) validated against strict methodological criteria in two neurodegenerative diseases, Parkinson's disease (PD) and degenerative ataxia (DA). CADN comprises two parts, an anamnesis (part one) and consumption (part two). Two-thirds of patients were assessed using reference tests, the SWAL-QOL symptoms subscale (part one) and videofluoroscopic assessment of swallowing (part two). CADN has 11 items and can be administered and scored in an average of 7 min. Test-retest reliability was established using correlation and Bland-Altman plots. 125 patients with a neurodegenerative disease were recruited; 60 PD and 65 DA. Validity was established using ROC graphs and correlations. CADN has sensitivity of 79 and 84% and specificity 71 and 69% for parts one and two, respectively. Significant correlations with disease severity were also observed (p dysphagia symptomatology and risk of aspiration. The CADN is a reliable, valid, brief, quantifiable, and easily deployed assessment of swallowing in neurodegenerative disease. It is thus ideally suited for both clinical bedside assessment and future multicentre clinical trials in neurodegenerative disease.

  10. Reliable computation of roots in analytical waveguide modeling using an interval-Newton approach and algorithmic differentiation.

    Science.gov (United States)

    Bause, Fabian; Walther, Andrea; Rautenberg, Jens; Henning, Bernd

    2013-12-01

    For the modeling and simulation of wave propagation in geometrically simple waveguides such as plates or rods, one may employ the analytical global matrix method. That is, a certain (global) matrix depending on the two parameters wavenumber and frequency is built. Subsequently, one must calculate all parameter pairs within the domain of interest where the global matrix becomes singular. For this purpose, one could compute all roots of the determinant of the global matrix when the two parameters vary in the given intervals. This requirement to calculate all roots is actually the method's most concerning restriction. Previous approaches are based on so-called mode-tracers, which use the physical phenomenon that solutions, i.e., roots of the determinant of the global matrix, appear in a certain pattern, the waveguide modes, to limit the root-finding algorithm's search space with respect to consecutive solutions. In some cases, these reductions of the search space yield only an incomplete set of solutions, because some roots may be missed as a result of uncertain predictions. Therefore, we propose replacement of the mode-tracer approach with a suitable version of an interval- Newton method. To apply this interval-based method, we extended the interval and derivative computation provided by a numerical computing environment such that corresponding information is also available for Bessel functions used in circular models of acoustic waveguides. We present numerical results for two different scenarios. First, a polymeric cylindrical waveguide is simulated, and second, we show simulation results of a one-sided fluid-loaded plate. For both scenarios, we compare results obtained with the proposed interval-Newton algorithm and commercial software.

  11. Real-time imaging as an emerging process analytical technology tool for monitoring of fluid bed coating process.

    Science.gov (United States)

    Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S

    2018-07-01

    A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.

  12. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Reliability and Normative Reference Values for the Vestibular/Ocular Motor Screening (VOMS) Tool in Youth Athletes.

    Science.gov (United States)

    Moran, Ryan N; Covassin, Tracey; Elbin, R J; Gould, Dan; Nogle, Sally

    2018-05-01

    The Vestibular/Ocular Motor Screening (VOMS) measure is a newly developed vestibular and ocular motor symptom provocation screening tool for sport-related concussions. Baseline data, psychometric properties, and reliability of the VOMS are needed to further understand the applications of this tool, especially in the youth population, where research is scarce. To establish normative data and document the internal consistency and false-positive rate of the VOMS in a sample of nonconcussed youth athletes. Cross-sectional study; Level of evidence, 3. A total of 423 youth athletes (male = 278, female = 145) between the ages of 8 and 14 years completed baseline VOMS screening before the start of their respective sport seasons. Internal consistency was measured with Cronbach α and inter-item correlations. Approximately 60% of youth athletes reported no symptom provocation on baseline VOMS assessment, with 9% to 13% scoring over the cutoff levels (score of ≥2 for any individual VOMS symptom, near point convergence distance of ≥5 cm). The VOMS displayed a high internal consistency (Cronbach α = .97) at baseline among youth athletes. The current findings provide preliminary support for the implementation of VOMS baseline assessment into clinical practice, due to a high internal consistency, strong relationships between VOMS items, and a low false-positive rate at baseline in youth athletes.

  14. The validity and reliability of the portuguese versions of three tools used to diagnose delirium in critically ill patients

    Directory of Open Access Journals (Sweden)

    Dimitri Gusmao-Flores

    2011-01-01

    Full Text Available OBJECTIVES: The objectives of this study are to compare the sensitivity and specificity of three diagnostic tools for delirium (the Intensive Care Delirium Screening Checklist, the Confusion Assessment Method for Intensive Care Units and the Confusion Assessment Method for Intensive Care Units Flowsheet in a mixed population of critically ill patients, and to validate the Brazilian Portuguese Confusion Assessment Method for Intensive Care Units. METHODS: The study was conducted in four intensive care units in Brazil. Patients were screened for delirium by a psychiatrist or neurologist using the Diagnostic and Statistical Manual of Mental Disorders. Patients were subsequently screened by an intensivist using Portuguese translations of the three tools. RESULTS: One hundred and nineteen patients were evaluated and 38.6% were diagnosed with delirium by the reference rater. The Confusion Assessment Method for Intensive Care Units had a sensitivity of 72.5% and a specificity of 96.2%; the Confusion Assessment Method for Intensive Care Units Flowsheet had a sensitivity of 72.5% and a specificity of 96.2%; the Intensive Care Delirium Screening Checklist had a sensitivity of 96.0% and a specificity of 72.4%. There was strong agreement between the Confusion Assessment Method for Intensive Care Units and the Confusion Assessment Method for Intensive Care Units Flowsheet (kappa coefficient = 0.96 CONCLUSION: All three instruments are effective diagnostic tools in critically ill intensive care unit patients. In addition, the Brazilian Portuguese version of the Confusion Assessment Method for Intensive Care Units is a valid and reliable instrument for the assessment of delirium among critically ill patients.

  15. Born analytical or adopted over time? a study investigating if new analytical tools can ensure the survival of market oriented startups.

    OpenAIRE

    Skogen, Hege Janson; De la Cruz, Kai

    2017-01-01

    Masteroppgave(MSc) in Master of Science in Strategic Marketing Management - Handelshøyskolen BI, 2017 This study investigates whether the prevalence of technological advances within quantitative analytics moderates the effect market orientation has on firm performance, and if startups can take advantage of the potential opportunities to ensure their own survival. For this purpose, the authors review previous literature in marketing orientation, startups, marketing analytics, an...

  16. The Outdoor MEDIA DOT: The development and inter-rater reliability of a tool designed to measure food and beverage outlets and outdoor advertising.

    Science.gov (United States)

    Poulos, Natalie S; Pasch, Keryn E

    2015-07-01

    Few studies of the food environment have collected primary data, and even fewer have reported reliability of the tool used. This study focused on the development of an innovative electronic data collection tool used to document outdoor food and beverage (FB) advertising and establishments near 43 middle and high schools in the Outdoor MEDIA Study. Tool development used GIS based mapping, an electronic data collection form on handheld devices, and an easily adaptable interface to efficiently collect primary data within the food environment. For the reliability study, two teams of data collectors documented all FB advertising and establishments within one half-mile of six middle schools. Inter-rater reliability was calculated overall and by advertisement or establishment category using percent agreement. A total of 824 advertisements (n=233), establishment advertisements (n=499), and establishments (n=92) were documented (range=8-229 per school). Overall inter-rater reliability of the developed tool ranged from 69-89% for advertisements and establishments. Results suggest that the developed tool is highly reliable and effective for documenting the outdoor FB environment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. The Korean version of relative and absolute reliability of gait and balance assessment tools for patients with dementia in day care center and nursing home.

    Science.gov (United States)

    Lee, Han Suk; Park, Sun Wook; Chung, Hyung Kuk

    2017-11-01

    [Purpose] This study was aimed to determine the relative and absolute reliability of Korean version tools of the Berg Balance Scale (BBS), the Timed Up and Go (TUG), the Four-Meter Walking Test (4MWT) and the Groningen Meander Walking Test (GMWT) in patients with dementia. [Subjects and Methods] A total of 53 patients with dementia were tested on TUG, BBS, 4MWT and GMWT with a prospective cohort methodological design. Intra-class Correlation Coefficients (ICCs) to assess relative reliability and the standard error of measurement (SEM), minimal detectable change (MDC 95 ) and its percentage (MDC % ) to analyze the absolute reliability were calculated. [Results] Inter-rater reliability (ICC (2,3) ) of TUG, BBS and GMWT was 0.99 and that of 4MWT was 0.82. Inter-rater reliability was high for TUG, BBS and GMWT, with low SEM, MDC 95 , and MDC % . Inter-rater reliability was low for 4MWT, with high SEM, MDC 95 , and MDC % . Test-retest (ICC (2,3) ) of TUG, BBS and GMWT was 0.96-0.99 and Test-retest (ICC (2,3) ) of 4MWT was 0.85. The test-retest was high for TUG, BBS and GMWT, with low SEM, MDC 95 , and MDC % , but it was low for 4MWT, with high SEM, MDC 95 , and MDC % . [Conclusion] The relative reliability was high for all the assessment tools. The absolute reliability has a reasonable level of stability except the 4MWT.

  18. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    Science.gov (United States)

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Product ion isotopologue pattern: A tool to improve the reliability of elemental composition elucidations of unknown compounds in complex matrices.

    Science.gov (United States)

    Kaufmann, A; Walker, S; Mol, G

    2016-04-15

    Elucidation of the elemental compositions of unknown compounds (e.g., in metabolomics) generally relies on the availability of accurate masses and isotopic ratios. This study focuses on the information provided by the abundance ratio within a product ion pair (monoisotopic versus the first isotopic peak) when isolating and fragmenting the first isotopic ion (first isotopic mass spectrum) of the precursor. This process relies on the capability of the quadrupole within the Q Orbitrap instrument to isolate a very narrow mass window. Selecting only the first isotopic peak (first isotopic mass spectrum) leads to the observation of a unique product ion pair. The lighter ion within such an isotopologue pair is monoisotopic, while the heavier ion contains a single carbon isotope. The observed abundance ratio is governed by the percentage of carbon atoms lost during the fragmentation and can be described by a hypergeometric distribution. The observed carbon isotopologue abundance ratio (product ion isotopologue pattern) gives reliable information regarding the percentage of carbon atoms lost in the fragmentation process. It therefore facilitates the elucidation of the involved precursor and product ions. Unlike conventional isotopic abundances, the product ion isotopologue pattern is hardly affected by isobaric interferences. Furthermore, the appearance of these pairs greatly aids in cleaning up a 'matrix-contaminated' product ion spectrum. The product ion isotopologue pattern is a valuable tool for structural elucidation. It increases confidence in results and permits structural elucidations for heavier ions. This tool is also very useful in elucidating the elemental composition of product ions. Such information is highly valued in the field of multi-residue analysis, where the accurate mass of product ions is required for the confirmation process. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. A web-based team-oriented medical error communication assessment tool: development, preliminary reliability, validity, and user ratings.

    Science.gov (United States)

    Kim, Sara; Brock, Doug; Prouty, Carolyn D; Odegard, Peggy Soule; Shannon, Sarah E; Robins, Lynne; Boggs, Jim G; Clark, Fiona J; Gallagher, Thomas

    2011-01-01

    Multiple-choice exams are not well suited for assessing communication skills. Standardized patient assessments are costly and patient and peer assessments are often biased. Web-based assessment using video content offers the possibility of reliable, valid, and cost-efficient means for measuring complex communication skills, including interprofessional communication. We report development of the Web-based Team-Oriented Medical Error Communication Assessment Tool, which uses videotaped cases for assessing skills in error disclosure and team communication. Steps in development included (a) defining communication behaviors, (b) creating scenarios, (c) developing scripts, (d) filming video with professional actors, and (e) writing assessment questions targeting team communication during planning and error disclosure. Using valid data from 78 participants in the intervention group, coefficient alpha estimates of internal consistency were calculated based on the Likert-scale questions and ranged from α=.79 to α=.89 for each set of 7 Likert-type discussion/planning items and from α=.70 to α=.86 for each set of 8 Likert-type disclosure items. The preliminary test-retest Pearson correlation based on the scores of the intervention group was r=.59 for discussion/planning and r=.25 for error disclosure sections, respectively. Content validity was established through reliance on empirically driven published principles of effective disclosure as well as integration of expert views across all aspects of the development process. In addition, data from 122 medicine and surgical physicians and nurses showed high ratings for video quality (4.3 of 5.0), acting (4.3), and case content (4.5). Web assessment of communication skills appears promising. Physicians and nurses across specialties respond favorably to the tool.

  1. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  2. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  3. The Application of State-of-the-Art Analytic Tools (Biosensors and Spectroscopy in Beverage and Food Fermentation Process Monitoring

    Directory of Open Access Journals (Sweden)

    Shaneel Chandra

    2017-09-01

    Full Text Available The production of several agricultural products and foods are linked with fermentation. Traditional methods used to control and monitor the quality of the products and processes are based on the use of simple chemical analysis. However, these methods are time-consuming and do not provide sufficient relevant information to guarantee the chemical changes during the process. Commonly used methods applied in the agriculture and food industries to monitor fermentation are those based on simple or single-point sensors, where only one parameter is measured (e.g., temperature or density. These sensors are used several times per day and are often the only source of data available from which the conditions and rate of fermentation are monitored. In the modern food industry, an ideal method to control and monitor the fermentation process should enable a direct, rapid, precise, and accurate determination of several target compounds, with minimal to no sample preparation or reagent consumption. Here, state-of-the-art advancements in both the application of sensors and analytical tools to monitor beverage and food fermentation processes will be discussed.

  4. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    OpenAIRE

    Chie Takahashi; Simon J Watt

    2011-01-01

    Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009). Variations in tool geometry also affect the reliability (precision) of haptic size estimates, however, because they alter the change ...

  5. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    Science.gov (United States)

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  6. Screening for Psychosocial Risk in Dutch Families of a Child With Cancer: Reliability, Validity, and Usability of the Psychosocial Assessment Tool

    NARCIS (Netherlands)

    Sint Nicolaas, Simone M.; Schepers, Sasja A.; Hoogerbrugge, Peter M.; Caron, Huib N.; Kaspers, Gertjan J. L.; van den Heuvel-Eibrink, Marry M.; Grootenhuis, Martha A.; Verhaak, Chris M.

    2016-01-01

    The Psychosocial Assessment Tool (PAT) was developed to screen for psychosocial risk in families of a child diagnosed with cancer. The current study is the first describing the cross-cultural adaptation, reliability, validity, and usability of the PAT in an European country (Dutch translation).   A

  7. Reliability and Validity of the Sport Concussion Assessment Tool-3 (SCAT3) in High School and Collegiate Athletes.

    Science.gov (United States)

    Chin, Esther Y; Nelson, Lindsay D; Barr, William B; McCrory, Paul; McCrea, Michael A

    2016-09-01

    The Sport Concussion Assessment Tool-3 (SCAT3) facilitates sideline clinical assessments of concussed athletes. Yet, there is little published research on clinically relevant metrics for the SCAT3 as a whole. We documented the psychometric properties of the major SCAT3 components (symptoms, cognition, balance) and derived clinical decision criteria (ie, reliable change score cutoffs and normative conversation tables) for clinicians to apply to cases with and without available preinjury baseline data. Cohort study (diagnosis); Level of evidence, 2. High school and collegiate athletes (N = 2018) completed preseason baseline evaluations including the SCAT3. Re-evaluations of 166 injured athletes and 164 noninjured controls were performed within 24 hours of injury and at 8, 15, and 45 days after injury. Analyses focused on predictors of baseline performance, test-retest reliability, and sensitivity and specificity of the SCAT3 using either single postinjury cutoffs or reliable change index (RCI) criteria derived from this sample. Athlete sex, level of competition, attention-deficit/hyperactivity disorder (ADHD), learning disability (LD), and estimated verbal intellectual ability (but not concussion history) were associated with baseline scores on ≥1 SCAT3 components (small to moderate effect sizes). Female sex, high school level of competition (vs college), and ADHD were associated with higher baseline symptom ratings (d = 0.25-0.32). Male sex, ADHD, and LD were associated with lower baseline Standardized Assessment of Concussion (SAC) scores (d = 0.28-0.68). Male sex, high school level of competition, ADHD, and LD were associated with poorer baseline Balance Error Scoring System (BESS) performance (d = 0.14-0.26). After injury, the symptom checklist manifested the largest effect size at the 24-hour assessment (d = 1.52), with group differences diminished but statistically significant at day 8 (d = 0.39) and nonsignificant at day 15. Effect sizes for the SAC and BESS

  8. The interrater and test-retest reliability of the Home Falls and Accidents Screening Tool (HOME FAST) in Malaysia: Using raters with a range of professional backgrounds.

    Science.gov (United States)

    Romli, Muhammad Hibatullah; Mackenzie, Lynette; Lovarini, Meryl; Tan, Maw Pin; Clemson, Lindy

    2017-06-01

    Falls can be a devastating issue for older people living in the community, including those living in Malaysia. Health professionals and community members have a responsibility to ensure that older people have a safe home environment to reduce the risk of falls. Using a standardised screening tool is beneficial to intervene early with this group. The Home Falls and Accidents Screening Tool (HOME FAST) should be considered for this purpose; however, its use in Malaysia has not been studied. Therefore, the aim of this study was to evaluate the interrater and test-retest reliability of the HOME FAST with multiple professionals in the Malaysian context. A cross-sectional design was used to evaluate interrater reliability where the HOME FAST was used simultaneously in the homes of older people by 2 raters and a prospective design was used to evaluate test-retest reliability with a separate group of older people at different times in their homes. Both studies took place in an urban area of Kuala Lumpur. Professionals from 9 professional backgrounds participated as raters in this study, and a group of 51 community older people were recruited for the interrater reliability study and another group of 30 for the test-retest reliability study. The overall agreement was moderate for interrater reliability and good for test-retest reliability. The HOME FAST was consistently rated by different professionals, and no bias was found among the multiple raters. The HOME FAST can be used with confidence by a variety of professionals across different settings. The HOME FAST can become a universal tool to screen for home hazards related to falls. © 2017 John Wiley & Sons, Ltd.

  9. A design tool to study the impact of mission-profile on the reliability of SiC-based PV-inverter devices

    DEFF Research Database (Denmark)

    Sintamarean, Nicolae Cristian; Wang, Huai; Blaabjerg, Frede

    2014-01-01

    and is further used as an input to a lifetime model. The proposed reliability oriented design tool is used to study the impact of MP and device degradation (aging) in the PV-inverter lifetime. The obtained results indicate that the MP of the field where the PV-inverter is operating has an important impact......This paper introduces a reliability-oriented design tool for a new generation of grid connected PV-inverters. The proposed design tool consists of a real field mission profile model (for one year operation in USA-Arizona), a PV-panel model, a grid connected PV-inverter model, an electro......-thermal model and the lifetime model of the power semiconductor devices. A simulation model able to consider one year real field operation conditions (solar irradiance and ambient temperature) is developed. Thus, one year estimation of the converter devices thermal loading distribution is achieved...

  10. Design for Reliability and Robustness Tool Platform for Power Electronic Systems – Study Case on Motor Drive Applications

    DEFF Research Database (Denmark)

    Vernica, Ionut; Wang, Huai; Blaabjerg, Frede

    2018-01-01

    Because of the high cost of failure, the reliability performance of power semiconductor devices is becoming a more and more important and stringent factor in many energy conversion applications. Thus, the need for appropriate reliability analysis of the power electronics emerges. Due to its...

  11. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Björklund, Anna

    2012-01-01

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  12. Assessing physiotherapists' communication skills for promoting patient autonomy for self-management: reliability and validity of the communication evaluation in rehabilitation tool.

    Science.gov (United States)

    Murray, Aileen; Hall, Amanda; Williams, Geoffrey C; McDonough, Suzanne M; Ntoumanis, Nikos; Taylor, Ian; Jackson, Ben; Copsey, Bethan; Hurley, Deirdre A; Matthews, James

    2018-02-27

    To assess the inter-rater reliability and concurrent validity of the Communication Evaluation in Rehabilitation Tool, which aims to externally assess physiotherapists competency in using Self-Determination Theory-based communication strategies in practice. Audio recordings of initial consultations between 24 physiotherapists and 24 patients with chronic low back pain in four hospitals in Ireland were obtained as part of a larger randomised controlled trial. Three raters, all of whom had Ph.Ds in psychology and expertise in motivation and physical activity, independently listened to the 24 audio recordings and completed the 18-item Communication Evaluation in Rehabilitation Tool. Inter-rater reliability between all three raters was assessed using intraclass correlation coefficients. Concurrent validity was assessed using Pearson's r correlations with a reference standard, the Health Care Climate Questionnaire. The total score for the Communication Evaluation in Rehabilitation Tool is an average of all 18 items. Total scores demonstrated good inter-rater reliability (Intraclass Correlation Coefficient (ICC) = 0.8) and concurrent validity with the Health Care Climate Questionnaire total score (range: r = 0.7-0.88). Item-level scores of the Communication Evaluation in Rehabilitation Tool identified five items that need improvement. Results provide preliminary evidence to support future use and testing of the Communication Evaluation in Rehabilitation Tool. Implications for Rehabilitation Promoting patient autonomy is a learned skill and while interventions exist to train clinicians in these skills there are no tools to assess how well clinicians use these skills when interacting with a patient. The lack of robust assessment has severe implications regarding both the fidelity of clinician training packages and resulting outcomes for promoting patient autonomy. This study has developed a novel measurement tool Communication Evaluation in Rehabilitation Tool and a

  13. Using image analysis as a tool for assessment of prognostic and predictive biomarkers for breast cancer: How reliable is it?

    Directory of Open Access Journals (Sweden)

    Mark C Lloyd

    2010-01-01

    Full Text Available Background : Estrogen receptor (ER, progesterone receptor (PR and human epidermal growth factor receptor-2 (HER2 are important and well-established prognostic and predictive biomarkers for breast cancers and routinely tested on patient′s tumor samples by immunohistochemical (IHC study. The accuracy of these test results has substantial impact on patient management. A critical factor that contributes to the result is the interpretation (scoring of IHC. This study investigates how computerized image analysis can play a role in a reliable scoring, and identifies potential pitfalls with common methods. Materials and Methods : Whole slide images of 33 invasive ductal carcinoma (IDC (10 ER and 23 HER2 were scored by pathologist under the light microscope and confirmed by another pathologist. The HER2 results were additionally confirmed by fluorescence in situ hybridization (FISH. The scoring criteria were adherent to the guidelines recommended by the American Society of Clinical Oncology/College of American Pathologists. Whole slide stains were then scored by commercially available image analysis algorithms from Definiens (Munich, Germany and Aperio Technologies (Vista, CA, USA. Each algorithm was modified specifically for each marker and tissue. The results were compared with the semi-quantitative manual scoring, which was considered the gold standard in this study. Results : For HER2 positive group, each algorithm scored 23/23 cases within the range established by the pathologist. For ER, both algorithms scored 10/10 cases within range. The performance of each algorithm varies somewhat from the percentage of staining as compared to the pathologist′s reading. Conclusions : Commercially available computerized image analysis can be useful in the evaluation of ER and HER2 IHC results. In order to achieve accurate results either manual pathologist region selection is necessary, or an automated region selection tool must be employed. Specificity can

  14. Can the second order multireference perturbation theory be considered a reliable tool to study mixed-valence compounds?

    Science.gov (United States)

    Pastore, Mariachiara; Helal, Wissam; Evangelisti, Stefano; Leininger, Thierry; Malrieu, Jean-Paul; Maynau, Daniel; Angeli, Celestino; Cimiraglia, Renzo

    2008-05-07

    In this paper, the problem of the calculation of the electronic structure of mixed-valence compounds is addressed in the frame of multireference perturbation theory (MRPT). Using a simple mixed-valence compound (the 5,5(') (4H,4H('))-spirobi[ciclopenta[c]pyrrole] 2,2('),6,6(') tetrahydro cation), and the n-electron valence state perturbation theory (NEVPT2) and CASPT2 approaches, it is shown that the ground state (GS) energy curve presents an unphysical "well" for nuclear coordinates close to the symmetric case, where a maximum is expected. For NEVPT, the correct shape of the energy curve is retrieved by applying the MPRT at the (computationally expensive) third order. This behavior is rationalized using a simple model (the ionized GS of two weakly interacting identical systems, each neutral system being described by two electrons in two orbitals), showing that the unphysical well is due to the canonical orbital energies which at the symmetric (delocalized) conformation lead to a sudden modification of the denominators in the perturbation expansion. In this model, the bias introduced in the second order correction to the energy is almost entirely removed going to the third order. With the results of the model in mind, one can predict that all MRPT methods in which the zero order Hamiltonian is based on canonical orbital energies are prone to present unreasonable energy profiles close to the symmetric situation. However, the model allows a strategy to be devised which can give a correct behavior even at the second order, by simply averaging the orbital energies of the two charge-localized electronic states. Such a strategy is adopted in a NEVPT2 scheme obtaining a good agreement with the third order results based on the canonical orbital energies. The answer to the question reported in the title (is this theoretical approach a reliable tool for a correct description of these systems?) is therefore positive, but care must be exercised, either in defining the orbital

  15. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    Science.gov (United States)

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  16. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  17. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  18. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  19. Reliability of Patient-Led Screening with the Malnutrition Screening Tool: Agreement between Patient and Health Care Professional Scores in the Cancer Care Ambulatory Setting.

    Science.gov (United States)

    Di Bella, Alexandra; Blake, Claire; Young, Adrienne; Pelecanos, Anita; Brown, Teresa

    2018-02-01

    The prevalence of malnutrition in patients with cancer is reported as high as 60% to 80%, and malnutrition is associated with lower survival, reduced response to treatment, and poorer functional status. The Malnutrition Screening Tool (MST) is a validated tool when administered by health care professionals; however, it has not been evaluated for patient-led screening. This study aims to assess the reliability of patient-led MST screening through assessment of inter-rater reliability between patient-led and dietitian-researcher-led screening and intra-rater reliability between an initial and a repeat patient screening. This cross-sectional study included 208 adults attending ambulatory cancer care services in a metropolitan teaching hospital in Queensland, Australia, in October 2016 (n=160 inter-rater reliability; n=48 intra-rater reliability measured in a separate sample). Primary outcome measures were MST risk categories (MST 0-1: not at risk, MST ≥2: at risk) as determined by screening completed by patients and a dietitian-researcher, patient test-retest screening, and patient acceptability. Percent and chance-corrected agreement (Cohen's kappa coefficient, κ) were used to determine agreement between patient-MST and dietitian-MST (inter-rater reliability) and MST completed by patient on admission to unit (patient-MSTA) and MST completed by patient 1 to 3 hours after completion of initial MST (patient-MSTB) (intra-rater reliability). High inter-rater reliability and intra-rater reliability were observed. Agreement between patient-MST and dietitian-MST was 96%, with "almost perfect" chance-adjusted agreement (κ=0.92, 95% CI 0.84 to 0.97). Agreement between repeated patient-MSTA and patient-MSTB was 94%, with "almost perfect" chance-adjusted agreement (κ=0.88, 95% CI 0.71 to 1.00). Based on dietitian-MST, 33% (n=53) of patients were identified as being at risk for malnutrition, and 40% of these reported not seeing a dietitian. Of 156 patients who provided

  20. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    Science.gov (United States)

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of 0.05).

  1. Reliability and validity of a tool to measure the severity of tongue thrust in children: the Tongue Thrust Rating Scale.

    Science.gov (United States)

    Serel Arslan, S; Demir, N; Karaduman, A A

    2017-02-01

    This study aimed to develop a scale called Tongue Thrust Rating Scale (TTRS), which categorised tongue thrust in children in terms of its severity during swallowing, and to investigate its validity and reliability. The study describes the developmental phase of the TTRS and presented its content and criterion-based validity and interobserver and intra-observer reliability. For content validation, seven experts assessed the steps in the scale over two Delphi rounds. Two physical therapists evaluated videos of 50 children with cerebral palsy (mean age, 57·9 ± 16·8 months), using the TTRS to test criterion-based validity, interobserver and intra-observer reliability. The Karaduman Chewing Performance Scale (KCPS) and Drooling Severity and Frequency Scale (DSFS) were used for criterion-based validity. All the TTRS steps were deemed necessary. The content validity index was 0·857. A very strong positive correlation was found between two examinations by one physical therapist, which indicated intra-observer reliability (r = 0·938, P reliability (r = 0·892, P validity of the TTRS. The TTRS is a valid, reliable and clinically easy-to-use functional instrument to document the severity of tongue thrust in children. © 2016 John Wiley & Sons Ltd.

  2. The Managing Emergencies in Paediatric Anaesthesia global rating scale is a reliable tool for simulation-based assessment in pediatric anesthesia crisis management.

    Science.gov (United States)

    Everett, Tobias C; Ng, Elaine; Power, Daniel; Marsh, Christopher; Tolchard, Stephen; Shadrina, Anna; Bould, Matthew D

    2013-12-01

    The use of simulation-based assessments for high-stakes physician examinations remains controversial. The Managing Emergencies in Paediatric Anaesthesia course uses simulation to teach evidence-based management of anesthesia crises to trainee anesthetists in the United Kingdom (UK) and Canada. In this study, we investigated the feasibility and reliability of custom-designed scenario-specific performance checklists and a global rating scale (GRS) assessing readiness for independent practice. After research ethics board approval, subjects were videoed managing simulated pediatric anesthesia crises in a single Canadian teaching hospital. Each subject was randomized to two of six different scenarios. All 60 scenarios were subsequently rated by four blinded raters (two in the UK, two in Canada) using the checklists and GRS. The actual and predicted reliability of the tools was calculated for different numbers of raters using the intraclass correlation coefficient (ICC) and the Spearman-Brown prophecy formula. Average measures ICCs ranged from 'substantial' to 'near perfect' (P ≤ 0.001). The reliability of the checklists and the GRS was similar. Single measures ICCs showed more variability than average measures ICC. At least two raters would be required to achieve acceptable reliability. We have established the reliability of a GRS to assess the management of simulated crisis scenarios in pediatric anesthesia, and this tool is feasible within the setting of a research study. The global rating scale allows raters to make a judgement regarding a participant's readiness for independent practice. These tools may be used in the future research examining simulation-based assessment. © 2013 John Wiley & Sons Ltd.

  3. COMPI Fertility Problem Stress Scales is a brief, valid and reliable tool for assessing stress in patients seeking treatment

    DEFF Research Database (Denmark)

    Sobral, Maria P.; Costa, Maria E.; Schmidt, Lone

    2017-01-01

    comparability of fertility-related stress across genders and countries. STUDY DESIGN SIZE, DURATION Cross-sectional study. First, we tested the structure of the COMPI-FPSS. Then, reliability and validity (convergent and discriminant) were examined for the final model. Finally, measurement invariance both across...... genders and cultures was tested. PARTICIPANTS/MATERIALS, SETTING, METHODS Our final sample had 3923 fertility patients (1691 men and 2232 women) recruited in clinical settings from seven different countries: Denmark, China, Croatia, Germany, Greece, Hungary and Sweden. Participants had a mean age of 34......STUDY QUESTION Are the Copenhagen Multi‐Centre Psychosocial Infertility research program Fertility Problem Stress Scales (COMPI-FPSS) a reliable and valid measure across gender and culture? SUMMARY ANSWER The COMPI-FPSS is a valid and reliable measure, presenting excellent or good fit...

  4. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is presen......Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy...... is presented encompassing protein characterization prior to and after cloning of the corresponding gene....

  5. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  6. I-HASTREAM : density-based hierarchical clustering of big data streams and its application to big graph analytics tools

    NARCIS (Netherlands)

    Hassani, M.; Spaus, P.; Cuzzocrea, A.; Seidl, T.

    2016-01-01

    Big Data Streams are very popular at now, as stirred-up by a plethora of modern applications such as sensor networks, scientific computing tools, Web intelligence, social network analysis and mining tools, and so forth. Here, the main research issue consists in how to effectively and efficiently

  7. Optimization of IC/HPLC as a rapid analytical tool for characterization of total impurities in UO2

    International Nuclear Information System (INIS)

    Kelkar, A.G.; Kapoor, Y.S.; Mahanty, B.N.; Fulzele, A.K.; Mallik, G.K.

    2007-01-01

    Use of ion chromatography in the determination of metallic and non metallic impurities has been studied and observed to be very satisfactory. In the present paper the total time was monitored in all these experiments and compared with the conventional analytical techniques. (author)

  8. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    DEFF Research Database (Denmark)

    Kaspersen, Per Skougaard; Halsnæs, Kirsten; Gregg, Jay Sterling

    . The project is one of seven initiatives proposed by KFT for 2012. The methodology report includes definitions of major concepts, an outline of an analytical structure, a presentation of models and their applicability, and the results of case studies. The work presented in this report draws on intensive...

  9. The victorian institute of sports assessment - achilles questionnaire (visa-a) - a reliable tool for measuring achilles tendinopathy

    DEFF Research Database (Denmark)

    Iversen, Jonas Vestergård; Bartels, Else Marie; Langberg, Henning

    2012-01-01

    Achilles tendinopathy (AT) is a common pathology and the aetiology is unknown. For valid and reliable assessment The Victorian Institute of Sports Assessment has designed a self-administered Achilles questionnaire, the VISA-A. The aim of the present study was to evaluate VISA-A as an outcome...

  10. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  11. Pain Assessment in Critically İll Adult Patients: Validity and Reliability Research of the Turkish Version of the Critical-Care Pain Observation Tool

    Directory of Open Access Journals (Sweden)

    Onur Gündoğan

    2016-12-01

    Full Text Available Objective: Critical-Care Pain Observation Tool (CPOT and the Behavioral Pain Scale (BPS are behavioral pain assessment scales for unconscious intensive care unit (ICU patients. The aim is to determine the validation and reliability of the CPOT in Turkish in mechanically ventilated adult ICU patients. Material and Method: This prospective observational cohort study included 50 mechanically ventilated mixed ICU patients who were unable to report pain. CPOT and BPS was translated into Turkish and language validity was performed by ten intensive care specialists. Pain was assessed in the course of painless and painful routine care procedures using the CPOT and the BPS by a resident and an intensivist concomitantly. Tests reliability, interrater reliability, and validity of the CPOT and the BPS were evaluated. Results: The mean age was 57.4 years and the mean APACHE II score was 18.7. A total of 100 assessments were recorded from 50 patients using CPOT and BPS. Scores of CPOT and BPS during the painful procedures were both significantly higher than painless procedures. The agreement between CPOT and BPS during painful and painless stimuli was ranged as; sensitivity 66.7%-90.3%; specificity 89.7%-97.9%; kappa value 0.712-0.892. The agreement between resident and intensivist during painful and painless stimuli was ranged from 97% to 100% and the kappa value was between 0.904 and 1.0. Conclusion: The Turkish version of the CPOT showed good correlation with the BPS. Interrater reliability between resident and intensivist was good. The study showed that the Turkish version of BPS and CPOT are reliable and valid tools to assess pain in daily clinical practice for intubated and unconscious ICU patients who are mechanically ventilated.

  12. Is Nottingham Health Profile a reliable tool to measure quality of life of Filipinos with chronic kidney diseases undergoing hemodialysis.

    Science.gov (United States)

    Chuku, Chika Lawson; Valdez, Josephine R; Ajonuma, Louis Chukwuemeka

    2010-12-01

    The quality of life (QOL) of hemodialysis patients is often compromised and many tools have been developed to assess the health-related QOL of chronic kidney disease (CKD) patients undergoing hemodialysis. However, no such tool is currently in use in the Philippines. The objective of this study is to determine if Nottingham Health Profile (NHP) can be a useful tool in the Philippines. Eighty patients undergoing hemodialysis in the dialysis unit of our hospital were enrolled for this study. Sixty-nine patients completed the study. Comparative analysis revealed significant difference in social isolation with favorable result for the Filipino patients. Other measures correlate well although with differences that were not statistically significant. NHP can be successfully applied as a standard QOL tool in the Philippines. However, it should be translated into Filipino to avoid language difficulty. NHP may be recommended for QOL determination in other developing countries.

  13. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  14. Evaluation of the quality of results obtained in institutions participating in interlaboratory experiments and of the reliability characteristics of the analytical methods used on the basis of certification of standard soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Parshin, A.K.; Obol' yaninova, V.G.; Sul' dina, N.P.

    1986-08-20

    Rapid monitoring of the level of pollution of the environment and, especially, of soils necessitates preparation of standard samples (SS) close in properties and material composition to the objects to be analyzed. During 1978-1982 four sets (three types of samples in each) of State Standard Samples of different soils were developed: soddy-podzolic sandy-loamy, typical chernozem, krasnozem, and calcareous sierozem. The certification studies of the SS of the soils were carried out in accordance with the classical scheme of interlab experiment (ILE). More than 100 institutions were involved in the ILE and the total number of independent analytical results was of the order of 10/sup 4/. With such a volume of analytical information at their disposal they were able to find some general characteristics intrinsic to certification studies, to assess the quality of work of the ILE participants with due regard for their specialization, and the reliability characteristics of the analytical methods used.

  15. Facial Angiofibroma Severity Index (FASI): reliability assessment of a new tool developed to measure severity and responsiveness to therapy in tuberous sclerosis-associated facial angiofibroma.

    Science.gov (United States)

    Salido-Vallejo, R; Ruano, J; Garnacho-Saucedo, G; Godoy-Gijón, E; Llorca, D; Gómez-Fernández, C; Moreno-Giménez, J C

    2014-12-01

    Tuberous sclerosis complex (TSC) is an autosomal dominant neurocutaneous disorder characterized by the development of multisystem hamartomatous tumours. Topical sirolimus has recently been suggested as a potential treatment for TSC-associated facial angiofibroma (FA). To validate a reproducible scale created for the assessment of clinical severity and treatment response in these patients. We developed a new tool, the Facial Angiofibroma Severity Index (FASI) to evaluate the grade of erythema and the size and extent of FAs. In total, 30 different photographs of patients with TSC were shown to 56 dermatologists at each evaluation. Three evaluations using the same photographs but in a different random order were performed 1 week apart. Test and retest reliability and interobserver reproducibility were determined. There was good agreement between the investigators. Inter-rater reliability showed strong correlations (> 0.98; range 0.97-0.99) with inter-rater correlation coefficients (ICCs) for the FASI. The global estimated kappa coefficient for the degree of intra-rater agreement (test-retest) was 0.94 (range 0.91-0.97). The FASI is a valid and reliable tool for measuring the clinical severity of TSC-associated FAs, which can be applied in clinical practice to evaluate the response to treatment in these patients. © 2014 British Association of Dermatologists.

  16. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  17. Measuring Outcomes for Dysphagia: Validity and Reliability of the European Portuguese Eating Assessment Tool (P-EAT-10).

    Science.gov (United States)

    Nogueira, Dália Santos; Ferreira, Pedro Lopes; Reis, Elizabeth Azevedo; Lopes, Inês Sousa

    2015-10-01

    The purpose of this study was to evaluate the validity and the reliability of the European Portuguese version of the EAT-10 (P-EAT-10). This research was conducted in three phases: (i) cultural and linguistic adaptation; (ii) feasibility and reliability test; and (iii) validity tests. The final sample was formed by a cohort of 520 subjects. The P-EAT-10 index was compared for socio-demographic and clinic variables. It was also compared for both dysphagic and non-dysphagic groups as well as for the results of the 3Oz wst. Lastly, the P-EAT-10 scores were correlated with the EuroQol Group Portuguese EQ-5D index. The Cronbach's α obtained for the P-EAT-10 scale was 0.952 and it remained excellent even if any item was deleted. The item-total and the intraclass correlation coefficients were very good. The P-EAT-10 mean of the non-dysphagic cohort was 0.56 and that of the dysphagic cohort was 14.26, the mean comparison between the 3Oz wst groups and the P-EAT-10 scores were significant. A significant higher perception of QoL was also found among the non-dysphagic subjects. P-EAT-10 is a valid and reliable measure that may be used to document dysphagia which makes it useful both for screening in clinical practice and in research.

  18. Blinded evaluation of interrater reliability of an operative competency assessment tool for direct laryngoscopy and rigid bronchoscopy.

    Science.gov (United States)

    Ishman, Stacey L; Benke, James R; Johnson, Kaalan Erik; Zur, Karen B; Jacobs, Ian N; Thorne, Marc C; Brown, David J; Lin, Sandra Y; Bhatti, Nasir; Deutsch, Ellen S

    2012-10-01

    OBJECTIVES To confirm interrater reliability using blinded evaluation of a skills-assessment instrument to assess the surgical performance of resident and fellow trainees performing pediatric direct laryngoscopy and rigid bronchoscopy in simulated models. DESIGN Prospective, paired, blinded observational validation study. SUBJECTS Paired observers from multiple institutions simultaneously evaluated residents and fellows who were performing surgery in an animal laboratory or using high-fidelity manikins. The evaluators had no previous affiliation with the residents and fellows and did not know their year of training. INTERVENTIONS One- and 2-page versions of an objective structured assessment of technical skills (OSATS) assessment instrument composed of global and a task-specific surgical items were used to evaluate surgical performance. RESULTS Fifty-two evaluations were completed by 17 attending evaluators. The instrument agreement for the 2-page assessment was 71.4% when measured as a binary variable (ie, competent vs not competent) (κ = 0.38; P = .08). Evaluation as a continuous variable revealed a 42.9% percentage agreement (κ = 0.18; P = .14). The intraclass correlation was 0.53, considered substantial/good interrater reliability (69% reliable). For the 1-page instrument, agreement was 77.4% when measured as a binary variable (κ = 0.53, P = .0015). Agreement when evaluated as a continuous measure was 71.0% (κ = 0.54, P formative feedback on operational competency.

  19. Assessing the Construct Validity and Internal Reliability of the Screening Tool Test Your Memory in Patients with Chronic Pain

    Science.gov (United States)

    Ojeda, B.; Salazar, A.; Dueñas, M.; Torres, L. M.; Mico, J. A.; Failde, I.

    2016-01-01

    Patients with chronic pain often complain about cognitive difficulties, and since these symptoms represent an additional source of suffering and distress, evaluating the cognitive status of these patients with valid and reliable tests should be an important part of their overall assessment. Although cognitive impairment is a critical characteristic of pain, there is no specific measure designed to detect these effects in this population. The objective was to analyze the psychometric properties of the “Test Your Memory” (TYM) test in patients with chronic pain of three different origins. A cross-sectional study was carried out on 72 subjects free of pain and 254 patients suffering from different types of chronic pain: neuropathic pain (104), musculoskeletal pain (99) and fibromyalgia (51). The construct validity of the TYM was assessed using the Mini-Mental State Examination (MMSE), Hospital Anxiety and Depression Scale (HADs), Index-9 from MOS-sleep, SF-12, and through the intensity (Visual Analogical Scale) and duration of pain. An exploratory factor analysis was also performed and internal reliability was assessed using Cronbach’s alpha. After adjusting for potential confounders the TYM could distinguish between pain and pain-free patients, and it was correlated with the: MMSE (0.89, pmental components (0.55, p valid and reliable screening instrument to assess cognitive function in chronic pain patients that will be of particular value in clinical situations. PMID:27119165

  20. Assessing the Construct Validity and Internal Reliability of the Screening Tool Test Your Memory in Patients with Chronic Pain.

    Science.gov (United States)

    Ojeda, B; Salazar, A; Dueñas, M; Torres, L M; Mico, J A; Failde, I

    2016-01-01

    Patients with chronic pain often complain about cognitive difficulties, and since these symptoms represent an additional source of suffering and distress, evaluating the cognitive status of these patients with valid and reliable tests should be an important part of their overall assessment. Although cognitive impairment is a critical characteristic of pain, there is no specific measure designed to detect these effects in this population. The objective was to analyze the psychometric properties of the "Test Your Memory" (TYM) test in patients with chronic pain of three different origins. A cross-sectional study was carried out on 72 subjects free of pain and 254 patients suffering from different types of chronic pain: neuropathic pain (104), musculoskeletal pain (99) and fibromyalgia (51). The construct validity of the TYM was assessed using the Mini-Mental State Examination (MMSE), Hospital Anxiety and Depression Scale (HADs), Index-9 from MOS-sleep, SF-12, and through the intensity (Visual Analogical Scale) and duration of pain. An exploratory factor analysis was also performed and internal reliability was assessed using Cronbach's alpha. After adjusting for potential confounders the TYM could distinguish between pain and pain-free patients, and it was correlated with the: MMSE (0.89, pmental components (0.55, p reliable screening instrument to assess cognitive function in chronic pain patients that will be of particular value in clinical situations.

  1. Reliability and criterion validity of measurements using a smart phone-based measurement tool for the transverse rotation angle of the pelvis during single-leg lifting.

    Science.gov (United States)

    Jung, Sung-Hoon; Kwon, Oh-Yun; Jeon, In-Cheol; Hwang, Ui-Jae; Weon, Jong-Hyuck

    2018-01-01

    The purposes of this study were to determine the intra-rater test-retest reliability of a smart phone-based measurement tool (SBMT) and a three-dimensional (3D) motion analysis system for measuring the transverse rotation angle of the pelvis during single-leg lifting (SLL) and the criterion validity of the transverse rotation angle of the pelvis measurement using SBMT compared with a 3D motion analysis system (3DMAS). Seventeen healthy volunteers performed SLL with their dominant leg without bending the knee until they reached a target placed 20 cm above the table. This study used a 3DMAS, considered the gold standard, to measure the transverse rotation angle of the pelvis to assess the criterion validity of the SBMT measurement. Intra-rater test-retest reliability was determined using the SBMT and 3DMAS using intra-class correlation coefficient (ICC) [3,1] values. The criterion validity of the SBMT was assessed with ICC [3,1] values. Both the 3DMAS (ICC = 0.77) and SBMT (ICC = 0.83) showed excellent intra-rater test-retest reliability in the measurement of the transverse rotation angle of the pelvis during SLL in a supine position. Moreover, the SBMT showed an excellent correlation with the 3DMAS (ICC = 0.99). Measurement of the transverse rotation angle of the pelvis using the SBMT showed excellent reliability and criterion validity compared with the 3DMAS.

  2. Development and reliability of a Motivational Interviewing Scenarios Tool for Eating Disorders (MIST-ED) using a skills-based intervention among caregivers.

    Science.gov (United States)

    Sepulveda, Ana R; Wise, Caroline; Zabala, Maria; Todd, Gill; Treasure, Janet

    2013-12-01

    The aims of this study were to develop an eating disorder scenarios tool to assess the motivational interviewing (MI) skills of caregivers and evaluate the coding reliability of the instrument, and to test the sensitivity to change through a pre/post/follow-up design. The resulting Motivational Interview Scenarios Tool for Eating Disorders (MIST-ED) was administered to caregivers (n = 66) who were asked to provide oral and written responses before and after a skills-based intervention, and at a 3-month follow-up. Raters achieved excellent inter-rater reliability (intra-class correlations of 91.8% on MI adherent and 86.1% for MI non-adherent statements for written scenarios and 89.2%, and 85.3% for oral scenarios). Following the intervention, MI adherent statements increased (baseline = 9.4%, post = 61.5% and follow-up 47.2%) and non-MI adherent statements decreased (baseline = 90.6%, post = 38.5% and follow-up = 52.8%). This instrument can be used as a simple method to measure the acquisition of MI skills to improve coping and both response methods are adequate. The tool shows good sensitivity to improved skills. © 2013.

  3. Evaluation of Four Different Analytical Tools to Determine the Regional Origin of Gastrodia elata and Rehmannia glutinosa on the Basis of Metabolomics Study

    Directory of Open Access Journals (Sweden)

    Dong-Kyu Lee

    2014-05-01

    Full Text Available Chemical profiles of medicinal plants could be dissimilar depending on the cultivation environments, which may influence their therapeutic efficacy. Accordingly, the regional origin of the medicinal plants should be authenticated for correct evaluation of their medicinal and market values. Metabolomics has been found very useful for discriminating the origin of many plants. Choosing the adequate analytical tool can be an essential procedure because different chemical profiles with different detection ranges will be produced according to the choice. In this study, four analytical tools, Fourier transform near‑infrared spectroscopy (FT-NIR, 1H-nuclear magnetic resonance spectroscopy (1H‑NMR, liquid chromatography-mass spectrometry (LC-MS, and gas chromatography-mass spectroscopy (GC-MS were applied in parallel to the same samples of two popular medicinal plants (Gastrodia elata and Rehmannia glutinosa cultivated either in Korea or China. The classification abilities of four discriminant models for each plant were evaluated based on the misclassification rate and Q2 obtained from principal component analysis (PCA and orthogonal projection to latent structures-discriminant analysis (OPLS‑DA, respectively. 1H-NMR and LC-MS, which were the best techniques for G. elata and R. glutinosa, respectively, were generally preferable for origin discrimination over the others. Reasoned by integrating all the results, 1H-NMR is the most prominent technique for discriminating the origins of two plants. Nonetheless, this study suggests that preliminary screening is essential to determine the most suitable analytical tool and statistical method, which will ensure the dependability of metabolomics-based discrimination.

  4. Evidence of Reliability for Persian Version of the “Cumberland Ankle Instability Tool (CAIT” in Iranian Athletes with lateral Ankle Sprain

    Directory of Open Access Journals (Sweden)

    Mitra Haji-Maghsoudi

    2016-01-01

    Full Text Available Objectives: The purpose of the present study was to evaluate the reliability of persian version of the “Cumberland Ankle Instability Tool (CAIT” in Iranian athletes with lateral ankle sprain. Matterials & Methods: The present study is a methodological and non-experimental study. After forward and backward translation of CAIT, 46 athletes were selected with convenient nonprobably sampling from Physical Education Faculty of Tehran university and Taekwondo Club. Questionnaire was given to participants who experienced at least one lateral ankle sprain based on doctor’s diagnosis. In the second phase (one week later the questionnaire was distributed among the participants again to test the reliability of the measured between the two tests. After collecting the data, the test-retest reliability of  Persian version of the questionnaire was evaluated by calculating the intraclass correlation coefficient, standard error of measurement, smallest detectable change and Cronbach’s alpha coefficients were calculated to assess the internal consistency of the questionnaire’s items. Results Cronbach’s alpha was 0.64, which is close to acceptable level of internal consistency (0.7-0.95. Factor analysis showed that questionnaires’ items can be classified  in 4 categories with maximum of 72% variance cover. The test-retest correlation coefficient ICC for the total score of CAIT was 0.95 (P>100.0, indicating excellent reproducibility of the Persian version of the questionnaire. The standard error of measurement (SEM was 1 and the smallest acceptable change (SDC was 2.76 with 95% confidence. Conclusion: The results show that the Persian version of the CAIT can be used in athletes with functional ankle instability as a reliable tool to detect instability and assess changes caused by therapeutic interventions.

  5. Development of a Standardized Kalamazoo Communication Skills Assessment Tool for Radiologists: Validation, Multisource Reliability, and Lessons Learned.

    Science.gov (United States)

    Brown, Stephen D; Rider, Elizabeth A; Jamieson, Katherine; Meyer, Elaine C; Callahan, Michael J; DeBenedectis, Carolynn M; Bixby, Sarah D; Walters, Michele; Forman, Sara F; Varrin, Pamela H; Forbes, Peter; Roussin, Christopher J

    2017-08-01

    The purpose of this study was to develop and test a standardized communication skills assessment instrument for radiology. The Delphi method was used to validate the Kalamazoo Communication Skills Assessment instrument for radiology by revising and achieving consensus on the 43 items of the preexisting instrument among an interdisciplinary team of experts consisting of five radiologists and four nonradiologists (two men, seven women). Reviewers assessed the applicability of the instrument to evaluation of conversations between radiology trainees and trained actors portraying concerned parents in enactments about bad news, radiation risks, and diagnostic errors that were video recorded during a communication workshop. Interrater reliability was assessed by use of the revised instrument to rate a series of enactments between trainees and actors video recorded in a hospital-based simulator center. Eight raters evaluated each of seven different video-recorded interactions between physicians and parent-actors. The final instrument contained 43 items. After three review rounds, 42 of 43 (98%) items had an average rating of relevant or very relevant for bad news conversations. All items were rated as relevant or very relevant for conversations about error disclosure and radiation risk. Reliability and rater agreement measures were moderate. The intraclass correlation coefficient range was 0.07-0.58; mean, 0.30; SD, 0.13; and median, 0.30. The range of weighted kappa values was 0.03-0.47; mean, 0.23; SD, 0.12; and median, 0.22. Ratings varied significantly among conversations (χ 2 6 = 1186; p communication skills assessment instrument is highly relevant for radiology, having moderate interrater reliability. These findings have important implications for assessing the relational competencies of radiology trainees.

  6. Inter- and intrarater reliability of two proprioception tests using clinical applicable measurement tools in subjects with and without knee osteoarthritis.

    Science.gov (United States)

    Baert, Isabel A C; Lluch, Enrique; Struyf, Thomas; Peeters, Greta; Van Oosterwijck, Sophie; Tuynman, Joanna; Rufai, Salim; Struyf, Filip

    2018-06-01

    The therapeutic value of proprioceptive-based exercises in knee osteoarthritis (KOA) management warrants investigation of proprioceptive testing methods easily accessible in clinical practice. To estimate inter- and intrarater reliability of the knee joint position sense (KJPS) test and knee force sense (KFS) test in subjects with and without KOA. Cross-sectional test-retest design. Two blinded raters performed independently repeated measures of the KJPS and KFS test, using an analogue inclinometer and handheld dynamometer, respectively, in eight KOA patients (12 symptomatic knees) and 26 healthy controls (52 asymptomatic knees). Intraclass correlation coefficients (ICCs; model 2,1), standard error of measurement (SEM) and minimal detectable change with 95% confidence bounds (MDC 95 ) were calculated. For KJPS, results showed good to excellent test-retest agreement (ICCs 0.70-0.95 in KOA patients; ICCs 0.65-0.85 in healthy controls). A 2° measurement error (SEM 1°) was reported when measuring KJPS in multiple test positions and calculating mean repositioning error. Testing KOA patients pre and post therapy a repositioning error larger than 4° (MDC 95 ) is needed to consider true change. Measuring KFS using handheld dynamometry showed poor to fair interrater and poor to excellent intrarater reliability in subjects with and without KOA. Measuring KJPS in multiple test positions using an analogue inclinometer and calculating mean repositioning error is reliable and can be used in clinical practice. We do not recommend the use of the KFS test to clinicians. Further research is required to establish diagnostic accuracy and validity of our KJPS test in larger knee pain populations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Assessing the Construct Validity and Internal Reliability of the Screening Tool Test Your Memory in Patients with Chronic Pain.

    Directory of Open Access Journals (Sweden)

    B Ojeda

    Full Text Available Patients with chronic pain often complain about cognitive difficulties, and since these symptoms represent an additional source of suffering and distress, evaluating the cognitive status of these patients with valid and reliable tests should be an important part of their overall assessment. Although cognitive impairment is a critical characteristic of pain, there is no specific measure designed to detect these effects in this population. The objective was to analyze the psychometric properties of the "Test Your Memory" (TYM test in patients with chronic pain of three different origins. A cross-sectional study was carried out on 72 subjects free of pain and 254 patients suffering from different types of chronic pain: neuropathic pain (104, musculoskeletal pain (99 and fibromyalgia (51. The construct validity of the TYM was assessed using the Mini-Mental State Examination (MMSE, Hospital Anxiety and Depression Scale (HADs, Index-9 from MOS-sleep, SF-12, and through the intensity (Visual Analogical Scale and duration of pain. An exploratory factor analysis was also performed and internal reliability was assessed using Cronbach's alpha. After adjusting for potential confounders the TYM could distinguish between pain and pain-free patients, and it was correlated with the: MMSE (0.89, p<0.001; HAD-anxiety (-0.50, p<0.001 and HAD-depression scales (-0.52, p<0.001; MOS-sleep Index-9 (-0.49, p<0.001; and the physical (0.49, p < .001 and mental components (0.55, p < .001 of SF-12. The exploratory structure of the TYM showed an 8-factor solution that explained 53% of the variance, and Cronbach's alpha was 0.66. The TYM is a valid and reliable screening instrument to assess cognitive function in chronic pain patients that will be of particular value in clinical situations.

  8. Development of a new assessment tool for cervical myelopathy using hand-tracking sensor: Part 1: validity and reliability.

    Science.gov (United States)

    Alagha, M Abdulhadi; Alagha, Mahmoud A; Dunstan, Eleanor; Sperwer, Olaf; Timmins, Kate A; Boszczyk, Bronek M

    2017-04-01

    To assess the reliability and validity of a hand motion sensor, Leap Motion Controller (LMC), in the 15-s hand grip-and-release test, as compared against human inspection of an external digital camera recording. Fifty healthy participants were asked to fully grip-and-release their dominant hand as rapidly as possible for two trials with a 10-min rest in-between, while wearing a non-metal wrist splint. Each test lasted for 15 s, and a digital camera was used to film the anterolateral side of the hand on the first test. Three assessors counted the frequency of grip-and-release (G-R) cycles independently and in a blinded fashion. The average mean of the three was compared with that measured by LMC using the Bland-Altman method. Test-retest reliability was examined by comparing the two 15-s tests. The mean number of G-R cycles recorded was: 47.8 ± 6.4 (test 1, video observer); 47.7 ± 6.5 (test 1, LMC); and 50.2 ± 6.5 (test 2, LMC). Bland-Altman indicated good agreement, with a low bias (0.15 cycles) and narrow limits of agreement. The ICC showed high inter-rater agreement and the coefficient of repeatability for the number of cycles was ±5.393, with a mean bias of 3.63. LMC appears to be valid and reliable in the 15-s grip-and-release test. This serves as a first step towards the development of an objective myelopathy assessment device and platform for the assessment of neuromotor hand function in general. Further assessment in a clinical setting and to gauge healthy benchmark values is warranted.

  9. Validity and reliability of a tool for determining appropriateness of days of stay: an observational study in the orthopedic intensive rehabilitation facilities in Italy.

    Directory of Open Access Journals (Sweden)

    Aida Bianco

    Full Text Available OBJECTIVES: To test the validity and reliability of a tool specifically developed for the evaluation of appropriateness in rehabilitation facilities and to assess the prevalence of appropriateness of the days of stay. METHODS: The tool underwent a process of cross-cultural translation, content validity, and test-retest validity. Two hospital-based rehabilitation wards providing intensive rehabilitation care located in the Region of Calabria, Southern Italy, were randomly selected. A review of medical records on a random sample of patients aged 18 or more was performed. RESULTS: The process of validation resulted in modifying some of the criteria used for the evaluation of appropriateness. Test-retest reliability showed that the agreement and the k statistic for the assessment of the appropriateness of days of stay were 93.4% and 0.82, respectively. A total of 371 patient days was reviewed, and 22.9% of the days of stay in the sample were judged to be inappropriate. The most frequently selected appropriateness criterion was the evaluation of patients by rehabilitation professionals for at least 3 hours on the index day (40.8%; moreover, the most frequent primary reason accounting for the inappropriate days of stay was social and/or family environment issues (34.1%. CONCLUSIONS: The findings showed that the tool used is reliable and have adequate validity to measure the extent of appropriateness of days of stay in rehabilitation facilities and that the prevalence of inappropriateness is contained in the investigated settings. Further research is needed to expand appropriateness evaluation to other rehabilitation settings, and to investigate more thoroughly internal and external causes of inappropriate use of rehabilitation services.

  10. Analytical tools for solitons and periodic waves corresponding to phonons on Lennard-Jones lattices in helical proteins

    DEFF Research Database (Denmark)

    D'ovidio, Francesco; Bohr, Henrik; Lindgård, Per-Anker

    2005-01-01

    We study the propagation of solitons along the hydrogen bonds of an alpha helix. Modeling the hydrogen and peptide bonds with Lennard-Jones potentials, we show that the solitons can appear spontaneously and have long lifetimes. Remarkably, even if no explicit solution is known for the Lennard-Jones...... potential, the solitons can be characterized analytically with a good quantitative agreement using formulas for a Toda potential with parameters fitted to the Lennard-Jones potential. We also discuss and show the robustness of the family of periodic solutions called cnoidal waves, corresponding to phonons...

  11. Development and Testing of the Healthy Work Environment Inventory: A Reliable Tool for Assessing Work Environment Health and Satisfaction.

    Science.gov (United States)

    Clark, Cynthia M; Sattler, Victoria P; Barbosa-Leiker, Celestina

    2016-10-01

    Fostering healthy work environments that enhance job satisfaction and reflect high levels of employee engagement and productivity is imperative for all organizations. This is especially true for health care organizations where unhealthy work conditions can lead to poor patient outcomes. A convenience sample of 520 nursing faculty and practice-based nurses in the United States participated in a study to test the psychometric properties of the Healthy Work Environment Inventory (HWEI). A factor analysis and other reliability analyses support the use of the HWEI as a valid and reliable instrument to measure perceptions of work environment health. The HWEI is a 20-item psychometrically sound instrument to measure perceptions of the health of the work environment. It may be completed either as an individual exercise or by all members of a team to compare perceptions of work environment health, to determine areas of strength and improvement, and to form the basis for interviewing. [J Nurs Educ. 2016;55(10):555-562.]. Copyright 2016, SLACK Incorporated.

  12. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    Science.gov (United States)

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  13. Analytical tools and methodologies for evaluation of residual life of contacting pressure tubes in the early generation of Indian PHWRs

    International Nuclear Information System (INIS)

    Sinha, S.K.; Madhusoodanan, K.; Rupani, B.B.; Sinha, R.K.

    2002-01-01

    In-service life of a contacting Zircaloy-2 pressure tube (PT) in the earlier generation of Indian PHWRs, is limited mainly due to the accelerated hydrogen pick-up and nucleation and growth of hydride blister(s) at the cold spot(s) formed on outside surface of pressure tube as a result of its contact with the calandria tube (CT). The activities involving development of the analytical models for simulating the degradation mechanisms leading to PT-CT contact and the methodologies for the revaluation of their safe life under such condition form the important part of our extensive programme for the life management of contacting pressure tubes. Since after the PT-CT contact, rate of hydrogen pick-up and nucleation and growth of hydride blisters govern the safe residual life of the pressure tube, two analytical models (a) hydrogen pick-up model ('HYCON') and (b) model for the nucleation and growth of hydride blister at the contact spot ('BLIST -2D') have been developed in-house to estimate the extent of degradation caused by them. Along with them, a methodology for evaluation of safe residual life has also been formulated for evaluating the safe residual life of the contacting channels. This paper gives the brief description of the models and the methodologies relevant for the contacting Zircaloy-2 pressure tubes. (author)

  14. Understanding how the placement of an asymmetric vibration damping tool within drilling while underreaming can influence performance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Kabbara, Alan; McCarthy, John; Burnett, Timm; Forster, Ian [National Oilwell Varco Downhole Ltd. (NOV), Houston, TX (United States)

    2012-07-01

    This paper describes the work, on test rigs and full-scale drilling rigs, carried out with respect to placement of an Asymmetric Vibration Damping Tool (AVDT) within drilling while under reaming operations. An AVDT, by virtue of the forward synchronous motion imposed on the drill string, offers benefits in minimizing down hole vibration-related tool failures and therefore maximizing rate of penetration (ROP). Of interest in using the AVDT is the tendency to minimize stick slip by means of the parasitic torque it generates. This is of particular importance during under reaming operations. While under reaming, stick slip can result in low (ROP) and potentially an increased incidence of down hole tool failures. The use of an AVDT in these operations has been shown to significantly reduce stick slip. However, due to the forward synchronous motion caused by the AVDT, there is the potential to cause eccentric wear to the Bottom Hole Assembly (BHA) components in the vicinity of the AVDT. If allowed to progress, this eccentric wear can cause a reduction in down hole tool life and drilling performance. Eliminating eccentric wear would be beneficial in reducing repair costs, extending component life and further improving drilling performance. To minimize eccentric wear and maximize drilling performance, the placement of the AVDT within the BHA is critical. This paper describes how the placement of intermediate stabilizers between the AVDT and the under reamer can minimize eccentric wear to the under reamer and the adjacent drill string due to the forward synchronous whirl induced by the AVDT. This approach allows the full benefits of the AVDT to be recognized while reducing the potentially damaging effects of eccentric wear to other BHA components. The work has drawn upon small-scale rig testing, full-scale testing at the Ullrigg test facility in Norway and from real-world drilling and under reaming operations in the USA. (author)

  15. Reliability of power system with open access

    International Nuclear Information System (INIS)

    Ehsani, A.; Ranjbar, A. M.; Fotuhi Firuzabad, M.; Ehsani, M.

    2003-01-01

    Recently, in many countries, electric utility industry is undergoing considerable changes in regard to its structure and regulation. It can be clearly seen that the thrust towards privatization and deregulation or re regulation of the electric utility industry will introduce numerous reliability problems that will require new criteria and analytical tools that recognize the residual uncertainties in the new environment. In this paper, different risks and uncertainties in competitive electricity markets are briefly introduced; the approach of customers, operators, planners, generation bodies and network providers to the reliability of deregulated system is studied; the impact of dispersed generation on system reliability is evaluated; and finally, the reliability cost/reliability worth issues in the new competitive environment are considered

  16. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    Science.gov (United States)

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  17. HPTLC-FLD-SERS as a facile and reliable screening tool: Exemplarily shown with tyramine in cheese.

    Science.gov (United States)

    Wang, Liao; Xu, Xue-Ming; Chen, Yi-Sheng; Ren, Jie; Liu, Yun-Tao

    2018-04-01

    The serious cytotoxicity of tyramine attracted marked attention as it induced necrosis of human intestinal cells. This paper presented a novel and facile high performance thin-layer chromatography (HPTLC) method tailored for screening tyramine in cheese. Separation was performed on glass backed silica gel plates, using methanol/ethyl acetate/ammonia (6/4/1 v/v/v) as the mobile phase. Special efforts were focused on optimizing conditions (substrate preparation, laser wavelength, salt types and concentrations) of surface enhanced Raman spectroscopy (SERS) measurements directly on plates after derivatization, which enabled molecule-specific identification of targeted bands. In parallel, fluorescent densitometry (FLD) scanning at 380SERS provided a new horizon in fast and reliable screening of sophisticated samples like food and herb drugs, striking an excellent balance between specificity, sensitivity and simplicity. Copyright © 2017. Published by Elsevier B.V.

  18. Feasibility and reliability of a mobile tool to evaluate exposure to tobacco product marketing and messages using ecological momentary assessment.

    Science.gov (United States)

    Hébert, Emily T; Vandewater, Elizabeth A; Businelle, Michael S; Harrell, Melissa B; Kelder, Steven H; Perry, Cheryl L

    2017-10-01

    Existing measures of tobacco marketing and messaging exposure are limited, relying on recall, recognition, or proxy measures. This study aimed to determine the feasibility and reliability of a mobile application for the measurement of tobacco and e-cigarette marketing and message exposure using ecological momentary assessment (EMA). Young adults from Austin, TX (n=181, ages 18-29) were instructed to use a mobile application to record all sightings of marketing or social media related to tobacco (including e-cigarettes) in real-time for 28days (Event EMAs). Tobacco product use and recall of message encounters were assessed daily using an app-initiated EMA (Daily EMAs). The mobile app was a feasible and acceptable method to measure exposure to tobacco messages. The majority of messages (45.0%) were seen on the Internet, and many were user-generated. Thirty-day recall of messages at baseline was poorly correlated with messages reported via Event EMA during the study period; however, the correlation between post-study 30-day recall and Event EMA was much stronger (r=0.603 for industry-sponsored messages, r=0.599 for user-generated messages). Correlations between Daily EMAs and 30-day recall of message exposure (baseline and post-study) were small (baseline: r=0.329-0.389) to large (post-study: r=0.656-0.766). These findings suggest that EMA is a feasible and reliable method for measuring tobacco message exposure, especially given the prevalence of messages encountered online and on social media. Recall measures are limited in their ability to accurately represent marketing exposure, but might be improved by a period of priming or clearer response categories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    Directory of Open Access Journals (Sweden)

    Sund Björn

    2013-02-01

    Full Text Available Abstract Background Out-of-hospital cardiac arrest (OHCA is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1 static changes such as trimming the emergency call handling time or (2 dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  20. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Rakitin, I.D.; Malkin, S.D.; Shalia, V.V.; Fedorov, E.M.; Lebedev, N.N.; Khoudiakov, M.M.

    1999-01-01

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  1. Single-cell MALDI-MS as an analytical tool for studying intrapopulation metabolic heterogeneity of unicellular organisms.

    Science.gov (United States)

    Amantonico, Andrea; Urban, Pawel L; Fagerer, Stephan R; Balabin, Roman M; Zenobi, Renato

    2010-09-01

    Heterogeneity is a characteristic feature of all populations of living organisms. Here we make an attempt to validate a single-cell mass spectrometric method for detection of changes in metabolite levels occurring in populations of unicellular organisms. Selected metabolites involved in central metabolism (ADP, ATP, GTP, and UDP-Glucose) could readily be detected in single cells of Closterium acerosum by means of negative-mode matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS). The analytical capabilities of this approach were characterized using standard compounds. The method was then used to study populations of individual cells with different levels of the chosen metabolites. With principal component analysis and support vector machine algorithms, it was possible to achieve a clear separation of individual C. acerosum cells in different metabolic states. This study demonstrates the suitability of mass spectrometric analysis of metabolites in single cells to measure cell-population heterogeneity.

  2. Embodying resistance : a discourse analytical study of the selfie as political tool within the fourth wave of feminism

    OpenAIRE

    Barbala, Astri Moksnes

    2017-01-01

    This Master’s thesis is exploring whether the selfie can be utilised as a political tool in order to challenge the stereotypical ideas of femininity and female beauty that currently dominate the visual social media landscape. Focusing on the photo-sharing application Instagram, the emphasis is here on how the selfie can position the portrayed subject’s body as a site of resistance. By publishing images depicting their non-normative physical appearances, social media-participating feminists ar...

  3. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Skougaard Kaspersen, P.; Halsnaes, K.; Gregg, J.; Drews, M.

    2012-12-15

    In this report we provide recommendations about how more consistent studies and data can be provided based on available modelling tools and data for integrated assessment of climate change risks and adaptation options. It is concluded that integrated assessments within this area requires the use of a wide range of data and models in order to cover the full chain of elements including climate modelling, impact, risks, costs, social issues, and decision making. As an outcome of this activity a comprehensive data and modelling tool named Danish Integrated Assessment System (DIAS) has been developed, this may be used by researchers within the field. DIAS has been implemented and tested in a case study on urban flooding caused by extreme precipitation in Aarhus, and this study highlights the usefulness of integrating data, models, and methods from several disciplines into a common framework. DIAS is an attempt to describe such a framework with regards to integrated analysis of climate impacts and adaptation. The final product of the DTU KFT project ''Tool for Vulnerability analysis'' is NOT a user friendly Climate Adaptation tool ready for various types of analysis that may directly be used by decision makers and consultant on their own. Rather developed methodology and collected/available data can serve as a starting point for case specific analyses. For this reason alone this work should very much be viewed as an attempt to coordinate research, data and models outputs between different research institutes from various disciplines. It is unquestionable that there is a future need to integrate information for areas not yet included, and it is very likely that such efforts will depend on research projects conducted in different climate change adaptation areas and sectors in Denmark. (Author)

  4. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  5. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  7. Design and validation of new genotypic tools for easy and reliable estimation of HIV tropism before using CCR5 antagonists.

    Science.gov (United States)

    Poveda, Eva; Seclén, Eduardo; González, María del Mar; García, Federico; Chueca, Natalia; Aguilera, Antonio; Rodríguez, Jose Javier; González-Lahoz, Juan; Soriano, Vincent

    2009-05-01

    Genotypic tools may allow easier and less expensive estimation of HIV tropism before prescription of CCR5 antagonists compared with the Trofile assay (Monogram Biosciences, South San Francisco, CA, USA). Paired genotypic and Trofile results were compared in plasma samples derived from the maraviroc expanded access programme (EAP) in Europe. A new genotypic approach was built to improve the sensitivity to detect X4 variants based on an optimization of the webPSSM algorithm. Then, the new tool was validated in specimens from patients included in the ALLEGRO trial, a multicentre study conducted in Spain to assess the prevalence of R5 variants in treatment-experienced HIV patients. A total of 266 specimens from the maraviroc EAP were tested. Overall geno/pheno concordance was above 72%. A high specificity was generally seen for the detection of X4 variants using genotypic tools (ranging from 58% to 95%), while sensitivity was low (ranging from 31% to 76%). The PSSM score was then optimized to enhance the sensitivity to detect X4 variants changing the original threshold for R5 categorization. The new PSSM algorithms, PSSM(X4R5-8) and PSSM(SINSI-6.4), considered as X4 all V3 scoring values above -8 or -6.4, respectively, increasing the sensitivity to detect X4 variants up to 80%. The new algorithms were then validated in 148 specimens derived from patients included in the ALLEGRO trial. The sensitivity/specificity to detect X4 variants was 93%/69% for PSSM(X4R5-8) and 93%/70% for PSSM(SINSI-6.4). PSSM(X4R5-8) and PSSM(SINSI-6.4) may confidently assist therapeutic decisions for using CCR5 antagonists in HIV patients, providing an easier and rapid estimation of tropism in clinical samples.

  8. Reliability Generalization: An Examination of the Positive Affect and Negative Affect Schedule

    Science.gov (United States)

    Leue, Anja; Lange, Sebastian

    2011-01-01

    The assessment of positive affect (PA) and negative affect (NA) by means of the Positive Affect and Negative Affect Schedule has received a remarkable popularity in the social sciences. Using a meta-analytic tool--namely, reliability generalization (RG)--population reliability scores of both scales have been investigated on the basis of a random…

  9. The Cognitive Telephone Screening Instrument (COGTEL: A Brief, Reliable, and Valid Tool for Capturing Interindividual Differences in Cognitive Functioning in Epidemiological and Aging Studies

    Directory of Open Access Journals (Sweden)

    Andreas Ihle

    2017-10-01

    Full Text Available Aims: The present study set out to evaluate the psychometric properties of the Cognitive Telephone Screening Instrument (COGTEL in 2 different samples of older adults. Methods: We assessed COGTEL in 116 older adults, with retest after 7 days to evaluate the test-retest reliability. Moreover, we assessed COGTEL in 868 older adults to evaluate convergent validity to the Mini-Mental State Examination (MMSE. Results: Test-retest reliability of the COGTEL total score was good at 0.85 (p < 0.001. Latent variable analyses revealed that COGTEL and MMSE correlated by 0.93 (p < 0.001, indicating convergent validity of the COGTEL. Conclusion: The present analyses suggest COGTEL as a brief, reliable, and valid instrument for capturing interindividual differences in cognitive functioning in epidemiological and aging studies, with the advantage of covering more cognitive domains than traditional screening tools such as the MMSE, as well as differentiating between individual performance levels, in healthy older adults.

  10. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  11. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  12. HPTLC-FLD-SERS as a facile and reliable screening tool: Exemplarily shown with tyramine in cheese

    Directory of Open Access Journals (Sweden)

    Liao Wang

    2018-04-01

    Full Text Available The serious cytotoxicity of tyramine attracted marked attention as it induced necrosis of human intestinal cells. This paper presented a novel and facile high performance thin-layer chromatography (HPTLC method tailored for screening tyramine in cheese. Separation was performed on glass backed silica gel plates, using methanol/ethyl acetate/ammonia (6/4/1 v/v/v as the mobile phase. Special efforts were focused on optimizing conditions (substrate preparation, laser wavelength, salt types and concentrations of surface enhanced Raman spectroscopy (SERS measurements directly on plates after derivatization, which enabled molecule-specific identification of targeted bands. In parallel, fluorescent densitometry (FLD scanning at 380reliable screening of sophisticated samples like food and herb drugs, striking an excellent balance between specificity, sensitivity and simplicity. Keywords: FLD, HPTLC, SERS, Screening, Tyramine

  13. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    Science.gov (United States)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  14. Supervisor assessment of clinical and professional competence of medical trainees: a reliability study using workplace data and a focused analytical literature review.

    NARCIS (Netherlands)

    McGill, D.A.; Vleuten, C.P.M. van der; Clarke, M.J.

    2011-01-01

    Even though rater-based judgements of clinical competence are widely used, they are context sensitive and vary between individuals and institutions. To deal adequately with rater-judgement unreliability, evaluating the reliability of workplace rater-based assessments in the local context is

  15. Supervisor Assessment of Clinical and Professional Competence of Medical Trainees: A Reliability Study Using Workplace Data and a Focused Analytical Literature Review

    Science.gov (United States)

    McGill, D. A.; van der Vleuten, C. P. M.; Clarke, M. J.

    2011-01-01

    Even though rater-based judgements of clinical competence are widely used, they are context sensitive and vary between individuals and institutions. To deal adequately with rater-judgement unreliability, evaluating the reliability of workplace rater-based assessments in the local context is essential. Using such an approach, the primary intention…

  16. NIR spectroscopy as a process analytical technology (PAT) tool for monitoring and understanding of a hydrolysis process.

    Science.gov (United States)

    Wu, Zhisheng; Peng, Yanfang; Chen, Wei; Xu, Bing; Ma, Qun; Shi, Xinyuan; Qiao, Yanjiang

    2013-06-01

    The use of near infrared spectroscopy was investigated as a process analytical technology to monitor the amino acids concentration profile during hydrolysis process of Cornu Bubali. A protocol was followed, including outlier selection using relationship plot of residuals versus the leverage level, calibration models using interval partial least squares and synergy interval partial least squares (SiPLS). A strategy of four robust root mean square error of predictions (RMSEP) values have been developed to assess calibration models by means of the desirability index. Furthermore, multivariate quantification limits (MQL) values of the optimum model were determined using two types of error. The SiPLS(3) models for L-proline, L-tyrosine, L-valine, L-phenylalanine and L-lysine provided excellent accuracies with RMSEP values of 0.0915 mg/mL, 0.1605 mg/mL, 0.0515 mg/mL, 0.0586 mg/mL and 0.0613 mg/mL, respectively. The MQL ranged from 90 ppm to 810 ppm, which confirmed that these models can be suitable for most applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Analytical solution of concentric two-pole Halbach cylinders as a preliminary design tool for magnetic refrigeration systems

    Science.gov (United States)

    Fortkamp, F. P.; Lozano, J. A.; Barbosa, J. R.

    2017-12-01

    This work presents a parametric analysis of the performance of nested permanent magnet Halbach cylinders intended for applications in magnetic refrigeration and heat pumping. An analytical model for the magnetic field generated by the cylinders is used to systematically investigate the influence of their geometric parameters. The proposed configuration generates two poles in the air gap between the cylinders, where active magnetic regenerators are positioned for conversion of magnetic work into cooling capacity or heat power. A sample geometry based on previous designs of magnetic refrigerators is investigated, and the results show that the magnetic field in the air gap oscillates between 0 to approximately 1 T, forming a rectified cosine profile along the circumference of the gap. Calculations of the energy density of the magnets indicate the need to operate at a low energy (particular the inner cylinder) in order to generate a magnetic profile suitable for a magnetic cooler. In practice, these low-energy regions of the magnet can be potentially replaced by soft ferromagnetic material. A parametric analysis of the air gap height has been performed, showing that there are optimal values which maximize the magnet efficiency parameter Λcool . Some combinations of cylinder radii resulted in magnetic field changes that were too small for practical purposes. No demagnetization of the cylinders has been found for the range of parameters considered.

  18. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  19. MICROSCOPY, MICRO-CHEMISTRY AND FTIR AS ANALYTICAL TOOLS FOR IDENTIFYING TRANSPARENT FINISHES CASE STUDIES FROM ASTRA MUSEUM – SIBIU

    Directory of Open Access Journals (Sweden)

    Maria Cristina TIMAR

    2015-12-01

    Full Text Available Conservation of cultural heritage relies on scientific investigation of artefacts, a key point being identification of the original materials. In this context, besides wood species identification, investigation of finishing layers is of ultimate importance for old furniture and any other wooden objects with historic, documentary or artistic value. The present paper refers to a series of micro-destructive investigation methods applied for identification of finishing materials, namely: simple in situ and laboratory physical tests, optical microscopy, micro-chemistry and FTIR – ATR analysis. Small samples of finishing layers were taken from four furniture objects belonging to CNM ASTRA Sibiu and were analysed according to the usual procedures of the laboratories from Sibiu and Brasov. The results showed that physical tests and microscopy are useful to get basic information on the samples’ morphology and possible classes of coating materials, while micro-chemistry revealed by some successive tests more specific information on the type of finishing materials. FTIR - ATR is a rapid method of identifying the coating materials based on available reference samples or spectra. However, this is not always straightforward and preliminary physical tests of solubility are useful to select the adequate references, while micro-chemistry tests could complete the FTIR result, especially for those components of the finishing layer present in very small amounts (less than 5%, bellow the FTIR sensitivity. Corroboration of microscopy, physical and micro-chemistry tests with FTIR can provide more reliable results in terms of finishes identification and also valuable information for restoration.

  20. Use of cartography in historical seismicity analysis: a reliable tool to better apprehend the contextualization of the historical documents

    Science.gov (United States)

    Thibault, Fradet; Grégory, Quenet; Kevin, Manchuel

    2014-05-01

    Historical studies, including historical seismicity analysis, deal with historical documents. Numerous factors, such as culture, social condition, demography, political situations and opinions or religious ones influence the way the events are transcribed in the archives. As a consequence, it is crucial to contextualize and compare the historical documents reporting on a given event in order to reduce the uncertainties affecting their analysis and interpretation. When studying historical seismic events it is often tricky to have a global view of all the information provided by the historical documents. It is also difficult to extract cross-correlated information from the documents and draw a precise historical context. Use of cartographic and geographic tools in GIS software is the best tool for the synthesis, interpretation and contextualization of the historical material. The main goal is to produce the most complete dataset of available information, in order to take into account all the components of the historical context and consequently improve the macroseismic analysis. The Entre-Deux-Mers earthquake (1759, Iepc= VII-VIII) [SISFRANCE 2013 - EDF-IRSN-BRGM] is well documented but has never benefited from a cross-analysis of historical documents and historical context elements. The map of available intensity data from SISFRANCE highlights a gap in macroseismic information within the estimated epicentral area. The aim of this study is to understand the origin of this gap by making a cartographic compilation of both, archive information and historical context elements. The results support the hypothesis that the lack of documents and macroseismic data in the epicentral area is related to a low human activity rather than low seismic effects in this zone. Topographic features, geographical position, flood hazard, roads and pathways locations, vineyards distribution and the forester coverage, mentioned in the archives and reported on the Cassini's map confirm this

  1. Factor Structure, Reliability and Measurement Invariance of the Alberta Context Tool and the Conceptual Research Utilization Scale, for German Residential Long Term Care

    Science.gov (United States)

    Hoben, Matthias; Estabrooks, Carole A.; Squires, Janet E.; Behrens, Johann

    2016-01-01

    We translated the Canadian residential long term care versions of the Alberta Context Tool (ACT) and the Conceptual Research Utilization (CRU) Scale into German, to study the association between organizational context factors and research utilization in German nursing homes. The rigorous translation process was based on best practice guidelines for tool translation, and we previously published methods and results of this process in two papers. Both instruments are self-report questionnaires used with care providers working in nursing homes. The aim of this study was to assess the factor structure, reliability, and measurement invariance (MI) between care provider groups responding to these instruments. In a stratified random sample of 38 nursing homes in one German region (Metropolregion Rhein-Neckar), we collected questionnaires from 273 care aides, 196 regulated nurses, 152 allied health providers, 6 quality improvement specialists, 129 clinical leaders, and 65 nursing students. The factor structure was assessed using confirmatory factor models. The first model included all 10 ACT concepts. We also decided a priori to run two separate models for the scale-based and the count-based ACT concepts as suggested by the instrument developers. The fourth model included the five CRU Scale items. Reliability scores were calculated based on the parameters of the best-fitting factor models. Multiple-group confirmatory factor models were used to assess MI between provider groups. Rather than the hypothesized ten-factor structure of the ACT, confirmatory factor models suggested 13 factors. The one-factor solution of the CRU Scale was confirmed. The reliability was acceptable (>0.7 in the entire sample and in all provider groups) for 10 of 13 ACT concepts, and high (0.90–0.96) for the CRU Scale. We could demonstrate partial strong MI for both ACT models and partial strict MI for the CRU Scale. Our results suggest that the scores of the German ACT and the CRU Scale for nursing

  2. Measurement methods to assess diastasis of the rectus abdominis muscle (DRAM): A systematic review of their measurement properties and meta-analytic reliability generalisation.

    Science.gov (United States)

    van de Water, A T M; Benjamin, D R

    2016-02-01

    Systematic literature review. Diastasis of the rectus abdominis muscle (DRAM) has been linked with low back pain, abdominal and pelvic dysfunction. Measurement is used to either screen or to monitor DRAM width. Determining which methods are suitable for screening and monitoring DRAM is of clinical value. To identify the best methods to screen for DRAM presence and monitor DRAM width. AMED, Embase, Medline, PubMed and CINAHL databases were searched for measurement property studies of DRAM measurement methods. Population characteristics, measurement methods/procedures and measurement information were extracted from included studies. Quality of all studies was evaluated using 'quality rating criteria'. When possible, reliability generalisation was conducted to provide combined reliability estimations. Thirteen studies evaluated measurement properties of the 'finger width'-method, tape measure, calipers, ultrasound, CT and MRI. Ultrasound was most evaluated. Methodological quality of these studies varied widely. Pearson's correlations of r = 0.66-0.79 were found between calipers and ultrasound measurements. Calipers and ultrasound had Intraclass Correlation Coefficients (ICC) of 0.78-0.97 for test-retest, inter- and intra-rater reliability. The 'finger width'-method had weighted Kappa's of 0.73-0.77 for test-retest reliability, but moderate agreement (63%; weighted Kappa = 0.53) between raters. Comparing calipers and ultrasound, low measurement error was found (above the umbilicus), and the methods had good agreement (83%; weighted Kappa = 0.66) for discriminative purposes. The available information support ultrasound and calipers as adequate methods to assess DRAM. For other methods limited measurement information of low to moderate quality is available and further evaluation of their measurement properties is required. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  4. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  5. Development, reliability and use of a food environment assessment tool in supermarkets of four neighbourhoods in Montréal, Canada.

    Science.gov (United States)

    Jalbert-Arsenault, Élise; Robitaille, Éric; Paquette, Marie-Claude

    2017-09-01

    The food environment is a promising arena in which to influence people's dietary habits. This study aimed to develop a comprehensive food environment assessment tool for businesses and characterize the food environment of a low-tomedium income area of Montréal, Canada. We developed a tool, Mesure de l'environnement alimentaire du consommateur dans les supermarchés (MEAC-S), and tested it for reliability. We used the MEAC-S to assess the consumer food environment of 17 supermarkets in four neighbourhoods of Montréal. We measured the shelf length, variety, price, display counts and in-store positions of fruits and vegetables (FV) and ultra-processed food products (UPFPs). We also assessed fresh FV for quality. Store size was estimated using the total measured shelf length for all food categories. We conducted Spearman correlations between these indicators of the food environment. Reliability analyses revealed satisfactory results for most indicators. Characterization of the food environment revealed high variability in shelf length, variety and price of FV between supermarkets and suggested a disproportionate promotion of UPFPs. Display counts of UPFPs outside their normal display location ranged from 7 to 26, and they occupied 8 to 33 strategic in-store positions, whereas the number of display counts of fresh FV outside their normal display location exceeded 1 in only 2 of the 17 stores surveyed, and they occupied a maximum of 2 strategic in-store positions per supermarket. Price of UPFPs was inversely associated with their prominence (p environment between supermarkets and underscores the importance of measuring in-store characteristics to adequately picture the consumer food environment.

  6. Development, validity and reliability testing of the East Midlands Evaluation Tool (EMET) for measuring impacts on trainees' confidence and competence following end of life care training.

    Science.gov (United States)

    Whittaker, B; Parry, R; Bird, L; Watson, S; Faull, C

    2017-02-02

    To develop, test and validate a versatile questionnaire, the East Midlands Evaluation Tool (EMET), for measuring effects of end of life care training events on trainees' self-reported confidence and competence. A paper-based questionnaire was designed on the basis of the English Department of Health's core competences for end of life care, with sections for completion pretraining, immediately post-training and also for longer term follow-up. Preliminary versions were field tested at 55 training events delivered by 13 organisations to 1793 trainees working in diverse health and social care backgrounds. Iterative rounds of development aimed to maximise relevance to events and trainees. Internal consistency was assessed by calculating interitem correlations on questionnaire responses during field testing. Content validity was assessed via qualitative content analysis of (1) responses to questionnaires completed by field tester trainers and (2) field notes from a workshop with a separate cohort of experienced trainers. Test-retest reliability was assessed via repeat administration to a cohort of student nurses. The EMET comprises 27 items with Likert-scaled responses supplemented with questions seeking free-text responses. It measures changes in self-assessed confidence and competence on 5 subscales: communication skills; assessment and care planning; symptom management; advance care planning; overarching values and knowledge. Test-retest reliability was found to be good, as was internal consistency: the questions successfully assess different aspects of the same underlying concept. The EMET provides a time-efficient, reliable and flexible means of evaluating effects of training on self-reported confidence and competence in the key elements of end of life care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    Science.gov (United States)

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  9. The Mayo Dysphagia Questionnaire-30: documentation of reliability and validity of a tool for interventional trials in adults with esophageal disease.

    Science.gov (United States)

    McElhiney, Judith; Lohse, Matthew R; Arora, Amindra S; Peloquin, Joanna M; Geno, Debra M; Kuntz, Melissa M; Enders, Felicity B; Fredericksen, Mary; Abdalla, Adil A; Khan, Yulia; Talley, Nicholas J; Diehl, Nancy N; Beebe, Timothy J; Harris, Ann M; Farrugia, Gianrico; Graner, Darlene E; Murray, Joseph A; Locke, G Richard; Grothe, Rayna M; Crowell, Michael D; Francis, Dawn L; Grudell, April M B; Dabade, Tushar; Ramirez, Angelica; Alkhatib, MhdMaan; Alexander, Jeffrey A; Kimber, Jessica; Prasad, Ganapathy; Zinsmeister, Alan R; Romero, Yvonne

    2010-09-01

    The aim of this study was to develop the Mayo Dysphagia Questionnaire-30 Day (MDQ-30), a tool to measure esophageal dysphagia, by adapting items from validated instruments for use in clinical trials, and assess its feasibility, reproducibility, and concurrent validity. Outpatients referred to endoscopy for dysphagia or seen in a specialty clinic were recruited. Feasibility testing was done to identify problematic items. Reproducibility was measured by test-retest format. Concurrent validity reflects agreement between information gathered in a structured interview versus the patients' written responses. The MDQ-30, a 28-item instrument, took 10 min (range = 5-30 min) to complete. Four hundred thirty-one outpatients [210 (49%) men; mean age = 61 years] participated. Overall, most concurrent validity kappa values for dysphagia were very good to excellent with a median of 0.78 (min 0.28, max 0.95). The majority of reproducibility kappa values for dysphagia were moderate to excellent with a median kappa value of 0.66 (min 0.07, max 1.0). Overall, concurrent validity and reproducibility kappa values for gastroesophageal reflux disease (GERD) symptoms were 0.81 (95% CI = 0.72, 0.91) and 0.66 (95% CI = 0.55, 0.77), respectively. Individual item percent agreement was generally very good to excellent. Internal consistency was excellent. We conclude that the MDQ-30 is an easy-to-complete tool to evaluate reliably dysphagia symptoms over the last 30 days.

  10. CorRECTreatment: a web-based decision support tool for rectal cancer treatment that uses the analytic hierarchy process and decision tree.

    Science.gov (United States)

    Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C

    2015-01-01

    The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratiodecisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.

  11. Reliability studies in a developing technology

    International Nuclear Information System (INIS)

    Mitchell, L.A.; Osgood, C.; Radcliffe, S.J.

    1975-01-01

    The standard methods of reliability analysis can only be applied if valid failure statistics are available. In a developing technology the statistics which have been accumulated, over many years of conventional experience, are often rendered useless by environmental effects. Thus new data, which take account of the new environment, are required. This paper discusses the problem of optimizing the acquisition of these data when time-scales and resources are limited. It is concluded that the most fruitful strategy in assessing the reliability of mechanisms is to study the failures of individual joints whilst developing, where necessary, analytical tools to facilitate the use of these data. The approach is illustrated by examples from the field of tribology. Failures of rolling element bearings in moist, high-pressure carbon dioxide illustrate the important effects of apparently minor changes in the environment. New analytical techniques are developed from a study of friction failures in sliding joints. (author)

  12. Development, reliability and use of a food environment assessment tool in supermarkets of four neighbourhoods in Montréal, Canada

    Directory of Open Access Journals (Sweden)

    Élise Jalbert-Arsenault

    2017-09-01

    Full Text Available Introduction: The food environment is a promising arena in which to influence people’s dietary habits. This study aimed to develop a comprehensive food environment assessment tool for businesses and characterize the food environment of a low-tomedium income area of Montréal, Canada. Methods: We developed a tool, Mesure de l’environnement alimentaire du consommateur dans les supermarchés (MEAC-S, and tested it for reliability. We used the MEAC-S to assess the consumer food environment of 17 supermarkets in four neighbourhoods of Montréal. We measured the shelf length, variety, price, display counts and in-store positions of fruits and vegetables (FV and ultra-processed food products (UPFPs. We also assessed fresh FV for quality. Store size was estimated using the total measured shelf length for all food categories. We conducted Spearman correlations between these indicators of the food environment. Results: Reliability analyses revealed satisfactory results for most indicators. Characterization of the food environment revealed high variability in shelf length, variety and price of FV between supermarkets and suggested a disproportionate promotion of UPFPs. Display counts of UPFPs outside their normal display location ranged from 7 to 26, and they occupied 8 to 33 strategic in-store positions, whereas the number of display counts of fresh FV outside their normal display location exceeded 1 in only 2 of the 17 stores surveyed, and they occupied a maximum of 2 strategic in-store positions per supermarket. Price of UPFPs was inversely associated with their prominence (p $lt .005 and promotion (p $lt .003. Store size was associated with display counts and strategic in-store positioning of UPFPs (p $lt .001, but not FV, and was inversely associated with the price of soft drinks (p $lt .003. Conclusion: This study illustrates the variability of the food environment between supermarkets and underscores the importance of measuring in

  13. Analytical tools in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  14. Analytic tools for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Vladimir A. [Moscow State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2012-07-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  15. Analytic tools for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Vladimir A.

    2012-01-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  16. Analytical tools in accelerator physics

    International Nuclear Information System (INIS)

    Litvinenko, V.N.

    2010-01-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev (Kolomensky), but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz (Landau). A large number of short notes covering various techniques are placed in the Appendices.

  17. Analytic Tools for Feynman Integrals

    CERN Document Server

    Smirnov, Vladimir A

    2012-01-01

    The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice.  This book supersedes the author’s previous Springer book “Evaluating Feynman Integrals” and its textbook version “Feynman Integral Calculus.” Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added:  One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, “Applied Asymptotic Expansions in Momenta and Masses,” by the author. This chapter describes, on t...

  18. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  19. Development of simulation tools for improvement of measurement accuracy and efficiency in ultrasonic testing. Part 2. Development of fast simulator based on analytical approach

    International Nuclear Information System (INIS)

    Yamada, Hisao; Fukutomi, Hiroyuki; Lin, Shan; Ogata, Takashi

    2008-01-01

    CRIEPI developed a high speed simulation method to predict B scope images for crack-like defects under ultrasonic testing. This method is based on the geometrical theory of diffraction (GTD) to follow ultrasonic waves transmitted from the angle probe and with the aid of reciprocity relations to find analytical equations to express echoes received by the probe. The tip and mirror echoes from a slit with an arbitrary angle in the direction of thickness of test article and an arbitrary depth can be calculated by this method. Main object of the study is to develop a high speed simulation tool to gain B scope displays from the crack-like defect. This was achieved for the simple slits in geometry change regions by the prototype software based on the method. Fairy complete B scope images for slits could be obtained by about a minute on a current personal computer. The numerical predictions related to the surface opening slits were in excellent agreement with the relative experimental measurements. (author)

  20. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  1. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  2. A construction of standardized near infrared hyper-spectral teeth database: a first step in the development of reliable diagnostic tool for quantification and early detection of caries

    Science.gov (United States)

    Bürmen, Miran; Usenik, Peter; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan

    2011-03-01

    Dental caries is a disease characterized by demineralization of enamel crystals leading to the penetration of bacteria into the dentin and pulp. If left untreated, the disease can lead to pain, infection and tooth loss. Early detection of enamel demineralization resulting in increased enamel porosity, commonly known as white spots, is a difficult diagnostic task. Several papers reported on near infrared (NIR) spectroscopy to be a potentially useful noninvasive spectroscopic technique for early detection of caries lesions. However, the conducted studies were mostly qualitative and did not include the critical assessment of the spectral variability of the sound and carious dental tissues and influence of the water content. Such assessment is essential for development and validation of reliable qualitative and especially quantitative diagnostic tools based on NIR spectroscopy. In order to characterize the described spectral variability, a standardized diffuse reflectance hyper-spectral database was constructed by imaging 12 extracted human teeth with natural lesions of various degrees in the spectral range from 900 to 1700 nm with spectral resolution of 10 nm. Additionally, all the teeth were imaged by digital color camera. The influence of water content on the acquired spectra was characterized by monitoring the teeth during the drying process. The images were assessed by an expert, thereby obtaining the gold standard. By analyzing the acquired spectra we were able to accurately model the spectral variability of the sound dental tissues and identify the advantages and limitations of NIR hyper-spectral imaging.

  3. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  4. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  5. The MacArthur Competence Assessment Tool-Criminal Adjudication: Factor structure, interrater reliability, and association with clinician opinion of competence in a forensic inpatient sample.

    Science.gov (United States)

    Wood, Mary E; Anderson, Jaime L; Glassmire, David M

    2017-06-01

    Adjudicative competence is the most frequently referred evaluation in the forensic context, and it is because of this that periodic evaluation of competence assessment instruments is imperative. Among those instruments, the MacArthur Competence Assessment Tool-Criminal Adjudication (MacCAT-CA) has demonstrated adequate psychometric properties suggesting its utility in informing the forensic inquiry. The purpose of the current study was to further investigate the psychometric properties and ultimate utility of subscale scores using archival data from a sample of 103 male and female forensic patients who were hospitalized for competence restoration treatment. Results of the present study suggested adequate internal consistency and good model fit for the factor structure. Interrater reliability was evaluated by comparing the absolute agreement of scores derived from 2 independent research assistants for each of the subscales; 2 of the 3 subscales fell within the acceptable range given established interpretative benchmarks for forensic assessment. Of particular interest was that the Appreciation subscale, while heralding the lowest intraclass correlation coefficient, explained the largest proportion of variance in clinician opinion relative to the other 2 subscales. In other words, the most subjective subscale (as evidenced by the lowest intraclass correlation), explained the largest proportion of variance in ultimate opinion. The authors argue that, although these results are an important consideration in these assessments, they are neither surprising nor entirely problematic when considering the case-specific nature of the inquiries on the subscale, as well as the subjectivity of scoring criteria for each of the Appreciation items. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Determination of serum 25-hydroxy cholecalciferol using high-performance liquid chromatography: a reliable tool for assessment of vitamin D status.

    Science.gov (United States)

    Neyestani, Tirang R; Gharavi, A'azam; Kalayi, Ali

    2007-09-01

    This study was undertaken to design and set up a rather simple, reliable, and less expensive high-performance liquid chromatography (HPLC)-based method to assay 25(OH)D as a diagnostic tool for vitamin D assessment. Serum proteins were precipitated using ethanol and, after 10 minutes incubation at room temperature, methanol:isopropanol. The extraction was performed using hexane followed by evaporation under nitrogen flow. The sediment was then reconstituted in methanol and passed through a polypropylene filter. To run the chromatographic analysis, 20 microL of the filtrate was injected to the column. Peaks of 25(OH)D2 and 25(OH)D3 were both detected using a UV detector set at 265 nm. With a flow rate of 1.2 mL/minute, peaks of D3 and D2 vitamers were detected around 9.5 and 10.7 minutes, respectively. The intra- and inter-assay variations were 8.1% and 12.6%, respectively, and the recovery percent was found to be 100 +/- 5%. To compare the procedure with conventional methods, 90 serum samples from subjects (48 females and 42 males) aged 40.5 +/- 13.9 yrs, were analyzed for 25(OH)D using HPLC, competitive protein-binding assay (CPBA), and radioimmunoassay (RIA). Generally, CPBA and RIA assays both showed over-estimation of serum 25(OH)D, compared to HPLC. Though all three methods correlated significantly with each other, with the strongest between HPLC and RIA (r = 0.87, p < 0.001), both RIA and CPBA were found unreliable in detection of some deficient samples.

  7. Travel reliability inventory for Chicago.

    Science.gov (United States)

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  8. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  9. Quality Indicators for Learning Analytics

    Science.gov (United States)

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  10. Analytical procedures. Pt. 1

    International Nuclear Information System (INIS)

    Weber, G.

    1985-01-01

    In analytical procedures (Boole procedures) there is certain to be a close relationship between the safety assessment and reliability assessment of technical facilities. The paper gives an overview of the organization of models, fault trees, the probabilistic evaluation of systems, evaluation with minimum steps or minimum paths regarding statistically dependent components and of systems liable to suffer different kinds of outages. (orig.) [de

  11. Comparison of Analytic and Numerical Models With Commercially Available Simulation Tools for the Prediction of Semiconductor Freeze-Out and Exhaustion

    National Research Council Canada - National Science Library

    Reeves, Derek

    2002-01-01

    .... An efficient and reliable method is needed to accomplish this task. Silvaco International's semiconductor simulation software was used to predict temperature dependent majority carrier concentration for a semiconductor cell...

  12. Energy-economy models and energy efficiency policy evaluation for the household sector. An analysis of modelling tools and analytical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis; Neij, Lena

    2009-10-15

    Using the residential sector as a case study, the research presented in this report is separated into five main parts: (1) review of bottom-up methodologies and corresponding energy-economy models; (2) key drivers of energy demand and end-use coverage, (3) choice-determinants for efficient-technologies embedded in modelling methodologies; and (4) the analysis of modelling studies that focus on ex-ante energy efficiency policy evaluation. Based on the findings, (5) several research areas to further advance models are identified and discussed. We first identify four types of methodological categories: simulation, optimisation, accounting and hybrid models. A representative sample of these various methodological categories is reviewed. Technology representation is mostly explicit and technologically rich across all the reviewed models. This is a critical requisite for simulating energy efficiency policy instruments or portfolios that aim to induce ample technological change. Regardless the methodological approach, the explicit and rich technological component allows coverage of numerous energy services. All the reviewed models originate from the OECD region and more than 60 per cent of the identified applications focus mostly on developed countries. To some extent, this finding correlates with the claims about the need for more policy evaluation efforts to assist energy efficiency policy and other GHG mitigation options for the building sector in developing countries. We find that whereas capital and operating costs are relevant for efficient-technology (non-)adoption, they represent only a part of a great variety of determinants that drives consumer's energy-related decisions regarding technology choices. Factors including design, comfort, brand, functionality, reliability, environmental awareness, among others, are likely to influence the decisions of consumers in reality. Whereas economic variables are used as key determinants for technology choice in energy

  13. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  14. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides; Simulacion Monte Carlo: herramienta para la calibracion en determinaciones analiticas de radionucleidos

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez, E-mail: cphr@cphr.edu.cu [Centro de Proteccion e Higiene de las Radiaciones (CPHR), La Habana (Cuba)

    2013-07-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program.

  15. Strategy for continuous improvement in IC manufacturability, yield, and reliability

    Science.gov (United States)

    Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary

    1993-01-01

    Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.

  16. The Movement Imagery Questionnaire-Revised, Second Edition (MIQ-RS Is a Reliable and Valid Tool for Evaluating Motor Imagery in Stroke Populations

    Directory of Open Access Journals (Sweden)

    Andrew J. Butler

    2012-01-01

    Full Text Available Mental imagery can improve motor performance in stroke populations when combined with physical therapy. Valid and reliable instruments to evaluate the imagery ability of stroke survivors are needed to maximize the benefits of mental imagery therapy. The purposes of this study were to: examine and compare the test-retest intra-rate reliability of the Movement Imagery Questionnaire-Revised, Second Edition (MIQ-RS in stroke survivors and able-bodied controls, examine internal consistency of the visual and kinesthetic items of the MIQ-RS, determine if the MIQ-RS includes both the visual and kinesthetic dimensions of mental imagery, correlate impairment and motor imagery scores, and investigate the criterion validity of the MIQ-RS in stroke survivors by comparing the results to the KVIQ-10. Test-retest analysis indicated good levels of reliability (ICC range: .83–.99 and internal consistency (Cronbach α: .95–.98 of the visual and kinesthetic subscales in both groups. The two-factor structure of the MIQ-RS was supported by factor analysis, with the visual and kinesthetic components accounting for 88.6% and 83.4% of the total variance in the able-bodied and stroke groups, respectively. The MIQ-RS is a valid and reliable instrument in the stroke population examined and able-bodied populations and therefore useful as an outcome measure for motor imagery ability.

  17. The 2010 American college of rheumatology fibromyalgia survey diagnostic criteria and symptom severity scale is a valid and reliable tool in a French speaking fibromyalgia cohort

    Directory of Open Access Journals (Sweden)

    Fitzcharles Mary-Ann

    2012-09-01

    Full Text Available Abstract Background Fibromyalgia (FM is a pain condition with associated symptoms contributing to distress. The Fibromyalgia Survey Diagnostic Criteria and Severity Scale (FSDC is a patient-administered questionnaire assessing diagnosis and symptom severity. Locations of body pain measured by the Widespread Pain Index (WPI, and the Symptom Severity scale (SS measuring fatigue, unrefreshing sleep, cognitive and somatic complaints provide a score (0–31, measuring a composite of polysymptomatic distress. The reliability and validity of the translated French version of the FSDC was evaluated. Methods The French FSDC was administered twice to 73 FM patients, and was correlated with measures of symptom status including: Fibromyalgia Impact Questionnaire (FIQ, Health Assessment Questionnaire (HAQ, McGill Pain Questionnaire (MPQ, and a visual analogue scale (VAS for global severity and pain. Test-retest reliability, internal consistency, and construct validity were evaluated. Results Test-retest reliability was between .600 and .888 for the 25 single items of the FSDC, and .912 for the total FSDC, with all correlations significant (p  Conclusions The French FSDC is a valid instrument in French FM patients with reliability and construct validity. It is easily completed, simple to score, and has the potential to become the standard for measurement of polysymptomatic distress in FM.

  18. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  19. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  20. The reliability and validity of the informant AD8 by comparison with a series of cognitive assessment tools in primary healthcare.

    Science.gov (United States)

    Shaik, Muhammad Amin; Xu, Xin; Chan, Qun Lin; Hui, Richard Jor Yeong; Chong, Steven Shih Tsze; Chen, Christopher Li-Hsian; Dong, YanHong

    2016-03-01

    The validity and reliability of the informant AD8 in primary healthcare has not been established. Therefore, the present study examined the validity and reliability of the informant AD8 in government subsidized primary healthcare centers in Singapore. Eligible patients (≥60 years old) were recruited from primary healthcare centers and their informants received the AD8. Patient-informant dyads who agreed for further cognitive assessments received the Mini-Mental State Examination (MMSE), Montreal Cognitive Assessment (MoCA), Clinical Dementia Rating (CDR), and a locally validated formal neuropsychological battery at a research center in a tertiary hospital. 1,082 informants completed AD8 assessment at two primary healthcare centers. Of these, 309 patients-informant dyads were further assessed, of whom 243 (78.6%) were CDR = 0; 22 (7.1%) were CDR = 0.5; and 44 (14.2%) were CDR≥1. The mean administration time of the informant AD8 was 2.3 ± 1.0 minutes. The informant AD8 demonstrated good internal consistency (Cronbach's α = 0.85); inter-rater reliability (Intraclass Correlation Coefficient (ICC) = 0.85); and test-retest reliability (weighted κ = 0.80). Concurrent validity, as measured by the correlation between total AD8 scores and CDR global (R = 0.65, p validity, as measured by convergent validity (R ≥ 0.4) between individual items of AD8 with CDR and neuropsychological domains was acceptable. The informant AD8 demonstrated good concurrent and construct validity and is a reliable measure to detect cognitive dysfunction in primary healthcare.

  1. Reliable planning and monitoring tools by dismantling 3D photographic image of high resolution and document management systems. Application MEDS system

    International Nuclear Information System (INIS)

    Vela Morales, F.

    2010-01-01

    MEDS system (Metric Environment Documentation System) is a method developed by CT3 based engineering documentation generation metric of a physical environment using measurement tools latest technology and high precision, such as the Laser Scanner. With this equipment it is possible to obtain three-dimensional information of a physical environment through the 3D coordinates of millions of points. This information is processed by software that is very useful tool for modeling operations and 3D simulations.

  2. Analytic webs support the synthesis of ecological data sets.

    Science.gov (United States)

    Ellison, Aaron M; Osterweil, Leon J; Clarke, Lori; Hadley, Julian L; Wise, Alexander; Boose, Emery; Foster, David R; Hanson, Allen; Jensen, David; Kuzeja, Paul; Riseman, Edward; Schultz, Howard

    2006-06-01

    A wide variety of data sets produced by individual investigators are now synthesized to address ecological questions that span a range of spatial and temporal scales. It is important to facilitate such syntheses so that "consumers" of data sets can be confident that both input data sets and synthetic products are reliable. Necessary documentation to ensure the reliability and validation of data sets includes both familiar descriptive metadata and formal documentation of the scientific processes used (i.e., process metadata) to produce usable data sets from collections of raw data. Such documentation is complex and difficult to construct, so it is important to help "producers" create reliable data sets and to facilitate their creation of required metadata. We describe a formal representation, an "analytic web," that aids both producers and consumers of data sets by providing complete and precise definitions of scientific processes used to process raw and derived data sets. The formalisms used to define analytic webs are adaptations of those used in software engineering, and they provide a novel and effective support system for both the synthesis and the validation of ecological data sets. We illustrate the utility of an analytic web as an aid to producing synthetic data sets through a worked example: the synthesis of long-term measurements of whole-ecosystem carbon exchange. Analytic webs are also useful validation aids for consumers because they support the concurrent construction of a complete, Internet-accessible audit trail of the analytic processes used in the synthesis of the data sets. Finally we describe our early efforts to evaluate these ideas through the use of a prototype software tool, SciWalker. We indicate how this tool has been used to create analytic webs tailored to specific data-set synthesis and validation activities, and suggest extensions to it that will support additional forms of validation. The process metadata created by SciWalker is

  3. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  4. Reliability and maintainability

    International Nuclear Information System (INIS)

    1994-01-01

    Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters

  5. Chloride present in biological samples as a tool for enhancement of sensitivity in capillary zone electrophoretic analysis of anionic trace analytes

    Czech Academy of Sciences Publication Activity Database

    Křivánková, Ludmila; Pantůčková, Pavla; Gebauer, Petr; Boček, Petr; Caslavska, J.; Thormann, W.

    2003-01-01

    Roč. 24, č. 3 (2003), s. 505-517 ISSN 0173-0835 R&D Projects: GA ČR GA203/02/0023; GA ČR GA203/01/0401; GA AV ČR IAA4031103 Institutional research plan: CEZ:AV0Z4031919 Keywords : acetoacetate * capillary zone electrophoresis * chloride stacking effects Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.040, year: 2003

  6. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  7. How to Conduct Multimethod Field Studies in the Operating Room: The iPad Combined With a Survey App as a Valid and Reliable Data Collection Tool.

    Science.gov (United States)

    Tscholl, David W; Weiss, Mona; Spahn, Donat R; Noethiger, Christoph B

    2016-01-05

    Tablet computers such as the Apple iPad are progressively replacing traditional paper-and-pencil-based data collection. We combined the iPad with the ready-to-use survey software, iSurvey (from Harvestyourdata), to create a straightforward tool for data collection during the Anesthesia Pre-Induction Checklist (APIC) study, a hospital-wide multimethod intervention study involving observation of team performance and team member surveys in the operating room (OR). We aimed to provide an analysis of the factors that led to the use of the iPad- and iSurvey-based tool for data collection, illustrate our experiences with the use of this data collection tool, and report the results of an expert survey about user experience with this tool. We used an iPad- and iSurvey-based tool to observe anesthesia inductions conducted by 205 teams (N=557 team members) in the OR. In Phase 1, expert raters used the iPad- and iSurvey-based tool to rate team performance during anesthesia inductions, and anesthesia team members were asked to indicate their perceptions after the inductions. In Phase 2, we surveyed the expert raters about their perceptions regarding the use of the iPad- and iSurvey-based tool to observe, rate, and survey teams in the ORs. The results of Phase 1 showed that training data collectors on the iPad- and iSurvey-based data collection tool was effortless and there were no serious problems during data collection, upload, download, and export. Interrater agreement of the combined data collection tool was found to be very high for the team observations (median Fleiss' kappa=0.88, 95% CI 0.78-1.00). The results of the follow-up expert rater survey (Phase 2) showed that the raters did not prefer a paper-and-pencil-based data collection method they had used during other earlier studies over the iPad- and iSurvey-based tool (median response 1, IQR 1-1; 1=do not agree, 2=somewhat disagree, 3=neutral, 4=somewhat agree, 5=fully agree). They found the iPad (median 5, IQR 4

  8. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  9. The Reliability and Validity of a Progress-Monitoring Tool: A Psychometric Examination of the Phonological Awareness Skills of Preschoolers with ASD

    Science.gov (United States)

    Martini, Jay R.

    2017-01-01

    The purpose of this study was to conduct a psychometric evaluation the "Sound Beginning" phonological awareness progress monitoring tool. This assessment was used to track emergent literacy skills of preschoolers with autism spectrum disorder who were participating in a randomized trial studying early literacy interventions. Research…

  10. Prospective, Head-to-Head Study of Three Computerized Neurocognitive Assessment Tools (CNTs): Reliability and Validity for the Assessment of Sport-Related Concussion.

    Science.gov (United States)

    Nelson, Lindsay D; LaRoche, Ashley A; Pfaller, Adam Y; Lerner, E Brooke; Hammeke, Thomas A; Randolph, Christopher; Barr, William B; Guskiewicz, Kevin; McCrea, Michael A

    2016-01-01

    Limited data exist comparing the performance of computerized neurocognitive tests (CNTs) for assessing sport-related concussion. We evaluated the reliability and validity of three CNTs-ANAM, Axon Sports/Cogstate Sport, and ImPACT-in a common sample. High school and collegiate athletes completed two CNTs each at baseline. Concussed (n=165) and matched non-injured control (n=166) subjects repeated testing within 24 hr and at 8, 15, and 45 days