WorldWideScience

Sample records for streamlined review methods

  1. Streamlining the license renewal review process

    International Nuclear Information System (INIS)

    Dozier, J.; Lee, S.; Kuo, P.T.

    2001-01-01

    The staff of the NRC has been developing three regulatory guidance documents for license renewal: the Generic Aging Lessons Learned (GALL) report, Standard Review Plan for License Renewal (SRP-LR), and Regulatory Guide (RG) for Standard Format and Content for Applications to Renew Nuclear Power Plant Operating Licenses. These documents are designed to streamline the license renewal review process by providing clear guidance for license renewal applicants and the NRC staff in preparing and reviewing license renewal applications. The GALL report systematically catalogs aging effects on structures and components; identifies the relevant existing plant programs; and evaluates the existing programs against the attributes considered necessary for an aging management program to be acceptable for license renewal. The GALL report also provides guidance for the augmentation of existing plant programs for license renewal. The revised SRP-LR allows an applicant to reference the GALL report to preclude further NRC staff evaluation if the plant's existing programs meet the criteria described in the GALL report. During the review process, the NRC staff will focus primarily on existing programs that should be augmented or new programs developed specifically for license renewal. The Regulatory Guide is expected to endorse the Nuclear Energy Institute (NEI) guideline, NEI 95-10, Revision 2, entitled 'Industry Guideline for Implementing the Requirements of 10 CFR Part 54 - The License Renewal Rule', which provides guidance for preparing a license renewal application. This paper will provide an introduction to the GALL report, SRP-LR, Regulatory Guide, and NEI 95-10 to show how these documents are interrelated and how they will be used to streamline the license renewal review process. This topic will be of interest to domestic power utilities considering license renewal and international ICONE participants seeking state-of-the-art information about license renewal in the United States

  2. A Streamlined Artificial Variable Free Version of Simplex Method

    OpenAIRE

    Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad

    2015-01-01

    This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new ...

  3. Evaluation of two streamlined life cycle assessment methods

    International Nuclear Information System (INIS)

    Hochschomer, Elisabeth; Finnveden, Goeran; Johansson, Jessica

    2002-02-01

    Two different methods for streamlined life cycle assessment (LCA) are described: the MECO-method and SLCA. Both methods are tested on an already made case-study on cars fuelled with petrol or ethanol, and electric cars with electricity produced from hydro power or coal. The report also contains some background information on LCA and streamlined LCA, and a deschption of the case study used. The evaluation of the MECO and SLCA-methods are based on a comparison of the results from the case study as well as practical aspects. One conclusion is that the SLCA-method has some limitations. Among the limitations are that the whole life-cycle is not covered, it requires quite a lot of information and there is room for arbitrariness. It is not very flexible instead it difficult to develop further. We are therefore not recommending the SLCA-method. The MECO-method does in comparison show several attractive features. It is also interesting to note that the MECO-method produces information that is complementary compared to a more traditional quantitative LCA. We suggest that the MECO method needs some further development and adjustment to Swedish conditions

  4. An Evaluation of the Acquisition Streamlining Methods at the Fleet and Industrial Supply Center Pearl Harbor Hawaii

    National Research Council Canada - National Science Library

    Henry, Mark

    1999-01-01

    ...) Pearl Harbor's implementation of acquisition streamlining initiatives and recommends viable methods of streamlining the acquisition process at FISC Pearl Harbor and other Naval Supply Systems Command...

  5. A streamlined artificial variable free version of simplex method.

    Directory of Open Access Journals (Sweden)

    Syed Inayatullah

    Full Text Available This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement.

  6. A streamlined artificial variable free version of simplex method.

    Science.gov (United States)

    Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad

    2015-01-01

    This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement.

  7. A new method for calculating volumetric sweeps efficiency using streamline simulation concepts

    International Nuclear Information System (INIS)

    Hidrobo, E A

    2000-01-01

    One of the purposes of reservoir engineering is to quantify the volumetric sweep efficiency for optimizing reservoir management decisions. The estimation of this parameter has always been a difficult task. Until now, sweep efficiency correlations and calculations have been limited to mostly homogeneous 2-D cases. Calculating volumetric sweep efficiency in a 3-D heterogeneous reservoir becomes difficult due to inherent complexity of multiple layers and arbitrary well configurations. In this paper, a new method for computing volumetric sweep efficiency for any arbitrary heterogeneity and well configuration is presented. The proposed method is based on Datta-Gupta and King's formulation of streamline time-of-flight (1995). Given the fact that the time-of-flight reflects the fluid front propagation at various times, then the connectivity in the time-of-flight represents a direct measure of the volumetric sweep efficiency. The proposed approach has been applied to synthetic as well as field examples. Synthetic examples are used to validate the volumetric sweep efficiency calculations using the streamline time-of-flight connectivity criterion by comparison with analytic solutions and published correlations. The field example, which illustrates the feasibility of the approach for large-scale field applications, is from the north Robertson unit, a low permeability carbonate reservoir in west Texas

  8. The streamline upwind Petrov-Galerkin stabilising method for the numerical solution of highly advective problems

    Directory of Open Access Journals (Sweden)

    Carlos Humberto Galeano Urueña

    2009-05-01

    Full Text Available This article describes the streamline upwind Petrov-Galerkin (SUPG method as being a stabilisation technique for resolving the diffusion-advection-reaction equation by finite elements. The first part of this article has a short analysis of the importance of this type of differential equation in modelling physical phenomena in multiple fields. A one-dimensional description of the SUPG me- thod is then given to extend this basis to two and three dimensions. The outcome of a strongly advective and a high numerical complexity experiment is presented. The results show how the version of the implemented SUPG technique allowed stabilised approaches in space, even for high Peclet numbers. Additional graphs of the numerical experiments presented here can be downloaded from www.gnum.unal.edu.co.

  9. Inter-laboratory validation of an inexpensive streamlined method to measure inorganic arsenic in rice grain.

    Science.gov (United States)

    Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J

    2018-05-04

    With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.

  10. A streamlined risk screening method for managing reutilization of abandoned factories in Taiwan

    Directory of Open Access Journals (Sweden)

    I-Chun Chen

    2017-05-01

    Full Text Available An integrated management strategy that considers the competing relationships between land values and associated risks in the process of land-use conversion is needed to assess and manage the reutilization of brownfields. However, the often large number of individual brownfields renders it difficult to conduct a completed risk assessment for all sites, and a streamlined risk screening method would facilitate prioritization of the redevelopment of those factories. This methodology takes into account the spatial heterogeneity of contaminated lands and produces risk mapping that compiles complex risk-related information. Using abandoned factories in Taiwan as a case study, the method considers 40 points (50% accumulated probability as the threshold of acceptable risk. Emergency risk should be over 90% of accumulated probability. For the sustainability of brownfield reutilization in Taiwan, this research uses a risk matrix to identify the low, middle, and high risk for brownfield reutilization. It can indicate zones with a high risk level or low economic incentive as areas of concern for future decision making. In Taiwan, high-risk sites with high incentive account for only 21.3% of the sites. In contrast, the sites with the lowest incentive and low risk account for 57.6% of the sites. To avoid failure in the brownfield market, three strategies are suggested: (1 flexible land management with urban planning is a feasible option for protecting the receptor's health; (2 the government could provide the tool or brownfield funds to reduce the uncertainty of investment risk; and (3 risk monitoring and management can reduce the possible pitfalls associated with brownfield reutilization.

  11. Scientific Evaluation and Review of Claims in Health Care (SEaRCH): A Streamlined, Systematic, Phased Approach for Determining “What Works” in Healthcare

    Science.gov (United States)

    Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela

    2017-01-01

    Abstract Background: Answering the question of “what works” in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. Methods: SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL©), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods—CAP, REAL, and EP—can be integrated into a strategic approach to help answer the question “what works in healthcare?” and what it means in a comprehensive way. Discussion: SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and

  12. Scientific Evaluation and Review of Claims in Health Care (SEaRCH): A Streamlined, Systematic, Phased Approach for Determining "What Works" in Healthcare.

    Science.gov (United States)

    Jonas, Wayne B; Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela

    2017-01-01

    Answering the question of "what works" in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL © ), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods-CAP, REAL, and EP-can be integrated into a strategic approach to help answer the question "what works in healthcare?" and what it means in a comprehensive way. SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and wellness.

  13. Report: Follow-Up Report: EPA Proposes to Streamline the Review, Management and Disposal of Hazardous Waste Pharmaceuticals

    Science.gov (United States)

    Report #15-P-0260, August 19, 2015. EPA states that it intends to issue a proposed rule, Management Standards for Hazardous Waste, which will attempt to streamline the approach to managing and disposing of hazardous and nonhazardous pharmaceutical waste.

  14. Critique of one-stop siting in Washington: streamlining review without compromising effectiveness

    International Nuclear Information System (INIS)

    Granger, J.A.; Wise, K.R.

    1980-01-01

    The state of Washington adopted a one-stop power plant siting law in 1970 so that the regulatory elements could be coordinated into a single siting decision. Efficiency improves as duplications and inconsistencies disappear, but increasing lead times and higher costs persist in the state. An analysis of the legislation examines why certain statutory and regulatory provisions allow this to happen, pointing particularly at the review and approval process. Appropriate reforms include adequate funds and staff for the permit agency, early identification of issues, prehearing conferences, and explicit guidelines and standards. 114 references and footnotes

  15. A scoping review of rapid review methods.

    Science.gov (United States)

    Tricco, Andrea C; Antony, Jesmin; Zarin, Wasifa; Strifler, Lisa; Ghassemi, Marco; Ivory, John; Perrier, Laure; Hutton, Brian; Moher, David; Straus, Sharon E

    2015-09-16

    Rapid reviews are a form of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a timely manner. Although numerous centers are conducting rapid reviews internationally, few studies have examined the methodological characteristics of rapid reviews. We aimed to examine articles, books, and reports that evaluated, compared, used or described rapid reviews or methods through a scoping review. MEDLINE, EMBASE, the Cochrane Library, internet websites of rapid review producers, and reference lists were searched to identify articles for inclusion. Two reviewers independently screened literature search results and abstracted data from included studies. Descriptive analysis was conducted. We included 100 articles plus one companion report that were published between 1997 and 2013. The studies were categorized as 84 application papers, seven development papers, six impact papers, and four comparison papers (one was included in two categories). The rapid reviews were conducted between 1 and 12 months, predominantly in Europe (58 %) and North America (20 %). The included studies failed to report 6 % to 73 % of the specific systematic review steps examined. Fifty unique rapid review methods were identified; 16 methods occurred more than once. Streamlined methods that were used in the 82 rapid reviews included limiting the literature search to published literature (24 %) or one database (2 %), limiting inclusion criteria by date (68 %) or language (49 %), having one person screen and another verify or screen excluded studies (6 %), having one person abstract data and another verify (23 %), not conducting risk of bias/quality appraisal (7 %) or having only one reviewer conduct the quality appraisal (7 %), and presenting results as a narrative summary (78 %). Four case studies were identified that compared the results of rapid reviews to systematic reviews. Three studies found that the conclusions between

  16. Streamline integration as a method for two-dimensional elliptic grid generation

    Energy Technology Data Exchange (ETDEWEB)

    Wiesenberger, M., E-mail: Matthias.Wiesenberger@uibk.ac.at [Institute for Ion Physics and Applied Physics, Universität Innsbruck, A-6020 Innsbruck (Austria); Held, M. [Institute for Ion Physics and Applied Physics, Universität Innsbruck, A-6020 Innsbruck (Austria); Einkemmer, L. [Numerical Analysis group, Universität Innsbruck, A-6020 Innsbruck (Austria)

    2017-07-01

    We propose a new numerical algorithm to construct a structured numerical elliptic grid of a doubly connected domain. Our method is applicable to domains with boundaries defined by two contour lines of a two-dimensional function. Furthermore, we can adapt any analytically given boundary aligned structured grid, which specifically includes polar and Cartesian grids. The resulting coordinate lines are orthogonal to the boundary. Grid points as well as the elements of the Jacobian matrix can be computed efficiently and up to machine precision. In the simplest case we construct conformal grids, yet with the help of weight functions and monitor metrics we can control the distribution of cells across the domain. Our algorithm is parallelizable and easy to implement with elementary numerical methods. We assess the quality of grids by considering both the distribution of cell sizes and the accuracy of the solution to elliptic problems. Among the tested grids these key properties are best fulfilled by the grid constructed with the monitor metric approach. - Graphical abstract: - Highlights: • Construct structured, elliptic numerical grids with elementary numerical methods. • Align coordinate lines with or make them orthogonal to the domain boundary. • Compute grid points and metric elements up to machine precision. • Control cell distribution by adaption functions or monitor metrics.

  17. Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report

    Energy Technology Data Exchange (ETDEWEB)

    Addleman, Raymond S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Naes, Benjamin E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Olsen, Khris B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chouyyok, Wilaiwan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Willingham, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Spigner, Angel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-30

    The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmental sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to ORNL for

  18. Rapid Evidence Assessment of the Literature (REAL(©)): streamlining the systematic review process and creating utility for evidence-based health care.

    Science.gov (United States)

    Crawford, Cindy; Boyd, Courtney; Jain, Shamini; Khorsan, Raheleh; Jonas, Wayne

    2015-11-02

    Systematic reviews (SRs) are widely recognized as the best means of synthesizing clinical research. However, traditional approaches can be costly and time-consuming and can be subject to selection and judgment bias. It can also be difficult to interpret the results of a SR in a meaningful way in order to make research recommendations, clinical or policy decisions, or practice guidelines. Samueli Institute has developed the Rapid Evidence Assessment of the Literature (REAL) SR process to address these issues. REAL provides up-to-date, rigorous, high quality SR information on health care practices, products, or programs in a streamlined, efficient and reliable manner. This process is a component of the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™) program developed by Samueli Institute, which aims at answering the question of "What works?" in health care. The REAL process (1) tailors a standardized search strategy to a specific and relevant research question developed with various stakeholders to survey the available literature; (2) evaluates the quantity and quality of the literature using structured tools and rulebooks to ensure objectivity, reliability and reproducibility of reviewer ratings in an independent fashion and; (3) obtains formalized, balanced input from trained subject matter experts on the implications of the evidence for future research and current practice. Online tools and quality assurance processes are utilized for each step of the review to ensure a rapid, rigorous, reliable, transparent and reproducible SR process. The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into "what works" in health care. Using the REAL system allows for the facilitation of recommendations on appropriate next steps in policy, funding

  19. Streamline-based microfluidic device

    Science.gov (United States)

    Tai, Yu-Chong (Inventor); Zheng, Siyang (Inventor); Kasdan, Harvey (Inventor)

    2013-01-01

    The present invention provides a streamline-based device and a method for using the device for continuous separation of particles including cells in biological fluids. The device includes a main microchannel and an array of side microchannels disposed on a substrate. The main microchannel has a plurality of stagnation points with a predetermined geometric design, for example, each of the stagnation points has a predetermined distance from the upstream edge of each of the side microchannels. The particles are separated and collected in the side microchannels.

  20. 75 FR 28517 - 2004 and 2006 Biennial Regulatory Reviews-Streamlining and Other Revisions of the Commission's...

    Science.gov (United States)

    2010-05-21

    ... Commission for the transmission of radio energy, the owner of the tower shall maintain the prescribed.... FOR FURTHER INFORMATION CONTACT: John Borkowski, Wireless Telecommunications Bureau, (202) 418-0626, e... its 2004 Biennial Review Comments, PCIA--the Wireless Infrastructure Association (PCIA) states that...

  1. Analysis Streamlining in ATLAS

    CERN Document Server

    Heinrich, Lukas; The ATLAS collaboration

    2018-01-01

    We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...

  2. The Use of Rapid Review Methods for the U.S. Preventive Services Task Force.

    Science.gov (United States)

    Patnode, Carrie D; Eder, Michelle L; Walsh, Emily S; Viswanathan, Meera; Lin, Jennifer S

    2018-01-01

    Rapid review products are intended to synthesize available evidence in a timely fashion while still meeting the needs of healthcare decision makers. Various methods and products have been applied for rapid evidence syntheses, but no single approach has been uniformly adopted. Methods to gain efficiency and compress the review time period include focusing on a narrow clinical topic and key questions; limiting the literature search; performing single (versus dual) screening of abstracts and full-text articles for relevance; and limiting the analysis and synthesis. In order to maintain the scientific integrity, including transparency, of rapid evidence syntheses, it is imperative that procedures used to streamline standard systematic review methods are prespecified, based on sound review principles and empiric evidence when possible, and provide the end user with an accurate and comprehensive synthesis. The collection of clinical preventive service recommendations maintained by the U.S. Preventive Services Task Force, along with its commitment to rigorous methods development, provide a unique opportunity to refine, implement, and evaluate rapid evidence synthesis methods and add to an emerging evidence base on rapid review methods. This paper summarizes the U.S. Preventive Services Task Force's use of rapid review methodology, its criteria for selecting topics for rapid evidence syntheses, and proposed methods to streamline the review process. Copyright © 2018 American Journal of Preventive Medicine. All rights reserved.

  3. VALUATION METHODS- LITERATURE REVIEW

    OpenAIRE

    Dorisz Talas

    2015-01-01

    This paper is a theoretical overview of the often used valuation methods with the help of which the value of a firm or its equity is calculated. Many experts (including Aswath Damodaran, Guochang Zhang and CA Hozefa Natalwala) classify the methods. The basic models are based on discounted cash flows. The main method uses the free cash flow for valuation, but there are some newer methods that reveal and correct the weaknesses of the traditional models. The valuation of flexibility of managemen...

  4. VALUATION METHODS- LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    Dorisz Talas

    2015-07-01

    Full Text Available This paper is a theoretical overview of the often used valuation methods with the help of which the value of a firm or its equity is calculated. Many experts (including Aswath Damodaran, Guochang Zhang and CA Hozefa Natalwala classify the methods. The basic models are based on discounted cash flows. The main method uses the free cash flow for valuation, but there are some newer methods that reveal and correct the weaknesses of the traditional models. The valuation of flexibility of management can be conducted mainly with real options. This paper briefly describes the essence of the Dividend Discount Model, the Free Cash Flow Model, the benefit from using real options and the Residual Income Model. There are a few words about the Adjusted Present Value approach as well. Different models uses different premises, and an overall truth is that if the required premises are real and correct, the value will be appropriately accurate. Another important condition is that experts, analysts should choose between the models on the basis of the purpose of valuation. Thus there are no good or bad methods, only methods that fit different goals and aims. The main task is to define exactly the purpose, then to find the most appropriate valuation technique. All the methods originates from the premise that the value of an asset is the present value of its future cash flows. According to the different points of view of different techniques the resulted values can be also differed from each other. Valuation models and techniques should be adapted to the rapidly changing world, but the basic statements remain the same. On the other hand there is a need for more accurate models in order to help investors get as many information as they could. Today information is one of the most important resources and financial models should keep up with this trend.

  5. Streamlining Smart Meter Data Analytics

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts

    2015-01-01

    of the so-called big data possible. This can improve energy management, e.g., help utilities improve the management of energy and services, and help customers save money. As this regard, the paper focuses on building an innovative software solution to streamline smart meter data analytic, aiming at dealing......Today smart meters are increasingly used in worldwide. Smart meters are the advanced meters capable of measuring customer energy consumption at a fine-grained time interval, e.g., every 15 minutes. The data are very sizable, and might be from different sources, along with the other social......-economic metrics such as the geographic information of meters, the information about users and their property, geographic location and others, which make the data management very complex. On the other hand, data-mining and the emerging cloud computing technologies make the collection, management, and analysis...

  6. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  7. A streamlined failure mode and effects analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD 21287 (United States)

    2014-06-15

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  8. A streamlined failure mode and effects analysis

    International Nuclear Information System (INIS)

    Ford, Eric C.; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-01-01

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed

  9. Critical review on biofilm methods

    DEFF Research Database (Denmark)

    Azeredo, Joana; F. Azevedo, Nuno; Briandet, Romain

    2017-01-01

    Biofilms are widespread in nature and constitute an important strategy implemented by microorganisms to survive in sometimes harsh environmental conditions. They can be beneficial or have a negative impact particularly when formed in industrial settings or on medical devices. As such, research in...... and limitations of several methods. Accordingly, this review aims at helping scientists in finding the most appropriate and up-to-date methods to study their biofilms....

  10. ACHP | News | ACHP Issues Program Comment to Streamline Communication

    Science.gov (United States)

    Program Comment to Streamline Communication Facilities Construction and Modification ACHP Issues Program Comment to Streamline Communication Facilities Construction and Modification The Advisory Council on

  11. Accelerated Logistics: Streamlining the Army's Supply Chain

    National Research Council Canada - National Science Library

    Wang, Mark

    2000-01-01

    ...) initiative, the Army has dramatically streamlined its supply chain, cutting order and ship times for repair parts by nearly two-thirds nationwide and over 75 percent at several of the major Forces Command (FORSCOM) installations...

  12. Creating customer value by streamlining business processes.

    Science.gov (United States)

    Vantrappen, H

    1992-02-01

    Much of the strategic preoccupation of senior managers in the 1990s is focusing on the creation of customer value. Companies are seeking competitive advantage by streamlining the three processes through which they interact with their customers: product creation, order handling and service assurance. 'Micro-strategy' is a term which has been coined for the trade-offs and decisions on where and how to streamline these three processes. The article discusses micro-strategies applied by successful companies.

  13. Streamlining: Reducing costs and increasing STS operations effectiveness

    Science.gov (United States)

    Petersburg, R. K.

    1985-01-01

    The development of streamlining as a concept, its inclusion in the space transportation system engineering and operations support (STSEOS) contract, and how it serves as an incentive to management and technical support personnel is discussed. The mechanics of encouraging and processing streamlining suggestions, reviews, feedback to submitters, recognition, and how individual employee performance evaluations are used to motivation are discussed. Several items that were implemented are mentioned. Information reported and the methodology of determining estimated dollar savings are outlined. The overall effect of this activity on the ability of the McDonnell Douglas flight preparation and mission operations team to support a rapidly increasing flight rate without a proportional increase in cost is illustrated.

  14. Streamlined Darwin methods for particle beam injectors

    International Nuclear Information System (INIS)

    Boyd, J.K.

    1987-01-01

    Physics issues that involve inductive effects, such as beam fluctuations, electromagnetic (EM) instability, or interactions with a cavity require a time-dependent simulation. The most elaborate time-dependent codes self-consistently solve Maxwell's equations and the force equation for a large number of macroparticles. Although these full EM particle-in-cell (PIC) codes have been used to study a broad range of phenomena, including beam injectors, they have several drawbacks. In an explicit solution of Maxwell's equations, the time step is restricted by a Courant condition. A second disadvantage is the production of anomalously large numerical fluctuations, caused by representing many real particles by a single computational macroparticle. Last, approximate models of internal boundaries can create nonphysical radiation in a full EM simulation. In this work, many of the problems of a fully electromagnetic simulation are avoided by using the Darwin field model. The Darwin field model is the magnetoinductive limit of Maxwell's equations, and it retains the first-order relativistic correction to the particle Lagrangian. It includes the part of the displacement current necessary to satisfy the charge-continuity equation. This feature is important for simulation of nonneutral beams. Because the Darwin model does not include the solenoidal vector component of the displacement current, it cannot be used to study high-frequency phenomena or effects caused by rapid current changes. However, because wave motion is not followed, the Courant condition of a fully electromagnetic code can be exceeded. In addition, inductive effects are modeled without creating nonphysical radiation

  15. Impact assessment: Eroding benefits through streamlining?

    Energy Technology Data Exchange (ETDEWEB)

    Bond, Alan, E-mail: alan.bond@uea.ac.uk [School of Environmental Sciences, University of East Anglia (United Kingdom); School of Geo and Spatial Sciences, North-West University (South Africa); Pope, Jenny, E-mail: jenny@integral-sustainability.net [Integral Sustainability (Australia); Curtin University Sustainability Policy Institute (Australia); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, North-West University (South Africa); Environmental Science, Murdoch University (Australia); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, North-West University (South Africa); Gunn, Jill A.E., E-mail: jill.gunn@usask.ca [Department of Geography and Planning and School of Environment and Sustainability, University of Saskatchewan (Canada)

    2014-02-15

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environment and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment.

  16. Impact assessment: Eroding benefits through streamlining?

    International Nuclear Information System (INIS)

    Bond, Alan; Pope, Jenny; Morrison-Saunders, Angus; Retief, Francois; Gunn, Jill A.E.

    2014-01-01

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environment and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment

  17. Industrial Compositional Streamline Simulation for Efficient and Accurate Prediction of Gas Injection and WAG Processes

    Energy Technology Data Exchange (ETDEWEB)

    Margot Gerritsen

    2008-10-31

    the redundant work generally done in the near-well regions. We improved the accuracy of the streamline simulator with a higher order mapping from pressure grid to streamlines that significantly reduces smoothing errors, and a Kriging algorithm is used to map from the streamlines to the background grid. The higher accuracy of the Kriging mapping means that it is not essential for grid blocks to be crossed by one or more streamlines. The higher accuracy comes at the price of increased computational costs, but allows coarser coverage and so does not generally increase the overall costs of the computations. To reduce errors associated with fixing the pressure field between pressure updates, we developed a higher order global time-stepping method that allows the use of larger global time steps. Third-order ENO schemes are suggested to propagate components along streamlines. Both in the two-phase and three-phase experiments these ENO schemes outperform other (higher order) upwind schemes. Application of the third order ENO scheme leads to overall computational savings because the computational grid used can be coarsened. Grid adaptivity along streamlines is implemented to allow sharp but efficient resolution of solution fronts at reduced computational costs when displacement fronts are sufficiently separated. A correction for Volume Change On Mixing (VCOM) is implemented that is very effective at handling this effect. Finally, a specialized gravity operator splitting method is proposed for use in compositional streamline methods that gives an effective correction of gravity segregation. A significant part of our effort went into the development of a parallelization strategy for streamline solvers on the next generation shared memory machines. We found in this work that the built-in dynamic scheduling strategies of OpenMP lead to parallel efficiencies that are comparable to optimal schedules obtained with customized explicit load balancing strategies as long as the ratio of

  18. Review of the ISOL Method

    CERN Document Server

    Lindroos, M

    2004-01-01

    The ISOL technique was invented in Copenhagen over 50 years ago and eventually migrated to CERN where a suitable proton drive beam was available at the Syncho-Cyclotron. The quick spread of the technique from CERN to many other laboratories has resulted in a large user community, which has assured the continued development of the method, physics in the front-line of fundamental research and the application of the method to many applied sciences. The technique is today established as one of the main techniques for on-line isotope production of high intensity and high quality beams. The thick targets used allows the production of unmatched high intensity radioactive beams. The fact that the ions are produced at rest makes it ideally suitable for low energy experiments and for post acceleration using well established accelerator techniques. The many different versions of the technique will be discussed and the many facilities spread all over the world will be reviewed. The major developments at the existing faci...

  19. Streamlining the Bankability Process using International Standards

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Repins, Ingrid L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kelly, George [Sunset Technology, Mount Airy, MD; Ramu, Govind [SunPower, San Jose, California; Heinz, Matthias [TUV Rheinland, Cologne, Germany; Chen, Yingnan [CGC (China General Certification Center), Beijing; Wohlgemuth, John [PowerMark, Union Hall, VA; Lokanath, Sumanth [First Solar, Tempe, Arizona; Daniels, Eric [Suncycle USA, Frederick MD; Hsi, Edward [Swiss RE, Zurich, Switzerland; Yamamichi, Masaaki [RTS, Trumbull, CT

    2017-09-27

    NREL has supported the international efforts to create a streamlined process for documenting bankability and/or completion of each step of a PV project plan. IECRE was created for this purpose in 2014. This poster describes the goals, current status of this effort, and how individuals and companies can become involved.

  20. Hydrodynamic Drag on Streamlined Projectiles and Cavities

    KAUST Repository

    Jetly, Aditya

    2016-04-19

    The air cavity formation resulting from the water-entry of solid objects has been the subject of extensive research due to its application in various fields such as biology, marine vehicles, sports and oil and gas industries. Recently we demonstrated that at certain conditions following the closing of the air cavity formed by the initial impact of a superhydrophobic sphere on a free water surface a stable streamlined shape air cavity can remain attached to the sphere. The formation of superhydrophobic sphere and attached air cavity reaches a steady state during the free fall. In this thesis we further explore this novel phenomenon to quantify the drag on streamlined shape cavities. The drag on the sphere-cavity formation is then compared with the drag on solid projectile which were designed to have self-similar shape to that of the cavity. The solid projectiles of adjustable weight were produced using 3D printing technique. In a set of experiments on the free fall of projectile we determined the variation of projectiles drag coefficient as a function of the projectiles length to diameter ratio and the projectiles specific weight, covering a range of intermediate Reynolds number, Re ~ 104 – 105 which are characteristic for our streamlined cavity experiments. Parallel free fall experiment with sphere attached streamlined air cavity and projectile of the same shape and effective weight clearly demonstrated the drag reduction effect due to the stress-free boundary condition at cavity liquid interface. The streamlined cavity experiments can be used as the upper bound estimate of the drag reduction by air layers naturally sustained on superhydrophobic surfaces in contact with water. In the final part of the thesis we design an experiment to test the drag reduction capacity of robust superhydrophobic coatings deposited on the surface of various model vessels.

  1. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  2. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  3. Streamlining Research by Using Existing Tools

    OpenAIRE

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...

  4. A Review Of Authentication Methods

    OpenAIRE

    Nilesh A. Lal; Salendra Prasad; Mohammed Farik

    2015-01-01

    Authentication is process of granting a user access to an information system. There are three main types of authentication mechanisms password entry smart card and biometric. Each authentication mechanism functions differently and has their strengths and weakness. In this paper we review different types of authentication mechanisms their vulnerabilities and recommend novel solutions.

  5. A Review Of Authentication Methods

    Directory of Open Access Journals (Sweden)

    Nilesh A. Lal

    2015-08-01

    Full Text Available Authentication is process of granting a user access to an information system. There are three main types of authentication mechanisms password entry smart card and biometric. Each authentication mechanism functions differently and has their strengths and weakness. In this paper we review different types of authentication mechanisms their vulnerabilities and recommend novel solutions.

  6. A Review of Methods for Missing Data.

    Science.gov (United States)

    Pigott, Therese D.

    2001-01-01

    Reviews methods for handling missing data in a research study. Model-based methods, such as maximum likelihood using the EM algorithm and multiple imputation, hold more promise than ad hoc methods. Although model-based methods require more specialized computer programs and assumptions about the nature of missing data, these methods are appropriate…

  7. Analysis of Streamline Separation at Infinity Using Time-Discrete Markov Chains.

    Science.gov (United States)

    Reich, W; Scheuermann, G

    2012-12-01

    Existing methods for analyzing separation of streamlines are often restricted to a finite time or a local area. In our paper we introduce a new method that complements them by allowing an infinite-time-evaluation of steady planar vector fields. Our algorithm unifies combinatorial and probabilistic methods and introduces the concept of separation in time-discrete Markov-Chains. We compute particle distributions instead of the streamlines of single particles. We encode the flow into a map and then into a transition matrix for each time direction. Finally, we compare the results of our grid-independent algorithm to the popular Finite-Time-Lyapunov-Exponents and discuss the discrepancies.

  8. An outline review of numerical transport methods

    International Nuclear Information System (INIS)

    Budd, C.

    1981-01-01

    A brief review is presented of numerical methods for solving the neutron transport equation in the context of reactor physics. First the various forms of transport equation are given. Second, the various ways of classifying numerical transport methods are discussed. Finally each method (or class of methods) is outlined in turn. (U.K.)

  9. Large-scale renewable energy project barriers: Environmental impact assessment streamlining efforts in Japan and the EU

    International Nuclear Information System (INIS)

    Schumacher, Kim

    2017-01-01

    Environmental Impact Assessment (EIA) procedures have been identified as a major barrier to renewable energy (RE) development with regards to large-scale projects (LS-RE). However EIA laws have also been neglected by many decision-makers who have been underestimating its impact on RE development and the stifling potential they possess. As a consequence, apart from acknowledging the shortcomings of the systems currently in place, few governments momentarily have concrete plans to reform their EIA laws. By looking at recent EIA streamlining efforts in two industrialized regions that underwent major transformations in their energy sectors, this paper attempts to assess how such reform efforts can act as a means to support the balancing of environmental protection and climate change mitigation with socio-economic challenges. Thereby this paper fills this intellectual void by identifying the strengths and weaknesses of the Japanese EIA law by contrasting it with the recently revised EIA Directive of the European Union (EU). This enables the identification of the regulatory provisions that impact RE development the most and the determination of how structured EIA law reforms would affect domestic RE project development. The main focus lies on the evaluation of regulatory streamlining efforts in the Japanese and EU contexts through the application of a mixed-methods approach, consisting of in-depth literary and legal reviews, followed by a comparative analysis and a series of semi-structured interviews. Highlighting several legal inconsistencies in combination with the views of EIA professionals, academics and law- and policymakers, allowed for a more comprehensive assessment of what streamlining elements of the reformed EU EIA Directive and the proposed Japanese EIA framework modifications could either promote or stifle further RE deployment. - Highlights: •Performs an in-depth review of EIA reforms in OECD territories •First paper to compare Japan and the European

  10. Industrial Practice in Formal Methods : A Review

    DEFF Research Database (Denmark)

    Bicarregui, Juan C.; Fitzgerald, John; Larsen, Peter Gorm

    2009-01-01

    We examine the the industrial application of formal methods using data gathered in a review of 62 projects taking place over the last 25 years. The review suggests that formal methods are being applied in a wide range of application domains, with increasingly strong tool support. Significant chal...... challenges remain in providing usable tools that can be integrated into established development processes; in education and training; in taking formal methods from first use to second use, and in gathering and evidence to support informed selection of methods and tools.......We examine the the industrial application of formal methods using data gathered in a review of 62 projects taking place over the last 25 years. The review suggests that formal methods are being applied in a wide range of application domains, with increasingly strong tool support. Significant...

  11. Streamlined approach to waste management at CRL

    International Nuclear Information System (INIS)

    Adams, L.; Campbell, B.

    2011-01-01

    Radioactive, mixed, hazardous and non-hazardous wastes have been and continue to be generated at Chalk River Laboratories (CRL) as a result of research and development activities and operations since the 1940s. Over the years, the wastes produced as a byproduct of activities delivering the core missions of the CRL site have been of many types, and today, over thirty distinct waste streams have been identified, all requiring efficient management. With the commencement of decommissioning of the legacy created as part of the development of the Canadian nuclear industry, the volumes and range of wastes to be managed have been increasing in the near term, and this trend will continue into the future. The development of a streamlined approach to waste management is a key to successful waste management at CRL. Waste management guidelines that address all of the requirements have become complex, and so have the various waste management groups receiving waste, with their many different processes and capabilities. This has led to difficulties for waste generators in understanding all of the requirements to be satisfied for the various CRL waste receivers, whose primary concerns are to be safe and in compliance with their acceptance criteria and license conditions. As a result, waste movement on site can often be very slow, especially for non-routine waste types. Recognizing an opportunity for improvement, the Waste Management organization at CRL has implemented a more streamlined approach with emphasis on early identification of waste type and possible disposition path. This paper presents a streamlined approach to waste identification and waste management at CRL, the implementation methodology applied and the early results achieved from this process improvement. (author)

  12. Bee Queen Breeding Methods - Review

    Directory of Open Access Journals (Sweden)

    Silvia Patruica

    2016-05-01

    Full Text Available The biological potential of a bee family is mainly generated by the biological value of the queen. Whether we grow queens widely or just for our own apiaries, we must consider the acquisition of high-quality biological material, and also the creation of optimal feeding and caring conditions, in order to obtain high genetic value queens. Queen breeding technology starts with the setting of hoeing families, nurse families, drone-breeding families – necessary for the pairing of young queens, and also of the families which will provide the bees used to populate the nuclei where the next queens will hatch. The complex of requirements for the breeding of good, high-production queens is sometimes hard to met, under the application of artificial methods. The selection of breeding method must rely on all these requirements and on the beekeeper’s level of training.

  13. Lightroom 5 streamlining your digital photography process

    CERN Document Server

    Sylvan, Rob

    2014-01-01

    Manage your images with Lightroom and this beautifully illustrated guide Image management can soak up huge amounts of a photographer's time, but help is on hand. This complete guides teaches you how to use Adobe Lightroom 5 to import, manage, edit, and showcase large quantities of images with impressive results. The authors, both professional photographers and Lightroom experts, walk you through step by step, demonstrating real-world techniques as well as a variety of practical tips, tricks, and shortcuts that save you time. Streamline image management tasks like a pro, and get back to doing

  14. Rapid review programs to support health care and policy decision making: a descriptive analysis of processes and methods.

    Science.gov (United States)

    Polisena, Julie; Garritty, Chantelle; Kamel, Chris; Stevens, Adrienne; Abou-Setta, Ahmed M

    2015-03-14

    Health care decision makers often need to make decisions in limited timeframes and cannot await the completion of a full evidence review. Rapid reviews (RRs), utilizing streamlined systematic review methods, are increasingly being used to synthesize the evidence with a shorter turnaround time. Our primary objective was to describe the processes and methods used internationally to produce RRs. In addition, we sought to understand the underlying themes associated with these programs. We contacted representatives of international RR programs from a broad realm in health care to gather information about the methods and processes used to produce RRs. The responses were summarized narratively to understand the characteristics associated with their processes and methods. The summaries were compared and contrasted to highlight potential themes and trends related to the different RR programs. Twenty-nine international RR programs were included in our sample with a broad organizational representation from academia, government, research institutions, and non-for-profit organizations. Responses revealed that the main objectives for RRs were to inform decision making with regards to funding health care technologies, services and policy, and program development. Central themes that influenced the methods used by RR programs, and report type and dissemination were the imposed turnaround time to complete a report, resources available, the complexity and sensitivity of the research topics, and permission from the requestor. Our study confirmed that there is no standard approach to conduct RRs. Differences in processes and methods across programs may be the result of the novelty of RR methods versus other types of evidence syntheses, customization of RRs for various decision makers, and definition of 'rapid' by organizations, since it impacts both the timelines and the evidence synthesis methods. Future research should investigate the impact of current RR methods and reporting to

  15. A review of methods supporting supplier selection

    NARCIS (Netherlands)

    de Boer, L.; Labro, Eva; Morlacchi, Pierangela

    2001-01-01

    this paper we present a review of decision methods reported in the literature for supporting the supplier selection process. The review is based on an extensive search in the academic literature. We position the contributions in a framework that takes the diversity of procurement situations in terms

  16. Streamlining environmental product declarations: a stage model

    Science.gov (United States)

    Lefebvre, Elisabeth; Lefebvre, Louis A.; Talbot, Stephane; Le Hen, Gael

    2001-02-01

    General public environmental awareness and education is increasing, therefore stimulating the demand for reliable, objective and comparable information about products' environmental performances. The recently published standard series ISO 14040 and ISO 14025 are normalizing the preparation of Environmental Product Declarations (EPDs) containing comprehensive information relevant to a product's environmental impact during its life cycle. So far, only a few environmentally leading manufacturing organizations have experimented the preparation of EPDs (mostly from Europe), demonstrating its great potential as a marketing weapon. However the preparation of EPDs is a complex process, requiring collection and analysis of massive amounts of information coming from disparate sources (suppliers, sub-contractors, etc.). In a foreseeable future, the streamlining of the EPD preparation process will require product manufacturers to adapt their information systems (ERP, MES, SCADA) in order to make them capable of gathering, and transmitting the appropriate environmental information. It also requires strong functional integration all along the product supply chain in order to ensure that all the information is made available in a standardized and timely manner. The goal of the present paper is two fold: first to propose a transitional model towards green supply chain management and EPD preparation; second to identify key technologies and methodologies allowing to streamline the EPD process and subsequently the transition toward sustainable product development

  17. Epidemiological methods: a brief review

    International Nuclear Information System (INIS)

    Winkelstein, W. Jr.

    1983-01-01

    Epidemiology, the study of disease distributions in populations and the factors which influence these distributions, is an observational science, i.e., its data base consists of measurements made on free living individuals characterized by presence or absence of disease states and putative risk factors. Epidemiological studies are usually classified as descriptive or analytical. Descriptive studies are primarily used for planning and evaluating health programs or to generate etiological hypotheses. Analytical studies are primarily used for testing etiological hypotheses. Analytical studies are designed either as cohort investigations in which populations with and without a putative risk factor are followed through time to ascertain their differential incidence of disease, or case-control investigations in which the history of exposure to a putative risk factor is compared among persons with a disease and appropriate controls free of disease. Both descriptive and analytical epidemiological studies have been applied to health physics problems. Examples of such problems and the epidemiological methods used to explore them will be presented

  18. Review of Test Theory and Methods.

    Science.gov (United States)

    1981-01-01

    literature, although some books , technical reports, and unpub- lished literature have been included where relevant. The focus of the review is on practical...1977) and Abu-Sayf (1977) developed new versions of formula scores, and Molenaar (1977) took a Bayesian approach to correcting for random guessing. The...Snow’s (1977) book on aptitude and instructional methods is a landmark review of the research on the interaction between instructional methods and

  19. State Models to Incentivize and Streamline Small Hydropower Development

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, Taylor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Levine, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Johnson, Kurt [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-10-31

    In 2016, the hydropower fleet in the United States produced more than 6 percent (approximately 265,829 gigawatt-hours [GWh]) of the total net electricity generation. The median-size hydroelectric facility in the United States is 1.6 MW and 75 percent of total facilities have a nameplate capacity of 10 MW or less. Moreover, the U.S. Department of Energy's Hydropower Vision study identified approximately 79 GW hydroelectric potential beyond what is already developed. Much of the potential identified is at low-impact new stream-reaches, existing conduits, and non-powered dams with a median project size of 10 MW or less. To optimize the potential and value of small hydropower development, state governments are crafting policies that provide financial assistance and expedite state and federal review processes for small hydroelectric projects. This report analyzes state-led initiatives and programs that incentivize and streamline small hydroelectric development.

  20. Quantum mechanical streamlines. I - Square potential barrier

    Science.gov (United States)

    Hirschfelder, J. O.; Christoph, A. C.; Palke, W. E.

    1974-01-01

    Exact numerical calculations are made for scattering of quantum mechanical particles hitting a square two-dimensional potential barrier (an exact analog of the Goos-Haenchen optical experiments). Quantum mechanical streamlines are plotted and found to be smooth and continuous, to have continuous first derivatives even through the classical forbidden region, and to form quantized vortices around each of the nodal points. A comparison is made between the present numerical calculations and the stationary wave approximation, and good agreement is found between both the Goos-Haenchen shifts and the reflection coefficients. The time-independent Schroedinger equation for real wavefunctions is reduced to solving a nonlinear first-order partial differential equation, leading to a generalization of the Prager-Hirschfelder perturbation scheme. Implications of the hydrodynamical formulation of quantum mechanics are discussed, and cases are cited where quantum and classical mechanical motions are identical.

  1. [Montessori method applied to dementia - literature review].

    Science.gov (United States)

    Brandão, Daniela Filipa Soares; Martín, José Ignacio

    2012-06-01

    The Montessori method was initially applied to children, but now it has also been applied to people with dementia. The purpose of this study is to systematically review the research on the effectiveness of this method using Medical Literature Analysis and Retrieval System Online (Medline) with the keywords dementia and Montessori method. We selected lo studies, in which there were significant improvements in participation and constructive engagement, and reduction of negative affects and passive engagement. Nevertheless, systematic reviews about this non-pharmacological intervention in dementia rate this method as weak in terms of effectiveness. This apparent discrepancy can be explained because the Montessori method may have, in fact, a small influence on dimensions such as behavioral problems, or because there is no research about this method with high levels of control, such as the presence of several control groups or a double-blind study.

  2. A streamlined failure mode and effects analysis.

    Science.gov (United States)

    Ford, Eric C; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-06-01

    Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes had RPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  3. Streamlining cardiovascular clinical trials to improve efficiency and generalisability.

    Science.gov (United States)

    Zannad, Faiez; Pfeffer, Marc A; Bhatt, Deepak L; Bonds, Denise E; Borer, Jeffrey S; Calvo-Rojas, Gonzalo; Fiore, Louis; Lund, Lars H; Madigan, David; Maggioni, Aldo Pietro; Meyers, Catherine M; Rosenberg, Yves; Simon, Tabassome; Stough, Wendy Gattis; Zalewski, Andrew; Zariffa, Nevine; Temple, Robert

    2017-08-01

    Controlled trials provide the most valid determination of the efficacy and safety of an intervention, but large cardiovascular clinical trials have become extremely costly and complex, making it difficult to study many important clinical questions. A critical question, and the main objective of this review, is how trials might be simplified while maintaining randomisation to preserve scientific integrity and unbiased efficacy assessments. Experience with alternative approaches is accumulating, specifically with registry-based randomised controlled trials that make use of data already collected. This approach addresses bias concerns while still capitalising on the benefits and efficiencies of a registry. Several completed or ongoing trials illustrate the feasibility of using registry-based controlled trials to answer important questions relevant to daily clinical practice. Randomised trials within healthcare organisation databases may also represent streamlined solutions for some types of investigations, although data quality (endpoint assessment) is likely to be a greater concern in those settings. These approaches are not without challenges, and issues pertaining to informed consent, blinding, data quality and regulatory standards remain to be fully explored. Collaboration among stakeholders is necessary to achieve standards for data management and analysis, to validate large data sources for use in randomised trials, and to re-evaluate ethical standards to encourage research while also ensuring that patients are protected. The rapidly evolving efforts to streamline cardiovascular clinical trials have the potential to lead to major advances in promoting better care and outcomes for patients with cardiovascular disease. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Streamlining of the Decontamination and Demolition Document Preparation Process

    International Nuclear Information System (INIS)

    Durand, Nick; Meincke, Carol; Peek, Georgianne

    1999-01-01

    During the past five years, the Sandia National Labo- ratories Decontamination, Decommissioning, Demolition, and Reuse (D3R) Program has evolved and become more focused and efficient. Historical approaches to project documentation, requirements, and drivers are discussed detailing key assumptions, oversight authority, and proj- ect approvals. Discussion of efforts to streamline the D3R project planning and preparation process include the in- corporation of the principles of graded approach, Total Quality Management, and the Observational Method (CH2MHILL April 1989).1 Process improvements were realized by clearly defining regulatory requirements for each phase of a project, establishing general guidance for the program and combining project-specific documents to eliminate redundant and unneeded information. Proc- ess improvements to cost, schedule, and quality are dis- cussed in detail for several projects

  5. Streamlined bioreactor-based production of human cartilage tissues.

    Science.gov (United States)

    Tonnarelli, B; Santoro, R; Adelaide Asnaghi, M; Wendt, D

    2016-05-27

    Engineered tissue grafts have been manufactured using methods based predominantly on traditional labour-intensive manual benchtop techniques. These methods impart significant regulatory and economic challenges, hindering the successful translation of engineered tissue products to the clinic. Alternatively, bioreactor-based production systems have the potential to overcome such limitations. In this work, we present an innovative manufacturing approach to engineer cartilage tissue within a single bioreactor system, starting from freshly isolated human primary chondrocytes, through the generation of cartilaginous tissue grafts. The limited number of primary chondrocytes that can be isolated from a small clinically-sized cartilage biopsy could be seeded and extensively expanded directly within a 3D scaffold in our perfusion bioreactor (5.4 ± 0.9 doublings in 2 weeks), bypassing conventional 2D expansion in flasks. Chondrocytes expanded in 3D scaffolds better maintained a chondrogenic phenotype than chondrocytes expanded on plastic flasks (collagen type II mRNA, 18-fold; Sox-9, 11-fold). After this "3D expansion" phase, bioreactor culture conditions were changed to subsequently support chondrogenic differentiation for two weeks. Engineered tissues based on 3D-expanded chondrocytes were more cartilaginous than tissues generated from chondrocytes previously expanded in flasks. We then demonstrated that this streamlined bioreactor-based process could be adapted to effectively generate up-scaled cartilage grafts in a size with clinical relevance (50 mm diameter). Streamlined and robust tissue engineering processes, as the one described here, may be key for the future manufacturing of grafts for clinical applications, as they facilitate the establishment of compact and closed bioreactor-based production systems, with minimal automation requirements, lower operating costs, and increased compliance to regulatory guidelines.

  6. The contingent valuation method: a review

    International Nuclear Information System (INIS)

    Venkatachalam, L.

    2004-01-01

    The contingent valuation method (CVM) is a simple, flexible nonmarket valuation method that is widely used in cost-benefit analysis and environmental impact assessment. However, this method is subject to severe criticism. The criticism revolves mainly around two aspects, namely, the validity and the reliability of the results, and the effects of various biases and errors. The major objective of this paper is to review the recent developments on measures to address the validity and reliability issues arising out of different kinds of biases/errors and other related empirical and methodological issues concerning contingent valuation method

  7. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    Moulin, D.

    1987-01-01

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  8. The design of the Comet streamliner: An electric land speed record motorcycle

    Science.gov (United States)

    McMillan, Ethan Alexander

    The development of the land speed record electric motorcycle streamliner, the Comet, is discussed herein. Its design process includes a detailed literary review of past and current motorcycle streamliners in an effort to highlight the main components of such a vehicle's design, while providing baseline data for performance comparisons. A new approach to balancing a streamliner at low speeds is also addressed, a system henceforth referred to as landing gear, which has proven an effective means for allowing the driver to control the low speed instabilities of the vehicle with relative ease compared to tradition designs. This is accompanied by a dynamic stability analysis conducted on a test chassis that was developed for the primary purpose of understanding the handling dynamics of streamliners, while also providing a test bed for the implementation of the landing gear system and a means to familiarize the driver to the operation and handling of such a vehicle. Data gathered through the use of GPS based velocity tracking, accelerometers, and a linear potentiometer provided a means to validate a dynamic stability analysis of the weave and wobble modes of the vehicle through linearization of a streamliner model developed in the BikeSIM software suite. Results indicate agreement between the experimental data and the simulation, indicating that the conventional recumbent design of a streamliner chassis is in fact highly stable throughout the performance envelope beyond extremely low speeds. A computational fluid dynamics study was also performed, utilized in the development of the body of the Comet to which a series of tests were conducted in order to develop a shape that was both practical to transport and highly efficient. By creating a hybrid airfoil from a NACA 0018 and NACA 66-018, a drag coefficient of 0.1 and frontal area of 0.44 m2 has been found for the final design. Utilizing a performance model based on the proposed vehicle's motor, its rolling resistance, and

  9. The impact of groundwater velocity fields on streamlines in an aquifer system with a discontinuous aquitard (Inner Mongolia, China)

    Science.gov (United States)

    Wu, Qiang; Zhao, Yingwang; Xu, Hua

    2018-04-01

    Many numerical methods that simulate groundwater flow, particularly the continuous Galerkin finite element method, do not produce velocity information directly. Many algorithms have been proposed to improve the accuracy of velocity fields computed from hydraulic potentials. The differences in the streamlines generated from velocity fields obtained using different algorithms are presented in this report. The superconvergence method employed by FEFLOW, a popular commercial code, and some dual-mesh methods proposed in recent years are selected for comparison. The applications to depict hydrogeologic conditions using streamlines are used, and errors in streamlines are shown to lead to notable errors in boundary conditions, the locations of material interfaces, fluxes and conductivities. Furthermore, the effects of the procedures used in these two types of methods, including velocity integration and local conservation, are analyzed. The method of interpolating velocities across edges using fluxes is shown to be able to eliminate errors associated with refraction points that are not located along material interfaces and streamline ends at no-flow boundaries. Local conservation is shown to be a crucial property of velocity fields and can result in more accurate streamline densities. A case study involving both three-dimensional and two-dimensional cross-sectional models of a coal mine in Inner Mongolia, China, are used to support the conclusions presented.

  10. A REVIEW OF ORDER PICKING IMPROVEMENT METHODS

    Directory of Open Access Journals (Sweden)

    Johan Oscar Ong

    2014-09-01

    Full Text Available As a crucial and one of the most important parts of warehousing, order picking often raises discussion between warehousing professionals, resulting in various studies aiming to analyze how order picking activity can be improved from various perspective. This paper reviews various past researches on order picking improvement, and the various methods those studies analyzed or developed. This literature review is based on twenty research articles on order picking improvement viewed from four different perspectives: Automation (specifically, stock-to-picker system, storage assignment policy, order batching, and order picking sequencing. By reviewing these studies, we try to identify the most prevalent order picking improvement approach to order picking improvement. Keywords: warehousing; stock-to-picker; storage assignment; order batching; order picking sequencing; improvement

  11. Southern Ocean overturning across streamlines in an eddying simulation of the Antarctic Circumpolar Current

    Directory of Open Access Journals (Sweden)

    A. M. Treguier

    2007-12-01

    Full Text Available An eddying global model is used to study the characteristics of the Antarctic Circumpolar Current (ACC in a streamline-following framework. Previous model-based estimates of the meridional circulation were calculated using zonal averages: this method leads to a counter-intuitive poleward circulation of the less dense waters, and underestimates the eddy effects. We show that on the contrary, the upper ocean circulation across streamlines agrees with the theoretical view: an equatorward mean flow partially cancelled by a poleward eddy mass flux. Two model simulations, in which the buoyancy forcing above the ACC changes from positive to negative, suggest that the relationship between the residual meridional circulation and the surface buoyancy flux is not as straightforward as assumed by the simplest theoretical models: the sign of the residual circulation cannot be inferred from the surface buoyancy forcing only. Among the other processes that likely play a part in setting the meridional circulation, our model results emphasize the complex three-dimensional structure of the ACC (probably not well accounted for in streamline-averaged, two-dimensional models and the distinct role of temperature and salinity in the definition of the density field. Heat and salt transports by the time-mean flow are important even across time-mean streamlines. Heat and salt are balanced in the ACC, the model drift being small, but the nonlinearity of the equation of state cannot be ignored in the density balance.

  12. Hydroxyapatite Fibers: A Review of Synthesis Methods

    Science.gov (United States)

    Qi, Mei-Li; He, Kun; Huang, Zhen-Nan; Shahbazian-Yassar, Reza; Xiao, Gui-Yong; Lu, Yu-Peng; Shokuhfar, Tolou

    2017-08-01

    Hydroxyapatite (HA) exhibits excellent biocompatibility, bioactivity, osteoconductivity, non-toxicity and so on, making it a perfect candidate for biomedical applications. However, HA is not qualified to be used in load-bearing sites due to its poor flexural strength and fracture toughness. Design, synthesis and application of fibrous HA is a promising strategy to overcome the inherent brittleness. This review provides a brief description of HA and hydroxyapatite fiber (HAF), then introduces different synthesis methods of HAF and highlights the inherent merits and drawbacks involved in each method. Finally, the future perspectives in this active research area are given. The purpose of this review is to acquaint the reader with this promising new field of biomaterials research and with emphasis on recent techniques to obtain continuous, uniform and long HAF.

  13. Case studies in geographic information systems for environmental streamlining

    Science.gov (United States)

    2012-05-31

    This 2012 summary report addresses the current use of geographic information systems (GIS) and related technologies by State Departments of Transportation (DOTs) for environmental streamlining and stewardship, particularly in relation to the National...

  14. Streamline segment statistics of premixed flames with nonunity Lewis numbers

    Science.gov (United States)

    Chakraborty, Nilanjan; Wang, Lipo; Klein, Markus

    2014-03-01

    The interaction of flame and surrounding fluid motion is of central importance in the fundamental understanding of turbulent combustion. It is demonstrated here that this interaction can be represented using streamline segment analysis, which was previously applied in nonreactive turbulence. The present work focuses on the effects of the global Lewis number (Le) on streamline segment statistics in premixed flames in the thin-reaction-zones regime. A direct numerical simulation database of freely propagating thin-reaction-zones regime flames with Le ranging from 0.34 to 1.2 is used to demonstrate that Le has significant influences on the characteristic features of the streamline segment, such as the curve length, the difference in the velocity magnitude at two extremal points, and their correlations with the local flame curvature. The strengthenings of the dilatation rate, flame normal acceleration, and flame-generated turbulence with decreasing Le are principally responsible for these observed effects. An expression for the probability density function (pdf) of the streamline segment length, originally developed for nonreacting turbulent flows, captures the qualitative behavior for turbulent premixed flames in the thin-reaction-zones regime for a wide range of Le values. The joint pdfs between the streamline length and the difference in the velocity magnitude at two extremal points for both unweighted and density-weighted velocity vectors are analyzed and compared. Detailed explanations are provided for the observed differences in the topological behaviors of the streamline segment in response to the global Le.

  15. Streamlined Modeling for Characterizing Spacecraft Anomalous Behavior

    Science.gov (United States)

    Klem, B.; Swann, D.

    2011-09-01

    Anomalous behavior of on-orbit spacecraft can often be detected using passive, remote sensors which measure electro-optical signatures that vary in time and spectral content. Analysts responsible for assessing spacecraft operational status and detecting detrimental anomalies using non-resolved imaging sensors are often presented with various sensing and identification issues. Modeling and measuring spacecraft self emission and reflected radiant intensity when the radiation patterns exhibit a time varying reflective glint superimposed on an underlying diffuse signal contribute to assessment of spacecraft behavior in two ways: (1) providing information on body component orientation and attitude; and, (2) detecting changes in surface material properties due to the space environment. Simple convex and cube-shaped spacecraft, designed to operate without protruding solar panel appendages, may require an enhanced level of preflight characterization to support interpretation of the various physical effects observed during on-orbit monitoring. This paper describes selected portions of the signature database generated using streamlined signature modeling and simulations of basic geometry shapes apparent to non-imaging sensors. With this database, summarization of key observable features for such shapes as spheres, cylinders, flat plates, cones, and cubes in specific spectral bands that include the visible, mid wave, and long wave infrared provide the analyst with input to the decision process algorithms contained in the overall sensing and identification architectures. The models typically utilize baseline materials such as Kapton, paints, aluminum surface end plates, and radiators, along with solar cell representations covering the cylindrical and side portions of the spacecraft. Multiple space and ground-based sensors are assumed to be located at key locations to describe the comprehensive multi-viewing aspect scenarios that can result in significant specular reflection

  16. Gingival Retraction Methods: A Systematic Review.

    Science.gov (United States)

    Tabassum, Sadia; Adnan, Samira; Khan, Farhan Raza

    2017-12-01

    The aim of this systematic review was to assess the gingival retraction methods in terms of the amount of gingival retraction achieved and changes observed in various clinical parameters: gingival index (GI), plaque index (PI), probing depth (PD), and attachment loss (AL). Data sources included three major databases, PubMed, CINAHL plus (Ebsco), and Cochrane, along with hand search. Search was made using the key terms in different permutations of gingival retraction* AND displacement method* OR technique* OR agents OR material* OR medicament*. The initial search results yielded 145 articles which were narrowed down to 10 articles using a strict eligibility criteria of including clinical trials or experimental studies on gingival retraction methods with the amount of tooth structure gained and assessment of clinical parameters as the outcomes conducted on human permanent teeth only. Gingival retraction was measured in 6/10 studies whereas the clinical parameters were assessed in 5/10 studies. The total number of teeth assessed in the 10 included studies was 400. The most common method used for gingival retraction was chemomechanical. The results were heterogeneous with regards to the outcome variables. No method seemed to be significantly superior to the other in terms of gingival retraction achieved. Clinical parameters were not significantly affected by the gingival retraction method. © 2016 by the American College of Prosthodontists.

  17. Mean streamline analysis for performance prediction of cross-flow fans

    International Nuclear Information System (INIS)

    Kim, Jae Won; Oh, Hyoung Woo

    2004-01-01

    This paper presents the mean streamline analysis using the empirical loss correlations for performance prediction of cross-flow fans. Comparison of overall performance predictions with test data of a cross-flow fan system with a simplified vortex wall scroll casing and with the published experimental characteristics for a cross-flow fan has been carried out to demonstrate the accuracy of the proposed method. Predicted performance curves by the present mean streamline analysis agree well with experimental data for two different cross-flow fans over the normal operating conditions. The prediction method presented herein can be used efficiently as a tool for the preliminary design and performance analysis of general-purpose cross-flow fans

  18. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  19. Review of human factors guidelines and methods

    International Nuclear Information System (INIS)

    Rhodes, W.; Szlapetis, I.; Hay, T.; Weihrer, S.

    1995-04-01

    The review examines the use of human factors guidelines and methods in high technology applications, with emphasis on application to the nuclear industry. An extensive literature review was carried out identifying over 250 applicable documents, with 30 more documents identified during interviews with experts in human factors. Surveys were sent to 15 experts, of which 11 responded. The survey results indicated guidelines used and why these were favoured. Thirty-three of the most applicable guideline documents were described in detailed annotated bibliographies. A bibliographic list containing over 280 references was prepared. Thirty guideline documents were rated for their completeness, validity, applicability and practicality. The experts survey indicated the use of specific techniques. Ten human factors methods of analysis were described in general summaries, including procedures, applications, and specific techniques. Detailed descriptions of the techniques were prepared and each technique rated for applicability and practicality. Recommendations for further study of areas of importance to human factors in the nuclear field in Canada are given. (author). 8 tabs., 2 figs

  20. Review of human factors guidelines and methods

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, W; Szlapetis, I; Hay, T; Weihrer, S [Rhodes and Associates Inc., Toronto, ON (Canada)

    1995-04-01

    The review examines the use of human factors guidelines and methods in high technology applications, with emphasis on application to the nuclear industry. An extensive literature review was carried out identifying over 250 applicable documents, with 30 more documents identified during interviews with experts in human factors. Surveys were sent to 15 experts, of which 11 responded. The survey results indicated guidelines used and why these were favoured. Thirty-three of the most applicable guideline documents were described in detailed annotated bibliographies. A bibliographic list containing over 280 references was prepared. Thirty guideline documents were rated for their completeness, validity, applicability and practicality. The experts survey indicated the use of specific techniques. Ten human factors methods of analysis were described in general summaries, including procedures, applications, and specific techniques. Detailed descriptions of the techniques were prepared and each technique rated for applicability and practicality. Recommendations for further study of areas of importance to human factors in the nuclear field in Canada are given. (author). 8 tabs., 2 figs.

  1. Calculation of heat transfer in transversely stream-lined tube bundles with chess arrangement

    International Nuclear Information System (INIS)

    Migaj, V.K.

    1978-01-01

    A semiempirical theory of heat transfer in transversely stream-lined chess-board tube bundles has been developed. The theory is based on a single cylinder model and involves external flow parameter evaluation on the basis of the solidification principle of a vortex zone. The effect of turbulence is estimated according to experimental results. The method is extended to both average and local heat transfer coefficients. Comparison with experiment shows satisfactory agreement

  2. Review of Calibration Methods for Scheimpflug Camera

    Directory of Open Access Journals (Sweden)

    Cong Sun

    2018-01-01

    Full Text Available The Scheimpflug camera offers a wide range of applications in the field of typical close-range photogrammetry, particle image velocity, and digital image correlation due to the fact that the depth-of-view of Scheimpflug camera can be greatly extended according to the Scheimpflug condition. Yet, the conventional calibration methods are not applicable in this case because the assumptions used by classical calibration methodologies are not valid anymore for cameras undergoing Scheimpflug condition. Therefore, various methods have been investigated to solve the problem over the last few years. However, no comprehensive review exists that provides an insight into recent calibration methods of Scheimpflug cameras. This paper presents a survey of recent calibration methods of Scheimpflug cameras with perspective lens, including the general nonparametric imaging model, and analyzes in detail the advantages and drawbacks of the mainstream calibration models with respect to each other. Real data experiments including calibrations, reconstructions, and measurements are performed to assess the performance of the models. The results reveal that the accuracies of the RMM, PLVM, PCIM, and GNIM are basically equal, while the accuracy of GNIM is slightly lower compared with the other three parametric models. Moreover, the experimental results reveal that the parameters of the tangential distortion are likely coupled with the tilt angle of the sensor in Scheimpflug calibration models. The work of this paper lays the foundation of further research of Scheimpflug cameras.

  3. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  4. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  5. 76 FR 75825 - Streamlining Inherited Regulations

    Science.gov (United States)

    2011-12-05

    ... easier. DATES: Comments must be submitted by March 5, 2012. Commenters will have 30 additional days... 1700 G Street NW., Washington, DC 20006, on official business days between the hours of 10 a.m. and 5 p... deadline for most of these rules. At the same time, the Bureau wants to start reviewing the inherited...

  6. Managing Written Directives: A Software Solution to Streamline Workflow.

    Science.gov (United States)

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases

  7. A Streamlined Approach by a Combination of Bioindication and Geostatistical Methods for Assessing Air Contaminants and Their Effects on Human Health in Industrialized Areas: A Case Study in Southern Brazil

    Directory of Open Access Journals (Sweden)

    Angélica B. Ferreira

    2017-09-01

    Full Text Available Industrialization in developing countries associated with urban growth results in a number of economic benefits, especially in small or medium-sized cities, but leads to a number of environmental and public health consequences. This problem is further aggravated when adequate infrastructure is lacking to monitor the environmental impacts left by industries and refineries. In this study, a new protocol was designed combining biomonitoring and geostatistics to evaluate the possible effects of shale industry emissions on human health and wellbeing. Futhermore, the traditional and expensive air quality method based on PM2.5 measuring was also used to validate the low-cost geostatistical approach. Chemical analysis was performed using Energy Dispersive X-ray Fluorescence Spectrometer (EDXRF to measure inorganic elements in tree bark and shale retorted samples in São Mateus do Sul city, Southern Brazil. Fe, S, and Si were considered potential pollutants in the study area. Distribution maps of element concentrations were generated from the dataset and used to estimate the spatial behavior of Fe, S, and Si and the range from their hot spot(s, highlighting the regions sorrounding the shale refinery. This evidence was also demonstrated in the measurements of PM2.5 concentrations, which are in agreement with the information obtained from the biomonitoring and geostatistical model. Factor and descriptive analyses performed on the concentrations of tree bark contaminants suggest that Fe, S, and Si might be used as indicators of industrial emissions. The number of cases of respiratory diseases obtained from local basic health unit were used to assess a possible correlation between shale refinery emissions and cases of repiratory disease. These data are public and may be accessed on the website of the the Brazilian Ministry of Health. Significant associations were found between the health data and refinery activities. The combination of the spatial

  8. A Review of Human Activity Recognition Methods

    Directory of Open Access Journals (Sweden)

    Michalis eVrigkas

    2015-11-01

    Full Text Available Recognizing human activities from video sequences or still images is a challenging task due to problems such as background clutter, partial occlusion, changes in scale, viewpoint, lighting, and appearance. Many applications, including video surveillance systems, human-computer interaction, and robotics for human behavior characterization, require a multiple activity recognition system. In this work, we provide a detailed review of recent and state-of-the-art research advances in the field of human activity classification. We propose a categorization of human activity methodologies and discuss their advantages and limitations. In particular, we divide human activity classification methods into two large categories according to whether they use data from different modalities or not. Then, each of these categories is further analyzed into sub-categories, which reflect how they model human activities and what type of activities they are interested in. Moreover, we provide a comprehensive analysis of the existing, publicly available human activity classification datasets and examine the requirements for an ideal human activity recognition dataset. Finally, we report the characteristics of future research directions and present some open issues on human activity recognition.

  9. 48 CFR 12.602 - Streamlined evaluation of offers.

    Science.gov (United States)

    2010-10-01

    ... offers. 12.602 Section 12.602 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... for Commercial Items 12.602 Streamlined evaluation of offers. (a) When evaluation factors are used... evaluation factors. (b) Offers shall be evaluated in accordance with the criteria contained in the...

  10. Application-Tailored I/O with Streamline

    NARCIS (Netherlands)

    de Bruijn, W.J.; Bos, H.J.; Bal, H.E.

    2011-01-01

    Streamline is a stream-based OS communication subsystem that spans from peripheral hardware to userspace processes. It improves performance of I/O-bound applications (such as webservers and streaming media applications) by constructing tailor-made I/O paths through the operating system for each

  11. Streamlining of the RELAP5-3D Code

    International Nuclear Information System (INIS)

    Mesina, George L; Hykes, Joshua; Guillen, Donna Post

    2007-01-01

    RELAP5-3D is widely used by the nuclear community to simulate general thermal hydraulic systems and has proven to be so versatile that the spectrum of transient two-phase problems that can be analyzed has increased substantially over time. To accommodate the many new types of problems that are analyzed by RELAP5-3D, both the physics and numerical methods of the code have been continuously improved. In the area of computational methods and mathematical techniques, many upgrades and improvements have been made decrease code run time and increase solution accuracy. These include vectorization, parallelization, use of improved equation solvers for thermal hydraulics and neutron kinetics, and incorporation of improved library utilities. In the area of applied nuclear engineering, expanded capabilities include boron and level tracking models, radiation/conduction enclosure model, feedwater heater and compressor components, fluids and corresponding correlations for modeling Generation IV reactor designs, and coupling to computational fluid dynamics solvers. Ongoing and proposed future developments include improvements to the two-phase pump model, conversion to FORTRAN 90, and coupling to more computer programs. This paper summarizes the general improvements made to RELAP5-3D, with an emphasis on streamlining the code infrastructure for improved maintenance and development. With all these past, present and planned developments, it is necessary to modify the code infrastructure to incorporate modifications in a consistent and maintainable manner. Modifying a complex code such as RELAP5-3D to incorporate new models, upgrade numerics, and optimize existing code becomes more difficult as the code grows larger. The difficulty of this as well as the chance of introducing errors is significantly reduced when the code is structured. To streamline the code into a structured program, a commercial restructuring tool, FOR( ) STRUCT, was applied to the RELAP5-3D source files. The

  12. 28 CFR 34.105 - Peer review methods.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Peer review methods. 34.105 Section 34... Review § 34.105 Peer review methods. (a) For both competitive and noncompetitive applications, peer... announcement or otherwise established by the Administrator, together with the assignment of numerical values...

  13. Streamlined, Inexpensive 3D Printing of the Brain and Skull.

    Science.gov (United States)

    Naftulin, Jason S; Kimchi, Eyal Y; Cash, Sydney S

    2015-01-01

    Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.

  14. Streamlined, Inexpensive 3D Printing of the Brain and Skull.

    Directory of Open Access Journals (Sweden)

    Jason S Naftulin

    Full Text Available Neuroimaging technologies such as Magnetic Resonance Imaging (MRI and Computed Tomography (CT collect three-dimensional data (3D that is typically viewed on two-dimensional (2D screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM images to stereolithography (STL files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = <30 min. Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.

  15. Streamlined, Inexpensive 3D Printing of the Brain and Skull

    Science.gov (United States)

    Cash, Sydney S.

    2015-01-01

    Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3–4 in consumable plastic filament as described, and the total process takes 14–17 hours, almost all of which is unsupervised (preprocessing = 4–6 hr; printing = 9–11 hr, post-processing = Printing a matching portion of a skull costs $1–5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes. PMID:26295459

  16. Assessing Contractor Capabilities for Streamlined Site Investigations

    Science.gov (United States)

    The purpose of this document is to familiarize and encourage brownfields decision makers to investigate and employ innovative methods for characterizing their sites, to assist brownfields decision makers in assessing contractors' capabilities.

  17. Streamlining and automation of radioanalytical methods at a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, J.T.; Dillard, J.W. [IT Corp., Knoxville, TN (United States)

    1993-12-31

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed.

  18. Streamlining and automation of radioanalytical methods at a commercial laboratory

    International Nuclear Information System (INIS)

    Harvey, J.T.; Dillard, J.W.

    1993-01-01

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed

  19. Strategy on review method for JENDL High Energy File

    Energy Technology Data Exchange (ETDEWEB)

    Yamano, Naoki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Status on review method and problems for a High Energy File of Japanese Evaluated Nuclear Data Library (JENDL-HE File) has been described. Measurements on differential and integral data relevant to the review work for the JENDL-HE File have been examined from a viewpoint of data quality and applicability. In order to achieve the work effectively, strategy on development of standard review method has been discussed as well as necessity of tools to be used in the review scheme. (author)

  20. The role of streamline curvature in sand dune dynamics: evidence from field and wind tunnel measurements

    Science.gov (United States)

    Wiggs, Giles F. S.; Livingstone, Ian; Warren, Andrew

    1996-09-01

    Field measurements on an unvegetated, 10 m high barchan dune in Oman are compared with measurements over a 1:200 scale fixed model in a wind tunnel. Both the field and wind tunnel data demonstrate similar patterns of wind and shear velocity over the dune, confirming significant flow deceleration upwind of and at the toe of the dune, acceleration of flow up the windward slope, and deceleration between the crest and brink. This pattern, including the widely reported upwind reduction in shear velocity, reflects observations of previous studies. Such a reduction in shear velocity upwind of the dune should result in a reduction in sand transport and subsequent sand deposition. This is not observed in the field. Wind tunnel modelling using a near-surface pulse-wire probe suggests that the field method of shear velocity derivation is inadequate. The wind tunnel results exhibit no reduction in shear velocity upwind of or at the toe of the dune. Evidence provided by Reynolds stress profiles and turbulence intensities measured in the wind tunnel suggest that this maintenance of upwind shear stress may be a result of concave (unstable) streamline curvature. These additional surface stresses are not recorded by the techniques used in the field measurements. Using the occurrence of streamline curvature as a starting point, a new 2-D model of dune dynamics is deduced. This model relies on the establishment of an equilibrium between windward slope morphology, surface stresses induced by streamline curvature, and streamwise acceleration. Adopting the criteria that concave streamline curvature and streamwise acceleration both increase surface shear stress, whereas convex streamline curvature and deceleration have the opposite effect, the relationships between form and process are investigated in each of three morphologically distinct zones: the upwind interdune and concave toe region of the dune, the convex portion of the windward slope, and the crest-brink region. The

  1. Interior-Point Methods for Linear Programming: A Review

    Science.gov (United States)

    Singh, J. N.; Singh, D.

    2002-01-01

    The paper reviews some recent advances in interior-point methods for linear programming and indicates directions in which future progress can be made. Most of the interior-point methods belong to any of three categories: affine-scaling methods, potential reduction methods and central path methods. These methods are discussed together with…

  2. Gust loading on streamlined bridge decks

    DEFF Research Database (Denmark)

    Larose, Guy; Mann, Jakob

    1998-01-01

    The current analytical description of the buffeting action of wind on long-span bridges is based on the strip assumption. However, recent experiments on closed-box girder bridge decks have shown that this assumption is not valid and is the source of an important part of the error margin...... of the analytical prediction methods. In this paper, an analytical model that departs from the strip assumption is used to describe the gust loading on a thin airfoil. A parallel is drawn between the analytical model and direct measurements of gust loading on motionless closed-box girder bridge decks. Empirical...

  3. Joint statistics and conditional mean strain rates of streamline segments

    International Nuclear Information System (INIS)

    Schaefer, P; Gampert, M; Peters, N

    2013-01-01

    Based on four different direct numerical simulations of turbulent flows with Taylor-based Reynolds numbers ranging from Re λ = 50 to 300 among which are two homogeneous isotropic decaying, one forced and one homogeneous shear flow, streamlines are identified and the obtained space curves are parameterized with the pseudo-time as well as the arclength. Based on local extrema of the absolute value of the velocity along the streamlines, the latter are partitioned into segments following Wang (2010 J. Fluid Mech. 648 183–203). Streamline segments are then statistically analyzed based on both parameterizations using the joint probability density function of the pseudo-time lag τ (arclength l, respectively) between and the velocity difference Δu at the extrema: P(τ,Δu), (P(l,Δu)). We distinguish positive and negative streamline segments depending on the sign of the velocity difference Δu. Differences as well as similarities in the statistical description for both parameterizations are discussed. In particular, it turns out that the normalized probability distribution functions (pdfs) (of both parameterizations) of the length of positive, negative and all segments assume a universal shape for all Reynolds numbers and flow types and are well described by a model derived in Schaefer P et al (2012 Phys. Fluids 24 045104). Particular attention is given to the conditional mean velocity difference at the ending points of the segments, which can be understood as a first-order structure function in the context of streamline segment analysis. It determines to a large extent the stretching (compression) of positive (negative) streamline segments and corresponds to the convective velocity in phase space in the transport model equation for the pdf. While based on the random sweeping hypothesis a scaling ∝ (u rms ετ) 1/3 is found for the parameterization based on the pseudo-time, the parameterization with the arclength l yields a much larger than expected l 1/3 scaling. A

  4. Review of geophysical characterization methods used at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    GV Last; DG Horton

    2000-03-23

    This paper presents a review of geophysical methods used at Hanford in two parts: (1) shallow surface-based geophysical methods and (2) borehole geophysical methods. This review was not intended to be ``all encompassing'' but should represent the vast majority (>90% complete) of geophysical work conducted onsite and aimed at hazardous waste investigations in the vadose zone and/or uppermost groundwater aquifers. This review did not cover geophysical methods aimed at large-scale geologic structures or seismicity and, in particular, did not include those efforts conducted in support of the Basalt Waste Isolation Program. This review focused primarily on the more recent efforts.

  5. Review of geophysical characterization methods used at the Hanford Site

    International Nuclear Information System (INIS)

    GV Last; DG Horton

    2000-01-01

    This paper presents a review of geophysical methods used at Hanford in two parts: (1) shallow surface-based geophysical methods and (2) borehole geophysical methods. This review was not intended to be ''all encompassing'' but should represent the vast majority (>90% complete) of geophysical work conducted onsite and aimed at hazardous waste investigations in the vadose zone and/or uppermost groundwater aquifers. This review did not cover geophysical methods aimed at large-scale geologic structures or seismicity and, in particular, did not include those efforts conducted in support of the Basalt Waste Isolation Program. This review focused primarily on the more recent efforts

  6. Streamlined Darwin simulation of nonneutral plasmas

    International Nuclear Information System (INIS)

    Hewett, D.W.; Boyd, J.K.

    1987-01-01

    Efficient, new algorithms that require less formal manipulation than previous implementations have been formulated for the numerical solution of the Darwin model. These new procedures reduce the effort required to achieve some of the advantages that the Darwin model offers. Because the Courant--Friedrichs--Lewy stability limit for radiation modes is eliminated, the Darwin model has the advantage of a substantially larger time-step. Further, without radiation modes, simulation results are less sensitive to enhanced particle fluctation noise. We discuss methods for calculating the magnetic field that avoid formal vector decomposition and offer a new procedure for finding the inductive electric field. This procedure avoids vector decomposition of plasma source terms and circumvents some source gradient issues that slow convergence. As a consequence, the numerical effort required for each of the field time-steps is reduced, and more importantly, the need to specify several nonintuitive boundary conditions is eliminated. copyright 1987 Academic Press, Inc

  7. Ethnographic Methods in Academic Libraries: A Review

    Science.gov (United States)

    Ramsden, Bryony

    2016-01-01

    Research in academic libraries has recently seen an increase in the use of ethnographic-based methods to collect data. Primarily used to learn about library users and their interaction with spaces and resources, the methods are proving particularly useful to academic libraries. The data ethnographic methods retrieve is rich, context specific, and…

  8. A Method for Improving the Integrity of Peer Review.

    Science.gov (United States)

    Dadkhah, Mehdi; Kahani, Mohsen; Borchardt, Glenn

    2017-08-15

    Peer review is the most important aspect of reputable journals. Without it, we would be unsure about whether the material published was as valid and reliable as is possible. However, with the advent of the Internet, scientific literature has now become subject to a relatively new phenomenon: fake peer reviews. Some dishonest researchers have been manipulating the peer review process to publish what are often inferior papers. There are even papers that explain how to do it. This paper discusses one of those methods and how editors can defeat it by using a special review ID. This method is easy to understand and can be added to current peer review systems easily.

  9. Streamline topology: Patterns in fluid flows and their bifurcations

    DEFF Research Database (Denmark)

    Brøns, Morten

    2007-01-01

    Using dynamical systems theory, we consider structures such as vortices and separation in the streamline patterns of fluid flows. Bifurcation of patterns under variation of external parameters is studied using simplifying normal form transformations. Flows away from boundaries, flows close to fix...... walls, and axisymmetric flows are analyzed in detail. We show how to apply the ideas from the theory to analyze numerical simulations of the vortex breakdown in a closed cylindrical container....

  10. Damage Detection with Streamlined Structural Health Monitoring Data

    OpenAIRE

    Li, Jian; Deng, Jun; Xie, Weizhi

    2015-01-01

    The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM) systems often overwhelms the systems’ capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compressio...

  11. Zephyr: A secure Internet process to streamline engineering

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, C.W.; Niven, W.A.; Cavitt, R.E. [and others

    1998-05-12

    Lawrence Livermore National Laboratory (LLNL) is implementing an Internet-based process pilot called `Zephyr` to streamline engineering and commerce using the Internet. Major benefits have accrued by using Zephyr in facilitating industrial collaboration, speeding the engineering development cycle, reducing procurement time, and lowering overall costs. Programs at LLNL are potentializing the efficiencies introduced since implementing Zephyr. Zephyr`s pilot functionality is undergoing full integration with Business Systems, Finance, and Vendors to support major programs at the Laboratory.

  12. Dividing Streamline Formation Channel Confluences by Physical Modeling

    Directory of Open Access Journals (Sweden)

    Minarni Nur Trilita

    2010-02-01

    Full Text Available Confluence channels are often found in open channel network system and is the most important element. The incoming flow from the branch channel to the main cause various forms and cause vortex flow. Phenomenon can cause erosion of the side wall of the channel, the bed channel scour and sedimentation in the downstream confluence channel. To control these problems needed research into the current width of the branch channel. The incoming flow from the branch channel to the main channel flow bounded by a line distributors (dividing streamline. In this paper, the wide dividing streamline observed in the laboratory using a physical model of two open channels, a square that formed an angle of 30º. Observations were made with a variety of flow coming from each channel. The results obtained in the laboratory observation that the width of dividing streamline flow is influenced by the discharge ratio between the channel branch with the main channel. While the results of a comparison with previous studies showing that the observation in the laboratory is smaller than the results of previous research.

  13. Identification and authentication. Common biometric methods review

    OpenAIRE

    Lysak, A.

    2012-01-01

    Major biometric methods used for identification and authentication purposes in modern computing systems are considered in the article. Basic classification, application areas and key differences are given.

  14. A review on automated pavement distress detection methods

    NARCIS (Netherlands)

    Coenen, Tom B.J.; Golroo, Amir

    2017-01-01

    In recent years, extensive research has been conducted on pavement distress detection. A large part of these studies applied automated methods to capture different distresses. In this paper, a literature review on the distresses and related detection methods are presented. This review also includes

  15. Thresholding methods for PET imaging: A review

    International Nuclear Information System (INIS)

    Dewalle-Vignion, A.S.; Betrouni, N.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; El Abiad, A.

    2010-01-01

    This work deals with positron emission tomography segmentation methods for tumor volume determination. We propose a state of art techniques based on fixed or adaptive threshold. Methods found in literature are analysed with an objective point of view on their methodology, advantages and limitations. Finally, a comparative study is presented. (authors)

  16. Review on Finite Element Method * ERHUNMWUN, ID ...

    African Journals Online (AJOL)

    ADOWIE PERE

    ABSTRACT: In this work, we have discussed what Finite Element Method (FEM) is, its historical development, advantages and ... residual procedures, are examples of the direct approach ... The paper centred on the "stiffness and deflection of ...

  17. Review of Upscaling Methods for Describing Unsaturated Flow

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Brian D.

    2000-09-26

    Representing samll-scale features can be a challenge when one wants to model unsaturated flow in large domains. In this report, the various upscaling techniques are reviewed. The following upscaling methods have been identified from the literature: stochastic methods, renormalization methods, volume averaging and homogenization methods. In addition, a final technique, full resolution numerical modeling, is also discussed.

  18. Speech emotion recognition methods: A literature review

    Science.gov (United States)

    Basharirad, Babak; Moradhaseli, Mohammadreza

    2017-10-01

    Recently, attention of the emotional speech signals research has been boosted in human machine interfaces due to availability of high computation capability. There are many systems proposed in the literature to identify the emotional state through speech. Selection of suitable feature sets, design of a proper classifications methods and prepare an appropriate dataset are the main key issues of speech emotion recognition systems. This paper critically analyzed the current available approaches of speech emotion recognition methods based on the three evaluating parameters (feature set, classification of features, accurately usage). In addition, this paper also evaluates the performance and limitations of available methods. Furthermore, it highlights the current promising direction for improvement of speech emotion recognition systems.

  19. DEFLUORIDA TION METHODS - A REVIEW Belay Woldeyes ...

    African Journals Online (AJOL)

    use of renewable solar energy should be evaluated in order to test the applicability of this method. Ion-exchange .... volcanic ash influence are abundant in Kenya and other 'countries along the Rift Valley [27]. Further .... adsorpti9l1 include thermal treatment and acid treatment. . Regarding low P¢imeability, development of.

  20. MRI Brain Tumor Segmentation Methods- A Review

    OpenAIRE

    Gursangeet, Kaur; Jyoti, Rani

    2016-01-01

    Medical image processing and its segmentation is an active and interesting area for researchers. It has reached at the tremendous place in diagnosing tumors after the discovery of CT and MRI. MRI is an useful tool to detect the brain tumor and segmentation is performed to carry out the useful portion from an image. The purpose of this paper is to provide an overview of different image segmentation methods like watershed algorithm, morphological operations, neutrosophic sets, thresholding, K-...

  1. BOOK REVIEW: Vortex Methods: Theory and Practice

    Science.gov (United States)

    Cottet, G.-H.; Koumoutsakos, P. D.

    2001-03-01

    The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez

  2. Computed tomography shielding methods: a literature review.

    Science.gov (United States)

    Curtis, Jessica Ryann

    2010-01-01

    To investigate available shielding methods in an effort to further awareness and understanding of existing preventive measures related to patient exposure in computed tomography (CT) scanning. Searches were conducted to locate literature discussing the effectiveness of commercially available shields. Literature containing information regarding breast, gonad, eye and thyroid shielding was identified. Because of rapidly advancing technology, the selection of articles was limited to those published within the past 5 years. The selected studies were examined using the following topics as guidelines: the effectiveness of the shield (percentage of dose reduction), the shield's effect on image quality, arguments for or against its use (including practicality) and overall recommendation for its use in clinical practice. Only a limited number of studies have been performed on the use of shields for the eyes, thyroid and gonads, but the evidence shows an overall benefit to their use. Breast shielding has been the most studied shielding method, with consistent agreement throughout the literature on its effectiveness at reducing radiation dose. The effect of shielding on image quality was not remarkable in a majority of studies. Although it is noted that more studies need to be conducted regarding the impact on image quality, the currently published literature stresses the importance of shielding in reducing dose. Commercially available shields for the breast, thyroid, eyes and gonads should be implemented in clinical practice. Further research is needed to ascertain the prevalence of shielding in the clinical setting.

  3. Streamlining digital signal processing a tricks of the trade guidebook

    CERN Document Server

    2012-01-01

    Streamlining Digital Signal Processing, Second Edition, presents recent advances in DSP that simplify or increase the computational speed of common signal processing operations and provides practical, real-world tips and tricks not covered in conventional DSP textbooks. It offers new implementations of digital filter design, spectrum analysis, signal generation, high-speed function approximation, and various other DSP functions. It provides:Great tips, tricks of the trade, secrets, practical shortcuts, and clever engineering solutions from seasoned signal processing professionalsAn assortment.

  4. A streamlined ribosome profiling protocol for the characterization of microorganisms

    DEFF Research Database (Denmark)

    Latif, Haythem; Szubin, Richard; Tan, Justin

    2015-01-01

    Ribosome profiling is a powerful tool for characterizing in vivo protein translation at the genome scale, with multiple applications ranging from detailed molecular mechanisms to systems-level predictive modeling. Though highly effective, this intricate technique has yet to become widely used...... in the microbial research community. Here we present a streamlined ribosome profiling protocol with reduced barriers to entry for microbial characterization studies. Our approach provides simplified alternatives during harvest, lysis, and recovery of monosomes and also eliminates several time-consuming steps...

  5. Streamlined library programming how to improve services and cut costs

    CERN Document Server

    Porter-Reynolds, Daisy

    2014-01-01

    In their roles as community centers, public libraries offer many innovative and appealing programs; but under current budget cuts, library resources are stretched thin. With slashed budgets and limited staff hours, what can libraries do to best serve their publics? This how-to guide provides strategies for streamlining library programming in public libraries while simultaneously maintaining-or even improving-quality delivery. The wide variety of principles and techniques described can be applied on a selective basis to libraries of all sizes. Based upon the author's own extensive experience as

  6. Topology of streamlines and vorticity contours for two - dimensional flows

    DEFF Research Database (Denmark)

    Andersen, Morten

    on the vortex filament by the localised induction approximation the stream function is slightly modified and an extra parameter is introduced. In this setting two new flow topologies arise, but not more than two critical points occur for any combination of the parameters. The analysis of the closed form show...... by a point vortex above a wall in inviscid fluid. There is no reason to a priori expect equivalent results of the three vortex definitions. However, the study is mainly motivated by the findings of Kudela & Malecha (Fluid Dyn. Res. 41, 2009) who find good agreement between the vorticity and streamlines...

  7. Review of PCMS and heat transfer enhancement methods applied ...

    African Journals Online (AJOL)

    Most available PCMs have low thermal conductivity making heat transfer enhancement necessary for power applications. The various methods of heat transfer enhancement in latent heat storage systems were also reviewed systematically. The review showed that three commercially - available PCMs are suitable in the ...

  8. A systematic review and appraisal of methods of developing and ...

    African Journals Online (AJOL)

    ) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for ...

  9. A systematic review and appraisal of methods of developing and ...

    African Journals Online (AJOL)

    authors and deductive reasoning of authors. For validation, methods .... length of the existing questionnaire thereby making it ... decisions were made by one reviewer, with reference to ..... Inductive reasoning by authors; Inductive reasoning.

  10. Robust Preconditioning Estimates for Convection-Dominated Elliptic Problems via a Streamline Poincaré--Friedrichs Inequality

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe; Karátson, J.; Kovács, B.

    2014-01-01

    Roč. 52, č. 6 (2014), s. 2957-2976 ISSN 0036-1429 R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:68145535 Keywords : streamline diffusion finite element method * solving convection-dominated elliptic problems * convergence is robust Subject RIV: BA - General Mathematics Impact factor: 1.788, year: 2014 http://epubs.siam.org/doi/abs/10.1137/130940268

  11. Complementary arsenic speciation methods: A review

    Energy Technology Data Exchange (ETDEWEB)

    Nearing, Michelle M., E-mail: michelle.nearing@rmc.ca; Koch, Iris, E-mail: koch-i@rmc.ca; Reimer, Kenneth J., E-mail: reimer-k@rmc.ca

    2014-09-01

    The toxicity of arsenic greatly depends on its chemical form and oxidation state (speciation) and therefore accurate determination of arsenic speciation is a crucial step in understanding its chemistry and potential risk. High performance liquid chromatography with inductively coupled mass spectrometry (HPLC–ICP-MS) is the most common analysis used for arsenic speciation but it has two major limitations: it relies on an extraction step (usually from a solid sample) that can be incomplete or alter the arsenic compounds; and it provides no structural information, relying on matching sample peaks to standard peaks. The use of additional analytical methods in a complementary manner introduces the ability to address these disadvantages. The use of X-ray absorption spectroscopy (XAS) with HPLC–ICP-MS can be used to identify compounds not extracted for HPLC–ICP-MS and provide minimal processing steps for solid state analysis that may help preserve labile compounds such as those containing arsenic-sulfur bonds, which can degrade under chromatographic conditions. On the other hand, HPLC–ICP-MS is essential in confirming organoarsenic compounds with similar white line energies seen by using XAS, and identifying trace arsenic compounds that are too low to be detected by XAS. The complementary use of electrospray mass spectrometry (ESI–MS) with HPLC–ICP-MS provides confirmation of arsenic compounds identified during the HPLC–ICP-MS analysis, identification of unknown compounds observed during the HPLC–ICP-MS analysis and further resolves HPLC–ICP-MS by identifying co-eluting compounds. In the complementary use of HPLC–ICP-MS and ESI–MS, HPLC–ICP-MS helps to focus the ESI–MS selection of ions. Numerous studies have shown that the information obtained from HPLC–ICP-MS analysis can be greatly enhanced by complementary approaches. - Highlights: • HPLC–ICP-MS is the most common method used for arsenic speciation. • HPLC limitations include

  12. Complementary arsenic speciation methods: A review

    International Nuclear Information System (INIS)

    Nearing, Michelle M.; Koch, Iris; Reimer, Kenneth J.

    2014-01-01

    The toxicity of arsenic greatly depends on its chemical form and oxidation state (speciation) and therefore accurate determination of arsenic speciation is a crucial step in understanding its chemistry and potential risk. High performance liquid chromatography with inductively coupled mass spectrometry (HPLC–ICP-MS) is the most common analysis used for arsenic speciation but it has two major limitations: it relies on an extraction step (usually from a solid sample) that can be incomplete or alter the arsenic compounds; and it provides no structural information, relying on matching sample peaks to standard peaks. The use of additional analytical methods in a complementary manner introduces the ability to address these disadvantages. The use of X-ray absorption spectroscopy (XAS) with HPLC–ICP-MS can be used to identify compounds not extracted for HPLC–ICP-MS and provide minimal processing steps for solid state analysis that may help preserve labile compounds such as those containing arsenic-sulfur bonds, which can degrade under chromatographic conditions. On the other hand, HPLC–ICP-MS is essential in confirming organoarsenic compounds with similar white line energies seen by using XAS, and identifying trace arsenic compounds that are too low to be detected by XAS. The complementary use of electrospray mass spectrometry (ESI–MS) with HPLC–ICP-MS provides confirmation of arsenic compounds identified during the HPLC–ICP-MS analysis, identification of unknown compounds observed during the HPLC–ICP-MS analysis and further resolves HPLC–ICP-MS by identifying co-eluting compounds. In the complementary use of HPLC–ICP-MS and ESI–MS, HPLC–ICP-MS helps to focus the ESI–MS selection of ions. Numerous studies have shown that the information obtained from HPLC–ICP-MS analysis can be greatly enhanced by complementary approaches. - Highlights: • HPLC–ICP-MS is the most common method used for arsenic speciation. • HPLC limitations include

  13. Diagnostic Methods for Feline Coronavirus: A Review

    Directory of Open Access Journals (Sweden)

    Saeed Sharif

    2010-01-01

    Full Text Available Feline coronaviruses (FCoVs are found throughout the world. Infection with FCoV can result in a diverse range of signs from clinically inapparent infections to a highly fatal disease called feline infectious peritonitis (FIP. FIP is one of the most serious viral diseases of cats. While there is neither an effective vaccine, nor a curative treatment for FIP, a diagnostic protocol for FCoV would greatly assist in the management and control of the virus. Clinical findings in FIP are non-specific and not helpful in making a differential diagnosis. Haematological and biochemical abnormalities in FIP cases are also non-specific. The currently available serological tests have low specificity and sensitivity for detection of active infection and cross-react with FCoV strains of low pathogenicity, the feline enteric coronaviruses (FECV. Reverse transcriptase polymerase chain reaction (RT-PCR has been used to detect FCoV and is rapid and sensitive, but results must be interpreted in the context of clinical findings. At present, a definitive diagnosis of FIP can be established only by histopathological examination of biopsies. This paper describes and compares diagnostic methods for FCoVs and includes a brief account of the virus biology, epidemiology, and pathogenesis.

  14. The Scrum agile method: A systematic literature review

    OpenAIRE

    Krajnik, Matevž

    2016-01-01

    Scrum is an agile method for software engineering, used by companies to develop products faster and more efficiently. Because the customer is more engaged in the process and the development is incremental and iterative projects progress better, it is also easier to implement any changes in functionality the customer might want. In this thesis a review of existing scientific literature regarding the method Scrum in software engineering has been made. The review brought forth answers to three p...

  15. A mixed-methods approach to systematic reviews.

    Science.gov (United States)

    Pearson, Alan; White, Heath; Bath-Hextall, Fiona; Salmond, Susan; Apostolo, Joao; Kirkpatrick, Pamela

    2015-09-01

    There are an increasing number of published single-method systematic reviews that focus on different types of evidence related to a particular topic. As policy makers and practitioners seek clear directions for decision-making from systematic reviews, it is likely that it will be increasingly difficult for them to identify 'what to do' if they are required to find and understand a plethora of syntheses related to a particular topic.Mixed-methods systematic reviews are designed to address this issue and have the potential to produce systematic reviews of direct relevance to policy makers and practitioners.On the basis of the recommendations of the Joanna Briggs Institute International Mixed Methods Reviews Methodology Group in 2012, the Institute adopted a segregated approach to mixed-methods synthesis as described by Sandelowski et al., which consists of separate syntheses of each component method of the review. Joanna Briggs Institute's mixed-methods synthesis of the findings of the separate syntheses uses a Bayesian approach to translate the findings of the initial quantitative synthesis into qualitative themes and pooling these with the findings of the initial qualitative synthesis.

  16. Study of streamline flow in the portal system

    International Nuclear Information System (INIS)

    Atkins, H.L.; Deitch, J.S.; Oster, Z.H.; Perkes, E.A.

    1985-01-01

    The study was undertaken to determine if streamline flow occurs in the portal vein, thus separating inflow from the superior mesenteric artery (SMA) and the inferior mesenteric artery. Previously published data on this subject is inconsistent. Patients undergoing abdominal angiography received two administrations of Tc-99m sulfur colloid, first via the SMA during angiography and, after completion of the angiographic procedure, via a peripheral vein (IV). Anterior images of the liver were recorded over a three minute acquisition before and after the IV injection without moving the patient. The image from the SMA injection was subtracted from the SMA and IV image to provide a pure IV image. Analysis of R to L ratios for selected regions of interest as well as whole lobes was carried out and the shift of R to L (SMA to IV) determined. Six patients had liver metastases from the colon, four had cirrhosis and four had no known liver disease. The shift in the ratio was highly variable without a consistent pattern. Large changes in some patients could be attributed to hepatic artery flow directed to metastases. No consistent evidence for streamlining of portal flow was discerned

  17. Review of experimental methods for evaluating effective delayed neutron fraction

    Energy Technology Data Exchange (ETDEWEB)

    Yamane, Yoshihiro [Nagoya Univ. (Japan). School of Engineering

    1997-03-01

    The International Effective Delayed Neutron Fraction ({beta}{sub eff}) Benchmark Experiments have been carried out at the Fast Critical Assembly of Japan Atomic Energy Research Institute since 1995. Researchers from six countries, namely France, Italy, Russia, U.S.A., Korea, and Japan, participate in this FCA project. Each team makes use of each experimental method, such as Frequency Method, Rossi-{alpha} Method, Nelson Number Method, Cf Neutron Source Method, and Covariance Method. In this report these experimental methods are reviewed. (author)

  18. Demystifying Mixed Methods Research Design: A Review of the Literature

    OpenAIRE

    Gail D. Caruth

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research. A review of the literature revealed that it has been gaining acceptance among researchers, researchers have begun using mixed methods research, it ...

  19. Stakeholder involvement in systematic reviews: a protocol for a systematic review of methods, outcomes and effects.

    Science.gov (United States)

    Pollock, Alex; Campbell, Pauline; Struthers, Caroline; Synnot, Anneliese; Nunn, Jack; Hill, Sophie; Goodare, Heather; Watts, Chris; Morley, Richard

    2017-01-01

    Researchers are expected to actively involve stakeholders (including patients, the public, health professionals, and others) in their research. Although researchers increasingly recognise that this is good practice, there is limited practical guidance about how to involve stakeholders. Systematic reviews are a research method in which international literature is brought together, using carefully designed and rigorous methods to answer a specified question about healthcare. We want to investigate how researchers have involved stakeholders in systematic reviews, and how involvement has potentially affected the quality and impact of reviews. We plan to bring this information together by searching and reviewing the literature for reports of stakeholder involvement in systematic reviews. This paper describes in detail the methods that we plan to use to do this. After carrying out comprehensive searches for literature, we will: 1. Provide an overview of identified reports, describing key information such as types of stakeholders involved, and how. 2. Pick out reports of involvement which include detailed descriptions of how researchers involved people in a systematic review and summarise the methods they used. We will consider who was involved, how people were recruited, and how the involvement was organised and managed. 3. Bring together any reports which have explored the effect, or impact, of involving stakeholders in a systematic review. We will assess the quality of these reports, and summarise their findings. Once completed, our review will be used to produce training resources aimed at helping researchers to improve ways of involving stakeholders in systematic reviews. Background There is an expectation for stakeholders (including patients, the public, health professionals, and others) to be involved in research. Researchers are increasingly recognising that it is good practice to involve stakeholders in systematic reviews. There is currently a lack of evidence

  20. How to measure comorbidity. a critical review of available methods.

    NARCIS (Netherlands)

    de Groot, V.; Beckerman, H.; Lankhorst, G.J.; Bouter, L.M.

    2003-01-01

    The object of this article was to systematically review available methods to measure comorbidity and to assess their validity and reliability. A search was made in Medline and Embase, with the keywords comorbidity and multi-morbidity, to identify articles in which a method to measure comorbidity was

  1. How to measure comorbidity. A critical review of available methods

    NARCIS (Netherlands)

    de Groot, V; Beckerman, H; Lankhorst, G J; Bouter, L M

    2003-01-01

    The object of this article was to systematically review available methods to measure comorbidity and to assess their validity and reliability. A search was made in Medline and Embase, with the keywords comorbidity and multi-morbidity, to identify articles in which a method to measure comorbidity was

  2. Review of analysis methods for prestressed concrete reactor vessels

    International Nuclear Information System (INIS)

    Dodge, W.G.; Bazant, Z.P.; Gallagher, R.H.

    1977-02-01

    Theoretical and practical aspects of analytical models and numerical procedures for detailed analysis of prestressed concrete reactor vessels are reviewed. Constitutive models and numerical algorithms for time-dependent and nonlinear response of concrete and various methods for modeling crack propagation are discussed. Published comparisons between experimental and theoretical results are used to assess the accuracy of these analytical methods

  3. Review of unfolding methods for neutron flux dosimetry

    International Nuclear Information System (INIS)

    Stallmann, F.W.; Kam, F.B.K.

    1975-01-01

    The primary method in reactor dosimetry is the foil activation technique. To translate the activation measurements into neutron fluxes, a special data processing technique called unfolding is needed. Some general observations about the problems and the reliability of this approach to reactor dosimetry are presented. Current unfolding methods are reviewed. 12 references. (auth)

  4. Review of Electrical and Gravity Methods of Near-Surface ...

    African Journals Online (AJOL)

    USER

    ABSTRACT: The theory and practice of electrical and gravity methods of geophysics for groundwater exploration was reviewed with illustrations and data examples. With the goal of reducing cases of borehole/water-well failure attributed to the lack of the knowledge of the methods of geophysics for groundwater exploration ...

  5. A Review of Classical Methods of Item Analysis.

    Science.gov (United States)

    French, Christine L.

    Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…

  6. The Zig-zag Instability of Streamlined Bodies

    Science.gov (United States)

    Guillet, Thibault; Coux, Martin; Quere, David; Clanet, Christophe

    2017-11-01

    When a floating bluff body, like a sphere, impacts water with a vertical velocity, its trajectory is straight and the depth of its dive increases with its initial velocity. Even though we observe the same phenomenon at low impact speed for axisymmetric streamlined bodies, the trajectory is found to deviate from the vertical when the velocity overcomes a critical value. This instability results from a competition between the destabilizing torque of the lift and the stabilizing torque of the Archimede's force. Balancing these torques yields a prediction on the critical velocity above which the instability appears. This theoretical value is found to depend on the position of the gravity center of the projectile and predicts with a full agreement the behaviour observed in our different experiments. Project funded by DGA.

  7. Streamlining air import operations by trade facilitation measures

    Directory of Open Access Journals (Sweden)

    Yuri da Cunha Ferreira

    2017-12-01

    Full Text Available Global operations are subject to considerable uncertainties. Due to the Trade Facilitation Agreement that became effective in February 2017, the study of measures to streamline customs controls is urgent. This study aims to assess the impact of trade facilitation measures on import flows. An experimental study was performed in the largest cargo airport in South America through discrete-event simulation and design of experiments. Operation impacts of three trade facilitation measures are assessed on import flow by air. We shed light in the following trade facilitation measures: the use of X-ray equipment for physical inspection; increase of the number of qualified companies in the trade facilitation program; performance targets for customs officials. All trade facilitation measures used indicated potential to provide more predictability, cost savings, time reduction, and increase in security in international supply chain.

  8. Scoping reviews: time for clarity in definition, methods, and reporting.

    Science.gov (United States)

    Colquhoun, Heather L; Levac, Danielle; O'Brien, Kelly K; Straus, Sharon; Tricco, Andrea C; Perrier, Laure; Kastner, Monika; Moher, David

    2014-12-01

    The scoping review has become increasingly popular as a form of knowledge synthesis. However, a lack of consensus on scoping review terminology, definition, methodology, and reporting limits the potential of this form of synthesis. In this article, we propose recommendations to further advance the field of scoping review methodology. We summarize current understanding of scoping review publication rates, terms, definitions, and methods. We propose three recommendations for clarity in term, definition and methodology. We recommend adopting the terms "scoping review" or "scoping study" and the use of a proposed definition. Until such time as further guidance is developed, we recommend the use of the methodological steps outlined in the Arksey and O'Malley framework and further enhanced by Levac et al. The development of reporting guidance for the conduct and reporting of scoping reviews is underway. Consistency in the proposed domains and methodologies of scoping reviews, along with the development of reporting guidance, will facilitate methodological advancement, reduce confusion, facilitate collaboration and improve knowledge translation of scoping review findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Mixed methods in psychotherapy research: A review of method(ology) integration in psychotherapy science.

    Science.gov (United States)

    Bartholomew, Theodore T; Lockard, Allison J

    2018-06-13

    Mixed methods can foster depth and breadth in psychological research. However, its use remains in development in psychotherapy research. Our purpose was to review the use of mixed methods in psychotherapy research. Thirty-one studies were identified via the PRISMA systematic review method. Using Creswell & Plano Clark's typologies to identify design characteristics, we assessed each study for rigor and how each used mixed methods. Key features of mixed methods designs and these common patterns were identified: (a) integration of clients' perceptions via mixing; (b) understanding group psychotherapy; (c) integrating methods with cases and small samples; (d) analyzing clinical data as qualitative data; and (e) exploring cultural identities in psychotherapy through mixed methods. The review is discussed with respect to the value of integrating multiple data in single studies to enhance psychotherapy research. © 2018 Wiley Periodicals, Inc.

  10. Mixed methods systematic review exploring mentorship outcomes in nursing academia.

    Science.gov (United States)

    Nowell, Lorelli; Norris, Jill M; Mrklas, Kelly; White, Deborah E

    2017-03-01

    The aim of this study was to report on a mixed methods systematic review that critically examines the evidence for mentorship in nursing academia. Nursing education institutions globally have issued calls for mentorship. There is emerging evidence to support the value of mentorship in other disciplines, but the extant state of the evidence in nursing academia is not known. A comprehensive review of the evidence is required. A mixed methods systematic review. Five databases (MEDLINE, CINAHL, EMBASE, ERIC, PsycINFO) were searched using an a priori search strategy from inception to 2 November 2015 to identify quantitative, qualitative and mixed methods studies. Grey literature searches were also conducted in electronic databases (ProQuest Dissertations and Theses, Index to Theses) and mentorship conference proceedings and by hand searching the reference lists of eligible studies. Study quality was assessed prior to inclusion using standardized critical appraisal instruments from the Joanna Briggs Institute. A convergent qualitative synthesis design was used where results from qualitative, quantitative and mixed methods studies were transformed into qualitative findings. Mentorship outcomes were mapped to a theory-informed framework. Thirty-four studies were included in this review, from the 3001 records initially retrieved. In general, mentorship had a positive impact on behavioural, career, attitudinal, relational and motivational outcomes; however, the methodological quality of studies was weak. This review can inform the objectives of mentorship interventions and contribute to a more rigorous approach to studies that assess mentorship outcomes. © 2016 John Wiley & Sons Ltd.

  11. Landslide Susceptibility Statistical Methods: A Critical and Systematic Literature Review

    Science.gov (United States)

    Mihir, Monika; Malamud, Bruce; Rossi, Mauro; Reichenbach, Paola; Ardizzone, Francesca

    2014-05-01

    Landslide susceptibility assessment, the subject of this systematic review, is aimed at understanding the spatial probability of slope failures under a set of geomorphological and environmental conditions. It is estimated that about 375 landslides that occur globally each year are fatal, with around 4600 people killed per year. Past studies have brought out the increasing cost of landslide damages which primarily can be attributed to human occupation and increased human activities in the vulnerable environments. Many scientists, to evaluate and reduce landslide risk, have made an effort to efficiently map landslide susceptibility using different statistical methods. In this paper, we do a critical and systematic landslide susceptibility literature review, in terms of the different statistical methods used. For each of a broad set of studies reviewed we note: (i) study geography region and areal extent, (ii) landslide types, (iii) inventory type and temporal period covered, (iv) mapping technique (v) thematic variables used (vi) statistical models, (vii) assessment of model skill, (viii) uncertainty assessment methods, (ix) validation methods. We then pulled out broad trends within our review of landslide susceptibility, particularly regarding the statistical methods. We found that the most common statistical methods used in the study of landslide susceptibility include logistic regression, artificial neural network, discriminant analysis and weight of evidence. Although most of the studies we reviewed assessed the model skill, very few assessed model uncertainty. In terms of geographic extent, the largest number of landslide susceptibility zonations were in Turkey, Korea, Spain, Italy and Malaysia. However, there are also many landslides and fatalities in other localities, particularly India, China, Philippines, Nepal and Indonesia, Guatemala, and Pakistan, where there are much fewer landslide susceptibility studies available in the peer-review literature. This

  12. Conducting organizational safety reviews - requirements, methods and experience

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.; Wahlstroem, B.; Rollenhagen, C.; Kahlbom, U.

    2008-03-01

    Organizational safety reviews are part of the safety management process of power plants. They are typically performed after major reorganizations, significant incidents or according to specified review programs. Organizational reviews can also be a part of a benchmarking between organizations that aims to improve work practices. Thus, they are important instruments in proactive safety management and safety culture. Most methods that have been used for organizational reviews are based more on practical considerations than a sound scientific theory of how various organizational or technical issues influence safety. Review practices and methods also vary considerably. The objective of this research is to promote understanding on approaches used in organizational safety reviews as well as to initiate discussion on criteria and methods of organizational assessment. The research identified a set of issues that need to be taken into account when planning and conducting organizational safety reviews. Examples of the issues are definition of appropriate criteria for evaluation, the expertise needed in the assessment and the organizational motivation for conducting the assessment. The study indicates that organizational safety assessments involve plenty of issues and situations where choices have to be made regarding what is considered valid information and a balance has to be struck between focus on various organizational phenomena. It is very important that these choices are based on a sound theoretical framework and that these choices can later be evaluated together with the assessment findings. The research concludes that at its best, the organizational safety reviews can be utilised as a source of information concerning the changing vulnerabilities and the actual safety performance of the organization. In order to do this, certain basic organizational phenomena and assessment issues have to be acknowledged and considered. The research concludes with recommendations on

  13. Conducting organizational safety reviews - requirements, methods and experience

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, T.; Oedewald, P.; Wahlstroem, B. [Technical Research Centre of Finland, VTT (Finland); Rollenhagen, C. [Royal Institute of Technology, KTH, (Sweden); Kahlbom, U. [RiskPilot (Sweden)

    2008-03-15

    Organizational safety reviews are part of the safety management process of power plants. They are typically performed after major reorganizations, significant incidents or according to specified review programs. Organizational reviews can also be a part of a benchmarking between organizations that aims to improve work practices. Thus, they are important instruments in proactive safety management and safety culture. Most methods that have been used for organizational reviews are based more on practical considerations than a sound scientific theory of how various organizational or technical issues influence safety. Review practices and methods also vary considerably. The objective of this research is to promote understanding on approaches used in organizational safety reviews as well as to initiate discussion on criteria and methods of organizational assessment. The research identified a set of issues that need to be taken into account when planning and conducting organizational safety reviews. Examples of the issues are definition of appropriate criteria for evaluation, the expertise needed in the assessment and the organizational motivation for conducting the assessment. The study indicates that organizational safety assessments involve plenty of issues and situations where choices have to be made regarding what is considered valid information and a balance has to be struck between focus on various organizational phenomena. It is very important that these choices are based on a sound theoretical framework and that these choices can later be evaluated together with the assessment findings. The research concludes that at its best, the organizational safety reviews can be utilised as a source of information concerning the changing vulnerabilities and the actual safety performance of the organization. In order to do this, certain basic organizational phenomena and assessment issues have to be acknowledged and considered. The research concludes with recommendations on

  14. [Molecular typing methods for Pasteurella multocida-A review].

    Science.gov (United States)

    Peng, Zhong; Liang, Wan; Wu, Bin

    2016-10-04

    Pasteurella multocida is an important gram-negative pathogenic bacterium that could infect wide ranges of animals. Humans could also be infected by P. multocida via animal bite or scratching. Current typing methods for P. multocida include serological typing methods and molecular typing methods. Of them, serological typing methods are based on immunological assays, which are too complicated for clinical bacteriological studies. However, the molecular methods including multiple PCRs and multilocus sequence typing (MLST) methods are more suitable for bacteriological studies of P. multocida in clinic, with their simple operation, high efficiency and accurate detection compared to the traditional serological typing methods, they are therefore widely used. In the current review, we briefly describe the molecular typing methods for P. multocida. Our aim is to provide a knowledge-foundation for clinical bacteriological investigation especially the molecular investigation for P. multocida.

  15. Analytical Work in Support of the Design and Operation of Two Dimensional Self Streamlining Test Sections

    Science.gov (United States)

    Judd, M.; Wolf, S. W. D.; Goodyer, M. J.

    1976-01-01

    A method has been developed for accurately computing the imaginary flow fields outside a flexible walled test section, applicable to lifting and non-lifting models. The tolerances in the setting of the flexible walls introduce only small levels of aerodynamic interference at the model. While it is not possible to apply corrections for the interference effects, they may be reduced by improving the setting accuracy of the portions of wall immediately above and below the model. Interference effects of the truncation of the length of the streamlined portion of a test section are brought to an acceptably small level by the use of a suitably long test section with the model placed centrally.

  16. Organisational reviews - requirements, methods and experience. Progress report 2006

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.; Wahlstroem, B.; Rollenhagen, C.; Kahlbom, U.

    2007-04-01

    Organisational reviews are important instruments in the continuous quest for improved performance. In the nuclear field there has been an increasing regulatory interest in organisational performance, because incidents and accidents often point to organisational deficiencies as one of the major precursors. Many methods for organisational reviews have been proposed, but they are mostly based on ad hoc approaches to specific problems. The absence of well-established techniques for organisational reviews has already shown to cause discussions and controversies on different levels. The aim of the OrRe project is to collect the experiences from organisational reviews carried out so far and to reflect them in a theoretical model of organisational performance. Furthermore, the project aims to reflect on the criteria for the definition of the scope and content of organisational reviews. Finally, recommendations will be made for guidance for people participating in organisational reviews. This progress report describes regulatory practices in Finland and Sweden together with some case examples of organizational reviews and assessment in both countries. Some issues of concern are raised and an outline for the next year's work is proposed. Issues of concern include the sufficient depth of the assessment, the required competence in assessments, data and criteria problems, definition of the boundaries of the system to be assessed, and the necessary internal support and organisational maturity required for successful assessments. Finally, plans for next year's work are outlined. (au)

  17. Organisational reviews - requirements, methods and experience. Progress report 2006

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, T.; Oedewald, P.; Wahlstroem, B. [VTT, Technical Research Centre of Finland (Finland); Rollenhagen, C.; Kahlbom, U. [Maelardalen University (FI)

    2007-04-15

    Organisational reviews are important instruments in the continuous quest for improved performance. In the nuclear field there has been an increasing regulatory interest in organisational performance, because incidents and accidents often point to organisational deficiencies as one of the major precursors. Many methods for organisational reviews have been proposed, but they are mostly based on ad hoc approaches to specific problems. The absence of well-established techniques for organisational reviews has already shown to cause discussions and controversies on different levels. The aim of the OrRe project is to collect the experiences from organisational reviews carried out so far and to reflect them in a theoretical model of organisational performance. Furthermore, the project aims to reflect on the criteria for the definition of the scope and content of organisational reviews. Finally, recommendations will be made for guidance for people participating in organisational reviews. This progress report describes regulatory practices in Finland and Sweden together with some case examples of organizational reviews and assessment in both countries. Some issues of concern are raised and an outline for the next year's work is proposed. Issues of concern include the sufficient depth of the assessment, the required competence in assessments, data and criteria problems, definition of the boundaries of the system to be assessed, and the necessary internal support and organisational maturity required for successful assessments. Finally, plans for next year's work are outlined. (au)

  18. Review of design optimization methods for turbomachinery aerodynamics

    Science.gov (United States)

    Li, Zhihui; Zheng, Xinqian

    2017-08-01

    In today's competitive environment, new turbomachinery designs need to be not only more efficient, quieter, and ;greener; but also need to be developed at on much shorter time scales and at lower costs. A number of advanced optimization strategies have been developed to achieve these requirements. This paper reviews recent progress in turbomachinery design optimization to solve real-world aerodynamic problems, especially for compressors and turbines. This review covers the following topics that are important for optimizing turbomachinery designs. (1) optimization methods, (2) stochastic optimization combined with blade parameterization methods and the design of experiment methods, (3) gradient-based optimization methods for compressors and turbines and (4) data mining techniques for Pareto Fronts. We also present our own insights regarding the current research trends and the future optimization of turbomachinery designs.

  19. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  20. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  1. Review of dynamic optimization methods in renewable natural resource management

    Science.gov (United States)

    Williams, B.K.

    1989-01-01

    In recent years, the applications of dynamic optimization procedures in natural resource management have proliferated. A systematic review of these applications is given in terms of a number of optimization methodologies and natural resource systems. The applicability of the methods to renewable natural resource systems are compared in terms of system complexity, system size, and precision of the optimal solutions. Recommendations are made concerning the appropriate methods for certain kinds of biological resource problems.

  2. Book Review: Comparative Education Research: Approaches and Methods

    Directory of Open Access Journals (Sweden)

    Noel Mcginn

    2014-10-01

    Full Text Available Book Review Comparative Education Research: Approaches and Methods (2nd edition By Mark Bray, Bob Adamson and Mark Mason (Eds. (2014, 453p ISBN: 978-988-17852-8-2, Hong Kong: Comparative Education Research Centre and Springer

  3. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  4. Review of Electrical and Gravity Methods of Near-Surface ...

    African Journals Online (AJOL)

    USER

    In every big city, dozen of new boreholes or hand-dug wells are .... This paper is a review of the electrical and gravity methods of ... audience/readership. II. ..... W. W. Northon and Company, New York. Butler ... McGraw Hill Books Co. New York ...

  5. Review of Artificial Abrasion Test Methods for PV Module Technology

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Muller, Matt T. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Simpson, Lin J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-08-01

    This review is intended to identify the method or methods--and the basic details of those methods--that might be used to develop an artificial abrasion test. Methods used in the PV literature were compared with their closest implementation in existing standards. Also, meetings of the International PV Quality Assurance Task Force Task Group 12-3 (TG12-3, which is concerned with coated glass) were used to identify established test methods. Feedback from the group, which included many of the authors from the PV literature, included insights not explored within the literature itself. The combined experience and examples from the literature are intended to provide an assessment of the present industry practices and an informed path forward. Recommendations toward artificial abrasion test methods are then identified based on the experiences in the literature and feedback from the PV community. The review here is strictly focused on abrasion. Assessment methods, including optical performance (e.g., transmittance or reflectance), surface energy, and verification of chemical composition were not examined. Methods of artificially soiling PV modules or other specimens were not examined. The weathering of artificial or naturally soiled specimens (which may ultimately include combined temperature and humidity, thermal cycling and ultraviolet light) were also not examined. A sense of the purpose or application of an abrasion test method within the PV industry should, however, be evident from the literature.

  6. Cost-effectiveness of lung cancer screening and treatment methods: a systematic review of systematic reviews.

    Science.gov (United States)

    Azar, Farbod Ebadifard; Azami-Aghdash, Saber; Pournaghi-Azar, Fatemeh; Mazdaki, Alireza; Rezapour, Aziz; Ebrahimi, Parvin; Yousefzadeh, Negar

    2017-06-19

    Due to extensive literature in the field of lung cancer and their heterogeneous results, the aim of this study was to systematically review of systematic reviews studies which reviewed the cost-effectiveness of various lung cancer screening and treatment methods. In this systematic review of systematic reviews study, required data were collected searching the following key words which selected from Mesh: "lung cancer", "lung oncology", "lung Carcinoma", "lung neoplasm", "lung tumors", "cost- effectiveness", "systematic review" and "Meta-analysis". The following databases were searched: PubMed, Cochrane Library electronic databases, Google Scholar, and Scopus. Two reviewers (RA and A-AS) evaluated the articles according to the checklist of "assessment of multiple systematic reviews" (AMSTAR) tool. Overall, information of 110 papers was discussed in eight systematic reviews. Authors focused on cost-effectiveness of lung cancer treatments in five systematic reviews. Targeted therapy options (bevacizumab, Erlotinib and Crizotinib) show an acceptable cost-effectiveness. Results of three studies failed to show cost-effectiveness of screening methods. None of the studies had used the meta-analysis method. The Quality of Health Economic Studies (QHES) tool and Drummond checklist were mostly used in assessing the quality of articles. Most perspective was related to the Payer (64 times) and the lowest was related to Social (11times). Most cases referred to Incremental analysis (82%) and also the lowest point of referral was related to Discounting (in 49% of the cases). The average quality score of included studies was calculated 9.2% from 11. Targeted therapy can be an option for the treatment of lung cancer. Evaluation of the cost-effectiveness of computerized tomographic colonography (CTC) in lung cancer screening is recommended. The perspective of the community should be more taken into consideration in studies of cost-effectiveness. Paying more attention to the topic of

  7. Review of methods for level density estimation from resonance parameters

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1983-01-01

    A number of methods are available for statistical analysis of resonance parameter sets, i.e. for estimation of level densities and average widths with account of missing levels. The main categories are (i) methods based on theories of level spacings (orthogonal-ensemble theory, Dyson-Mehta statistics), (ii) methods based on comparison with simulated cross section curves (Monte Carlo simulation, Garrison's autocorrelation method), (iii) methods exploiting the observed neutron width distribution by means of Bayesian or more approximate procedures such as maximum-likelihood, least-squares or moment methods, with various recipes for the treatment of detection thresholds and resolution effects. The present review will concentrate on (iii) with the aim of clarifying the basic mathematical concepts and the relationship between the various techniques. Recent theoretical progress in the treatment of resolution effects, detectability thresholds and p-wave admixture is described. (Auth.)

  8. Streamlined islands and the English Channel megaflood hypothesis

    Science.gov (United States)

    Collier, J. S.; Oggioni, F.; Gupta, S.; García-Moreno, D.; Trentesaux, A.; De Batist, M.

    2015-12-01

    Recognising ice-age catastrophic megafloods is important because they had significant impact on large-scale drainage evolution and patterns of water and sediment movement to the oceans, and likely induced very rapid, short-term effects on climate. It has been previously proposed that a drainage system on the floor of the English Channel was initiated by catastrophic flooding in the Pleistocene but this suggestion has remained controversial. Here we examine this hypothesis through an analysis of key landform features. We use a new compilation of multi- and single-beam bathymetry together with sub-bottom profiler data to establish the internal structure, planform geometry and hence origin of a set of 36 mid-channel islands. Whilst there is evidence of modern-day surficial sediment processes, the majority of the islands can be clearly demonstrated to be formed of bedrock, and are hence erosional remnants rather than depositional features. The islands display classic lemniscate or tear-drop outlines, with elongated tips pointing downstream, typical of streamlined islands formed during high-magnitude water flow. The length-to-width ratio for the entire island population is 3.4 ± 1.3 and the degree-of-elongation or k-value is 3.7 ± 1.4. These values are comparable to streamlined islands in other proven Pleistocene catastrophic flood terrains and are distinctly different to values found in modern-day rivers. The island geometries show a correlation with bedrock type: with those carved from Upper Cretaceous chalk having larger length-to-width ratios (3.2 ± 1.3) than those carved into more mixed Paleogene terrigenous sandstones, siltstones and mudstones (3.0 ± 1.5). We attribute these differences to the former rock unit having a lower skin friction which allowed longer island growth to achieve minimum drag. The Paleogene islands, although less numerous than the Chalk islands, also assume more perfect lemniscate shapes. These lithologies therefore reached island

  9. A Review on the Modified Finite Point Method

    Directory of Open Access Journals (Sweden)

    Nan-Jing Wu

    2014-01-01

    Full Text Available The objective of this paper is to make a review on recent advancements of the modified finite point method, named MFPM hereafter. This MFPM method is developed for solving general partial differential equations. Benchmark examples of employing this method to solve Laplace, Poisson, convection-diffusion, Helmholtz, mild-slope, and extended mild-slope equations are verified and then illustrated in fluid flow problems. Application of MFPM to numerical generation of orthogonal grids, which is governed by Laplace equation, is also demonstrated.

  10. [Teaching methods for clinical settings: a literature review].

    Science.gov (United States)

    Brugnolli, Anna; Benaglio, Carla

    2017-01-01

    . Teaching Methods for clinical settings: a review. The teaching process during internship requires several methods to promote the acquisition of more complex technical skills such as relational, decisional and planning abilities. To describe effective teaching methods to promote the learning of relational, decisional and planning skills. A literature review of the teaching methods that have proven most effective, most appreciated by students, and most frequently used in Italian nursing schools. Clinical teaching is a central element to transform clinical experiences during internship in professional competences. The students are gradually brought to become more independent, because they are offered opportunities to practice in real contexts, to receive feedback, to have positive role models, to become more autonomous: all elements that facilitate and potentiate learning. Clinical teaching should be based on a variety of methods. The students value a gradual progression both in clinical experiences and teaching strategies from more supervised methods to methods more oriented towards reflecting on clinical practice and self-directed learning.

  11. A review of experimental methods for determining residual creep life

    International Nuclear Information System (INIS)

    Bolton, C.J.

    1977-11-01

    Experimental methods available for determining how much creep life remains at a particular time in the high temperature service of a component are reviewed. After a brief consideration of the limitations of stress rupture extrapolation techniques, the application of post-exposure creep testing is considered. Ways of assessing the effect of microstructural degradation on residual life are then reviewed. It is pointed out that while this type of work will be useful for certain materials, there are other materials in which 'mechanical damage' such as cavitation will be more important. Cavitation measurement techniques are therefore reviewed. The report ends with a brief consideration of the use of crack growth measurements in assessing the residual life of cracked components. (author)

  12. Power Mobility Training Methods for Children: A Systematic Review.

    Science.gov (United States)

    Kenyon, Lisa K; Hostnik, Lisa; McElroy, Rachel; Peterson, Courtney; Farris, John P

    2018-01-01

    To summarize and critically appraise the existing evidence related to power mobility training methods used in research studies conducted with children 21 years or younger. A systematic review was conducted using 16 electronic databases to identify primary source quantitative studies published in peer-reviewed journals. Data extraction, determination of level of evidence, evaluation of methodological rigor, and assessment of the risk of bias were completed. The Evidence Alert Traffic Light Grading System (EATLS) was used. Twenty-seven studies were included in the review. Levels of evidence were II to V; scientific rigor scores were 2 to 7. An overall Yellow EATLS level of evidence was found indicating that therapists should use caution when providing power mobility training interventions and measure outcomes related to established goals in areas such as development, functional skills, or use of a power mobility device.

  13. Streamlining Collaboration for the Gravitational-wave Astronomy Community

    Science.gov (United States)

    Koranda, S.

    2016-12-01

    In the morning hours of September 14, 2015 the LaserInterferometer Gravitational-wave Observatory (LIGO) directlydetected gravitational waves from inspiraling and coalescingblack holes, confirming a major prediction of AlbertEinstein's general theory of relativity and beginning the eraof gravitational-wave astronomy. With the LIGO detectors in the United States, the Virgo andGEO detectors in Europe, and the KAGRA detector in Japan thegravitational-wave astrononmy community is opening a newwindow on our Universe. Realizing the full science potentialof LIGO and the other interferometers requires globalcollaboration not only within the gravitational-wave astronomycommunity but also with the astronomers and astrophysicists acrossmultipe disciplines working to realize and leverage the powerof multi-messenger astronomy. Enabling thousands of researchers from around the world andacross multiple projects to efficiently collaborate, share,and analyze data and provide streamlined access to services,computing, and tools requires new and scalable approaches toidentity and access management (IAM). We will discuss LIGO'sIAM journey that began in 2007 and how today LIGO leveragesinternal identity federations like InCommon and eduGAIN toprovide scalable and managed access for the gravitational-waveastronomy community. We will discuss the steps both largeand small research organizations and projects take as theirIAM infrastructure matures from ad-hoc silos of independent services to fully integrated and federated services thatstreamline collaboration so that scientists can focus onresearch and not managing passwords.

  14. Damage Detection with Streamlined Structural Health Monitoring Data

    Directory of Open Access Journals (Sweden)

    Jian Li

    2015-04-01

    Full Text Available The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM systems often overwhelms the systems’ capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compression, interactive sensor data retrieval, and structural knowledge discovery, which aim to enhance the reliability, efficiency, and robustness of on-line SHM systems. Adoption of this new concept will enable the design of an on-line SHM system with more uniform data generation and data handling capacity for its subsystems. To examine this concept in the context of vibration-based SHM systems, real sensor data from an on-line SHM system comprising a scaled steel bridge structure and an on-line data acquisition system with remote data access was used in this study. Vibration test results clearly demonstrated the prominent performance characteristics of the proposed integrated SHM system including rapid data access, interactive data retrieval and knowledge discovery of structural conditions on a global level.

  15. Damage detection with streamlined structural health monitoring data.

    Science.gov (United States)

    Li, Jian; Deng, Jun; Xie, Weizhi

    2015-04-15

    The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM) systems often overwhelms the systems' capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compression, interactive sensor data retrieval, and structural knowledge discovery, which aim to enhance the reliability, efficiency, and robustness of on-line SHM systems. Adoption of this new concept will enable the design of an on-line SHM system with more uniform data generation and data handling capacity for its subsystems. To examine this concept in the context of vibration-based SHM systems, real sensor data from an on-line SHM system comprising a scaled steel bridge structure and an on-line data acquisition system with remote data access was used in this study. Vibration test results clearly demonstrated the prominent performance characteristics of the proposed integrated SHM system including rapid data access, interactive data retrieval and knowledge discovery of structural conditions on a global level.

  16. An integrated billing application to streamline clinician workflow.

    Science.gov (United States)

    Vawdrey, David K; Walsh, Colin; Stetson, Peter D

    2014-01-01

    Between 2008 and 2010, our academic medical center transitioned to electronic provider documentation using a commercial electronic health record system. For attending physicians, one of the most frustrating aspects of this experience was the system's failure to support their existing electronic billing workflow. Because of poor system integration, it was difficult to verify the supporting documentation for each bill and impractical to track whether billable notes had corresponding charges. We developed and deployed in 2011 an integrated billing application called "iCharge" that streamlines clinicians' documentation and billing workflow, and simultaneously populates the inpatient problem list using billing diagnosis codes. Each month, over 550 physicians use iCharge to submit approximately 23,000 professional service charges for over 4,200 patients. On average, about 2.5 new problems are added to each patient's problem list. This paper describes the challenges and benefits of workflow integration across disparate applications and presents an example of innovative software development within a commercial EHR framework.

  17. Lessons learned in streamlining the preparation of SNM standard solutions

    International Nuclear Information System (INIS)

    Clark, J.P.; Johnson, S.R.

    1986-01-01

    Improved safeguard measurements have produced a demand for greater quantities of reliable SNM solution standards. At the Savannah River Plant (SRP), the demand for these standards has been met by several innovations to improve the productivity and reliability of standards preparations. With the use of computer controlled balance, large batches of SNM stock solutions are prepared on a gravimetric basis. Accurately dispensed quantities of the stock solution are weighed and stored in bottles. When needed, they are quantitatively transferred to tared containers, matrix adjusted to target concentrations, weighed, and measured for density at 25 0 C. Concentrations of SNM are calculated both gravimetrically and volumetrically. Calculated values are confirmed analytically before the standards are used in measurement control program (MCP) activities. The lessons learned include: MCP goals include error identification and management. Strategy modifications are required to improve error management. Administrative controls can minimize certain types of errors. Automation can eliminate redundancy and streamline preparations. Prudence and simplicity enhance automation success. The effort expended to increase productivity has increased the reliability of standards and provided better documentation for quality assurance

  18. VISMASHUP: streamlining the creation of custom visualization applications

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [Los Alamos National Laboratory; Santos, Emanuele [UNIV OF UTAH; Lins, Lauro [UNIV OF UTAH; Freire, Juliana [UNIV OF UTAH; Silva, Cl' audio T [UNIV OF UTAH

    2010-01-01

    Visualization is essential for understanding the increasing volumes of digital data. However, the process required to create insightful visualizations is involved and time consuming. Although several visualization tools are available, including tools with sophisticated visual interfaces, they are out of reach for users who have little or no knowledge of visualization techniques and/or who do not have programming expertise. In this paper, we propose VISMASHUP, a new framework for streamlining the creation of customized visualization applications. Because these applications can be customized for very specific tasks, they can hide much of the complexity in a visualization specification and make it easier for users to explore visualizations by manipulating a small set of parameters. We describe the framework and how it supports the various tasks a designer needs to carry out to develop an application, from mining and exploring a set of visualization specifications (pipelines), to the creation of simplified views of the pipelines, and the automatic generation of the application and its interface. We also describe the implementation of the system and demonstrate its use in two real application scenarios.

  19. Microdiversification in genome-streamlined ubiquitous freshwater Actinobacteria.

    Science.gov (United States)

    Neuenschwander, Stefan M; Ghai, Rohit; Pernthaler, Jakob; Salcher, Michaela M

    2018-01-01

    Actinobacteria of the acI lineage are the most abundant microbes in freshwater systems, but there are so far no pure living cultures of these organisms, possibly because of metabolic dependencies on other microbes. This, in turn, has hampered an in-depth assessment of the genomic basis for their success in the environment. Here we present genomes from 16 axenic cultures of acI Actinobacteria. The isolates were not only of minute cell size, but also among the most streamlined free-living microbes, with extremely small genome sizes (1.2-1.4 Mbp) and low genomic GC content. Genome reduction in these bacteria might have led to auxotrophy for various vitamins, amino acids and reduced sulphur sources, thus creating dependencies to co-occurring organisms (the 'Black Queen' hypothesis). Genome analyses, moreover, revealed a surprising degree of inter- and intraspecific diversity in metabolic pathways, especially of carbohydrate transport and metabolism, and mainly encoded in genomic islands. The striking genotype microdiversification of acI Actinobacteria might explain their global success in highly dynamic freshwater environments with complex seasonal patterns of allochthonous and autochthonous carbon sources. We propose a new order within Actinobacteria ('Candidatus Nanopelagicales') with two new genera ('Candidatus Nanopelagicus' and 'Candidatus Planktophila') and nine new species.

  20. Electron beam treatment planning: A review of dose computation methods

    International Nuclear Information System (INIS)

    Mohan, R.; Riley, R.; Laughlin, J.S.

    1983-01-01

    Various methods of dose computations are reviewed. The equivalent path length methods used to account for body curvature and internal structure are not adequate because they ignore the lateral diffusion of electrons. The Monte Carlo method for the broad field three-dimensional situation in treatment planning is impractical because of the enormous computer time required. The pencil beam technique may represent a suitable compromise. The behavior of a pencil beam may be described by the multiple scattering theory or, alternatively, generated using the Monte Carlo method. Although nearly two orders of magnitude slower than the equivalent path length technique, the pencil beam method improves accuracy sufficiently to justify its use. It applies very well when accounting for the effect of surface irregularities; the formulation for handling inhomogeneous internal structure is yet to be developed

  1. Literature Review of Applying Visual Method to Understand Mathematics

    Directory of Open Access Journals (Sweden)

    Yu Xiaojuan

    2015-01-01

    Full Text Available As a new method to understand mathematics, visualization offers a new way of understanding mathematical principles and phenomena via image thinking and geometric explanation. It aims to deepen the understanding of the nature of concepts or phenomena and enhance the cognitive ability of learners. This paper collates and summarizes the application of this visual method in the understanding of mathematics. It also makes a literature review of the existing research, especially with a visual demonstration of Euler’s formula, introduces the application of this method in solving relevant mathematical problems, and points out the differences and similarities between the visualization method and the numerical-graphic combination method, as well as matters needing attention for its application.

  2. Methods for land use impact assessment: A review

    International Nuclear Information System (INIS)

    Perminova, Tataina; Sirina, Natalia; Laratte, Bertrand; Baranovskaya, Natalia; Rikhvanov, Leonid

    2016-01-01

    Many types of methods to assess land use impact have been developed. Nevertheless a systematic synthesis of all these approaches is necessary to highlight the most commonly used and most effective methods. Given the growing interest in this area of research, a review of the different methods of assessing land use impact (LUI) was performed using bibliometric analysis. One hundred eighty seven articles of agricultural and biological science, and environmental sciences were examined. According to our results, the most frequently used land use assessment methods are Life-Cycle Assessment, Material Flow Analysis/Input–Output Analysis, Environmental Impact Assessment and Ecological Footprint. Comparison of the methods allowed their specific features to be identified and to arrive at the conclusion that a combination of several methods is the best basis for a comprehensive analysis of land use impact assessment. - Highlights: • We identified the most frequently used methods in land use impact assessment. • A comparison of the methods based on several criteria was carried out. • Agricultural land use is by far the most common area of study within the methods. • Incentive driven methods, like LCA, arouse the most interest in this field.

  3. Methods for land use impact assessment: A review

    Energy Technology Data Exchange (ETDEWEB)

    Perminova, Tataina, E-mail: tatiana.perminova@utt.fr [Research Centre for Environmental Studies and Sustainability, University of Technology of Troyes, CNRS UMR 6281, 12 Rue Marie Curie CS 42060, F-10004 Troyes Cedex (France); Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk (Russian Federation); Sirina, Natalia, E-mail: natalia.sirina@utt.fr [Research Centre for Environmental Studies and Sustainability, University of Technology of Troyes, CNRS UMR 6281, 12 Rue Marie Curie CS 42060, F-10004 Troyes Cedex (France); Laratte, Bertrand, E-mail: bertrand.laratte@utt.fr [Research Centre for Environmental Studies and Sustainability, University of Technology of Troyes, CNRS UMR 6281, 12 Rue Marie Curie CS 42060, F-10004 Troyes Cedex (France); Baranovskaya, Natalia, E-mail: natalya.baranovs@mail.ru [Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk (Russian Federation); Rikhvanov, Leonid, E-mail: rikhvanov@tpu.ru [Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk (Russian Federation)

    2016-09-15

    Many types of methods to assess land use impact have been developed. Nevertheless a systematic synthesis of all these approaches is necessary to highlight the most commonly used and most effective methods. Given the growing interest in this area of research, a review of the different methods of assessing land use impact (LUI) was performed using bibliometric analysis. One hundred eighty seven articles of agricultural and biological science, and environmental sciences were examined. According to our results, the most frequently used land use assessment methods are Life-Cycle Assessment, Material Flow Analysis/Input–Output Analysis, Environmental Impact Assessment and Ecological Footprint. Comparison of the methods allowed their specific features to be identified and to arrive at the conclusion that a combination of several methods is the best basis for a comprehensive analysis of land use impact assessment. - Highlights: • We identified the most frequently used methods in land use impact assessment. • A comparison of the methods based on several criteria was carried out. • Agricultural land use is by far the most common area of study within the methods. • Incentive driven methods, like LCA, arouse the most interest in this field.

  4. Streamlining Workflow for Endovascular Mechanical Thrombectomy: Lessons Learned from a Comprehensive Stroke Center.

    Science.gov (United States)

    Wang, Hongjin; Thevathasan, Arthur; Dowling, Richard; Bush, Steven; Mitchell, Peter; Yan, Bernard

    2017-08-01

    Recently, 5 randomized controlled trials confirmed the superiority of endovascular mechanical thrombectomy (EMT) to intravenous thrombolysis in acute ischemic stroke with large-vessel occlusion. The implication is that our health systems would witness an increasing number of patients treated with EMT. However, in-hospital delays, leading to increased time to reperfusion, are associated with poor clinical outcomes. This review outlines the in-hospital workflow of the treatment of acute ischemic stroke at a comprehensive stroke center and the lessons learned in reduction of in-hospital delays. The in-hospital workflow for acute ischemic stroke was described from prehospital notification to femoral arterial puncture in preparation for EMT. Systematic review of literature was also performed with PubMed. The implementation of workflow streamlining could result in reduction of in-hospital time delays for patients who were eligible for EMT. In particular, time-critical measures, including prehospital notification, the transfer of patients from door to computed tomography (CT) room, initiation of intravenous thrombolysis in the CT room, and the mobilization of neurointervention team in parallel with thrombolysis, all contributed to reduction in time delays. We have identified issues resulting in in-hospital time delays and have reported possible solutions to improve workflow efficiencies. We believe that these measures may help stroke centers initiate an EMT service for eligible patients. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  5. Review of various dynamic modeling methods and development of an intuitive modeling method for dynamic systems

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2008-01-01

    Conventional static reliability analysis methods are inadequate for modeling dynamic interactions between components of a system. Various techniques such as dynamic fault tree, dynamic Bayesian networks, and dynamic reliability block diagrams have been proposed for modeling dynamic systems based on improvement of the conventional modeling methods. In this paper, we review these methods briefly and introduce dynamic nodes to the existing Reliability Graph with General Gates (RGGG) as an intuitive modeling method to model dynamic systems. For a quantitative analysis, we use a discrete-time method to convert an RGGG to an equivalent Bayesian network and develop a software tool for generation of probability tables

  6. Analytical Method and Semianalytical Method for Analysis of Scattering by Anisotropic Sphere: A Review

    Directory of Open Access Journals (Sweden)

    Chao Wan

    2012-01-01

    Full Text Available The history of methods for the electromagnetic scattering by an anisotropic sphere has been reviewed. Two main methods, angular expansion method and T-matrix method, which are widely used for the anisotropic sphere, are expressed in Cartesian coordinate firstly. The comparison of those and the further exploration on the scattering field are illustrated afterwards. Based on the most general form concluded by variable separation method, the coupled electric field and magnetic field of radial anisotropic sphere can be derived. By simplifying the condition, simpler case of uniaxial anisotropic media is expressed with confirmed coefficients for the internal and external field. Details of significant phenomenon are presented.

  7. Available Prediction Methods for Corrosion under Insulation (CUI): A Review

    OpenAIRE

    Burhani Nurul Rawaida Ain; Muhammad Masdi; Ismail Mokhtar Che

    2014-01-01

    Corrosion under insulation (CUI) is an increasingly important issue for the piping in industries especially petrochemical and chemical plants due to its unexpected catastrophic disaster. Therefore, attention towards the maintenance and prediction of CUI occurrence, particularly in the corrosion rates, has grown in recent years. In this study, a literature review in determining the corrosion rates by using various prediction models and method of the corrosion occurrence between the external su...

  8. Methods for determination of biomethane potential of feedstocks: a review

    Directory of Open Access Journals (Sweden)

    Raphael Muzondiwa Jingura

    2017-06-01

    Full Text Available Biogas produced during anaerobic digestion (AD of biodegradable organic materials. AD is a series of biochemical reactions in which microorganisms degrade organic matter under anaerobic conditions. There are many biomass resources that can be degraded by AD to produce biogas. Biogas consists of methane, carbon dioxide, and trace amounts of other gases. The gamut of feedstocks used in AD includes animal manure, municipal solid waste, sewage sludge, and various crops. Several factors affect the potential of feedstocks for biomethane production. The factors include nutrient content, total and volatile solids (VS content, chemical and biological oxygen demand, carbon/nitrogen ratio, and presence of inhibitory substances. The biochemical methane potential (BMP, often defined as the maximum volume of methane produced per g of VS substrate provides an indication of the biodegradability of a substrate and its potential to produce methane via AD. The BMP test is a method of establishing a baseline for performance of AD. BMP data are useful for designing AD parameters in order to optimise methane production. Several methods which include experimental and theoretical methods can be used to determine BMP. The objective of this paper is to review several methods with a special focus on their advantages and disadvantages. The review shows that experimental methods, mainly the BMP test are widely used. The BMP test is credited for its reliability and validity. There are variants of BMP assays as well. Theoretical models are alternative methods to estimate BMP. They are credited for being fast and easy to use. Spectroscopy has emerged as a new experimental tool to determine BMP. Each method has its own advantages and disadvantages with reference to efficacy, time, and ease of use. Choosing a method to use depends on various exigencies. More work needs to be continuously done in order to improve the various methods used to determine BMP.

  9. Stream-lined Gating Systems with Improved Yield - Dimensioning and Experimental Validation

    DEFF Research Database (Denmark)

    Tiedje, Niels Skat; Skov-Hansen, Søren Peter

    the two types of lay-outs are cast in production. It is shown that flow in the stream-lined lay-out is well controlled and that the quality of the castings is as at least equal to that of castings produced with a traditional lay-out. Further, the yield is improved by 4 % relative to a traditional lay-out.......The paper describes how a stream-lined gating system where the melt is confined and controlled during filling can be designed. Commercial numerical modelling software has been used to compare the stream-lined design with a traditional gating system. These results are confirmed by experiments where...

  10. Trafficking and Health: A Systematic Review of Research Methods.

    Science.gov (United States)

    Cannon, Abby C; Arcara, Jennet; Graham, Laurie M; Macy, Rebecca J

    2018-04-01

    Trafficking in persons (TIP) is a human rights violation with serious public health consequences. Unfortunately, assessing TIP and its health sequelae rigorously and reliably is challenging due to TIP's clandestine nature, variation in definitions of TIP, and the need to use research methods that ensure studies are ethical and feasible. To help guide practice, policy, and research to assess TIP and health, we undertook a systematic literature review of 70 peer-reviewed, published articles to (a) identify TIP and health research methods being used, (b) determine what we can learn about TIP and health from these varied methodologies, and (c) determine the gaps that exist in health-focused TIP research. Results revealed that there are various quantitative and qualitative data collection and analysis methods being used to investigate TIP and health. Furthermore, findings show that the limitations of current methodologies affect what is known about TIP and health. In particular, varying definitions, participant recruitment strategies, ethical standards, and outcome measures all affect what is known about TIP and health. Moreover, findings demonstrate an urgent need for representative and nonpurposive recruitment strategies in future investigations of TIP and health as well as research on risk and protective factors related to TIP and health, intervention effectiveness, long-term health outcomes, and research on trafficked people beyond women trafficked for sex. We offer recommendations for research, policy, and practice based on review results.

  11. Review of teaching methods and critical thinking skills.

    Science.gov (United States)

    Kowalczyk, Nina

    2011-01-01

    Critical information is needed to inform radiation science educators regarding successful critical thinking educational strategies. From an evidence-based research perspective, systematic reviews are identified as the most current and highest level of evidence. Analysis at this high level is crucial in analyzing those teaching methods most appropriate to the development of critical thinking skills. To conduct a systematic literature review to identify teaching methods that demonstrate a positive effect on the development of students' critical thinking skills and to identify how these teaching strategies can best translate to radiologic science educational programs. A comprehensive literature search was conducted resulting in an assessment of 59 full reports. Nineteen of the 59 reports met inclusion criteria and were reviewed based on the level of evidence presented. Inclusion criteria included studies conducted in the past 10 years on sample sizes of 20 or more individuals demonstrating use of specific teaching interventions for 5 to 36 months in postsecondary health-related educational programs. The majority of the research focused on problem-based learning (PBL) requiring standardized small-group activities. Six of the 19 studies focused on PBL and demonstrated significant differences in student critical thinking scores. PBL, as described in the nursing literature, is an effective teaching method that should be used in radiation science education. ©2011 by the American Society of Radiologic Technologists.

  12. A Review of Design Optimization Methods for Electrical Machines

    Directory of Open Access Journals (Sweden)

    Gang Lei

    2017-11-01

    Full Text Available Electrical machines are the hearts of many appliances, industrial equipment and systems. In the context of global sustainability, they must fulfill various requirements, not only physically and technologically but also environmentally. Therefore, their design optimization process becomes more and more complex as more engineering disciplines/domains and constraints are involved, such as electromagnetics, structural mechanics and heat transfer. This paper aims to present a review of the design optimization methods for electrical machines, including design analysis methods and models, optimization models, algorithms and methods/strategies. Several efficient optimization methods/strategies are highlighted with comments, including surrogate-model based and multi-level optimization methods. In addition, two promising and challenging topics in both academic and industrial communities are discussed, and two novel optimization methods are introduced for advanced design optimization of electrical machines. First, a system-level design optimization method is introduced for the development of advanced electric drive systems. Second, a robust design optimization method based on the design for six-sigma technique is introduced for high-quality manufacturing of electrical machines in production. Meanwhile, a proposal is presented for the development of a robust design optimization service based on industrial big data and cloud computing services. Finally, five future directions are proposed, including smart design optimization method for future intelligent design and production of electrical machines.

  13. Gallic Acid: Review of the Methods of Determination and Quantification.

    Science.gov (United States)

    Fernandes, Felipe Hugo Alencar; Salgado, Hérida Regina Nunes

    2016-05-03

    Gallic acid (3,4,5 trihydroxybenzoic acid) is a secondary metabolite present in most plants. This metabolite is known to exhibit a range of bioactivities including antioxidant, antimicrobial, anti-inflammatory, and anticancer. There are various methods to analyze gallic acid including spectrometry, chromatography, and capillary electrophoresis, among others. They have been developed to identify and quantify this active ingredient in most biological matrices. The aim of this article is to review the available information on analytical methods for gallic acid, as well as presenting the advantages and limitations of each technique.

  14. Review of Tomographic Imaging using Finite Element Method

    Directory of Open Access Journals (Sweden)

    Mohd Fua’ad RAHMAT

    2011-12-01

    Full Text Available Many types of techniques for process tomography were proposed and developed during the past 20 years. This paper review the techniques and the current state of knowledge and experience on the subject, aimed at highlighting the problems associated with the non finite element methods, such as the ill posed, ill conditioned which relates to the accuracy and sensitivity of measurements. In this paper, considerations for choice of sensors and its applications were outlined and descriptions of non finite element tomography systems were presented. The finite element method tomography system as obtained from recent works, suitable for process control and measurement were also presented.

  15. Methods for systematic reviews of health economic evaluations: a systematic review, comparison, and synthesis of method literature.

    Science.gov (United States)

    Mathes, Tim; Walgenbach, Maren; Antoine, Sunya-Lee; Pieper, Dawid; Eikermann, Michaela

    2014-10-01

    The quality of systematic reviews of health economic evaluations (SR-HE) is often limited because of methodological shortcomings. One reason for this poor quality is that there are no established standards for the preparation of SR-HE. The objective of this study is to compare existing methods and suggest best practices for the preparation of SR-HE. To identify the relevant methodological literature on SR-HE, a systematic literature search was performed in Embase, Medline, the National Health System Economic Evaluation Database, the Health Technology Assessment Database, and the Cochrane methodology register, and webpages of international health technology assessment agencies were searched. The study selection was performed independently by 2 reviewers. Data were extracted by one reviewer and verified by a second reviewer. On the basis of the overlaps in the recommendations for the methods of SR-HE in the included papers, suggestions for best practices for the preparation of SR-HE were developed. Nineteen relevant publications were identified. The recommendations within them often differed. However, for most process steps there was some overlap between recommendations for the methods of preparation. The overlaps were taken as basis on which to develop suggestions for the following process steps of preparation: defining the research question, developing eligibility criteria, conducting a literature search, selecting studies, assessing the methodological study quality, assessing transferability, and synthesizing data. The differences in the proposed recommendations are not always explainable by the focus on certain evaluation types, target audiences, or integration in the decision process. Currently, there seem to be no standard methods for the preparation of SR-HE. The suggestions presented here can contribute to the harmonization of methods for the preparation of SR-HE. © The Author(s) 2014.

  16. Mixed-methods research in nursing - a critical review.

    Science.gov (United States)

    Bressan, Valentina; Bagnasco, Annamaria; Aleo, Giuseppe; Timmins, Fiona; Barisone, Michela; Bianchi, Monica; Pellegrini, Ramona; Sasso, Loredana

    2017-10-01

    To review the use of mixed-methods research in nursing with a particular focus on the extent to which current practice informs nurse researchers. It also aimed to highlight gaps in current knowledge, understanding and reporting of this type of research. Mixed-methods research is becoming increasingly popular among nurses and healthcare professionals. Emergent findings from this type of research are very useful for nurses in practice. The combination of both quantitative and qualitative methods provides a scientific base for practice but also richness from the qualitative enquiry. However, at the same time mixed-methods research is underdeveloped. This study identified mixed-methods research papers and critically evaluated their usefulness for research practice. To support the analysis, we performed a two-stage search using CINAHL to find papers with titles that included the key term 'mixed method'. An analysis of studies that used mixed-methods research revealed some inconsistencies in application and reporting. Attempts to use two distinct research methods in these studies often meant that one or both aspects had limitations. Overall methods were applied in a less rigorous way. This has implications for providing somewhat limited direction for novice researchers. There is also potential for application of evidence in healthcare practice that limited validity. This study highlights current gaps in knowledge, understanding and reporting of mixed-methods research. While these methods are useful to gain insight into clinical problems nurses lack guidance with this type of research. This study revealed that the guidance provided by current mixed-methods research is inconsistent and incomplete and this compounds the lack of available direction. There is an urgent need to develop robust guidelines for using mixed-methods research so that findings may be critically implemented in practice. © 2016 John Wiley & Sons Ltd.

  17. Methods to improve rehabilitation of patients following breast cancer surgery: a review of systematic reviews

    Directory of Open Access Journals (Sweden)

    Loh SY

    2015-03-01

    Full Text Available Siew Yim Loh, Aisya Nadia Musa Department of Rehabilitation Medicine, Faculty of Medicine, University of Malaya, Kuala Lumpur, Malaysia Context: Breast cancer is the most prevalent cancer amongst women but it has the highest survival rates amongst all cancer. Rehabilitation therapy of post-treatment effects from cancer and its treatment is needed to improve functioning and quality of life. This review investigated the range of methods for improving physical, psychosocial, occupational, and social wellbeing in women with breast cancer after receiving breast cancer surgery. Method: A search for articles published in English between the years 2009 and 2014 was carried out using The Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects, PubMed, and ScienceDirect. Search terms included: ‘breast cancer’, ‘breast carcinoma’, ‘surgery’, ‘mastectomy’, ‘lumpectomy’, ‘breast conservation’, ‘axillary lymph node dissection’, ‘rehabilitation’, 'therapy’, ‘physiotherapy’, ‘occupational therapy’, ‘psychological’, ‘psychosocial’, ‘psychotherapy’, ‘exercise’, ‘physical activity’, ‘cognitive’, ‘occupational’, ‘alternative’, ‘complementary’, and ‘systematic review’. Study selection: Systematic reviews on the effectiveness of rehabilitation methods in improving post-operative physical, and psychological outcomes for breast cancer were selected. Sixteen articles met all the eligibility criteria and were included in the review. Data extraction: Included review year, study aim, total number of participants included, and results. Data synthesis: Evidence for exercise rehabilitation is predominantly in the improvement of shoulder mobility and limb strength. Inconclusive results exist for a range of rehabilitation methods (physical, psycho-education, nutritional, alternative-complementary methods for addressing the domains of psychosocial, cognitive, and

  18. Path Planning Methods in an Environment with Obstacles (A Review

    Directory of Open Access Journals (Sweden)

    W. Liu

    2018-01-01

    Full Text Available Planning the path is the most important task in the mobile robot navigation. This task involves basically three aspects. First, the planned path must run from a given starting point to a given endpoint. Secondly, it should ensure robot’s collision-free movement. Thirdly, among all the possible paths that meet the first two requirements it must be, in a certain sense, optimal.Methods of path planning can be classified according to different characteristics. In the context of using intelligent technologies, they can be divided into traditional methods and heuristic ones. By the nature of the environment, it is possible to divide planning methods into planning methods in a static environment and in a dynamic one (it should be noted, however, that a static environment is rare. Methods can also be divided according to the completeness of information about the environment, namely methods with complete information (in this case the issue is a global path planning and methods with incomplete information (usually, this refers to the situational awareness in the immediate vicinity of the robot, in this case it is a local path planning. Note that incomplete information about the environment can be a consequence of the changing environment, i.e. in a dynamic environment, there is, usually, a local path planning.Literature offers a great deal of methods for path planning where various heuristic techniques are used, which, as a rule, result from the denotative meaning of the problem being solved. This review discusses the main approaches to the problem solution. Here we can distinguish five classes of basic methods: graph-based methods, methods based on cell decomposition, use of potential fields, optimization methods, фтв methods based on intelligent technologies.Many methods of path planning, as a result, give a chain of reference points (waypoints connecting the beginning and end of the path. This should be seen as an intermediate result. The problem

  19. Review of Upscaling Methods for Describing Unsaturated Flow

    Energy Technology Data Exchange (ETDEWEB)

    BD Wood

    2000-09-26

    The representation of small-scale features can be a challenge when attempting to model unsaturated flow in large domains. Upscaling methods offer the possibility of reducing the amount of resolution required to adequately simulate such a problem. In this report, the various upscaling techniques that are discussed in the literature are reviewed. The following upscaling methods have been identified from the literature: (1) stochastic methods, (2) renormalization methods, and (3) volume averaging and homogenization methods; in addition, a final technique, full resolution numerical modeling, is also discussed. Each of these techniques has its advantages and disadvantages. The trade-off is a reduction in accuracy in favor of a method that is easier to employ. For practical applications, the most reasonable approach appears to be one in which any of the upscaling methods identified above maybe suitable for upscaling in regions where the variations in the parameter fields are small. For regions where the subsurface structure is more complex, only the homogenization and volume averaging methods are probably suitable. With the continual increases in computational capacity, fill-resolution numerical modeling may in many instances provide a tractable means of solving the flow problem in unsaturated systems.

  20. THE CURRENT METHODS FOR MOLECULAR DIAGNOSTICS OF FISH DISEASES (REVIEW

    Directory of Open Access Journals (Sweden)

    O. Zaloilo

    2016-06-01

    Full Text Available Purpose. The methods of molecular diagnostic (MMD gradually become widespread in modern fish farming. MMD contain a wide variety of specific approaches, each of which has distinct limits of their possible applications and is characterized by individual peculiarities in practical performance. In addition to high sensitivity and the possibility of rapid diagnostics, the main advantage of molecular methods is to determine the uncultivated infectious agents. DNA amplification allows identifying pathogenic microorganisms at very small quantities even in the minimum sample volume. Molecular methods of diagnostic enable the determination of infection in latent or acute phases. These methods allow showing the differences between pathogens with similar antigenic structures. The current literature data on this subject usually show a methodology in the narrow context of the tasks or practical results obtained through such approaches. Thus, a synthesis of existing information on the mechanisms of action and the limits of the typical problems of basic methods of molecular diagnostics are an urgent task of fish breeding. In particular, the following description will more effectively choose one or several approaches to identify pathogens in fish. Findings. This paper reviews the basic molecular methods that are used in the world's aquaculture for diagnosis of various diseases in commercial fish species. Originality. This work is a generalization of data on the principles and mechanisms for the implementation of diagnostics based on modern molecular techniques. For each of the mentioned approaches, the most promising areas of application were shown. The information is provided in the form of a comparative analysis of each methodology, indicating positive and negative practical aspects. Practical value. The current review of modern methods of molecular diagnostic in aquaculture is focused on practical application. Generalizing and analytical information can be

  1. Methods for investigating biosurfactants and bioemulsifiers: a review.

    Science.gov (United States)

    Satpute, Surekha K; Banpurkar, Arun G; Dhakephalkar, Prashant K; Banat, Ibrahim M; Chopade, Balu A

    2010-06-01

    Microorganisms produce biosurfactant (BS)/bioemulsifier (BE) with wide structural and functional diversity which consequently results in the adoption of different techniques to investigate these diverse amphiphilic molecules. This review aims to compile information on different microbial screening methods, surface active products extraction procedures, and analytical terminologies used in this field. Different methods for screening microbial culture broth or cell biomass for surface active compounds production are also presented and their possible advantages and disadvantages highlighted. In addition, the most common methods for purification, detection, and structure determination for a wide range of BS and BE are introduced. Simple techniques such as precipitation using acetone, ammonium sulphate, solvent extraction, ultrafiltration, ion exchange, dialysis, ultrafiltration, lyophilization, isoelectric focusing (IEF), and thin layer chromatography (TLC) are described. Other more elaborate techniques including high pressure liquid chromatography (HPLC), infra red (IR), gas chromatography-mass spectroscopy (GC-MS), nuclear magnetic resonance (NMR), and fast atom bombardment mass spectroscopy (FAB-MS), protein digestion and amino acid sequencing are also elucidated. Various experimental strategies including static light scattering and hydrodynamic characterization for micelles have been discussed. A combination of various analytical methods are often essential in this area of research and a numbers of trials and errors to isolate, purify and characterize various surface active agents are required. This review introduces the various methodologies that are indispensable for studying biosurfactants and bioemulsifiers.

  2. A review of damage detection methods for wind turbine blades

    International Nuclear Information System (INIS)

    Li, Dongsheng; Song, Gangbing; Ren, Liang; Li, Hongnan; Ho, Siu-Chun M

    2015-01-01

    Wind energy is one of the most important renewable energy sources and many countries are predicted to increase wind energy portion of their whole national energy supply to about twenty percent in the next decade. One potential obstacle in the use of wind turbines to harvest wind energy is the maintenance of the wind turbine blades. The blades are a crucial and costly part of a wind turbine and over their service life can suffer from factors such as material degradation and fatigue, which can limit their effectiveness and safety. Thus, the ability to detect damage in wind turbine blades is of great significance for planning maintenance and continued operation of the wind turbine. This paper presents a review of recent research and development in the field of damage detection for wind turbine blades. Specifically, this paper reviews frequently employed sensors including fiber optic and piezoelectric sensors, and four promising damage detection methods, namely, transmittance function, wave propagation, impedance and vibration based methods. As a note towards the future development trend for wind turbine sensing systems, the necessity for wireless sensing and energy harvesting is briefly presented. Finally, existing problems and promising research efforts for online damage detection of turbine blades are discussed. (topical review)

  3. Patient and physician attitudes regarding risk and benefit in streamlined development programmes for antibacterial drugs: a qualitative analysis.

    Science.gov (United States)

    Holland, Thomas L; Mikita, Stephen; Bloom, Diane; Roberts, Jamie; McCall, Jonathan; Collyar, Deborah; Santiago, Jonas; Tiernan, Rosemary; Toerner, Joseph

    2016-11-10

    To explore patient, caregiver and physician perceptions and attitudes regarding the balance of benefit and risk in using antibacterial drugs developed through streamlined development processes. Semistructured focus groups and in-depth interviews were conducted to elicit perceptions and attitudes about the use of antibacterial drugs to treat multidrug-resistant infections. Participants were given background information about antibiotic resistance, streamlined drug development programmes and FDA drug approval processes. Audio recordings of focus groups/interviews were reviewed and quotes excerpted and categorised to identify key themes. Two primary stakeholder groups were engaged: one comprising caregivers, healthy persons and patients who had recovered from or were at risk of resistant infection (N=67; 11 focus groups); and one comprising physicians who treat resistant infections (N=23). Responses from focus groups/interviews indicated widespread awareness among patients/caregivers and physicians of the seriousness of the problem of antibacterial resistance. Both groups were willing to accept a degree of uncertainty regarding the balance of risk and benefit in a new therapy where a serious unmet need exists, but also expressed a desire for rigorous monitoring and rapid, transparent reporting of safety/effectiveness data. Both groups wanted to ensure that >1 physician had input on whether to treat patients with antibiotics developed through a streamlined process. Some patients/caregivers unfamiliar with exigencies of critical care suggested a relatively large multidisciplinary team, while physicians believed individual expert consultations would be preferable. Both groups agreed that careful oversight and stewardship of antibacterial drugs are needed to ensure patient safety, preserve efficacy and prevent abuse. Groups comprising patients/caregivers and physicians were aware of serious issues posed by resistant infections and the lack of effective antibacterial drug

  4. Streamlining genomes: toward the generation of simplified and stabilized microbial systems

    NARCIS (Netherlands)

    Leprince, A.; Passel, van M.W.J.; Martins Dos Santos, V.A.P.

    2012-01-01

    At the junction between systems and synthetic biology, genome streamlining provides a solid foundation both for increased understanding of cellular circuitry, and for the tailoring of microbial chassis towards innovative biotechnological applications. Iterative genomic deletions (targeted and

  5. West Virginia Peer Exchange : Streamlining Highway Safety Improvement Program Project Delivery - An RSPCB Peer Exchange

    Science.gov (United States)

    2014-09-01

    The West Virginia Division of Highways (WV DOH) hosted a Peer Exchange to share information and experiences for streamlining Highway Safety Improvement Program (HSIP) project delivery. The event was held September 23 to 24, 2014 in Charleston, West V...

  6. 77 FR 50691 - Request for Information (RFI): Guidance on Data Streamlining and Reducing Undue Reporting Burden...

    Science.gov (United States)

    2012-08-22

    .... Attention: HIV Data Streamlining. FOR FURTHER INFORMATION CONTACT: Andrew D. Forsyth Ph.D. or Vera... of HIV/AIDS programs that vary in their specifications (e.g., numerators, denominators, time frames...

  7. West Virginia peer exchange : streamlining highway safety improvement program project delivery.

    Science.gov (United States)

    2015-01-01

    The West Virginia Division of Highways (WV DOH) hosted a Peer Exchange to share information and experiences : for streamlining Highway Safety Improvement Program (HSIP) project delivery. The event was held September : 22 to 23, 2014 in Charleston, We...

  8. Applications and Preparation Methods of Copper Chromite Catalysts: A Review

    Directory of Open Access Journals (Sweden)

    Ram Prasad

    2011-11-01

    Full Text Available In this review article various applications and preparation methods of copper chromite catalysts have been discussed. While discussing it is concluded that copper chromite is a versatile catalyst which not only catalyses numerous processes of commercial importance and national program related to defence and space research but also finds applications in the most concerned problem worldwide i.e. environmental pollution control. Several other very useful applications of copper chromite catalysts are in production of clean energy, drugs and agro chemicals, etc. Various preparation methods about 15 have been discussed which depicts clear idea about the dependence of catalytic activity and selectivity on way of preparation of catalyst. In view of the globally increasing interest towards copper chromite catalysis, reexamination on the important applications of such catalysts and their useful preparation methods is thus the need of the time. This review paper encloses 369 references including a well-conceivable tabulation of the newer state of the art. Copyright © 2011 by BCREC UNDIP. All rights reserved.(Received: 19th March 2011, Revised: 03rd May 2011, Accepted: 23rd May 2011[How to Cite: R. Prasad, and P. Singh. (2011. Applications and Preparation Methods of Copper Chromite Catalysts: A Review. Bulletin of Chemical Reaction Engineering & Catalysis, 6 (2: 63-113. doi:10.9767/bcrec.6.2.829.63-113][How to Link / DOI: http://dx.doi.org/10.9767/bcrec.6.2.829.63-113 || or local:  http://ejournal.undip.ac.id/index.php/bcrec/article/view/829 ] | View in 

  9. Review methods for image segmentation from computed tomography images

    International Nuclear Information System (INIS)

    Mamat, Nurwahidah; Rahman, Wan Eny Zarina Wan Abdul; Soh, Shaharuddin Cik; Mahmud, Rozi

    2014-01-01

    Image segmentation is a challenging process in order to get the accuracy of segmentation, automation and robustness especially in medical images. There exist many segmentation methods that can be implemented to medical images but not all methods are suitable. For the medical purposes, the aims of image segmentation are to study the anatomical structure, identify the region of interest, measure tissue volume to measure growth of tumor and help in treatment planning prior to radiation therapy. In this paper, we present a review method for segmentation purposes using Computed Tomography (CT) images. CT images has their own characteristics that affect the ability to visualize anatomic structures and pathologic features such as blurring of the image and visual noise. The details about the methods, the goodness and the problem incurred in the methods will be defined and explained. It is necessary to know the suitable segmentation method in order to get accurate segmentation. This paper can be a guide to researcher to choose the suitable segmentation method especially in segmenting the images from CT scan

  10. Methods for Force Analysis of Overconstrained Parallel Mechanisms: A Review

    Science.gov (United States)

    Liu, Wen-Lan; Xu, Yun-Dou; Yao, Jian-Tao; Zhao, Yong-Sheng

    2017-11-01

    The force analysis of overconstrained PMs is relatively complex and difficult, for which the methods have always been a research hotspot. However, few literatures analyze the characteristics and application scopes of the various methods, which is not convenient for researchers and engineers to master and adopt them properly. A review of the methods for force analysis of both passive and active overconstrained PMs is presented. The existing force analysis methods for these two kinds of overconstrained PMs are classified according to their main ideas. Each category is briefly demonstrated and evaluated from such aspects as the calculation amount, the comprehensiveness of considering limbs' deformation, and the existence of explicit expressions of the solutions, which provides an important reference for researchers and engineers to quickly find a suitable method. The similarities and differences between the statically indeterminate problem of passive overconstrained PMs and that of active overconstrained PMs are discussed, and a universal method for these two kinds of overconstrained PMs is pointed out. The existing deficiencies and development directions of the force analysis methods for overconstrained systems are indicated based on the overview.

  11. Participatory methods in pediatric participatory research: a systematic review.

    Science.gov (United States)

    Haijes, Hanneke A; van Thiel, Ghislaine J M W

    2016-05-01

    Meaningful child participation in medical research is seen as important. In order to facilitate further development of participatory research, we performed a systematic literature study to describe and assess the available knowledge on participatory methods in pediatric research. A search was executed in five databases: PubMed, CINAHL, PsycINFO, Scopus, and Cochrane. After careful screening of relevant papers, finally 24 documents were included in our analysis. Literature on participatory methods in pediatric research appears generally to be descriptive, whereby high-quality evidence is lacking. Overall, five groups of participatory methods for children could be distinguished: observational, verbal, written, visual, and active methods. The choice for one of these methods should be based on the child's age, on social and demographic characteristics, and on the research objectives. To date, these methods are still solely used for obtaining data, yet they are suitable for conducting meaningful participation. This may result in a successful partnership between children and researchers. Researchers conducting participatory research with children can use this systematic review in order to weigh the current knowledge about the participatory methods presented.

  12. Methods for certification in colonoscopy - a systematic review

    DEFF Research Database (Denmark)

    Preisler, Louise; Svendsen, Morten Bo Søndergaard; Svendsen, Lars Bo

    2018-01-01

    INTRODUCTION: Reliable, valid, and feasible assessment tools are essential to ensure competence in colonoscopy. This study aims to provide an overview of the existing assessment methods and the validity evidence that supports them. METHODS: A systematic search was conducted in October 2016. Pubmed......, EMBASE, and PsycINFO were searched for studies evaluating assessment methods to ensure competency in colonoscopy. Outcome variables were described and evidence of validity was explored using a contemporary framework. RESULTS: Twenty-five observational studies were included in the systematic review. Most...... studies were based on small sample sizes. The studies were categorized after outcome measures into five groups: Clinical process related outcome metrics (n = 2), direct observational colonoscopy assessment (n = 8), simulator based metrics (n = 11), automatic computerized metrics (n = 2), and self...

  13. A review of common methods to convert morphine to methadone

    Directory of Open Access Journals (Sweden)

    Eric Wong

    2013-01-01

    Full Text Available When dosed appropriately on carefully chosen patients, methadone can be a very safe and effective choice in managing chronic pain. Many authors have discussed important issues surrounding patient selection, drug interactions, screening for QTc prolongation and monitoring. This article will focus on the dosing dilemma that exists after the patient is deemed an appropriate candidate for methadone and a conversion is necessary from another opioid. Despite many publications dedicated to addressing this challenging topic, there is no consensus on the most appropriate method for converting an opioid regimen to methadone. Given the lack of concrete guidance, clinicians in a community setting are likely to be faced with an increased challenge if there are no available pain specialists to provide clinical support. Common methods for converting morphine to methadone will be reviewed and two clinical patient scenarios used to illustrate the outcomes of applying the methods.

  14. Economic evaluation in patient safety: a literature review of methods.

    Science.gov (United States)

    de Rezende, Bruna Alves; Or, Zeynep; Com-Ruelle, Laure; Michel, Philippe

    2012-06-01

    Patient safety practices, targeting organisational changes for improving patient safety, are implemented worldwide but their costs are rarely evaluated. This paper provides a review of the methods used in economic evaluation of such practices. International medical and economics databases were searched for peer-reviewed publications on economic evaluations of patient safety between 2000 and 2010 in English and French. This was complemented by a manual search of the reference lists of relevant papers. Grey literature was excluded. Studies were described using a standardised template and assessed independently by two researchers according to six quality criteria. 33 articles were reviewed that were representative of different patient safety domains, data types and evaluation methods. 18 estimated the economic burden of adverse events, 3 measured the costs of patient safety practices and 12 provided complete economic evaluations. Healthcare-associated infections were the most common subject of evaluation, followed by medication-related errors and all types of adverse events. Of these, 10 were selected that had adequately fulfilled one or several key quality criteria for illustration. This review shows that full cost-benefit/utility evaluations are rarely completed as they are resource intensive and often require unavailable data; some overcome these difficulties by performing stochastic modelling and by using secondary sources. Low methodological transparency can be a problem for building evidence from available economic evaluations. Investing in the economic design and reporting of studies with more emphasis on defining study perspectives, data collection and methodological choices could be helpful for strengthening our knowledge base on practices for improving patient safety.

  15. A Review on Different Virtual Learning Methods in Pharmacy Education

    Directory of Open Access Journals (Sweden)

    Amin Noori

    2015-10-01

    Full Text Available Virtual learning is a type of electronic learning system based on the web. It models traditional in- person learning by providing virtual access to classes, tests, homework, feedbacks and etc. Students and teachers can interact through chat rooms or other virtual environments. Web 2.0 services are usually used for this method. Internet audio-visual tools, multimedia systems, a disco CD-ROMs, videotapes, animation, video conferencing, and interactive phones can all be used to deliver data to the students. E-learning can occur in or out of the classroom. It is time saving with lower costs compared to traditional methods. It can be self-paced, it is suitable for distance learning and it is flexible. It is a great learning style for continuing education and students can independently solve their problems but it has its disadvantages too. Thereby, blended learning (combination of conventional and virtual education is being used worldwide and has improved knowledge, skills and confidence of pharmacy students.The aim of this study is to review, discuss and introduce different methods of virtual learning for pharmacy students.Google scholar, Pubmed and Scupus databases were searched for topics related to virtual, electronic and blended learning and different styles like computer simulators, virtual practice environment technology, virtual mentor, virtual patient, 3D simulators, etc. are discussed in this article.Our review on different studies on these areas shows that the students are highly satisfied withvirtual and blended types of learning.

  16. Streamlining Appointment, Promotion, and Tenure Procedures to Promote Early-Career Faculty Success.

    Science.gov (United States)

    Smith, Shannon B; Hollerbach, Ann; Donato, Annemarie Sipkes; Edlund, Barbara J; Atz, Teresa; Kelechi, Teresa J

    2016-01-01

    A critical component of the progression of a successful academic career is being promoted in rank. Early-career faculty are required to have an understanding of appointment, promotion, and tenure (APT) guidelines, but many factors often impede this understanding, thwarting a smooth and planned promotion pathway for professional advancement. This article outlines the steps taken by an APT committee to improve the promotion process from instructor to assistant professor. Six sigma's DMAIC improvement model was selected as the guiding operational framework to remove variation in the promotion process. After faculty handbook revisions were made, several checklists developed, and a process review rubric was implemented; recently promoted faculty were surveyed on satisfaction with the process. Faculty opinions captured in the survey suggest increased transparency in the process and perceived support offered by the APT committee. Positive outcomes include a strengthened faculty support framework, streamlined promotion processes, and improved faculty satisfaction. Changes to the APT processes resulted in an unambiguous and standardized pathway for successful promotion. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Sonochemical Method for Casting the Polymer Nanocomposites: A Mini Review

    Directory of Open Access Journals (Sweden)

    D. Arthisree

    2018-04-01

    Full Text Available The present nano science domain focussed on sample preparation and inhibition of chemical reaction achieved by several techniques based on the principle of cavitation process using ultrasonic frequency-sonochemical routes. The effect of sonochemical routes is highly advantageous in reaction methods such as triggering reaction pathways, inducing the speedy reaction of inter-particle collision. In polymers, high intensity ultrasound waves are used for the polymerization of monomers by step growth process. This review is an outlook of sonochemical approach for polymer nanocomposites, which follows the physics of ultrasonic frequency bands, chemical reactions and the properties of acoustic cavitation highly applicable for the development of modern target materials.

  18. Review of training methods employed in nuclear fuel fabrication plants

    International Nuclear Information System (INIS)

    Box, W.D.; Browder, F.N.

    1975-01-01

    A search of the literature through the Nuclear Safety Information Center revealed that 86 percent of the incidents that have occurred in fuel fabrication plants can be traced directly or indirectly to insufficient operator training. In view of these findings, a review was made of the training programs now employed by the nuclear fuel fabrication industry. Most companies give the new employee approximately 20 hours of orientation courses, followed by 60 to 80 hours of on-the-job training. It was concluded that these training programs should be expanded in both scope and depth. A proposed program is outlined to offer guidance in improving the basic methods currently in use

  19. DFRFT: A Classified Review of Recent Methods with Its Application

    Directory of Open Access Journals (Sweden)

    Ashutosh Kumar Singh

    2013-01-01

    Full Text Available In the literature, there are various algorithms available for computing the discrete fractional Fourier transform (DFRFT. In this paper, all the existing methods are reviewed, classified into four categories, and subsequently compared to find out the best alternative from the view point of minimal computational error, computational complexity, transform features, and additional features like security. Subsequently, the correlation theorem of FRFT has been utilized to remove significantly the Doppler shift caused due to motion of receiver in the DSB-SC AM signal. Finally, the role of DFRFT has been investigated in the area of steganography.

  20. [Research methods of carbon sequestration by soil aggregates: a review].

    Science.gov (United States)

    Chen, Xiao-Xia; Liang, Ai-Zhen; Zhang, Xiao-Ping

    2012-07-01

    To increase soil organic carbon content is critical for maintaining soil fertility and agricultural sustainable development and for mitigating increased greenhouse gases and the effects of global climate change. Soil aggregates are the main components of soil, and have significant effects on soil physical and chemical properties. The physical protection of soil organic carbon by soil aggregates is the important mechanism of soil carbon sequestration. This paper reviewed the organic carbon sequestration by soil aggregates, and introduced the classic and current methods in studying the mechanisms of carbon sequestration by soil aggregates. The main problems and further research trends in this study field were also discussed.

  1. A review of zinc oxide mineral beneficiation using flotation method.

    Science.gov (United States)

    Ejtemaei, Majid; Gharabaghi, Mahdi; Irannajad, Mehdi

    2014-04-01

    In recent years, extraction of zinc from low-grade mining tailings of oxidized zinc has been a matter of discussion. This is a material which can be processed by flotation and acid-leaching methods. Owing to the similarities in the physicochemical and surface chemistry of the constituent minerals, separation of zinc oxide minerals from their gangues by flotation is an extremely complex process. It appears that selective leaching is a promising method for the beneficiation of this type of ore. However, with the high consumption of leaching acid, the treatment of low-grade oxidized zinc ores by hydrometallurgical methods is expensive and complex. Hence, it is best to pre-concentrate low-grade oxidized zinc by flotation and then to employ hydrometallurgical methods. This paper presents a critical review on the zinc oxide mineral flotation technique. In this paper, the various flotation methods of zinc oxide minerals which have been proposed in the literature have been detailed with the aim of identifying the important factors involved in the flotation process. The various aspects of recovery of zinc from these minerals are also dealt with here. The literature indicates that the collector type, sulfidizing agent, pH regulator, depressants and dispersants types, temperature, solid pulp concentration, and desliming are important parameters in the process. The range and optimum values of these parameters, as also the adsorption mechanism, together with the resultant flotation of the zinc oxide minerals reported in the literature are summarized and highlighted in the paper. This review presents a comprehensive scientific guide to the effectiveness of flotation strategy. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Efficacy of contraceptive methods: a review of the literature.

    Science.gov (United States)

    Mansour, Diana; Inki, Pirjo; Gemzell-Danielsson, Kristina

    2010-12-01

    To provide a comprehensive and objective summary of contraceptive failure rates for a variety of methods based on a systematic review of the literature. Medline and Embase were searched using the Ovid interface from January 1990 to February 2008, as well as the reference lists of published articles, to identify studies reporting contraceptive efficacy as a Pearl Index or life-table estimate. Reports that recruited less than 400 subjects per study group and those covering less than six cycles/six months were excluded. In addition, unlicensed products or those not internationally available, emergency contraception, and vasectomy studies were excluded. Information was identified and extracted from 139 studies. One-year Pearl Indices reported for short-acting user-dependent hormonal methods were generally less than 2.5. Gross life-table rates for long-acting hormonal methods (implants and the levonorgestrel releasing-intrauterine system [LNG-IUS]) generally ranged between 0-0.6 per 100 at one year, but wider ranges (0.1-1.5 per 100) were observed for the copper intrauterine devices (0.1-1.4 per 100 for Cu-UIDs with surface area ≥ 300 mm2 and 0.6-1.5 per 100 for those with surface area natural methods were the least effective. Our review broadly confirms the hierarchy of contraceptive effectiveness in descending order as: (1) female sterilisation, long-acting hormonal contraceptives (LNG-IUS and implants); (2) Cu-IUDs with ≥ 300 mm2 surface area; (3) Cu-IUDs with natural methods.

  3. A REVIEW ON EFFICACIOUS METHODS TO DECOLORIZE REACTIVE AZO DYE

    Directory of Open Access Journals (Sweden)

    Jagadeesan Vijayaraghavan

    2013-01-01

    Full Text Available This paper deals with the intensive review of reactive azo dye, Reactive Black 5. Various physicochemical methods namely photo catalysis, electrochemical, adsorption, hydrolysis and biological methods like microbial degradation, biosorption and bioaccumulation have been analyzed thoroughly along with the merits and demerits of each method. Among these various methods, biological treatment methods are found to be the best for decolorization of Reactive Black 5. With respect to dye biosorption, microbial biomass (bacteria, fungi, microalgae, etc, and outperformed macroscopic materials (seaweeds, crab shell, etc. are used for decolorization process. The use of living organisms may not be an option for the continuous treatment of highly toxic organic/inorganic contaminants. Once the toxicant concentration becomes too high or the process operated for a long time, the amount of toxicant accumulated will reach saturation. Beyond this point, an organism's metabolism may be interrupted, resulting in death of the organism. This scenario is not existed in the case of dead biomass, which is flexible to environmental conditions and toxicant concentrations. Thus, owing to its favorable characteristics, biosorption has received much attention in recent years.

  4. A Review of the Detection Methods for Climate Regime Shifts

    Directory of Open Access Journals (Sweden)

    Qunqun Liu

    2016-01-01

    Full Text Available An abrupt climate change means that the climate system shifts from a steady state to another steady state. Study on the phenomenon and theory of the abrupt climate change is a new research field of modern climatology, and it is of great significance for the prediction of future climate change. The climate regime shift is one of the most common forms of abrupt climate change, which mainly refers to the statistical significant changes on the variable of climate system at one time scale. These detection methods can be roughly divided into five categories based on different types of abrupt changes, namely, abrupt mean value change, abrupt variance change, abrupt frequency change, abrupt probability density change, and the multivariable analysis. The main research progress of abrupt climate change detection methods is reviewed. What is more, some actual applications of those methods in observational data are provided. With the development of nonlinear science, many new methods have been presented for detecting an abrupt dynamic change in recent years, which is useful supplement for the abrupt change detection methods.

  5. Analytical methods for determination of mycotoxins: a review.

    Science.gov (United States)

    Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A

    2009-01-26

    Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.

  6. Review of track-fitting methods in counter experiments

    International Nuclear Information System (INIS)

    Regler, M.; Eichinger, H.

    1981-01-01

    We review track-fitting methods recently used in high-energy physics experiments. Assuming that the problem of pattern recognition, i.e. of grouping the often ambiguous coordinate information (as frequently measured by wire chambers) together to form track candidates, has already been solved, we try to point out the way to obtain the ultimate geometrical resolution with the smallest and fastest possible program; owing to the wide variety of detectors and experimental set-ups, no universal method has been found. Some applications will serve as examples, and based on the experience gained we will try to indicate when and under which conditions a known algorithm could be used, and this might even help in designing future experiments. (orig.)

  7. Different types of anastomotic methods: a review of literature

    Directory of Open Access Journals (Sweden)

    Shadi Mooloughi

    2015-09-01

    Full Text Available Constructing successful anastomosis is an important concept in gastrointestinal tract surgeries, which can be affected by various factors such as preoperative bowel condition, intra- and postoperative complications, bleeding and the device characteristics. Suturing, stapling and compression anastomosis are different techniques. Despite the invention of compression anastomosis, which goes back almost two centuries, this method has not obtained the popularity of the suturing and stapling anastomosis and further studies are required. Designing methods and devices with no drawbacks might reduce the complications associated with anastomosis as the alternative to suturing and stapling anastomoses. Several materials can be used as reinforcement materials, which can improve the consequences of the stapled anastomosis. In addition to reinforcement materials, other forms of supports have been proposed, which might be capable of reducing the postoperative complications of anastomosis. In this study, we briefly review various types of anastomotic techniques and associated complications in different types of gastrointestinal surgeries.

  8. Applying systems ergonomics methods in sport: A systematic review.

    Science.gov (United States)

    Hulme, Adam; Thompson, Jason; Plant, Katherine L; Read, Gemma J M; Mclean, Scott; Clacy, Amanda; Salmon, Paul M

    2018-04-16

    As sports systems become increasingly more complex, competitive, and technology-centric, there is a greater need for systems ergonomics methods to consider the performance, health, and safety of athletes in context with the wider settings in which they operate. Therefore, the purpose of this systematic review was to identify and critically evaluate studies which have applied a systems ergonomics research approach in the context of sports performance and injury management. Five databases (PubMed, Scopus, ScienceDirect, Web of Science, and SPORTDiscus) were searched for the dates 01 January 1990 to 01 August 2017, inclusive, for original peer-reviewed journal articles and conference papers. Reported analyses were underpinned by a recognised systems ergonomics method, and study aims were related to the optimisation of sports performance (e.g. communication, playing style, technique, tactics, or equipment), and/or the management of sports injury (i.e. identification, prevention, or treatment). A total of seven articles were identified. Two articles were focussed on understanding and optimising sports performance, whereas five examined sports injury management. The methods used were the Event Analysis of Systemic Teamwork, Cognitive Work Analysis (the Work Domain Analysis Abstraction Hierarchy), Rasmussen's Risk Management Framework, and the Systems Theoretic Accident Model and Processes method. The individual sport application was distance running, whereas the team sports contexts examined were cycling, football, Australian Football League, and rugby union. The included systems ergonomics applications were highly flexible, covering both amateur and elite sports contexts. The studies were rated as valuable, providing descriptions of injury controls and causation, the factors influencing injury management, the allocation of responsibilities for injury prevention, as well as the factors and their interactions underpinning sports performance. Implications and future

  9. Available Prediction Methods for Corrosion under Insulation (CUI: A Review

    Directory of Open Access Journals (Sweden)

    Burhani Nurul Rawaida Ain

    2014-07-01

    Full Text Available Corrosion under insulation (CUI is an increasingly important issue for the piping in industries especially petrochemical and chemical plants due to its unexpected catastrophic disaster. Therefore, attention towards the maintenance and prediction of CUI occurrence, particularly in the corrosion rates, has grown in recent years. In this study, a literature review in determining the corrosion rates by using various prediction models and method of the corrosion occurrence between the external surface piping and its insulation was carried out. The results, prediction models and methods available were presented for future research references. However, most of the prediction methods available are based on each local industrial data only which might be different based on the plant location, environment, temperature and many other factors which may contribute to the difference and reliability of the model developed. Thus, it is more reliable if those models or method supported by laboratory testing or simulation which includes the factors promoting CUI such as environment temperature, insulation types, operating temperatures, and other factors.

  10. Halal and kosher slaughter methods and meat quality: a review.

    Science.gov (United States)

    Farouk, M M; Al-Mazeedi, H M; Sabow, A B; Bekhit, A E D; Adeyemi, K D; Sazili, A Q; Ghani, A

    2014-11-01

    There are many slaughter procedures that religions and cultures use around the world. The two that are commercially relevant are the halal and kosher methods practiced by Muslims and Jews respectively. The global trade in red meat and poultry produced using these two methods is substantial, thus the importance of the quality of the meat produced using the methods. Halal and kosher slaughter per se should not affect meat quality more than their industrial equivalents, however, some of their associated pre- and post-slaughter processes do. For instance, the slow decline in blood pressure following a halal pre-slaughter head-only stun and neck cut causes blood splash (ecchymosis) in a range of muscles and organs of slaughtered livestock. Other quality concerns include bruising, hemorrhages, skin discoloration and broken bones particularly in poultry. In addition to these conventional quality issues, the "spiritual quality" of the meat can also be affected when the halal and kosher religious requirements are not fully met during the slaughter process. The nature, causes, importance and mitigations of these and other quality issues related to halal and kosher slaughtering and meat production using these methods are the subjects of this review. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Effectiveness of and obstacles to antibiotic streamlining to amoxicillin monotherapy in bacteremic pneumococcal pneumonia.

    Science.gov (United States)

    Blot, Mathieu; Pivot, Diane; Bourredjem, Abderrahmane; Salmon-Rousseau, Arnaud; de Curraize, Claire; Croisier, Delphine; Chavanet, Pascal; Binquet, Christine; Piroth, Lionel

    2017-09-01

    Antibiotic streamlining is pivotal to reduce the emergence of resistant bacteria. However, whether streamlining is frequently performed and safe in difficult situations, such as bacteremic pneumococcal pneumonia (BPP), has still to be assessed. All adult patients admitted to Dijon Hospital (France) from 2005 to 2013 who had BPP without complications, and were alive on the third day were enrolled. Clinical, biological, radiological, microbiological and therapeutic data were recorded. A first analysis was conducted to assess factors associated with being on amoxicillin on the third day. A second analysis, adjusting for a propensity score, was performed to determine whether 30-day mortality was associated with streamlining to amoxicillin monotherapy. Of the 196 patients hospitalized for BPP, 161 were still alive on the third day and were included in the study. Treatment was streamlined to amoxicillin in 60 patients (37%). Factors associated with not streamlining were severe pneumonia (OR 3.11, 95%CI [1.23-7.87]) and a first-line antibiotic combination (OR 3.08, 95%CI [1.34-7.09]). By contrast, starting with amoxicillin monotherapy correlated inversely with the risk of subsequent treatment with antibiotics other than amoxicillin (OR 0.06, 95%CI [0.01-0.30]). The Cox model adjusted for the propensity-score analysis showed that streamlining to amoxicillin during BPP was not significantly associated with a higher risk of 30-day mortality (HR 0.38, 95%CI [0.08-1.87]). Streamlining to amoxicillin is insufficiently implemented during BPP. This strategy is safe and potentially associated with ecological and economic benefits; therefore, it should be further encouraged, particularly when antibiotic combinations are started for severe pneumonia. Copyright © 2017. Published by Elsevier B.V.

  12. Review of Congestion Management Methods for Distribution Networks with High Penetration of Distributed Energy Resources

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Liu, Zhaoxi

    2014-01-01

    This paper reviews the existing congestion management methods for distribution networks with high penetration of DERs documented in the recent research literatures. The congestion management methods for distribution networks reviewed can be grouped into two categories – market methods and direct...... control methods. The market methods consist of dynamic tariff, distribution capacity market, shadow price and flexible service market. The direct control methods are comprised of network reconfiguration, reactive power control and active power control. Based on the review of the existing methods...

  13. A brief review of other notable protein blotting methods.

    Science.gov (United States)

    Kurien, Biji T; Scofield, R Hal

    2009-01-01

    A plethora of methods have been used for transferring proteins from the gel to the membrane. These include centrifuge blotting, electroblotting of proteins to Teflon tape and membranes for N- and C-terminal sequence analysis, multiple tissue blotting, a two-step transfer of low and high molecular weight proteins, blotting of Coomassie Brilliant Blue (CBB)-stained proteins from polyacrylamide gels to transparencies, acid electroblotting onto activated glass, membrane-array method for the detection of human intestinal bacteria in fecal samples, protein microarray using a new black cellulose nitrate support, electrotransfer using square wave alternating voltage for enhanced protein recovery, polyethylene glycol-mediated significant enhancement of the immunoblotting transfer, parallel protein chemical processing before and during western blot and the molecular scanner concept, electronic western blot of matrix-assisted laser desorption/ionization (MALDI) mass spectrometry-identified polypeptides from parallel processed gel-separated proteins, semidry electroblotting of peptides and proteins from acid-urea polyacrylamide gels, transfer of silver-stained proteins from polyacrylamide gels to polyvinylidene difluoride (PVDF) membranes, and the display of K(+) channel proteins on a solid nitrocellulose support for assaying toxin binding. The quantification of proteins bound to PVDF membranes by elution of CBB, clarification of immunoblots on PVDF for transmission densitometry, gold coating of nonconductive membranes before MALDI tandem mass spectrometric analysis to prevent charging effect for analysis of peptides from PVDF membranes, and a simple method for coating native polysaccharides onto nitrocellulose are some of the methods involving either the manipulation of membranes with transferred proteins or just a passive transfer of antigens to membranes. All these methods are briefly reviewed in this chapter.

  14. Other notable protein blotting methods: a brief review.

    Science.gov (United States)

    Kurien, Biji T; Scofield, R Hal

    2015-01-01

    Proteins have been transferred from the gel to the membrane by a variety of methods. These include vacuum blotting, centrifuge blotting, electroblotting of proteins to Teflon tape and membranes for N- and C-terminal sequence analysis, multiple tissue blotting, a two-step transfer of low- and high-molecular-weight proteins, acid electroblotting onto activated glass, membrane-array method for the detection of human intestinal bacteria in fecal samples, protein microarray using a new black cellulose nitrate support, electrotransfer using square wave alternating voltage for enhanced protein recovery, polyethylene glycol-mediated significant enhancement of the immunoblotting transfer, parallel protein chemical processing before and during western blot and the molecular scanner concept, electronic western blot of matrix-assisted laser desorption/ionization mass spectrometric-identified polypeptides from parallel processed gel-separated proteins, semidry electroblotting of peptides and proteins from acid-urea polyacrylamide gels, transfer of silver-stained proteins from polyacrylamide gels to polyvinylidene difluoride (PVDF) membranes, and the display of K(+) channel proteins on a solid nitrocellulose support for assaying toxin binding. The quantification of proteins bound to PVDF membranes by elution of CBB, clarification of immunoblots on PVDF for transmission densitometry, gold coating of nonconductive membranes before matrix-assisted laser desorption/ionization tandem mass spectrometric analysis to prevent charging effect for analysis of peptides from PVDF membranes, and a simple method for coating native polysaccharides onto nitrocellulose are some of the methods involving either the manipulation of membranes with transferred proteins or just a passive transfer of antigens to membranes. All these methods are briefly reviewed in this chapter.

  15. Clinical tooth preparations and associated measuring methods: a systematic review.

    Science.gov (United States)

    Tiu, Janine; Al-Amleh, Basil; Waddell, J Neil; Duncan, Warwick J

    2015-03-01

    The geometries of tooth preparations are important features that aid in the retention and resistance of cemented complete crowns. The clinically relevant values and the methods used to measure these are not clear. The purpose of this systematic review was to retrieve, organize, and critically appraise studies measuring clinical tooth preparation parameters, specifically the methodology used to measure the preparation geometry. A database search was performed in Scopus, PubMed, and ScienceDirect with an additional hand search on December 5, 2013. The articles were screened for inclusion and exclusion criteria and information regarding the total occlusal convergence (TOC) angle, margin design, and associated measuring methods were extracted. The values and associated measuring methods were tabulated. A total of 1006 publications were initially retrieved. After removing duplicates and filtering by using exclusion and inclusion criteria, 983 articles were excluded. Twenty-three articles reported clinical tooth preparation values. Twenty articles reported the TOC, 4 articles reported margin designs, 4 articles reported margin angles, and 3 articles reported the abutment height of preparations. A variety of methods were used to measure these parameters. TOC values seem to be the most important preparation parameter. Recommended TOC values have increased over the past 4 decades from an unachievable 2- to 5-degree taper to a more realistic 10 to 22 degrees. Recommended values are more likely to be achieved under experimental conditions if crown preparations are performed outside of the mouth. We recommend that a standardized measurement method based on the cross sections of crown preparations and standardized reporting be developed for future studies analyzing preparation geometry. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  16. A review of bioinformatic methods for forensic DNA analyses.

    Science.gov (United States)

    Liu, Yao-Yuan; Harbison, SallyAnn

    2018-03-01

    Short tandem repeats, single nucleotide polymorphisms, and whole mitochondrial analyses are three classes of markers which will play an important role in the future of forensic DNA typing. The arrival of massively parallel sequencing platforms in forensic science reveals new information such as insights into the complexity and variability of the markers that were previously unseen, along with amounts of data too immense for analyses by manual means. Along with the sequencing chemistries employed, bioinformatic methods are required to process and interpret this new and extensive data. As more is learnt about the use of these new technologies for forensic applications, development and standardization of efficient, favourable tools for each stage of data processing is being carried out, and faster, more accurate methods that improve on the original approaches have been developed. As forensic laboratories search for the optimal pipeline of tools, sequencer manufacturers have incorporated pipelines into sequencer software to make analyses convenient. This review explores the current state of bioinformatic methods and tools used for the analyses of forensic markers sequenced on the massively parallel sequencing (MPS) platforms currently most widely used. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Recovery process of elite athletes: A review of contemporary methods

    Directory of Open Access Journals (Sweden)

    Veljović Draško

    2012-01-01

    Full Text Available A numerous training stimulus and competition as well can reduce level of abilities among athletes. This decline of performance can be a temporary phenomenon, with duration of several minutes or several hours after a workout, or take much longer, even a several days. The lack of adequate recovery process can influence on athletes not being able to train at the desired intensity or do not fully meet the tasks at the next training session. Chronic fatigue can lead to injuries, and therefore, full recovery is necessary for achieving optimal level of abilities that will ensure a better athletic performance. For this reasons, athletes often carry out a variety of techniques and methods aimed to recover after training or match. They have become a part of the training process and their purpose is reduction of stress and fatigue incurred as a result of daily exposure to intense training stimulus. There are numerous methods and techniques today that can accelerate the recovery process of athletes. For this reason it is necessary to know the efficiency of an adequate method which will be applied in the training process. The aim of this review article is to point to those currently used and their effects on the process of recovery after physical activity in elite sport.

  18. Review: Optimization methods for groundwater modeling and management

    Science.gov (United States)

    Yeh, William W.-G.

    2015-09-01

    Optimization methods have been used in groundwater modeling as well as for the planning and management of groundwater systems. This paper reviews and evaluates the various optimization methods that have been used for solving the inverse problem of parameter identification (estimation), experimental design, and groundwater planning and management. Various model selection criteria are discussed, as well as criteria used for model discrimination. The inverse problem of parameter identification concerns the optimal determination of model parameters using water-level observations. In general, the optimal experimental design seeks to find sampling strategies for the purpose of estimating the unknown model parameters. A typical objective of optimal conjunctive-use planning of surface water and groundwater is to minimize the operational costs of meeting water demand. The optimization methods include mathematical programming techniques such as linear programming, quadratic programming, dynamic programming, stochastic programming, nonlinear programming, and the global search algorithms such as genetic algorithms, simulated annealing, and tabu search. Emphasis is placed on groundwater flow problems as opposed to contaminant transport problems. A typical two-dimensional groundwater flow problem is used to explain the basic formulations and algorithms that have been used to solve the formulated optimization problems.

  19. Methods of plant root exudates analysis: a review

    Directory of Open Access Journals (Sweden)

    Peter Dundek

    2011-01-01

    Full Text Available The aim of this review is to summarise current knowledge on methods being used to determine individual compounds and properties of water-soluble plant root exudates. These compounds include amino acids, organic acids and simple sugars, as well as polysaccharides, proteins and organic substances. Qualitative composition of water-soluble root exudates and exudation rate are commonly measured with the aim of consequent synthetic preparation of plant root exudates to be supplied to soil to create artificial rhizosphere for different experimental purposes. Root exudates collection usually requires consequent filtration or centrifugation to remove solids, root detritus and microbial cell debris, and consequent concentration using an evaporator, lyophilizator or ultrafiltration. Methods used for analysis of total groups of compounds (total proteins and total carbohydrates and total organic carbon are simple. On the other hand, HPLC or GS/MS are commonly used to analyse individual low molecular weight organic molecules (sugars, organic acids and amino acids with separation using different columns. Other properties such as pH, conductivity or activity of different enzymes as well as gel electrophoresis of proteins are sometimes assessed. All of these methods are discussed in this work.

  20. Bioprofiling of unknown antibiotics in herbal extracts: Development of a streamlined direct bioautography using Bacillus subtilis linked to mass spectrometry.

    Science.gov (United States)

    Jamshidi-Aidji, Maryam; Morlock, Gertrud E

    2015-11-13

    Working in the field of profiling and identification of bioactive compounds in herbal extracts is faced with the challenge that common chromatographic methods do not directly link to bioactive compounds. Direct bioautography, the combination of TLC/HPTLC with bioassays, linked to structure elucidating techniques is demonstrated to overcome this challenge. The combination of TLC and Bacillus subtilis bioassay was already demonstrated to detect the antibiotics in samples. However, previous studies in this field were faced with some challenges, like being time-consuming, leading not to a homogenous plate background or being restricted to a non-acidic mobile phase. In this study, these aspects were investigated and a streamlined HPTLC-B. subtilis bioassay was developed that generated a homogenous plate background, which was crucial to yield a good baseline for biodensitometry. Two commonly used broths for B. subtilis and a self-designed medium were compared with regard to their capability of detection and baseline noise. The workflow developed allowed the use of acidic mobile phases for the first time. To prove this, 20 herbal extracts were screened for antimicrobial substances developed in parallel with an acidic mobile phase. The main antimicrobial substance in Salvia officinalis tincture detected was further characterized by microchemical reactions, Aliivibrio fischeri, β-glucosidase and acetylcholinesterase (bio)assays as well as mass spectrometry. Scientists looking for new herbal-based medicine may benefit from this time-saving and streamlined bioactivity profiling. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Cardiogenic induction of pluripotent stem cells streamlined through a conserved SDF-1/VEGF/BMP2 integrated network.

    Directory of Open Access Journals (Sweden)

    Anca Chiriac

    Full Text Available BACKGROUND: Pluripotent stem cells produce tissue-specific lineages through programmed acquisition of sequential gene expression patterns that function as a blueprint for organ formation. As embryonic stem cells respond concomitantly to diverse signaling pathways during differentiation, extraction of a pro-cardiogenic network would offer a roadmap to streamline cardiac progenitor output. METHODS AND RESULTS: To resolve gene ontology priorities within precursor transcriptomes, cardiogenic subpopulations were here generated according to either growth factor guidance or stage-specific biomarker sorting. Innate expression profiles were independently delineated through unbiased systems biology mapping, and cross-referenced to filter transcriptional noise unmasking a conserved progenitor motif (55 up- and 233 down-regulated genes. The streamlined pool of 288 genes organized into a core biological network that prioritized the "Cardiovascular Development" function. Recursive in silico deconvolution of the cardiogenic neighborhood and associated canonical signaling pathways identified a combination of integrated axes, CXCR4/SDF-1, Flk-1/VEGF and BMP2r/BMP2, predicted to synchronize cardiac specification. In vitro targeting of the resolved triad in embryoid bodies accelerated expression of Nkx2.5, Mef2C and cardiac-MHC, enhanced beating activity, and augmented cardiogenic yield. CONCLUSIONS: Transcriptome-wide dissection of a conserved progenitor profile thus revealed functional highways that coordinate cardiogenic maturation from a pluripotent ground state. Validating the bioinformatics algorithm established a strategy to rationally modulate cell fate, and optimize stem cell-derived cardiogenesis.

  2. Stabilization/Solidification Remediation Method for Contaminated Soil: A Review

    Science.gov (United States)

    Tajudin, S. A. A.; Azmi, M. A. M.; Nabila, A. T. A.

    2016-07-01

    Stabilization/Solidification (S/S) is typically a process that involves a mixing of waste with binders to reduce the volume of contaminant leachability by means of physical and chemical characteristics to convert waste in the environment that goes to landfill or others possibly channels. Stabilization is attempts to reduce the solubility or chemical reactivity of the waste by changing the physical and chemical properties. While, solidification attempt to convert the waste into easily handled solids with low hazardous level. These two processes are often discussed together since they have a similar purpose of improvement than containment of potential pollutants in treated wastes. The primary objective of this review is to investigate the materials used as a binder in Stabilization/Solidification (S/S) method as well as the ability of these binders to remediate the contaminated soils especially by heavy metals.

  3. Methods to estimate irrigated reference crop evapotranspiration - a review.

    Science.gov (United States)

    Kumar, R; Jat, M K; Shankar, V

    2012-01-01

    Efficient water management of crops requires accurate irrigation scheduling which, in turn, requires the accurate measurement of crop water requirement. Irrigation is applied to replenish depleted moisture for optimum plant growth. Reference evapotranspiration plays an important role for the determination of water requirements for crops and irrigation scheduling. Various models/approaches varying from empirical to physically base distributed are available for the estimation of reference evapotranspiration. Mathematical models are useful tools to estimate the evapotranspiration and water requirement of crops, which is essential information required to design or choose best water management practices. In this paper the most commonly used models/approaches, which are suitable for the estimation of daily water requirement for agricultural crops grown in different agro-climatic regions, are reviewed. Further, an effort has been made to compare the accuracy of various widely used methods under different climatic conditions.

  4. Review of training methods employed in nuclear fuel fabrication plants

    International Nuclear Information System (INIS)

    Box, W.D.; Browder, F.N.

    A search of the literature through the Nuclear Safety Information Center revealed that approximately 86 percent of the incidents that have occurred in fuel fabrication plants can be traced directly or indirectly to insufficient operator training. In view of these findings, a review was made of the training programs now employed by the nuclear fuel fabrication industry. Most companies give the new employee approximately 20 h of orientation courses, followed by 60 to 80 h of on-the-job training. It was concluded that these training programs should be expanded in both scope and depth. A proposed program is outlined to offer guidance in improving the basic methods currently in use. (U.S.)

  5. A review of teaching methods and outcomes of resident phacoemulsification.

    Science.gov (United States)

    Kaplowitz, Kevin; Yazdanie, Mohammad; Abazari, Azin

    Cataract surgery with phacoemulsification is a challenging procedure for surgeons in training to learn to perform safely, efficiently, and effectively. We review the auxiliary learning tools outside the operating room that residency programs have incorporated into their curriculum to improve surgical skills, including wet laboratory and surgical simulators. We then discuss different methods of teaching cataract surgery in the operating room. Our goal is to define a learning curve for cataract surgery. We demonstrate that complication rates decline significantly after a resident performs an average of 70 cases. We summarize the reported incidence and risk factors for complications in resident-performed cataract surgery to help identify cases that require a higher level of skill to improve visual outcomes. We suggest that future studies include details on preoperative comorbidities, risk stratification, resident skill level, and frequency of takeover by attending. Published by Elsevier Inc.

  6. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    Science.gov (United States)

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  7. Inactivation Methods of Trypsin Inhibitor in Legumes: A Review.

    Science.gov (United States)

    Avilés-Gaxiola, Sara; Chuck-Hernández, Cristina; Serna Saldívar, Sergio O

    2018-01-01

    Seed legumes have played a major role as a crop worldwide, being cultivated on about 12% to 15% of Earth's arable land; nevertheless, their use is limited by, among other things, the presence of several antinutritional factors (ANFs - naturally occurring metabolites that the plant produces to protect itself from pest attacks.) Trypsin inhibitors (TIs) are one of the most relevant ANFs because they reduce digestion and absorption of dietary proteins. Several methods have been developed in order to inactivate TIs, and of these, thermal treatments are the most commonly used. They cause loss of nutrients, affect functional properties, and require high amounts of energy. Given the above, new processes have emerged to improve the nutritional quality of legumes while trying to solve the problems caused by the use of thermal treatments. This review examines and discusses the methods developed by researchers to inactivate TI present in legumes and their effects over nutritional and functional properties. © 2017 Institute of Food Technologists®.

  8. Dental ceramics: a review of new materials and processing methods.

    Science.gov (United States)

    Silva, Lucas Hian da; Lima, Erick de; Miranda, Ranulfo Benedito de Paula; Favero, Stéphanie Soares; Lohbauer, Ulrich; Cesar, Paulo Francisco

    2017-08-28

    The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I) monolithic zirconia restorations; II) multilayered dental prostheses; III) new glass-ceramics; IV) polymer infiltrated ceramics; and V) novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  9. Dimension reduction methods for microarray data: a review

    Directory of Open Access Journals (Sweden)

    Rabia Aziz

    2017-03-01

    Full Text Available Dimension reduction has become inevitable for pre-processing of high dimensional data. “Gene expression microarray data” is an instance of such high dimensional data. Gene expression microarray data displays the maximum number of genes (features simultaneously at a molecular level with a very small number of samples. The copious numbers of genes are usually provided to a learning algorithm for producing a complete characterization of the classification task. However, most of the times the majority of the genes are irrelevant or redundant to the learning task. It will deteriorate the learning accuracy and training speed as well as lead to the problem of overfitting. Thus, dimension reduction of microarray data is a crucial preprocessing step for prediction and classification of disease. Various feature selection and feature extraction techniques have been proposed in the literature to identify the genes, that have direct impact on the various machine learning algorithms for classification and eliminate the remaining ones. This paper describes the taxonomy of dimension reduction methods with their characteristics, evaluation criteria, advantages and disadvantages. It also presents a review of numerous dimension reduction approaches for microarray data, mainly those methods that have been proposed over the past few years.

  10. Modern acupuncture-like stimulation methods: a literature review

    Directory of Open Access Journals (Sweden)

    Min-Ho Jun

    2015-12-01

    Full Text Available Acupuncture therapy has been proved to be effective for diverse diseases, symptoms, and conditions in numerous clinical trials. The growing popularity of acupuncture therapy has triggered the development of modern acupuncture-like stimulation devices (ASDs, which are equivalent or superior to manual acupuncture with respect to safety, decreased risk of infection, and facilitation of clinical trials. Here, we aim to summarize the research on modern ASDs, with a focus on featured devices undergoing active research and their effectiveness and target symptoms, along with annual publication rates. We searched the popular electronic databases Medline, PubMed, the Cochrane Library, and Web of Science, and analyzed English-language studies on humans. Thereby, a total of 728 studies were identified, of which 195 studies met our inclusion criteria. Electrical stimulators were found to be the earliest and most widely studied devices (133 articles, followed by laser (44 articles, magnetic (16 articles, and ultrasound (2 articles stimulators. A total of 114 studies used randomized controlled trials, and 109 studies reported therapeutic benefits. The majority of the studies (32% focused on analgesia and pain-relief effects, followed by effects on brain activity (16%. All types of the reviewed ASDs were associated with increasing annual publication trends; specifically, the annual growth in publications regarding noninvasive stimulation methods was more rapid than that regarding invasive methods. Based on this observation, we anticipate that the noninvasive or minimally invasive ASDs will become more popular in acupuncture therapy.

  11. Adaptive design methods in clinical trials – a review

    Directory of Open Access Journals (Sweden)

    Chang Mark

    2008-05-01

    Full Text Available Abstract In recent years, the use of adaptive design methods in clinical research and development based on accrued data has become very popular due to its flexibility and efficiency. Based on adaptations applied, adaptive designs can be classified into three categories: prospective, concurrent (ad hoc, and retrospective adaptive designs. An adaptive design allows modifications made to trial and/or statistical procedures of ongoing clinical trials. However, it is a concern that the actual patient population after the adaptations could deviate from the originally target patient population and consequently the overall type I error (to erroneously claim efficacy for an infective drug rate may not be controlled. In addition, major adaptations of trial and/or statistical procedures of on-going trials may result in a totally different trial that is unable to address the scientific/medical questions the trial intends to answer. In this article, several commonly considered adaptive designs in clinical trials are reviewed. Impacts of ad hoc adaptations (protocol amendments, challenges in by design (prospective adaptations, and obstacles of retrospective adaptations are described. Strategies for the use of adaptive design in clinical development of rare diseases are discussed. Some examples concerning the development of Velcade intended for multiple myeloma and non-Hodgkin's lymphoma are given. Practical issues that are commonly encountered when implementing adaptive design methods in clinical trials are also discussed.

  12. Dental ceramics: a review of new materials and processing methods

    Directory of Open Access Journals (Sweden)

    Lucas Hian da SILVA

    2017-08-01

    Full Text Available Abstract The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I monolithic zirconia restorations; II multilayered dental prostheses; III new glass-ceramics; IV polymer infiltrated ceramics; and V novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  13. Review on methods of golden mussel control in pires

    Directory of Open Access Journals (Sweden)

    Edemir Luiz Kowalski

    2008-07-01

    Full Text Available At the beginning of the 90’s, they were detected in Rio da Prata in Argentina the first samples of the exotic specie named limnoperna fortunei, from Asia, maybe introduced through ballast water of ships came from Asia. In Brazil the first samples were detected in Lagoa dos Patos in Rio Grande do Sul in the 90’s, possibly by the same reason. A second axis was verified in Campo Grande in Mato Grosso do Sul derived probably from Argentina because of the navigation through the Paraguay river going down to Lagoa de Itaipú causing its contamination. The invader specie has the capacity of fouling pipings where the contaminated water circulates, causing considerable financial damage to the infected industries. In Brazil the indrustries located in Rio Grande do Sul as well as hydroelectric plants as Itaipu, they manage these problems stopping the equipments for their maintenance and cleaning more times than the habitual. The United States of America and Canada already have the same kind of problem with the similar specie found here in Brazil. The target of this work is to introduce a review about the main methods to control the golden mussel mollusk without using any kind of chemical products, based on The USA and Canada’s experiences, where there are similar problems but with the specie zebra mussel. Key-words: Non Chemicals Methods, Golden Mussel, Zebra Mussel

  14. Streamline-concentration balance model for in-situ uranium leaching and site restoration

    International Nuclear Information System (INIS)

    Bommer, P.M.; Schechter, R.S.; Humenick, M.J.

    1981-03-01

    This work presents two computer models. One describes in-situ uranium leaching and the other describes post leaching site restoration. Both models use a streamline generator to set up the flow field over the reservoir. The leaching model then uses the flow data in a concentration balance along each streamline coupled with the appropriate reaction kinetics to calculate uranium production. The restoration model uses the same procedure except that binary cation exchange is used as the restoring mechanism along each streamline and leaching cation clean up is simulated. The mathematical basis for each model is shown in detail along with the computational schemes used. Finally, the two models have been used with several data sets to point out their capabilities and to illustrate important leaching and restoration parameters and schemes

  15. Self streamlining wind tunnel: Further low speed testing and final design studies for the transonic facility

    Science.gov (United States)

    Wolf, S. W. D.

    1978-01-01

    Work was continued with the low speed self streamlining wind tunnel (SSWT) using the NACA 0012-64 airfoil in an effort to explain the discrepancies between the NASA Langley low turbulence pressure tunnel (LTPT) and SSWT results obtained with the airfoil stalled. Conventional wind tunnel corrections were applied to straight wall SSWT airfoil data, to illustrate the inadequacy of standard correction techniques in circumstances of high blockage. Also one SSWT test was re-run at different air speeds to investigate the effects of such changes (perhaps through changes in Reynold's number and freestream turbulence levels) on airfoil data and wall contours. Mechanical design analyses for the transonic self-streamlining wind tunnel (TSWT) were completed by the application of theoretical airfoil flow field data to the elastic beam and streamline analysis. The control system for the transonic facility, which will eventually allow on-line computer operation of the wind tunnel, was outlined.

  16. Streamline-concentration balance model for in situ uranium leaching and site restoration

    International Nuclear Information System (INIS)

    Bommer, P.M.

    1979-01-01

    This work presents two computer models. One describes in situ uranium leaching and the other describes post leaching site restoration. Both models use a streamline generator to set up the flow field over the reservoir. The leaching model then uses the flow data in a concentration balance along each streamline coupled with the appropriate reaction kinetics to calculate uranium production. The restoration model uses the same procedure ecept that binary cation exchange is used as the restoring mechanism along each streamline and leaching cation clean up is stimulated. The mathematical basis for each model is shown in detail along with the computational schemes used. Finally, the two models have been used with several data sets to point out their capabilities and to illustrate important leaching and restoration parameters and schemes

  17. A Selective Review of Multimodal Fusion Methods in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Jing eSui

    2012-02-01

    Full Text Available Schizophrenia (SZ is one of the most cryptic and costly mental disorders in terms of human suffering and societal expenditure (van Os and Kapur, 2009. Though strong evidences for functional, structural and genetic abnormalities associated with this disease exist, there is yet no replicable finding which has proven accurate enough to be useful in clinical decision making (Fornito et al., 2009, and its diagnosis relies primarily upon symptom assessment (Williams et al., 2010a. It is likely in part that the lack of consistent neuroimaging findings is because most models favor only one data type or do not combine data from different imaging modalities effectively, thus missing potentially important differences which are only partially detected by each modality (Calhoun et al., 2006a. It is becoming increasingly clear that multi-modal fusion, a technique which takes advantage of the fact that each modality provides a limited view of the brain/gene and may uncover hidden relationships, is an important tool to help unravel the black box of schizophrenia. In this review paper, we survey a number of multimodal fusion applications which enable us to study the schizophrenia macro-connectome, including brain functional, structural and genetic aspects and may help us understand the disorder in a more comprehensive and integrated manner. We also provide a table that characterizes these applications by the methods used and compare these methods in detail, especially for multivariate models, which may serve as a valuable reference that helps readers select an appropriate method based on a given research.

  18. Woody biomass comminution and sorting - a review of mechanical methods

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, Gunnar [Swedish Univ. of Agricultural Sciences, Dept. of Forest Resource Management, Umeaa (Sweden)], e-mail: gunnar.eriksson@slu.se

    2012-11-01

    The increased demand for woody biomass for heat and electricity and biorefineries means that each bio component must be used efficiently. Any increase in raw material supply in the short term is likely to require the use of trees from early thinnings, logging residues and stumps, assortments of low value compared to stemwood. However, sorting of the novel materials into bio components may increase their value considerably. The challenge is to 1) maximise the overall values of the different raw material fractions for different users, 2) minimise costs for raw material extraction, processing, storage and transportation. Comminution of the raw material (e.g. to chips, chunks, flakes and powder) and sorting the bio components (e.g. separating bark from pulp chips and separating alkali-rich needles and shots for combustion and gasification applications) are crucial processes in this optimisation. The purpose of this study has been to make a literature review of principles for comminution and sorting, with an emphasis on mechanical methods suitable outside industries. More efficient comminution methods can be developed when the wood is to a larger extent cut along the fibre direction, and closer to the surface (with less pressure to the sides of the knife). By using coarse comminution (chunking) rather than fine comminution (chipping), productivity at landings can be increased and energy saved, the resulting product will have better storage and drying properties. At terminals, any further comminution (if necessary) could use larger-scale equipment of higher efficiency. Rolls and flails can be used to an increasing extent for removing foliage and twigs, possibly in the terrain (for instance fitted on grapples). Physical parameters used for sorting of the main components of trees include particle size, density and shape (aerodynamic drag and lift), optical and IR properties and X-ray fluorescence. Although methods developed for pulp chip production from whole trees may not

  19. Microfluidic DNA microarrays in PMMA chips: streamlined fabrication via simultaneous DNA immobilization and bonding activation by brief UV exposure

    DEFF Research Database (Denmark)

    Sabourin, David; Petersen, J; Snakenborg, Detlef

    2010-01-01

    This report presents and describes a simple and scalable method for producing functional DNA microarrays within enclosed polymeric, PMMA, microfluidic devices. Brief (30 s) exposure to UV simultaneously immobilized poly(T)poly(C)-tagged DNA probes to the surface of unmodified PMMA and activated...... the surface for bonding below the glass transition temperature of the bulk PMMA. Functionality and validation of the enclosed PMMA microarrays was demonstrated as 18 patients were correctly genotyped for all eight mutation sites in the HBB gene interrogated. The fabrication process therefore produced probes...... with desired hybridization properties and sufficient bonding between PMMA layers to allow construction of microfluidic devices. The streamlined fabrication method is suited to the production of low-cost microfluidic microarray-based diagnostic devices and, as such, is equally applicable to the development...

  20. Unique encoding for streamline topologies of incompressible and inviscid flows in multiply connected domains

    Energy Technology Data Exchange (ETDEWEB)

    Sakajo, T [Department of Mathematics, Kyoto University, Kitashirakawa Oiwake-cho, Sakyo-ku, Kyoto 606-8502 (Japan); Sawamura, Y; Yokoyama, T, E-mail: sakajo@math.kyoto-u.ac.jp [JST CREST, Kawaguchi, Saitama 332-0012 (Japan)

    2014-06-01

    This study considers the flow of incompressible and inviscid fluid in two-dimensional multiply connected domains. For such flows, encoding algorithms to assign a unique sequence of words to any structurally stable streamline topology based on the theory presented by Yokoyama and Sakajo (2013 Proc. R. Soc. A 469 20120558) are proposed. As an application, we utilize the algorithms to characterize the evolution of an incompressible and viscid flow around a flat plate inclined to the uniform flow in terms of the change of the word representations for their instantaneous streamline topologies. (papers)

  1. Streamline Patterns and their Bifurcations near a wall with Navier slip Boundary Conditions

    DEFF Research Database (Denmark)

    Tophøj, Laust; Møller, Søren; Brøns, Morten

    2006-01-01

    We consider the two-dimensional topology of streamlines near a surface where the Navier slip boundary condition applies. Using transformations to bring the streamfunction in a simple normal form, we obtain bifurcation diagrams of streamline patterns under variation of one or two external parameters....... Topologically, these are identical with the ones previously found for no-slip surfaces. We use the theory to analyze the Stokes flow inside a circle, and show how it can be used to predict new bifurcation phenomena. ©2006 American Institute of Physics...

  2. Streamlining the Acquisition Process: A DCAA Field-Grade Perspective

    Science.gov (United States)

    2014-03-01

    Initial Capabilities Document IFRS International Financial Reporting Standards IPT Integrated Product Team IRR Independent Reference Review...the responsibilities, programmed focus, strategic plan and recent events impacting the organization. B. DEFENSE CONTRACT AUDIT AGENCY 1. DCAA...material misstatements, whether caused by error or fraud. The type of audit requested by the contracting officer will directly impact both the

  3. Critical Review of Diagnostic Methods Used in Chronic Pancreatic Disease

    Directory of Open Access Journals (Sweden)

    Ivan T Beck

    1995-01-01

    Full Text Available This paper provides a balanced assessment of the various pancreatic function tests and imaging techniques used in the differential diagnosis of chronic pancreatic disease. Function tests that study the digestive capacity of the pancreas (fat absorption of dietary lipids, fluorescein- or radiolabelled fats, bentiromide test, etc have high specificity, but very low sensitivity. This is because 90% of pancreas has to be destroyed before steatorrhea or creatorrhea occurs. Tests that directly measure pancreatic bicarbonate and protein secretion (secretin test, etc are more accurate and may detect pancreatic dysfunction even before anatomical changes occur. Measurement of pancreatic enzymes in serum or urine, or the decreased decline of serum amino acids during their incorporation into pancreatic enzymes, are not sufficiently sensitive or specific to help diagnose pancreatic disease. Sensitive and specific tumour markers are not yet available. Thus screening tests are not cost-effective - if they are negative, they do not exclude pancreatic disease; and if positive, they have to be confirmed by more specific tests. Imaging techniques are the most commonly used methods of investigation. The usefulness of abdominal survey films, barium studies, percutaneous transhepatic cholangiography, endoscopic retrograde cholangiopancreatography (ERCP, ultrasonography, computed tomographic scan, magnetic resonance imaging and endoscopic ultrasonography is critically reviewed. Most of the radiological methods can be combined with cytology or biopsy. Histology demonstrating malignancy establishes this diagnosis, but negative biopsies do not exclude malignant tumours. Presently only ERCP and endoscopic ultrasound can diagnose cancers sufficiently early to allow for possible `curative' surgery, and only endoscopic ultrasound is capable to stage tumours for the assessment of resectability.

  4. Review of probabilistic pollen-climate transfer methods

    Science.gov (United States)

    Ohlwein, Christian; Wahl, Eugene R.

    2012-01-01

    Pollen-climate transfer methods are reviewed from a Bayesian perspective and with a special focus on the formulation of uncertainties. This approach is motivated by recent developments of spatial multi-proxy Bayesian hierarchical models (BHM), which allow synthesizing local reconstructions from different proxies for a spatially complete picture of past climate. In order to enhance the pollen realism in these models we try to bridge the gap between spatial statistics and paleoclimatology and show how far classical pollen-climate transfer concepts such as regression methods, mutual climatic range, modern analogues, plant functional types, and biomes can be understood in novel ways by refining the data models used in BHMs. As a case study, we discuss modeling of uncertainty by introducing a new probabilistic pollen ratio model, which is a simplified variation of the modern analogue technique (MAT) including the concept of response surfaces and designed for later inclusion in a spatial multiproxy BHM. Applications to fossil pollen data from varved sediments in three nearby lakes in west-central Wisconsin, USA and for a Holocene fossil pollen record from southern California, USA provide local climate reconstructions of summer temperature for the past millennium and the Holocene respectively. The performance of the probabilistic model is generally similar in comparison to MAT-derived reconstructions using the same data. Furthermore, the combination of co-location and precise dating for the three fossil sites in Wisconsin allows us to study the issue of site-specific uncertainty and to test the assumption of ergodicity in a real-world example. A multivariate ensemble kernel dressing approach derived from the post-processing of climate simulations reveals that the overall interpretation based on the individual reconstructions remains essentially unchanged, but the single-site reconstructions underestimate the overall uncertainty.

  5. Liquid phase microextraction of pesticides: a review on current methods

    International Nuclear Information System (INIS)

    Farajzadeh, Mir Ali; Sorouraddin, Saeed Mohammad; Mogaddam, Mohammad Reza Afshar

    2014-01-01

    Liquid phase microextraction (LPME) enables analytes to be extracted with a few microliters of an organic solvent. LPME is a technique for sample preparation that is extremely simple, affordable and virtually a solvent-free. It can provide a high degree of selectivity and enrichment by eliminating carry-over between single runs. A variety of solvents are known for the extraction of the various analytes. These features have led to the development of techniques such as single drop microextraction, hollow fiber LPME, dispersive liquid-liquid microextraction, and others. LPME techniques have been applied to the analysis of pharmaceuticals, food, beverages, and pesticides. This review covers the history of LPME methods, and then gives a comprehensive collection of their application to the preconcentration and determination of pesticides in various matrices. Specific sections cover (a) sample treatment techniques in general, (b) single-drop microextraction, (c) extraction based on the use of ionic liquids, (d) solidified floating organic drop microextraction, and various other techniques. (author)

  6. Lipid Extraction Methods from Microalgae: A Comprehensive Review

    Energy Technology Data Exchange (ETDEWEB)

    Ranjith Kumar, Ramanathan [Department of Plant Biology and Plant Biotechnology, Shree Chandraprabhu Jain College, Chennai (India); Hanumantha Rao, Polur [Department of Microbiology, Madras Christian College, Chennai (India); Arumugam, Muthu, E-mail: arumugam@niist.res.in [Division of Biotechnology, CSIR – National Institute for Interdisciplinary Science and Technology (NIIST), Trivandrum (India)

    2015-01-08

    Energy security has become a serious global issue and a lot of research is being carried out to look for economically viable and environment-friendly alternatives. The only solution that appears to meet futuristic needs is the use of renewable energy. Although various forms of renewable energy are being currently used, the prospects of producing carbon-neutral biofuels from microalgae appear bright because of their unique features such as suitability of growing in open ponds required for production of a commodity product, high CO{sub 2}-sequestering capability, and ability to grow in wastewater/seawater/brackish water and high-lipid productivity. The major process constraint in microalgal biofuel technology is the cost-effective and efficient extraction of lipids. The objective of this article is to provide a comprehensive review on various methods of lipid extraction from microalgae available, to date, as well as to discuss their advantages and disadvantages. The article covers all areas of lipid extraction procedures including solvent extraction procedures, mechanical approaches, and solvent-free procedures apart from some of the latest extraction technologies. Further research is required in this area for successful implementation of this technology at the production scale.

  7. Lipid Extraction Methods from Microalgae: A Comprehensive Review

    International Nuclear Information System (INIS)

    Ranjith Kumar, Ramanathan; Hanumantha Rao, Polur; Arumugam, Muthu

    2015-01-01

    Energy security has become a serious global issue and a lot of research is being carried out to look for economically viable and environment-friendly alternatives. The only solution that appears to meet futuristic needs is the use of renewable energy. Although various forms of renewable energy are being currently used, the prospects of producing carbon-neutral biofuels from microalgae appear bright because of their unique features such as suitability of growing in open ponds required for production of a commodity product, high CO 2 -sequestering capability, and ability to grow in wastewater/seawater/brackish water and high-lipid productivity. The major process constraint in microalgal biofuel technology is the cost-effective and efficient extraction of lipids. The objective of this article is to provide a comprehensive review on various methods of lipid extraction from microalgae available, to date, as well as to discuss their advantages and disadvantages. The article covers all areas of lipid extraction procedures including solvent extraction procedures, mechanical approaches, and solvent-free procedures apart from some of the latest extraction technologies. Further research is required in this area for successful implementation of this technology at the production scale.

  8. Geologic storage of carbon dioxide and enhanced oil recovery. I. Uncertainty quantification employing a streamline based proxy for reservoir flow simulation

    International Nuclear Information System (INIS)

    Kovscek, A.R.; Wang, Y.

    2005-01-01

    Carbon dioxide (CO 2 ) is already injected into a limited class of reservoirs for oil recovery purposes; however, the engineering design question for simultaneous oil recovery and storage of anthropogenic CO 2 is significantly different from that of oil recovery alone. Currently, the volumes of CO 2 injected solely for oil recovery are minimized due to the purchase cost of CO 2 . If and when CO 2 emissions to the atmosphere are managed, it will be necessary to maximize simultaneously both economic oil recovery and the volumes of CO 2 emplaced in oil reservoirs. This process is coined 'cooptimization'. This paper proposes a work flow for cooptimization of oil recovery and geologic CO 2 storage. An important component of the work flow is the assessment of uncertainty in predictions of performance. Typical methods for quantifying uncertainty employ exhaustive flow simulation of multiple stochastic realizations of the geologic architecture of a reservoir. Such approaches are computationally intensive and thereby time consuming. An analytic streamline based proxy for full reservoir simulation is proposed and tested. Streamline trajectories represent the three-dimensional velocity field during multiphase flow in porous media and so are useful for quantifying the similarity and differences among various reservoir models. The proxy allows rational selection of a representative subset of equi-probable reservoir models that encompass uncertainty with respect to true reservoir geology. The streamline approach is demonstrated to be thorough and rapid

  9. Empirical methods for systematic reviews and evidence-based medicine

    NARCIS (Netherlands)

    van Enst, W.A.

    2014-01-01

    Evidence-Based Medicine is the integration of best research evidence with clinical expertise and patient values. Systematic reviews have become the cornerstone of evidence-based medicine, which is reflected in the position systematic reviews have in the pyramid of evidence-based medicine. Systematic

  10. Streamlining the Online Course Development Process by Using Project Management Tools

    Science.gov (United States)

    Abdous, M'hammed; He, Wu

    2008-01-01

    Managing the design and production of online courses is challenging. Insufficient instructional design and inefficient management often lead to issues such as poor course quality and course delivery delays. In an effort to facilitate, streamline, and improve the overall design and production of online courses, this article discusses how we…

  11. Less is More : Better Compliance and Increased Revenues by Streamlining Business Registration in Uganda

    OpenAIRE

    Sander, Cerstin

    2003-01-01

    A pilot of a streamlined business registration system in Entebbe, Uganda, reduced compliance costs for enterprises by 75 percent, raised registration numbers and fee revenue by 40 percent and reduced the cost of administering the system. It also reduced opportunities for corruption, improved relations between businesses and the local authorities and resulted in better compliance.

  12. Zephyr: A secure Internet-based process to streamline engineering procurements using the World Wide Web

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, C.W.; Cavitt, R.E.; Niven, W.A.; Warren, F.E.; Taylor, S.S.; Sharick, T.M.; Vickers, D.L.; Mitschkowetz, N.; Weaver, R.L.

    1996-08-13

    Lawrence Livermore National Laboratory (LLNL) is piloting an Internet- based paperless process called `Zephyr` to streamline engineering procurements. Major benefits have accrued by using Zephyr in reducing procurement time, speeding the engineering development cycle, facilitating industrial collaboration, and reducing overall costs. Programs at LLNL are benefiting by the efficiencies introduced since implementing Zephyr`s engineering and commerce on the Internet.

  13. Streamline topologies near simple degenerate critical points in two-dimensional flow away from boundaries

    DEFF Research Database (Denmark)

    Brøns, Morten; Hartnack, Johan Nicolai

    1998-01-01

    Streamline patterns and their bifurcations in two-dimensional incompressible flow are investigated from a topological point of view. The velocity field is expanded at a point in the fluid, and the expansion coefficients are considered as bifurcation parameters. A series of non-linear coordinate c...

  14. Streamline topologies near simple degenerate critical points in two-dimensional flow away from boundaries

    DEFF Research Database (Denmark)

    Brøns, Morten; Hartnack, Johan Nicolai

    1999-01-01

    Streamline patterns and their bifurcations in two-dimensional incompressible flow are investigated from a topological point of view. The velocity field is expanded at a point in the fluid, and the expansion coefficients are considered as bifurcation parameters. A series of nonlinear coordinate ch...

  15. A review of the facile (FN) method in particle transport theory

    International Nuclear Information System (INIS)

    Garcia, R.D.M.

    1986-02-01

    The facile F N method for solving particle transport problems is reviewed. The fundamentals of the method are summarized, recent developments are discussed and several applications of the method are described in detail. (author) [pt

  16. Vortex Generators in a Streamline-Traced, External-Compression Supersonic Inlet

    Science.gov (United States)

    Baydar, Ezgihan; Lu, Frank K.; Slater, John W.; Trefny, Charles J.

    2017-01-01

    Vortex generators within a streamline-traced, external-compression supersonic inlet for Mach 1.66 were investigated to determine their ability to increase total pressure recovery and reduce total pressure distortion. The vortex generators studied were rectangular vanes arranged in counter-rotating and co-rotating arrays. The vane geometric factors of interest included height, length, spacing, angle-of-incidence, and positions upstream and downstream of the inlet terminal shock. The flow through the inlet was simulated numerically through the solution of the steady-state, Reynolds-averaged Navier-Stokes equations on multi-block, structured grids using the Wind-US flow solver. The vanes were simulated using a vortex generator model. The inlet performance was characterized by the inlet total pressure recovery and the radial and circumferential total pressure distortion indices at the engine face. Design of experiments and statistical analysis methods were applied to quantify the effect of the geometric factors of the vanes and search for optimal vane arrays. Co-rotating vane arrays with negative angles-of-incidence positioned on the supersonic diffuser were effective in sweeping low-momentum flow from the top toward the sides of the subsonic diffuser. This distributed the low-momentum flow more evenly about the circumference of the subsonic diffuser and reduced distortion. Co-rotating vane arrays with negative angles-of-incidence or counter-rotating vane arrays positioned downstream of the terminal shock were effective in mixing higher-momentum flow with lower-momentum flow to increase recovery and decrease distortion. A strategy of combining a co-rotating vane array on the supersonic diffuser with a counter-rotating vane array on the subsonic diffuser was effective in increasing recovery and reducing distortion.

  17. Connecting streamlined subglacial bedforms with the geological/geographical environment in which they are located.

    Science.gov (United States)

    Dowling, Tom; Möller, Per; Greenwood, Sarah; Spagnolo, Matteo; Åkesson, Maria; Fraser, Stephen; Hughs, Anna; Clark, Chris

    2016-04-01

    Much work has qualitatively shown that there appears to be a relationship between the morphology of streamlined subglacial bedforms (drumlinoids) and the geological/geographical environment in which said bedforms are located upon, particularly in terms of bedrock influence. However, the one quantitative study that has been carried out on this connectivity (Greenwood and Clark, 2010) found that there appears to be a connection between bedrock type and morphology only at a local scale. At a regional scale the most important geological factor seemed to be the properties of the substrate, usually till. In order to investigate these connections further, self-organising maps (SOM) are used to investigate the role of contextual geology/geography in drumlinoid morphology. The SOM method allows the statistical exploration of data that cannot normally be evaluated by traditional means; categorical data (e.g. bedrock type) can be used in the same analysis as continuous/vector data (e.g. drift depth). Here, three large morphological data sets from Sweden (20 041), Britain (36 104) and Ireland (13 454) are combined with bedrock type, drift depth, basal elevation and distance to esker to see if there are any relationships to be found between them. The results indicate that there are pervasive, statistically significant, and weak to very weak correlations between contextual geological/geographical factors and drumlinoid morphology. The most important contextual factor appears to be 'drift depth', followed by 'distance to esker'. Therefore, models of drumlinoid formation and any efforts to use such features for palaeo-ice reconstruction must take into account the geological and geographical environment in which they are situated. The logical extension of this is that models of ice-sheet growth and retreat must also take into account and be sensitive to the type of substratum present beneath the ice. Further research into the effect of drift properties on the flow of ice is needed.

  18. Economics methods in Cochrane systematic reviews of health promotion and public health related interventions.

    Science.gov (United States)

    Shemilt, Ian; Mugford, Miranda; Drummond, Michael; Eisenstein, Eric; Mallender, Jacqueline; McDaid, David; Vale, Luke; Walker, Damian

    2006-11-15

    Provision of evidence on costs alongside evidence on the effects of interventions can enhance the relevance of systematic reviews to decision-making. However, patterns of use of economics methods alongside systematic review remain unclear. Reviews of evidence on the effects of interventions are published by both the Cochrane and Campbell Collaborations. Although it is not a requirement that Cochrane or Campbell Reviews should consider economic aspects of interventions, many do. This study aims to explore and describe approaches to incorporating economics methods in a selection of Cochrane systematic reviews in the area of health promotion and public health, to help inform development of methodological guidance on economics for reviewers. The Cochrane Database of Systematic Reviews was searched using a search strategy for potential economic evaluation studies. We included current Cochrane reviews and review protocols retrieved using the search that are also identified as relevant to health promotion or public health topics. A reviewer extracted data which describe the economics components of included reviews. Extracted data were summarised in tables and analysed qualitatively. Twenty-one completed Cochrane reviews and seven review protocols met inclusion criteria. None incorporate formal economic evaluation methods. Ten completed reviews explicitly aim to incorporate economics studies and data. There is a lack of transparent reporting of methods underpinning the incorporation of economics studies and data. Some reviews are likely to exclude useful economics studies and data due to a failure to incorporate search strategies tailored to the retrieval of such data or use of key specialist databases, and application of inclusion criteria designed for effectiveness studies. There is a need for consistency and transparency in the reporting and conduct of the economics components of Cochrane reviews, as well as regular dialogue between Cochrane reviewers and economists to

  19. Improving and streamlining the workflow in the graphic arts and printing industry

    Science.gov (United States)

    Tuijn, Chris

    2003-01-01

    In order to survive in the economy of today, an ever-increasing productivity is required from all the partners participating in a specific business process. This is not different for the printing industry. One of the ways to remain profitable is, on one hand, to reduce costs by automation and aiming for large-scale projects and, on the other hand, to specialize and become an expert in the area in which one is active. One of the ways to realize these goals is by streamlining the communication of the different partners and focus on the core business. If we look at the graphic arts and printing industry, we can identify different important players that eventually help in the realization of printed material. For the printing company (as is the case for any other company), the most important player is the customer. This role can be adopted by many different players including publishers, companies, non-commercial institutions, private persons etc. Sometimes, the customer will be the content provider as well but this is not always the case. Often, the content is provided by other organizations such as design and prepress agencies, advertising companies etc. In most printing organizations, the customer has one contact person often referred to as the CSR (Customers Service Representative). Other people involved at the printing organization include the sales representatives, prepress operators, printing operators, postpress operators, planners, the logistics department, the financial department etc. In the first part of this article, we propose a solution that will improve the communication between all the different actors in the graphic arts and printing industry considerably and will optimize and streamline the overall workflow as well. This solution consists of an environment in which the customer can communicate with the CSR to ask for a quote based on a specific product intent; the CSR will then (after the approval from the customer's side) organize the work and brief

  20. Methods to assess intended effects of drug treatment in observational studies are reviewed

    NARCIS (Netherlands)

    Klungel, Olaf H|info:eu-repo/dai/nl/181447649; Martens, Edwin P|info:eu-repo/dai/nl/088859010; Psaty, Bruce M; Grobbee, Diederik E; Sullivan, Sean D; Stricker, Bruno H Ch; Leufkens, Hubert G M|info:eu-repo/dai/nl/075255049; de Boer, A|info:eu-repo/dai/nl/075097346

    2004-01-01

    BACKGROUND AND OBJECTIVE: To review methods that seek to adjust for confounding in observational studies when assessing intended drug effects. METHODS: We reviewed the statistical, economical and medical literature on the development, comparison and use of methods adjusting for confounding. RESULTS:

  1. [Baseflow separation methods in hydrological process research: a review].

    Science.gov (United States)

    Xu, Lei-Lei; Liu, Jing-Lin; Jin, Chang-Jie; Wang, An-Zhi; Guan, De-Xin; Wu, Jia-Bing; Yuan, Feng-Hui

    2011-11-01

    Baseflow separation research is regarded as one of the most important and difficult issues in hydrology and ecohydrology, but lacked of unified standards in the concepts and methods. This paper introduced the theories of baseflow separation based on the definitions of baseflow components, and analyzed the development course of different baseflow separation methods. Among the methods developed, graph separation method is simple and applicable but arbitrary, balance method accords with hydrological mechanism but is difficult in application, whereas time series separation method and isotopic method can overcome the subjective and arbitrary defects caused by graph separation method, and thus can obtain the baseflow procedure quickly and efficiently. In recent years, hydrological modeling, digital filtering, and isotopic method are the main methods used for baseflow separation.

  2. Review of Research on Template Methods in Preparation of Nanomaterials

    OpenAIRE

    Yadian Xie; Duygu Kocaefe; Chunying Chen; Yasar Kocaefe

    2016-01-01

    The nanomaterials have been widely used in various fields, such as photonics, catalysis, and adsorption, because of their unique physical and chemical properties. Therefore, their production methods are of utmost importance. Compared with traditional synthetic methods, the template method can effectively control the morphology, particle size, and structure during the preparation of nanomaterials, which is an effective method for their synthesis. The key for the template method is to choose di...

  3. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    International Nuclear Information System (INIS)

    Chess, Jordan J.; Montoya, Sergio A.; Harvey, Tyler R.; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E.; McMorran, Benjamin J.

    2017-01-01

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  4. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    International Nuclear Information System (INIS)

    Reed, J.K.

    1999-01-01

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities

  5. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    Energy Technology Data Exchange (ETDEWEB)

    Chess, Jordan J., E-mail: jchess@uoregon.edu [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Montoya, Sergio A. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); Harvey, Tyler R. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Ophus, Colin [National Center for Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); McMorran, Benjamin J. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States)

    2017-06-15

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  6. Empirical pillar design methods review report: Final report

    International Nuclear Information System (INIS)

    1988-02-01

    This report summarizes and evaluates empirical pillar design methods that may be of use during the conceptual design of a high-level nuclear waste repository in salt. The methods are discussed according to category (i.e, main, submain, and panel pillars; barrier pillars; and shaft pillars). Of the 21 identified for main, submain, and panel pillars, one method, the Confined Core Method, is evaluated as being most appropriate for conceptual design. Five methods are considered potentially applicable. Of six methods identified for barrier pillars, one method based on the Load Transfer Distance concept is considered most appropriate for design. Based on the evaluation of 25 methods identified for shaft pillars, an approximate sizing criterion is proposed for use in conceptual design. Aspects of pillar performance relating to creep, ground deformation, interaction with roof and floor rock, and response to high temperature environments are not adequately addressed by existing empirical design methods. 152 refs., 22 figs., 14 tabs

  7. A Systematic Method for Search Term Selection in Systematic Reviews

    Science.gov (United States)

    Thompson, Jenna; Davis, Jacqueline; Mazerolle, Lorraine

    2014-01-01

    The wide variety of readily available electronic media grants anyone the freedom to retrieve published references from almost any area of research around the world. Despite this privilege, keeping up with primary research evidence is almost impossible because of the increase in professional publishing across disciplines. Systematic reviews are a…

  8. Methods, Mechanism, and Applications of Photodeposition in Photocatalysis: A Review

    NARCIS (Netherlands)

    Wenderich, Kasper; Mul, Guido

    2016-01-01

    In this review, for a variety of metals and semiconductors, an attempt is made to generalize observations in the literature on the effect of process conditions applied during photodeposition on (i) particle size distributions, (ii) oxidation states of the metals obtained, and (iii) consequences for

  9. Implementing Montessori Methods for Dementia: A Scoping Review.

    Science.gov (United States)

    Hitzig, Sander L; Sheppard, Christine L

    2017-10-01

    A scoping review was conducted to develop an understanding of Montessori-based programing (MBP) approaches used in dementia care and to identify optimal ways to implement these programs across various settings. Six peer-reviewed databases were searched for relevant abstracts by 2 independent reviewers. Included articles and book chapters were those available in English and published by the end of January 2016. Twenty-three articles and 2 book chapters met the inclusion criteria. Four approaches to implementing MBP were identified: (a) staff assisted (n = 14); (b) intergenerational (n = 5); (c) resident assisted (n = 4); and (d) volunteer or family assisted (n = 2). There is a high degree of variability with how MBP was delivered and no clearly established "best practices" or standardized protocol emerged across approaches except for resident-assisted MBP. The findings from this scoping review provide an initial road map on suggestions for implementing MBP across dementia care settings. Irrespective of implementation approach, there are several pragmatic and logistical issues that need to be taken into account for optimal implementation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Mapping Saldana's Coding Methods onto the Literature Review Process

    Science.gov (United States)

    Onwuegbuzie, Anthony J.; Frels, Rebecca K.; Hwang, Eunjin

    2016-01-01

    Onwuegbuzie and Frels (2014) provided a step-by-step guide illustrating how discourse analysis can be used to analyze literature. However, more works of this type are needed to address the way that counselor researchers conduct literature reviews. Therefore, we present a typology for coding and analyzing information extracted for literature…

  11. Writing Integrative Reviews of the Literature: Methods and Purposes

    Science.gov (United States)

    Torraco, Richard J.

    2016-01-01

    This article discusses the integrative review of the literature as a distinctive form of research that uses existing literature to create new knowledge. As an expansion and update of a previously published article on this topic, it acknowledges the growth and appeal of this form of research to scholars, it identifies the main components of the…

  12. Demystifying Mixed Methods Research Design: A Review of the Literature

    Science.gov (United States)

    Caruth, Gail D.

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research.…

  13. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare.

    Science.gov (United States)

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna

    2016-06-01

    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  14. A review of analysis methods about thermal buckling

    International Nuclear Information System (INIS)

    Moulin, D.; Combescure, A.; Acker, D.

    1987-01-01

    This paper highlights the main items emerging from a large bibliographical survey carried out on strain-induced buckling analysis methods applicable in the building of fast neutron reactor structures. The work is centred on the practical analysis methods used in construction codes to account for the strain-buckling of thin and slender structures. Methods proposed in the literature concerning past and present studies are rapidly described. Experimental, theoretical and numerical methods are considered. Methods applicable to design and their degree of validation are indicated

  15. Review of methods for determination of ammonia volatilization in farmland

    Science.gov (United States)

    Yang, J.; Jiao, Y.; Yang, W. Z.; Gu, P.; Bai, S. G.; Liu, L. J.

    2018-02-01

    Ammonia is one of the most abundant alkaline trace gases in the atmosphere, which is one of the important factors affecting atmospheric quality. Excessive application of nitrogen fertilizer is the main source of global ammonia emissions, which not only exacerbate greenhouse gas emissions, but also leads to eutrophication of water bodies. In this paper, the basic principle, the operation process, the advantages and disadvantages, and the previous research results of the method are summarized in detail, including the enclosure method, the venting method, the continuous airflow enclosure method, the wind tunnel method and the micro-meteorological method. So as to provide a theoretical basis for selecting the appropriate method for determination of ammonia volatilization.

  16. Updated method guidelines for cochrane musculoskeletal group systematic reviews and metaanalyses

    DEFF Research Database (Denmark)

    Ghogomu, Elizabeth A T; Maxwell, Lara J; Buchbinder, Rachelle

    2014-01-01

    The Cochrane Musculoskeletal Group (CMSG), one of 53 groups of the not-for-profit, international Cochrane Collaboration, prepares, maintains, and disseminates systematic reviews of treatments for musculoskeletal diseases. It is important that authors conducting CMSG reviews and the readers of our...... reviews be aware of and use updated, state-of-the-art systematic review methodology. One hundred sixty reviews have been published. Previous method guidelines for systematic reviews of interventions in the musculoskeletal field published in 2006 have been substantially updated to incorporate...... using network metaanalysis. Method guidelines specific to musculoskeletal disorders are provided by CMSG editors for various aspects of undertaking a systematic review. These method guidelines will help improve the quality of reporting and ensure high standards of conduct as well as consistency across...

  17. Acoustic doppler methods for remote measurements of ocean flows - a review

    Digital Repository Service at National Institute of Oceanography (India)

    Joseph, A.

    The evolution of acoustic doppler methods for remote measurements of ocean flows has been briefly reviewed in historical perspective. Both Eulerian and profiling methods have been discussed. Although the first acoustic Doppler current meter has been...

  18. Test methods for evaluating hot cracking: Review and perspective

    International Nuclear Information System (INIS)

    Goodwin, G.M.

    1990-01-01

    The phenomenon of hot cracking is described and discussed, and criteria for tests to assess hot cracking are elucidated. The historical development of hot cracking tests is traced from the 1930s to present, with categorization of tests into several types. It is noted that the number of tests developed continues to increase dramatically. The number of literature citations also increases with time, with few popular tests receiving a major share of interest. Predominant countries of origin of both tests and citations shift with time, and a few journals account for most of the published information. Reviews of hot cracking are reviewed, and it is predicted that modeling and other developing analytical techniques will contribute greatly to an increase in our understanding of hot cracking. 30 refs., 10 figs., 1 tab

  19. Review on methods for determination of metallothioneins in aquatic organisms.

    Science.gov (United States)

    Shariati, Fatemeh; Shariati, Shahab

    2011-06-01

    One aspect of environmental degradation in coastal areas is pollution from toxic metals, which are persistent and are bioaccumulated by marine organisms, with serious public health implications. A conventional monitoring system of environmental metal pollution includes measuring the level of selected metals in the whole organism or in respective organs. However, measuring only the metal content in particular organs does not give information about its effect at the subcellular level. Therefore, the evaluation of biochemical biomarker metallothionein may be useful in assessing metal exposure and the prediction of potential detrimental effects induced by metal contamination. There are some methods for the determination of metallothioneins including spectrophotometric method, electrochemical methods, chromatography, saturation-based methods, immunological methods, electrophoresis, and RT-PCR. In this paper, different methods are discussed briefly and the comparison between them will be presented.

  20. Review of Synthetic Methods to Form Hollow Polymer Nanocapsules

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Madeline T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-03-13

    Syntactic foams have grown in interest due to the widened range of applications because of their mechanical strength and high damage tolerance. In the past, hollow glass or ceramic particles were used to create the pores. This paper reviews literature focused on the controlled synthesis of hollow polymer spheres with diameters ranging from 100 –200 nm. By using hollow polymer spheres, syntactic foams could reach ultra-low densities.

  1. A review of the methods for neuronal response latency estimation

    DEFF Research Database (Denmark)

    Levakovaa, Marie; Tamborrino, Massimiliano; Ditlevsen, Susanne

    2015-01-01

    Neuronal response latency is usually vaguely defined as the delay between the stimulus onset and the beginning of the response. It contains important information for the understanding of the temporal code. For this reason, the detection of the response latency has been extensively studied in the ...... by the stimulation using interspike intervals and spike times. The aim of this paper is to present a review of the main techniques proposed in both classes, highlighting their advantages and shortcomings....

  2. The review and results of different methods for facial recognition

    Science.gov (United States)

    Le, Yifan

    2017-09-01

    In recent years, facial recognition draws much attention due to its wide potential applications. As a unique technology in Biometric Identification, facial recognition represents a significant improvement since it could be operated without cooperation of people under detection. Hence, facial recognition will be taken into defense system, medical detection, human behavior understanding, etc. Several theories and methods have been established to make progress in facial recognition: (1) A novel two-stage facial landmark localization method is proposed which has more accurate facial localization effect under specific database; (2) A statistical face frontalization method is proposed which outperforms state-of-the-art methods for face landmark localization; (3) It proposes a general facial landmark detection algorithm to handle images with severe occlusion and images with large head poses; (4) There are three methods proposed on Face Alignment including shape augmented regression method, pose-indexed based multi-view method and a learning based method via regressing local binary features. The aim of this paper is to analyze previous work of different aspects in facial recognition, focusing on concrete method and performance under various databases. In addition, some improvement measures and suggestions in potential applications will be put forward.

  3. Chemical analysis of cyanide in cyanidation process: review of methods

    International Nuclear Information System (INIS)

    Nova-Alonso, F.; Elorza-Rodriguez, E.; Uribe-Salas, A.; Perez-Garibay, R.

    2007-01-01

    Cyanidation, the world wide method for precious metals recovery, the chemical analysis of cyanide, is a very important, but complex operation. Cyanide can be present forming different species, each of them with different stability, toxicity, analysis method and elimination technique. For cyanide analysis, there exists a wide selection of analytical methods but most of them present difficulties because of the interference of species present in the solution. This paper presents the different available methods for chemical analysis of cyanide: titration, specific electrode and distillation, giving special emphasis on the interferences problem, with the aim of helping in the interpretation of the results. (Author)

  4. Streamlining Transportation Corridor Planning Processess: Freight and Traffic Information

    Energy Technology Data Exchange (ETDEWEB)

    Franzese, Oscar [ORNL

    2010-08-01

    The traffic investigation is one of the most important parts of an Environmental Impact Statement of projects involving the construction of new roadway facilities and/or the improvement of existing ones. The focus of the traffic analysis is on the determination of anticipated traffic flow characteristics of the proposed project, by the application of analytical methods that can be grouped under the umbrella of capacity analysis methodologies. In general, the main traffic parameter used in EISs to describe the quality of traffic flow is the Level of Service (LOS). The current state of the practice in terms of the traffic investigations for EISs has two main shortcomings. The first one is related to the information that is necessary to conduct the traffic analysis, and specifically to the lack of integration among the different transportation models and the sources of information that, in general, reside in GIS databases. A discussion of the benefits of integrating CRS&SI technologies and the transportation models used in the EIS traffic investigation is included. The second shortcoming is in the presentation of the results, both in terms of the appearance and formatting, as well as content. The presentation of traffic results (current and proposed) is discussed. This chapter also addresses the need of additional data, in terms of content and coverage. Regarding the former, other traffic parameters (e.g., delays) that are more meaningful to non-transportation experts than LOS, as well as additional information (e.g., freight flows) that can impact traffic conditions and safety are discussed. Spatial information technologies can decrease the negative effects of, and even eliminate, these shortcomings by making the relevant information that is input to the models more complete and readily available, and by providing the means to communicate the results in a more clear and efficient manner. The benefits that the application and use of CRS&SI technologies can provide to

  5. Content validity of methods to assess malnutrition in cancer patients: a systematic review

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Ottery, Faith D.; van der Schans, Cees; Roodenburg, Jan L N; Jager-Wittenaar, Harriët

    Content validity of methods to assess malnutrition in cancer patients: A systematic review Rationale: Inadequate operationalisation of the multidimensial concept of malnutrition may result in inadequate evaluation of nutritional status. In this review we aimed to assess content validity of methods

  6. Streamline processing of discrete nuclear spectra by means of authoregularized iteration process (the KOLOBOK code)

    International Nuclear Information System (INIS)

    Gadzhokov, V.; Penev, I.; Aleksandrov, L.

    1979-01-01

    A brief description of the KOLOBOK computer code designed for streamline processing of discrete nuclear spectra with symmetric Gaussian shape of the single line on computers of the ES series, models 1020 and above, is given. The program solves the stream of discrete-spectrometry generated nonlinear problems by means of authoregularized iteration process. The Fortran-4 text of the code is reported in an Appendix

  7. A review of methods for updating forest monitoring system estimates

    Science.gov (United States)

    Hector Franco-Lopez; Alan R. Ek; Andrew P. Robinson

    2000-01-01

    Intensifying interest in forests and the development of new monitoring technologies have induced major changes in forest monitoring systems in the last few years, including major revisions in the methods used for updating. This paper describes the methods available for projecting stand- and plot-level information, emphasizing advantages and disadvantages, and the...

  8. A historical review on ''magnetic focusing method'' in Japan

    International Nuclear Information System (INIS)

    Yamada, Y.; Tanaka, K.; Abe, Z.

    1986-01-01

    Several topics on the development of the magnetic focusing method and its recent progress are discussed. The magnetic focusing method will be effective for measuring the local NMR parameters, and the advanced imaging technique will also be as useful as the recent conventional NMR Imaging techniques

  9. Review of Research on Template Methods in Preparation of Nanomaterials

    Directory of Open Access Journals (Sweden)

    Yadian Xie

    2016-01-01

    Full Text Available The nanomaterials have been widely used in various fields, such as photonics, catalysis, and adsorption, because of their unique physical and chemical properties. Therefore, their production methods are of utmost importance. Compared with traditional synthetic methods, the template method can effectively control the morphology, particle size, and structure during the preparation of nanomaterials, which is an effective method for their synthesis. The key for the template method is to choose different templates, which are divided into hard template and soft template according to their different structures. In this paper, the effects of different types of templates on the morphology of nanomaterials during their preparation are investigated from two aspects: hard template and soft template, combined with the mechanism of action.

  10. A Review of Research Methods in Children's Technology Design

    DEFF Research Database (Denmark)

    Jensen, Janne Jul; Skov, Mikael B.

    2005-01-01

    Research methods have been objects of discussions for dec-ades and defining research methods is still a quite substan-tial challenge. However, it is important to understand how research methods have been adapted in different disciplines as it potentially informs us on future directions and influ......-ences on the discipline. Inspired by previous studies from other disciplines, we conduct a survey of research methods in paper publications. 105 papers on children's technology design are classified on a two-dimensional matrix on research method and pur-pose. Our results show a strong focus on engineering of products...... as applied research and on evaluation of devel-oped products in the field or in the lab. Also, we find that much research is conducted in natural setting environments with strong focus on field studies....

  11. The Effects of Propulsive Jetting on Drag of a Streamlined body

    Science.gov (United States)

    Krieg, Michael; Mohseni, Kamran

    2017-11-01

    Recently an abundance of bioinspired underwater vehicles have emerged to leverage eons of evolution. Our group has developed a propulsion technique inspired by jellyfish and squid. Propulsive jets are generated by ingesting and expelling water from a flexible internal cavity. We have demonstrated thruster capabilities for maneuvering on AUV platforms, where the internal thruster geometry minimized forward drag; however, such a setup cannot characterize propulsive efficiency. Therefore, we created a new streamlined vehicle platform that produces unsteady jets for forward propulsion rather than maneuvering. The streamlined jetting body is placed in a water tunnel and held stationary while jetting frequency and background flow velocity are varied. For each frequency/velocity pair the flow field is measured around the surface and in the wake using PIV. Using the zero jetting frequency as a baseline for each background velocity, the passive body drag is related to the velocity distribution. For cases with active jetting the drag and jetting forces are estimated from the velocity field and compared to the passive case. For this streamlined body, the entrainment of surrounding flow into the propulsive jet can reduce drag forces in addition to the momentum transfer of the jet itself. Office of Naval Research.

  12. District nursing workforce planning: a review of the methods.

    Science.gov (United States)

    Reid, Bernie; Kane, Kay; Curran, Carol

    2008-11-01

    District nursing services in Northern Ireland face increasing demands and challenges which may be responded to by effective and efficient workforce planning and development. The aim of this paper is to critically analyse district nursing workforce planning and development methods, in an attempt to find a suitable method for Northern Ireland. A systematic analysis of the literature reveals four methods: professional judgement; population-based health needs; caseload analysis and dependency-acuity. Each method has strengths and weaknesses. Professional judgement offers a 'belt and braces' approach but lacks sensitivity to fluctuating patient numbers. Population-based health needs methods develop staffing algorithms that reflect deprivation and geographical spread, but are poorly understood by district nurses. Caseload analysis promotes equitable workloads but poorly performing district nursing localities may continue if benchmarking processes only consider local data. Dependency-acuity methods provide a means of equalizing and prioritizing workload but are prone to district nurses overstating factors in patient dependency or understating carers' capability. In summary a mixed method approach is advocated to evaluate and adjust the size and mix of district nursing teams using empirically determined patient dependency and activity-based variables based on the population's health needs.

  13. Water demand forecasting: review of soft computing methods.

    Science.gov (United States)

    Ghalehkhondabi, Iman; Ardjmand, Ehsan; Young, William A; Weckman, Gary R

    2017-07-01

    Demand forecasting plays a vital role in resource management for governments and private companies. Considering the scarcity of water and its inherent constraints, demand management and forecasting in this domain are critically important. Several soft computing techniques have been developed over the last few decades for water demand forecasting. This study focuses on soft computing methods of water consumption forecasting published between 2005 and 2015. These methods include artificial neural networks (ANNs), fuzzy and neuro-fuzzy models, support vector machines, metaheuristics, and system dynamics. Furthermore, it was discussed that while in short-term forecasting, ANNs have been superior in many cases, but it is still very difficult to pick a single method as the overall best. According to the literature, various methods and their hybrids are applied to water demand forecasting. However, it seems soft computing has a lot more to contribute to water demand forecasting. These contribution areas include, but are not limited, to various ANN architectures, unsupervised methods, deep learning, various metaheuristics, and ensemble methods. Moreover, it is found that soft computing methods are mainly used for short-term demand forecasting.

  14. Review of Monte Carlo methods for particle multiplicity evaluation

    CERN Document Server

    Armesto-Pérez, Nestor

    2005-01-01

    I present a brief review of the existing models for particle multiplicity evaluation in heavy ion collisions which are at our disposal in the form of Monte Carlo simulators. Models are classified according to the physical mechanisms with which they try to describe the different stages of a high-energy collision between heavy nuclei. A comparison of predictions, as available at the beginning of year 2000, for multiplicities in central AuAu collisions at the BNL Relativistic Heavy Ion Collider (RHIC) and PbPb collisions at the CERN Large Hadron Collider (LHC) is provided.

  15. Amperometric and coulometric methods of platinum metal determination. (Review)

    International Nuclear Information System (INIS)

    Ezerskaya, N.A.

    1981-01-01

    Reviewed are works published in the period from 1957-1979, on amperometric and coulometric (potentiostatistic and amperostatistic variant) determination of platinum metals, Ru in particular. During amperometric titration of Ru the following titrantes are used: hydroquinone, thioxne thiourea, Na 2 S 2 O 3 . It is proposed to titrate Ru in the form of ruthenate-ion with hydrazine sulphate in alkal: medium according to the current of reagent oxidation. During coulometric determination of Ru the electrogenerating titrant TiCl 3 or Ti 2 (SO 4 ) 3 (for initial form of Ru [RuCl 6 ] 2- ) is used [ru

  16. Research methods in complementary and alternative medicine: an integrative review.

    Science.gov (United States)

    de Almeida Andrade, Fabiana; Schlechta Portella, Caio Fabio

    2018-01-01

    The scientific literature presents a modest amount of evidence in the use of complementary and alternative medicine (CAM). On the other hand, in practice, relevant results are common. The debates among CAM practitioners about the quality and execution of scientific research are important. Therefore, the aim of this review is to gather, synthesize and describe the differentiated methodological models that encompass the complexity of therapeutic interventions. The process of bringing evidence-based medicine into clinical practice in CAM is essential for the growth and strengthening of complementary medicines worldwide. Copyright © 2017 Shanghai Changhai Hospital. Published by Elsevier B.V. All rights reserved.

  17. Critical review of the probability of causation method

    International Nuclear Information System (INIS)

    Cox, L.A. Jr.; Fiksel, J.R.

    1985-01-01

    In a more controversial report than the others in the study, the authors use one scientific discipline to review the work of another discipline. Their proposal recognizes the imprecision that develops in moving from group to individual interpretations of causal effects by substituting the term assigned share for probability of causation. The authors conclude that the use of a formula will not provide reliable measures of risk attribution in individual cases. The gap between scientific certainty and assigning shares of responsibility must be filled by subjective value judgments supplied by the scientists. 22 references, 2 figures, 4 tables

  18. A review on exudates detection methods for diabetic retinopathy.

    Science.gov (United States)

    Joshi, Shilpa; Karule, P T

    2018-01-01

    The presence of exudates on the retina is the most characteristic symptom of diabetic retinopathy. As exudates are among early clinical signs of DR, their detection would be an essential asset to the mass screening task and serve as an important step towards automatic grading and monitoring of the disease. Reliable identification and classification of exudates are of inherent interest in an automated diabetic retinopathy screening system. Here we review the numerous early studies that used for automatic exudates detection with the aim of providing decision support in addition to reducing the workload of an ophthalmologist. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  19. Review of Monte Carlo methods for particle multiplicity evaluation

    International Nuclear Information System (INIS)

    Armesto, Nestor

    2005-01-01

    I present a brief review of the existing models for particle multiplicity evaluation in heavy ion collisions which are at our disposal in the form of Monte Carlo simulators. Models are classified according to the physical mechanisms with which they try to describe the different stages of a high-energy collision between heavy nuclei. A comparison of predictions, as available at the beginning of year 2000, for multiplicities in central AuAu collisions at the BNL Relativistic Heavy Ion Collider (RHIC) and PbPb collisions at the CERN Large Hadron Collider (LHC) is provided

  20. What methods do reviews of normative ethics literature use for search, selection, analysis, and synthesis? In-depth results from a systematic review of reviews.

    Science.gov (United States)

    Mertz, Marcel; Strech, Daniel; Kahrass, Hannes

    2017-12-19

    (Semi-)systematic approaches to finding, analysing, and synthesising ethics literature on medical topics are still in their infancy. However, our recent systematic review showed that the rate of publication of such (semi-)systematic reviews has increased in the last two decades. This is not only true for reviews of empirical ethics literature, but also for reviews of normative ethics literature. In the latter case, there is currently little in the way of standards and guidance available. Therefore, the methods and reporting strategies of such reviews vary greatly. The purpose of the follow-up study we present was to obtain deeper methodological insight into the ways reviews of normative literature are actually conducted and to analyse the methods used. Our search in the PubMed, PhilPapers, and Google Scholar databases led to the identification of 183 reviews of ethics literature published between 1997 and 2015, of which 84 were identified as reviews of normative and mixed literature. Qualitative content analysis was used to extract and synthesise descriptions of search, selection, quality appraisal, analysis, and synthesis methods. We further assessed quantitatively how often certain methods (e.g. search strategies, data analysis procedures) were used by the reviews. The overall reporting quality varies among the analysed reviews and was generally poor even for major criteria regarding the search and selection of literature. For example, only 24 (29%) used a PRISMA flowchart. Also, only 55 (66%) reviews mentioned the information unit they sought to extract, and 12 (14%) stated an ethical approach as the theoretical basis for the analysis. Interpretable information on the synthesis method was given by 47 (60%); the most common methods applied were qualitative methods commonly used in social science research (83%). Reviews which fail to provide sufficient relevant information to readers have reduced methodological transparency regardless of actual methodological

  1. Review of 3d GIS Data Fusion Methods and Progress

    Science.gov (United States)

    Hua, Wei; Hou, Miaole; Hu, Yungang

    2018-04-01

    3D data fusion is a research hotspot in the field of computer vision and fine mapping, and plays an important role in fine measurement, risk monitoring, data display and other processes. At present, the research of 3D data fusion in the field of Surveying and mapping focuses on the 3D model fusion of terrain and ground objects. This paper summarizes the basic methods of 3D data fusion of terrain and ground objects in recent years, and classified the data structure and the establishment method of 3D model, and some of the most widely used fusion methods are analysed and commented.

  2. REVIEW OF 3D GIS DATA FUSION METHODS AND PROGRESS

    Directory of Open Access Journals (Sweden)

    W. Hua

    2018-04-01

    Full Text Available 3D data fusion is a research hotspot in the field of computer vision and fine mapping, and plays an important role in fine measurement, risk monitoring, data display and other processes. At present, the research of 3D data fusion in the field of Surveying and mapping focuses on the 3D model fusion of terrain and ground objects. This paper summarizes the basic methods of 3D data fusion of terrain and ground objects in recent years, and classified the data structure and the establishment method of 3D model, and some of the most widely used fusion methods are analysed and commented.

  3. Review of noise reduction methods for centrifugal fans

    Science.gov (United States)

    Neise, W.

    1981-11-01

    Several methods for the reduction of centrifugal fan noise are presented, the most of which are aimed at a lower blade passage frequency level. The methods are grouped into five categories: casing modifications to increase the distance between impeller and cutoff, the introduction of a phase shift of the source pressure fluctuations, impeller modifications, radial clearance between impeller eye and inlet nozzle, and acoustical measures. Resonators mounted at the cutoff of centrifugal fans appear to be a highly efficient and simple means of reducing the blade passage tone, and the method can be used for new fan construction and existing installations without affecting the aerodynamic performance of the fan.

  4. Review of assessment methods discount rate in investment analysis

    Directory of Open Access Journals (Sweden)

    Yamaletdinova Guzel Hamidullovna

    2011-08-01

    Full Text Available The article examines the current methods of calculating discount rate in investment analysis and business valuation, as well as analyzes the key problems using various techniques in terms of the Russian economy.

  5. Review of methods for modelling forest fire risk and hazard

    African Journals Online (AJOL)

    user

    -Leal et al., 2006). Stolle and Lambin (2003) noted that flammable fuel depends on ... advantages over conventional fire detection and fire monitoring methods because ofits repetitive andconsistent coverage over large areas of land (Martin et ...

  6. Advanced Measuring (Instrumentation Methods for Nuclear Installations: A Review

    Directory of Open Access Journals (Sweden)

    Wang Qiu-kuan

    2012-01-01

    Full Text Available The nuclear technology has been widely used in the world. The research of measurement in nuclear installations involves many aspects, such as nuclear reactors, nuclear fuel cycle, safety and security, nuclear accident, after action, analysis, and environmental applications. In last decades, many advanced measuring devices and techniques have been widely applied in nuclear installations. This paper mainly introduces the development of the measuring (instrumentation methods for nuclear installations and the applications of these instruments and methods.

  7. Economics methods in Cochrane systematic reviews of health promotion and public health related interventions

    Directory of Open Access Journals (Sweden)

    McDaid David

    2006-11-01

    Full Text Available Abstract Background Provision of evidence on costs alongside evidence on the effects of interventions can enhance the relevance of systematic reviews to decision-making. However, patterns of use of economics methods alongside systematic review remain unclear. Reviews of evidence on the effects of interventions are published by both the Cochrane and Campbell Collaborations. Although it is not a requirement that Cochrane or Campbell Reviews should consider economic aspects of interventions, many do. This study aims to explore and describe approaches to incorporating economics methods in a selection of Cochrane systematic reviews in the area of health promotion and public health, to help inform development of methodological guidance on economics for reviewers. Methods The Cochrane Database of Systematic Reviews was searched using a search strategy for potential economic evaluation studies. We included current Cochrane reviews and review protocols retrieved using the search that are also identified as relevant to health promotion or public health topics. A reviewer extracted data which describe the economics components of included reviews. Extracted data were summarised in tables and analysed qualitatively. Results Twenty-one completed Cochrane reviews and seven review protocols met inclusion criteria. None incorporate formal economic evaluation methods. Ten completed reviews explicitly aim to incorporate economics studies and data. There is a lack of transparent reporting of methods underpinning the incorporation of economics studies and data. Some reviews are likely to exclude useful economics studies and data due to a failure to incorporate search strategies tailored to the retrieval of such data or use of key specialist databases, and application of inclusion criteria designed for effectiveness studies. Conclusion There is a need for consistency and transparency in the reporting and conduct of the economics components of Cochrane reviews, as

  8. A systematic review of methods for studying consumer health YouTube videos, with implications for systematic reviews

    Directory of Open Access Journals (Sweden)

    Margaret Sampson

    2013-09-01

    Full Text Available Background. YouTube is an increasingly important medium for consumer health information – with content provided by healthcare professionals, government and non-government organizations, industry, and consumers themselves. It is a rapidly developing area of study for healthcare researchers. We examine the methods used in reviews of YouTube consumer health videos to identify trends and best practices.Methods and Materials. Published reviews of consumer-oriented health-related YouTube videos were identified through PubMed. Data extracted from these studies included type of journal, topic, characteristics of the search, methods of review including number of reviewers and method to achieve consensus between reviewers, inclusion and exclusion criteria, characteristics of the videos reported, ethical oversight, and follow-up.Results. Thirty-three studies were identified. Most were recent and published in specialty journals. Typically, these included more than 100 videos, and were examined by multiple reviewers. Most studies described characteristics of the videos, number of views, and sometime characteristics of the viewers. Accuracy of portrayal of the health issue under consideration was a common focus.Conclusion. Optimal transparency and reproducibility of studies of YouTube health-related videos can be achieved by following guidance designed for systematic review reporting, with attention to several elements specific to the video medium. Particularly when seeking to replicate consumer viewing behavior, investigators should consider the method used to select search terms, and use a snowballing rather than a sequential screening approach. Discontinuation protocols for online screening of relevance ranked search results is an area identified for further development.

  9. A systematic review of methods for studying consumer health YouTube videos, with implications for systematic reviews

    Science.gov (United States)

    Cumber, Jordi; Li, Claudia; Pound, Catherine M.; Fuller, Ann; Harrison, Denise

    2013-01-01

    Background. YouTube is an increasingly important medium for consumer health information – with content provided by healthcare professionals, government and non-government organizations, industry, and consumers themselves. It is a rapidly developing area of study for healthcare researchers. We examine the methods used in reviews of YouTube consumer health videos to identify trends and best practices. Methods and Materials. Published reviews of consumer-oriented health-related YouTube videos were identified through PubMed. Data extracted from these studies included type of journal, topic, characteristics of the search, methods of review including number of reviewers and method to achieve consensus between reviewers, inclusion and exclusion criteria, characteristics of the videos reported, ethical oversight, and follow-up. Results. Thirty-three studies were identified. Most were recent and published in specialty journals. Typically, these included more than 100 videos, and were examined by multiple reviewers. Most studies described characteristics of the videos, number of views, and sometime characteristics of the viewers. Accuracy of portrayal of the health issue under consideration was a common focus. Conclusion. Optimal transparency and reproducibility of studies of YouTube health-related videos can be achieved by following guidance designed for systematic review reporting, with attention to several elements specific to the video medium. Particularly when seeking to replicate consumer viewing behavior, investigators should consider the method used to select search terms, and use a snowballing rather than a sequential screening approach. Discontinuation protocols for online screening of relevance ranked search results is an area identified for further development. PMID:24058879

  10. A systematic review of methods for studying consumer health YouTube videos, with implications for systematic reviews.

    Science.gov (United States)

    Sampson, Margaret; Cumber, Jordi; Li, Claudia; Pound, Catherine M; Fuller, Ann; Harrison, Denise

    2013-01-01

    Background. YouTube is an increasingly important medium for consumer health information - with content provided by healthcare professionals, government and non-government organizations, industry, and consumers themselves. It is a rapidly developing area of study for healthcare researchers. We examine the methods used in reviews of YouTube consumer health videos to identify trends and best practices. Methods and Materials. Published reviews of consumer-oriented health-related YouTube videos were identified through PubMed. Data extracted from these studies included type of journal, topic, characteristics of the search, methods of review including number of reviewers and method to achieve consensus between reviewers, inclusion and exclusion criteria, characteristics of the videos reported, ethical oversight, and follow-up. Results. Thirty-three studies were identified. Most were recent and published in specialty journals. Typically, these included more than 100 videos, and were examined by multiple reviewers. Most studies described characteristics of the videos, number of views, and sometime characteristics of the viewers. Accuracy of portrayal of the health issue under consideration was a common focus. Conclusion. Optimal transparency and reproducibility of studies of YouTube health-related videos can be achieved by following guidance designed for systematic review reporting, with attention to several elements specific to the video medium. Particularly when seeking to replicate consumer viewing behavior, investigators should consider the method used to select search terms, and use a snowballing rather than a sequential screening approach. Discontinuation protocols for online screening of relevance ranked search results is an area identified for further development.

  11. InterviewStreamliner, a minimalist, free, open source, relational approach to computer-assisted qualitative data analysis software

    NARCIS (Netherlands)

    H.D. Pruijt (Hans)

    2010-01-01

    textabstractInterviewStreamliner is a free, open source, minimalist alternative to complex computer-assisted qualitative data analysis packages. It builds on the flexibility of relational database management technology.

  12. Brownfields Assessing Contractor Capabilities for Streamlined Site Investigation: Additional Information Regarding All Appropriate Inquiries and Hiring an Environmental Professional

    Science.gov (United States)

    This document assists Brownfields grantees and other decision makers as they assess the capabilities of contractors and consultants to determine their qualifications to provide streamlined and innovative strategies for the assessment and and cleanup.

  13. Indirect methods for reference interval determination - review and recommendations.

    Science.gov (United States)

    Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim

    2018-04-19

    Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.

  14. Biodiesel production methods of rubber seed oil: a review

    Science.gov (United States)

    Ulfah, M.; Mulyazmi; Burmawi; Praputri, E.; Sundari, E.; Firdaus

    2018-03-01

    The utilization of rubber seed as raw material of biodiesel production is seen highly potential in Indonesia. The availability of rubber seeds in Indonesia is estimated about 5 million tons per annum, which can yield rubber seed oil about 2 million tons per year. Due to the demand of edible oils as a food source is tremendous and the edible oil feedstock costs are far expensive to be used as fuel, production of biodiesel from non-edible oils such as rubber seed is an effective way to overcome all the associated problems with edible oils. Various methods for producing biodiesel from rubber seed oil have been reported. This paper introduces an optimum condition of biodiesel production methods from rubber seed oil. This article was written to be a reference in the selection of methods and the further development of biodiesel production from rubber seed oil. Biodiesel production methods for rubber seed oils has been developed by means of homogeneous catalysts, heterogeneous catalysts, supercritical method, ultrasound, in-situ and enzymatic processes. Production of biodiesel from rubber seed oil using clinker loaded sodium methoxide as catalyst is very interesting to be studied and developed further.

  15. A review of monitoring methods for pharmaceutical wet granulation.

    Science.gov (United States)

    Hansuld, E M; Briens, L

    2014-09-10

    High-shear wet granulation is commonly used in the pharmaceutical industry to improve powder properties for downstream processes such as tabletting. Granule growth, however, is difficult to predict because the process is sensitive to raw material properties and operating conditions. Development of process analytical technologies is encouraged by regulatory bodies to improve process understanding and monitor quality online. The primary technologies investigated for high-shear wet granulation monitoring include power consumption, near-infrared spectroscopy, Raman spectroscopy, capacitance measurements, microwave measurements, imaging, focused beam reflectance measurements, spatial filter velocimetry, stress and vibration measurements, as well as acoustic emissions. This review summarizes relevant research related to each of these technologies and discusses the challenges associated with each approach as a possible process analytical technology tool for high-shear wet granulation. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A Review of Imaging Methods for Prostate Cancer Detection

    Directory of Open Access Journals (Sweden)

    Saradwata Sarkar

    2016-01-01

    Full Text Available Imaging is playing an increasingly important role in the detection of prostate cancer (PCa. This review summarizes the key imaging modalities–multiparametric ultrasound (US, multiparametric magnetic resonance imaging (MRI, MRI-US fusion imaging, and positron emission tomography (PET imaging–-used in the diagnosis and localization of PCa. Emphasis is laid on the biological and functional characteristics of tumors that rationalize the use of a specific imaging technique. Changes to anatomical architecture of tissue can be detected by anatomical grayscale US and T2-weighted MRI. Tumors are known to progress through angiogenesis–-a fact exploited by Doppler and contrast-enhanced US and dynamic contrast-enhanced MRI. The increased cellular density of tumors is targeted by elastography and diffusion-weighted MRI. PET imaging employs several different radionuclides to target the metabolic and cellular activities during tumor growth. Results from studies using these various imaging techniques are discussed and compared.

  17. Methods of ammonia removal in anaerobic digestion: a review.

    Science.gov (United States)

    Krakat, Niclas; Demirel, Burak; Anjum, Reshma; Dietz, Donna

    2017-10-01

    The anaerobic digestion of substrates with high ammonia content has always been a bottleneck in the methanisation process of biomasses. Since microbial communities in anaerobic digesters are sensitive to free ammonia at certain conditions, the digestion of nitrogen-rich substrates such as livestock wastes may result in inhibition/toxicity eventually leading to process failures, unless appropriate engineering precautions are taken. There are many different options reported in literature to remove ammonia from anaerobic digesters to achieve a safe and stable process so that along with high methane yields, a good quality of effluents can also be obtained. Conventional techniques to remove ammonia include physical/chemical methods, immobilization and adaptation of microorganisms, while novel methods include ultrasonication, microwave, hollow fiber membranes and microbial fuel cell applications. This paper discusses conventional and novel methods of ammonia removal from anaerobic digesters using nitrogen-rich substrates, with particular focus on recent literature available about this topic.

  18. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    Moray, N.P.; Senders, J.W.; Rhodes, W.

    1985-06-01

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  19. Review of control rod calibration methods for irradiated AGRs

    Energy Technology Data Exchange (ETDEWEB)

    Telford, A. R.R.

    1975-10-15

    Methods of calibrating control rods with particular reference to irradiated CAGR are surveyed. Some systematic spatial effects are found and an estimate of their magnitude made. It is concluded that control rod oscillation provides a promising method of calibrating rods at power which is as yet untried on CAGR. Also the rod drop using inverse kinetics provides a rod calibration but spatial effects may be large and these would be difficult to correct theoretically. The pulsed neutron technique provides a calibration route with small errors due to spatial effects provided a suitable K-tube can be developed. The xenon transient method is shown to have spatial effects which have not needed consideration in earlier reactors but which in CAGR would need very careful evaluation.

  20. REVIEW ON NATURAL METHODS FOR WASTE WATER TREATMENT

    Directory of Open Access Journals (Sweden)

    Ashwani Kumar Dubey

    2014-01-01

    Full Text Available In Ethiopia, the most common method of disposal of waste water is by land spreading. This treatment method has numerous problems, namely high labor requirements and the potential for eutrophication of surface an d ground waters. Constructed wetlands are commonl y used for treatment of seconda ry municipal wastewaters and they have been gaining popularity for treatment of agricultural wastewaters in Ethiopia. Intermittent sand filtration may offer an alternative to traditional treatment methods. As well as providing comparable treatment performance, they also have a smaller footprint, due to the substantially higher organic loading rates that may be applied to their surfaces. Th is paper discusses the performance and design criteria of constructed wetlands for the treatment of domestic and agricultural wastewater, and sand filters for the treatment of domestic wastewater. It also proposes sand filtration as an alt ernative treatment mechanism for agricultural wa stewater and suggests design guide lines.

  1. Pregnancy diagnosis in sheep: review of the most practical methods

    International Nuclear Information System (INIS)

    Karen, A.; Szenci, O.; Kovacs, P.; Beckers, J.F.

    2001-01-01

    Various practical methods have been used for pregnancy diagnosis in sheep: radiography, rectal abdominal palpation, assessment of progesterone, assessment of estrone sulphate, RIA assay of placental lactogen, assessment of pregnancy proteins or pregnancy-associated glycoproteins, A-mode ultrasound, Doppler ultrasound and real-time B-mode ultrasonography. Real-time, B-mode ultrasonography appears to be the most practical and accurate method for diagnosing pregnancy and determining fetal numbers in sheep. Transabdominal B-mode ultrasonography achieved high accuracy for pregnancy diagnosis (94-100 %) and the determination of fetal numbers (92-99 %) on d 29 to 106 of gestation

  2. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  3. Guessing right for the next war: streamlining, pooling, and right-timing force design decisions for an environment of uncertainty

    Science.gov (United States)

    2017-05-25

    key ingredients for not only how the Army fought World War II, but also how it continues to organize today. In essence , streamlining pares down every...Germans.1 The Battle of Mortain reflected the US Army in World War II at its best.2 It defined US Army success in the European theater of operations...continues to organize today.5 In essence , streamlining pared down every unit to its essentials based around a critical capability it provided to

  4. The glenoid track: a review of the clinical relevance, method of calculation and current evidence behind this method

    Energy Technology Data Exchange (ETDEWEB)

    Younan, Yara; Wong, Philip K.; Umpierrez, Monica; Gonzalez, Felix; Singer, Adam Daniel [Emory University Hospital, Department of Radiology and Imaging Sciences, Section of Musculoskeletal Imaging, Atlanta, GA (United States); Karas, Spero [Emory University Hospital, Department of Orthopedic Surgery, Atlanta, GA (United States); Jose, Jean [University of Miami, Department of Radiology, Miami, FL (United States)

    2017-12-15

    In the setting of bipolar bone injury, orthopedic surgeons are currently making use of the glenoid track method to guide surgical management. Using preoperative CT or MR imaging, this method allows the identification of patients who are more likely to fail a primary capsuloligamentous Bankart repair. As the glenoid track method becomes increasingly used in preoperative planning, it is important for the radiologist to become familiar with its concept and method of calculation. This review article aims to concisely summarize the current literature and the clinical implications of the glenoid track method. (orig.)

  5. Review of current study methods for VRU safety

    DEFF Research Database (Denmark)

    Andersen, Camilla Sloth; Kamaluddin, Noor Azreena; Várhelyi, András

    written questionnaires (either online or paper-based), interviews may be performed (either face-to-face or via telephone) and people may be asked to report their accident via an app on their mobile device. The method for gaining self-reported information thus varies greatly – and so does the information...

  6. Literature Review on Processing and Analytical Methods for ...

    Science.gov (United States)

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  7. Coordinating contracts in SCM : a review of methods and literature

    NARCIS (Netherlands)

    Hezarkhani, B.; Kubiak, W.

    2010-01-01

    Supply chain coordination through contracts has been a burgeoning area of research in recent years. In spite of rapid development of research, there are only a few structured analyses of assumptions, methods, and applicability of insights in this field. The aim of this paper is to provide a

  8. Methods to Evaluate Corrosion in Buried Steel Structures: A Review

    Directory of Open Access Journals (Sweden)

    Lorena-de Arriba-Rodriguez

    2018-05-01

    Full Text Available Around the world, there are thousands of metal structures completely or partially buried in the soil. The main concern in their design is corrosion. Corrosion is a mechanism that degrades materials and causes structural failures in infrastructures, which can lead to severe effects on the environment and have direct impact on the population health. In addition, corrosion is extremely complex in the underground environment due to the variability of the local conditions. The problem is that there are many methods to its evaluation but none have been clearly established. In order to ensure the useful life of such structures, engineers usually consider an excess thickness that increases the economic cost of manufacturing and does not satisfy the principles of efficiency in the use of resources. In this paper, an extended revision of the existing methods to evaluate corrosion is carried out to optimize the design of buried steel structures according to their service life. Thus, they are classified into two categories depending on the information they provide: qualitative and quantitative methods. As a result, it is concluded that the most exhaustive methodologies for estimating soil corrosion are quantitative methods fed by non-electrochemical data based on experimental studies that measure the mass loss of structures.

  9. A Review of Family Planning Methods Used in Kano, Nigeria ...

    African Journals Online (AJOL)

    Method All records of the clients that attended the Family Planning Clinic from January 2003 to December 2007 were analyzed Results New clients were 22% while revisits were 78%, with a steady increase in the number of new clients from 4% in 2003 to 26% in 2007. Injectable contraceptives were the most commonly ...

  10. Accident Analysis Methods and Models — a Systematic Literature Review

    NARCIS (Netherlands)

    Wienen, Hans Christian Augustijn; Bukhsh, Faiza Allah; Vriezekolk, E.; Wieringa, Roelf J.

    2017-01-01

    As part of our co-operation with the Telecommunication Agency of the Netherlands, we want to formulate an accident analysis method and model for use in incidents in telecommunications that cause service unavailability. In order to not re-invent the wheel, we wanted to first get an overview of all

  11. Bibliography of reviews and methods of photosynthesis-85

    Czech Academy of Sciences Publication Activity Database

    Šesták, Zdeněk; Čatský, Jiří

    2002-01-01

    Roč. 39, č. 4 (2002), s. 615-640 ISSN 0300-3604 R&D Projects: GA AV ČR KSK5020115 Institutional research plan: CEZ:AV0Z5038910 Keywords : methods of photosynthesis Subject RIV: EF - Botanics Impact factor: 0.773, year: 2002

  12. Assessing Internet energy intensity: A review of methods and results

    Energy Technology Data Exchange (ETDEWEB)

    Coroama, Vlad C., E-mail: vcoroama@gmail.com [Instituto Superior Técnico, Universidade Técnica de Lisboa, Av. Rovisco Pais 1, 1049-001 Lisboa (Portugal); Hilty, Lorenz M. [Department of Informatics, University of Zurich, Binzmühlestrasse 14, 8050 Zurich (Switzerland); Empa, Swiss Federal Laboratories for Materials Science and Technology, Lerchenfeldstr. 5, 9014 St. Gallen (Switzerland); Centre for Sustainable Communications, KTH Royal Institute of Technology, Lindstedtsvägen 5, 100 44 Stockholm (Sweden)

    2014-02-15

    Assessing the average energy intensity of Internet transmissions is a complex task that has been a controversial subject of discussion. Estimates published over the last decade diverge by up to four orders of magnitude — from 0.0064 kilowatt-hours per gigabyte (kWh/GB) to 136 kWh/GB. This article presents a review of the methodological approaches used so far in such assessments: i) top–down analyses based on estimates of the overall Internet energy consumption and the overall Internet traffic, whereby average energy intensity is calculated by dividing energy by traffic for a given period of time, ii) model-based approaches that model all components needed to sustain an amount of Internet traffic, and iii) bottom–up approaches based on case studies and generalization of the results. Our analysis of the existing studies shows that the large spread of results is mainly caused by two factors: a) the year of reference of the analysis, which has significant influence due to efficiency gains in electronic equipment, and b) whether end devices such as personal computers or servers are included within the system boundary or not. For an overall assessment of the energy needed to perform a specific task involving the Internet, it is necessary to account for the types of end devices needed for the task, while the energy needed for data transmission can be added based on a generic estimate of Internet energy intensity for a given year. Separating the Internet as a data transmission system from the end devices leads to more accurate models and to results that are more informative for decision makers, because end devices and the networking equipment of the Internet usually belong to different spheres of control. -- Highlights: • Assessments of the energy intensity of the Internet differ by a factor of 20,000. • We review top–down, model-based, and bottom–up estimates from literature. • Main divergence factors are the year studied and the inclusion of end devices

  13. Assessing Internet energy intensity: A review of methods and results

    International Nuclear Information System (INIS)

    Coroama, Vlad C.; Hilty, Lorenz M.

    2014-01-01

    Assessing the average energy intensity of Internet transmissions is a complex task that has been a controversial subject of discussion. Estimates published over the last decade diverge by up to four orders of magnitude — from 0.0064 kilowatt-hours per gigabyte (kWh/GB) to 136 kWh/GB. This article presents a review of the methodological approaches used so far in such assessments: i) top–down analyses based on estimates of the overall Internet energy consumption and the overall Internet traffic, whereby average energy intensity is calculated by dividing energy by traffic for a given period of time, ii) model-based approaches that model all components needed to sustain an amount of Internet traffic, and iii) bottom–up approaches based on case studies and generalization of the results. Our analysis of the existing studies shows that the large spread of results is mainly caused by two factors: a) the year of reference of the analysis, which has significant influence due to efficiency gains in electronic equipment, and b) whether end devices such as personal computers or servers are included within the system boundary or not. For an overall assessment of the energy needed to perform a specific task involving the Internet, it is necessary to account for the types of end devices needed for the task, while the energy needed for data transmission can be added based on a generic estimate of Internet energy intensity for a given year. Separating the Internet as a data transmission system from the end devices leads to more accurate models and to results that are more informative for decision makers, because end devices and the networking equipment of the Internet usually belong to different spheres of control. -- Highlights: • Assessments of the energy intensity of the Internet differ by a factor of 20,000. • We review top–down, model-based, and bottom–up estimates from literature. • Main divergence factors are the year studied and the inclusion of end devices

  14. Out of the frying pan? Streamlining the ethics review process of multisite qualitative research projects.

    Science.gov (United States)

    Iedema, Rick A M; Allen, Suellen; Britton, Kate; Hor, Suyin

    2013-05-01

    This paper describes the ethics approval processes for two multicentre, nationwide, qualitative health service research projects. The paper explains that the advent of the National Ethics Application Form has brought many improvements, but that attendant processes put in place at local health network and Human Research Ethics Committee levels may have become significantly more complicated, particularly for innovative qualitative research projects. The paper raises several questions based on its analysis of ethics application processes currently in place. WHAT IS KNOWN ABOUT THE TOPIC? The complexity of multicentre research ethics applications for research in health services has been addressed by the introduction of the National Ethics Application Form. Uptake of the form across the country's human research ethics committees has been uneven. WHAT DOES THIS PAPER ADD? This paper adds detailed insight into the ethics application process as it is currently enacted across the country. The paper details this process with reference to difficulties faced by multisite and qualitative studies in negotiating access to research sites, ethics committees' relative unfamiliarity with qualitative research , and apparent tensions between harmonisation and local sites' autonomy in approving research. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? Practitioners aiming to engage in research need to be aware that ethics approval takes place in an uneven procedural landscape, made up of variable levels of ethics approval harmonization and intricate governance or site-specific assessment processes.

  15. 77 FR 68790 - Program Comment Issued for Streamlining Section 106 Review for Actions Affecting Post-1945...

    Science.gov (United States)

    2012-11-16

    ... Program Comment including: Removing reinforced concrete rigid frames, metal rigid frames, and curved metal... culverts and reinforced concrete boxes among the common bridge types covered by the Program Comment. Two... Program Comment: Program Comment for Common Post-1945 Concrete and Steel Bridges I. Introduction Every...

  16. What is the most appropriate knowledge synthesis method to conduct a review? Protocol for a scoping review

    Directory of Open Access Journals (Sweden)

    Kastner Monika

    2012-08-01

    Full Text Available Abstract Background A knowledge synthesis attempts to summarize all pertinent studies on a specific question, can improve the understanding of inconsistencies in diverse evidence, and can identify gaps in research evidence to define future research agendas. Knowledge synthesis activities in healthcare have largely focused on systematic reviews of interventions. However, a wider range of synthesis methods has emerged in the last decade addressing different types of questions (e.g., realist synthesis to explore mediating mechanisms and moderators of interventions. Many different knowledge synthesis methods exist in the literature across multiple disciplines, but locating these, particularly for qualitative research, present challenges. There is a need for a comprehensive manual for synthesis methods (quantitative/qualitative or mixed, outlining how these methods are related, and how to match the most appropriate knowledge synthesis method to answer a research question. The objectives of this scoping review are to: 1 conduct a systematic search of the literature for knowledge synthesis methods across multi-disciplinary fields; 2 compare and contrast the different knowledge synthesis methods; and, 3 map out the specific steps to conducting the knowledge syntheses to inform the development of a knowledge synthesis methods manual/tool. Methods We will search relevant electronic databases (e.g., MEDLINE, CINAHL, grey literature, and discipline-based listservs. The scoping review will consider all study designs including qualitative and quantitative methodologies (excluding economic analysis or clinical practice guideline development, and identify knowledge synthesis methods across the disciplines of health, education, sociology, and philosophy. Two reviewers will pilot-test the screening criteria and data abstraction forms, and will independently screen the literature and abstract the data. A three-step synthesis process will be used to map the

  17. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 574: Neptune, Nevada National Security Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Environmental Restoration

    2011-08-31

    This Streamlined Approach for Environmental Restoration (SAFER) Plan identifies the activities required for closure of Corrective Action Unit (CAU) 574, Neptune. CAU 574 is included in the Federal Facility Agreement and Consent Order (FFACO) (1996 [as amended March 2010]) and consists of the following two Corrective Action Sites (CASs) located in Area 12 of the Nevada National Security Site: (1) CAS 12-23-10, U12c.03 Crater (Neptune); (2) CAS 12-45-01, U12e.05 Crater (Blanca). This plan provides the methodology for the field activities that will be performed to gather the necessary information for closure of the two CASs. There is sufficient information and process knowledge regarding the expected nature and extent of potential contaminants to recommend closure of CAU 574 using the SAFER process. Based on historical documentation, personnel interviews, site process knowledge, site visits, photographs, field screening, analytical results, the results of the data quality objective (DQO) process (Section 3.0), and an evaluation of corrective action alternatives (Appendix B), closure in place with administrative controls is the expected closure strategy for CAU 574. Additional information will be obtained by conducting a field investigation to verify and support the expected closure strategy and provide a defensible recommendation that no further corrective action is necessary. This will be presented in a Closure Report that will be prepared and submitted to the Nevada Division of Environmental Protection (NDEP) for review and approval.

  18. The potential of high resolution melting analysis (hrma) to streamline, facilitate and enrich routine diagnostics in medical microbiology.

    Science.gov (United States)

    Ruskova, Lenka; Raclavsky, Vladislav

    2011-09-01

    Routine medical microbiology diagnostics relies on conventional cultivation followed by phenotypic techniques for identification of pathogenic bacteria and fungi. This is not only due to tradition and economy but also because it provides pure culture needed for antibiotic susceptibility testing. This review focuses on the potential of High Resolution Melting Analysis (HRMA) of double-stranded DNA for future routine medical microbiology. Search of MEDLINE database for publications showing the advantages of HRMA in routine medical microbiology for identification, strain typing and further characterization of pathogenic bacteria and fungi in particular. The results show increasing numbers of newly-developed and more tailor-made assays in this field. For microbiologists unfamiliar with technical aspects of HRMA, we also provide insight into the technique from the perspective of microbial characterization. We can anticipate that the routine availability of HRMA in medical microbiology laboratories will provide a strong stimulus to this field. This is already envisioned by the growing number of medical microbiology applications published recently. The speed, power, convenience and cost effectiveness of this technology virtually predestine that it will advance genetic characterization of microbes and streamline, facilitate and enrich diagnostics in routine medical microbiology without interfering with the proven advantages of conventional cultivation.

  19. Optimal and adaptive methods of processing hydroacoustic signals (review)

    Science.gov (United States)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  20. Review of nonchemical methods for controlling stored products pests

    International Nuclear Information System (INIS)

    Ignatowicz, S.

    1996-01-01

    Fumigation of stored products with methyl bromide has been an important means of limiting the loss of quality and quantity of these commodities that are subject to attack by cosmopolitan stored product pests. Methyl bromide was identified as a substance depleting ozone, and is expected to be withdrawn from production, importation, and use in Poland and other countries soon after 2000. Based on the current knowledge, most of alternatives to methyl bromide (controlled atmospheres, heat, cold, irradiation, biotechnical methods, inert dusts, biological methods, sanitation) have researchable gaps or other constraints. None of these alternatives used alone will replace methyl bromide. Successful pest control in the absence of methyl bromide will require the development of sophisticated pest monitoring and decision support systems to enable the use of integrated pest management strategies. (author)

  1. Methods in Entrepreneurship Education Research: A Review and Integrative Framework

    DEFF Research Database (Denmark)

    Blenker, Per; Trolle Elmholdt, Stine; Frederiksen, Signe Hedeboe

    2014-01-01

    is fragmented both conceptually and methodologically. Findings suggest that the methods applied in entrepreneurship education research cluster in two groups: 1. quantitative studies of the extent and effect of entrepreneurship education, and 2. qualitative single case studies of different courses and programmes....... It integrates qualitative and quantitative techniques, the use of research teams consisting of insiders (teachers studying their own teaching) and outsiders (research collaborators studying the education) as well as multiple types of data. To gain both in-depth and analytically generalizable studies...... a variety of helpful methods, explore the potential relation between insiders and outsiders in the research process, and discuss how different types of data can be combined. The integrated framework urges researchers to extend investments in methodological efforts and to enhance the in-depth understanding...

  2. A review on mathematical methods of conventional and Islamic derivatives

    Science.gov (United States)

    Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd

    2014-12-01

    Despite the impressive growth of risk management tools in financial institutions, Islamic finance remains miles away behind the conventional institutions. Islamic finance products need to comply with the syariah law and prohibitions, therefore they can use fewer of the available risk management tools compared to conventional. Derivatives have proven to be the effective hedging technique and instrument that broadly being used in the conventional institutions to manage their risks. However, derivatives are not generally accepted as the legitimate products in Islamic finance and they remain controversial issues among the Islamic scholars. This paper reviews the evolution of derivatives such as forwards, futures and options and then explores the mathematical models that being used to solve derivatives such as random walk model, asset pricing model that follows Brownian motion and Black-Scholes model. Other than that, this paper also critically discuss the perspective of derivatives from Islamic point of view. In conclusion, this paper delivers the traditional Islamic products such as salam, urbun and istijrar that can be used to create building blocks of Islamic derivatives.

  3. Review of Methods and Approaches for Deriving Numeric ...

    Science.gov (United States)

    EPA will propose numeric criteria for nitrogen/phosphorus pollution to protect estuaries, coastal areas and South Florida inland flowing waters that have been designated Class I, II and III , as well as downstream protective values (DPVs) to protect estuarine and marine waters. In accordance with the formal determination and pursuant to a subsequent consent decree, these numeric criteria are being developed to translate and implement Florida’s existing narrative nutrient criterion, to protect the designated use that Florida has previously set for these waters, at Rule 62-302.530(47)(b), F.A.C. which provides that “In no case shall nutrient concentrations of a body of water be altered so as to cause an imbalance in natural populations of aquatic flora or fauna.” Under the Clean Water Act and EPA’s implementing regulations, these numeric criteria must be based on sound scientific rationale and reflect the best available scientific knowledge. EPA has previously published a series of peer reviewed technical guidance documents to develop numeric criteria to address nitrogen/phosphorus pollution in different water body types. EPA recognizes that available and reliable data sources for use in numeric criteria development vary across estuarine and coastal waters in Florida and flowing waters in South Florida. In addition, scientifically defensible approaches for numeric criteria development have different requirements that must be taken into consider

  4. Modelling Of Flotation Processes By Classical Mathematical Methods - A Review

    Science.gov (United States)

    Jovanović, Ivana; Miljanović, Igor

    2015-12-01

    Flotation process modelling is not a simple task, mostly because of the process complexity, i.e. the presence of a large number of variables that (to a lesser or a greater extent) affect the final outcome of the mineral particles separation based on the differences in their surface properties. The attempts toward the development of the quantitative predictive model that would fully describe the operation of an industrial flotation plant started in the middle of past century and it lasts to this day. This paper gives a review of published research activities directed toward the development of flotation models based on the classical mathematical rules. The description and systematization of classical flotation models were performed according to the available references, with emphasize exclusively given to the flotation process modelling, regardless of the model application in a certain control system. In accordance with the contemporary considerations, models were classified as the empirical, probabilistic, kinetic and population balance types. Each model type is presented through the aspects of flotation modelling at the macro and micro process levels.

  5. Capillary electrophoresis methods for microRNAs assays: A review

    Energy Technology Data Exchange (ETDEWEB)

    Ban, Eunmi; Song, Eun Joo, E-mail: ejsong@kist.re.kr

    2014-12-10

    Highlights: • A review of CE analysis of miRNAs. • Summary of developments and applications of CE systems in miRNA studies. • Applications and development of microchip-based CE for rapid analysis of miRNA. - Abstract: MicroRNAs (miRNAs) are short noncoding RNAs that conduct important roles in many cellular processes such as development, proliferation, differentiation, and apoptosis. In particular, circulating miRNAs have been proposed as biomarkers for cancer, diabetes, cardiovascular disease, and other illnesses. Therefore, determination of miRNA expression levels in various biofluids is important for the investigation of biological processes in health and disease and for discovering their potential as new biomarkers and drug targets. Capillary electrophoresis (CE) is emerging as a useful analytical tool for analyzing miRNA because of its simple sample preparation steps and efficient resolution of a diverse size range of compounds. In particular, CE with laser-induced fluorescence detection is a promising and relatively rapidly developing tool with the potential to provide high sensitivity and specificity in the analysis of miRNAs. This paper covers a short overview of the recent developments and applications of CE systems in miRNA studies in biological and biomedical areas.

  6. Review of advanced methods for treating radioactive contaminated water

    International Nuclear Information System (INIS)

    Dubourg, M.

    2002-01-01

    The accidental release of large quantities of radionuclide after a nuclear accident tends to contaminate the groundwater system of rivers and lakes by the transfer of the main radionuclides such as Cesium 137, Strontium 90 or Cobalt 60, Ruthenium 106 and others (including transuranic radionuclides, such as: Pu 239, Pu 240, Am 241..). The aim of this paper is to review the possible solutions for the removal of these contaminants from large quantities of water. the use of crown ethers for the selective removal of strontium 90 such as the di-cyclohexyl 18 crown 6 which is able to remove with 90% of efficiency the strontium. the use of zeolites for the removal of Cesium 137. On larger scale the use of electromagnetic filtration technology is able to process in a relatively short time large quantities of water by using a seeding system of resin coated metallic magnetic particles to enhance the filtering efficiency under cold conditions. Examples of efficiencies and results obtained on loops at a fairly large will be given in this paper, theses examples show rather high efficiency of removal even at low concentration of contaminants (a few ppb: part per billion). Examples of water treatment concepts will be also given for treatment of contaminated surface water and to treat large groundwater applications. Major applications could be implemented on various sites namely in Russia (Karatchai lake) or in Belarus and Ukraine. The magnetic filtration is not a new concept but with the use of various selective adsorbing treatment particles, this concept has been proven so effective that dissolved metals in process water have been reduced to level in the very low ppb range. (authors)

  7. Review: engineering particles using the aerosol-through-plasma method

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Jonathan [Los Alamos National Laboratory; Luhrs, Claudia C [UNM; Richard, Monique [TEMA

    2009-01-01

    For decades, plasma processing of materials on the nanoscale has been an underlying enabling technology for many 'planar' technologies, particularly virtually every aspect of modern electronics from integrated-circuit fabrication with nanoscale elements to the newest generation of photovoltaics. However, it is only recent developments that suggest that plasma processing can be used to make 'particulate' structures of value in fields, including catalysis, drug delivery, imaging, higher energy density batteries, and other forms of energy storage. In this paper, the development of the science and technology of one class of plasma production of particulates, namely, aerosol-through-plasma (A-T-P), is reviewed. Various plasma systems, particularly RF and microwave, have been used to create nanoparticles of metals and ceramics, as well as supported metal catalysts. Gradually, the complexity of the nanoparticles, and concomitantly their potential value, has increased. First, unique two-layer particles were generated. These were postprocessed to create unique three-layer nanoscale particles. Also, the technique has been successfully employed to make other high-value materials, including carbon nanotubes, unsupported graphene, and spherical boron nitride. Some interesting plasma science has also emerged from efforts to characterize and map aerosol-containing plasmas. For example, it is clear that even a very low concentration of particles dramatically changes plasma characteristics. Some have also argued that the local-thermodynamic-equilibrium approach is inappropriate to these systems. Instead, it has been suggested that charged- and neutral-species models must be independently developed and allowed to 'interact' only in generation terms.

  8. A Review of Liver Perfusion Method in Toxicology Studies

    Directory of Open Access Journals (Sweden)

    M karami

    2014-06-01

    Full Text Available Introduction: The isolated perfused rat liver is an accepted method in toxicology studies. The isolated perfused rat liver (IPRL is a useful experimental system for evaluating hepatic function without the influence of other organ systems, undefined plasma constituents, and neural-hormonal effects. Methods: The untreated male rats (180-220gr body weight were anesthetised with ether and then surgery with proper method. The abdomen was opened through a midline and one transversal incision and the bile duct was cannulated. Heparin sodium solution (0.5 ml; 500 U/ml in 0.9% NaCl was injected via the abdominal vena cava to prevent blood clotting. The liver inferior venacava was cannulated with PE-10 tubing and secured. The portal vein was immediately cannulated with an 23gr catheter which was secured and then liver was perfused in situ by Krebs- Henseleit buffer (pH 7.4; saturated with 95% O2 and 5% CO2; 37°C at a flow rate of 20 ml/min for 3hr. Temperature, perfusion pressure, flow rate and perfusion fluid pH were closely monitored during the perfusion. Results: Transferase enzymes (ALT, AST alterations can be widely used as a measure of biochemical alterations in order to assess liver damage due to use of drugs such as isoniazid (INH and animal and plant toxins. Accumulated material in gallbladder are valuable samples to assess the level of Glutathione (GSH. Sections of perfused liver tissue can also be effectively analyzed for pathological aspects such as necrosis, fibrosis, cellularity. Conclusion: The isolated perfused rat liver (IPRL is a useful and Sutible experimental system for evaluating hepatic function. In this system, the effects of adjacent organs, on the liver is minimized

  9. A review on power reducing methods of neural recording amplifiers

    Directory of Open Access Journals (Sweden)

    samira mehdipour

    2016-10-01

    Full Text Available Implantable multi-channel neural recording Microsystems comprise a large number of neural amplifiers, that can affect the overall power consumption and chip area of the analog part of the system.power, noise, size and dc offset are the main challenge faced by designers. Ideally the output of the opamp should be at zero volts when the inputs are grounded.In reality the input terminals are at slightly different dc potentials.The input offset voltage is defined as the voltage that must be applied between the two input terminals of the opamp to obtain zero volts at the output. Amplifier must have capability to reject this dc offset. First method that uses a capacitor feedback network with ac coupling of input devices to reject the offset is very popular in designs.very small low-cutoff frequency.The second method employs a closed-loop resistive feedback and electrode capacitance to form a highpass filter.Moreover,The third method adopts the symmetric floating resistor the feedback path of low noise amplifier to achieve low-frequency cutoff and rejects DC offset voltage. .In some application we can use folded cascade topology.The telescopic topology is a good candidate in terms of providing large gain and phase margin while dissipating small power. the cortical VLSI neuron model reducing power consumption of circuits.Power distribution is the best way to reduce power, noise and silicon area. The total power consumption of the amplifier array is reduced by applying the partial OTA sharing technique. The silicon area is reduced as a benefit of sharing the bulky capacitor.

  10. A review on measuring methods of gas-liquid flow rates

    International Nuclear Information System (INIS)

    Minemura, Kiyoshi; Yamashita, Masato

    2000-01-01

    This paper presents a review on the state of current measuring techniques for gas-liquid multiphase flow rates. After briefly discussing the basic idea on measuring methods for single-phase and two-phase flows, existing methods for the two-phase flow rates are classified into several types, that is, with or without a homogenizing device, single or combined method of several techniques, with intrusive or non-intrusive sensors, and physical or software method. Each methods are comparatively reviewed in view of measuring accuracy and manageability. Its scope also contains the techniques developed for petroleum-gas-water flow rates. (author)

  11. Review and comparison of recent methods in space geodesy

    International Nuclear Information System (INIS)

    Varga, M.

    1983-01-01

    The study of geodynamic processes requires the application of new space-born geodesic measuring methods. A terrestrial reference system (TRS) is required for describing geodynamic processes. For this purpose satisfactory knowledge of polar motions, Earth rotation and tidal forces determined by laser, global positioning system (GPS) and VLBI measurements are needed. In addition, gravity and magnetic field of the Earth have to be known, modelled by using satellite to satellite traching (SST), altimetry, gradiometry and magnetometry results. Motions of the Earth-Moon system, as well as the relation between the terrestrial reference system and the inertial system can be determined by means of VLBI measurements. (author)

  12. Rapid and stream-lined methods for analysis of actinides in environmental samples

    International Nuclear Information System (INIS)

    Cooper, E.L.

    2001-01-01

    Full text: 1) Project Summary: A systematic study of separating the actinides from each other in 1 M hydrochloric acid media has been carried out using selective oxidation/reduction processes followed by co-precipitation with neodymium fluoride. We have optimized two such procedures, one with bromate and another with permanganate, for the sequential separation of Am, Pu, Np, and U isotopes. The first procedure involves oxidation of Pu, Np and U to +6 state in 1 M HCI media at 85 deg C with 30% NaBrO 3 and separation from trivalent Am by collecting the latter on the first NdF 3 co- precipitated source. Plutonium is then reduced and converted to +4 oxidation state with 40% NaNO 2 at 85 deg C, while Np and U are kept oxidized with additional bromate in solution at 50-70 deg C, thus separating Pu by collection on a second NdF 3 source. At this stage, Np present in the filtrate is reduced with hydroxylamine hydrochloride and separated from U by collecting on a third source. Subsequently, U is reduced with 30% TiCI 3 and co-precipitated on a final source. The second procedure, which employs KMnO 4 in 1 M HCI media at 60-85 deg C for oxidizing Pu, Np and U, and separating from Am, produces MnO 2 which is collected along with Am on the co-precipitated NdF 3 . This MnO 2 is dissolved on the filter itself with 1 ml of acidified 1.5% H 2 O 2 without any degradation of the α-spectra. After evaporating the filtrate to destroy H 2 O 2 , Pu, Np and U are separated by following steps similar to those in the bromate procedure. The recoveries of the actinides with both procedures are >99%. The decontamination factors are between 10 3 and 10 4 . 2) Summary of Proposed Work for the Next Year: Now that the separation procedure has been developed, we will begin to incorporate it into rapid and steam-lined procedures for samples, such as water, air filters and environmental materials. (author)

  13. Methods for streamlining intervention fidelity checklists: an example from the chronic disease self-management program.

    Science.gov (United States)

    Ahn, SangNam; Smith, Matthew Lee; Altpeter, Mary; Belza, Basia; Post, Lindsey; Ory, Marcia G

    2014-01-01

    Maintaining intervention fidelity should be part of any programmatic quality assurance (QA) plan and is often a licensure requirement. However, fidelity checklists designed by original program developers are often lengthy, which makes compliance difficult once programs become widely disseminated in the field. As a case example, we used Stanford's original Chronic Disease Self-Management Program (CDSMP) fidelity checklist of 157 items to demonstrate heuristic procedures for generating shorter fidelity checklists. Using an expert consensus approach, we sought feedback from active master trainers registered with the Stanford University Patient Education Research Center about which items were most essential to, and also feasible for, assessing fidelity. We conducted three sequential surveys and one expert group-teleconference call. Three versions of the fidelity checklist were created using different statistical and methodological criteria. In a final group-teleconference call with seven national experts, there was unanimous agreement that all three final versions (e.g., a 34-item version, a 20-item version, and a 12-item version) should be made available because the purpose and resources for administering a checklist might vary from one setting to another. This study highlights the methodology used to generate shorter versions of a fidelity checklist, which has potential to inform future QA efforts for this and other evidence-based programs (EBP) for older adults delivered in community settings. With CDSMP and other EBP, it is important to differentiate between program fidelity as mandated by program developers for licensure, and intervention fidelity tools for providing an "at-a-glance" snapshot of the level of compliance to selected program indicators.

  14. Sentiment analysis system for movie review in Bahasa Indonesia using naive bayes classifier method

    Science.gov (United States)

    Nurdiansyah, Yanuar; Bukhori, Saiful; Hidayat, Rahmad

    2018-04-01

    There are many ways of implementing the use of sentiments often found in documents; one of which is the sentiments found on the product or service reviews. It is so important to be able to process and extract textual data from the documents. Therefore, we propose a system that is able to classify sentiments from review documents into two classes: positive sentiment and negative sentiment. We use Naive Bayes Classifier method in this document classification system that we build. We choose Movienthusiast, a movie reviews in Bahasa Indonesia website as the source of our review documents. From there, we were able to collect 1201 movie reviews: 783 positive reviews and 418 negative reviews that we use as the dataset for this machine learning classifier. The classifying accuracy yields an average of 88.37% from five times of accuracy measuring attempts using aforementioned dataset.

  15. Evaluation of a new method for librarian-mediated literature searches for systematic reviews

    NARCIS (Netherlands)

    W.M. Bramer (Wichor); Rethlefsen, M.L. (Melissa L.); F. Mast (Frans); J. Kleijnen (Jos)

    2017-01-01

    textabstractObjective: To evaluate and validate the time of completion and results of a new method of searching for systematic reviews, the exhaustive search method (ESM), using a pragmatic comparison. Methods: Single-line search strategies were prepared in a text document. Term completeness was

  16. The Effect of Peer Review on Student Learning Outcomes in a Research Methods Course

    Science.gov (United States)

    Crowe, Jessica A.; Silva, Tony; Ceresola, Ryan

    2015-01-01

    In this study, we test the effect of in-class student peer review on student learning outcomes using a quasiexperimental design. We provide an assessment of peer review in a quantitative research methods course, which is a traditionally difficult and technical course. Data were collected from 170 students enrolled in four sections of a…

  17. The use of electrical devices for the treatment of bladder dysfunction: a review of methods.

    NARCIS (Netherlands)

    Balken, M.R. van; Vergunst, H.; Bemelmans, B.L.H.

    2004-01-01

    PURPOSE: We reviewed the literature on the application of various devices and techniques for the electrical stimulation treatment of lower urinary tract dysfunction with respect to mechanism of action and clinical outcome. MATERIALS AND METHODS: A systematic review was done in PubMed of publications

  18. Review: Gregory C. Stanczak (Ed. (2007. Visual Research Methods

    Directory of Open Access Journals (Sweden)

    Boris Traue

    2009-03-01

    Full Text Available In this edited volume, research methods employing the camera as a means of documentation in the context of ethnographic research are presented and discussed. To a lesser extent, the volume deals with research strategies for dealing with images produced by social actors, such as propaganda photography and video diaries. Special attention is given to the way the camera facilitates the process of communication in ethnographic research. This collection may be very helpful for readers looking for a discussion of methodological problems and practical advice for the use of cameras, especially in the context of ethnographic research. Theoretical issues of visuality and visual performance in contemporary societies are mentioned, but not treated in depth. URN: urn:nbn:de:0114-fqs090265

  19. Including mixed methods research in systematic reviews: Examples from qualitative syntheses in TB and malaria control

    Science.gov (United States)

    2012-01-01

    Background Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. Methods We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Results Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Conclusions Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research. PMID:22545681

  20. Critical review of methods for the estimation of actual evapotranspiration in hydrological models

    CSIR Research Space (South Africa)

    Jovanovic, Nebojsa

    2012-01-01

    Full Text Available The chapter is structured in three parts, namely: i) A theoretical overview of evapotranspiration processes, including the principle of atmospheric demand-soil water supply, ii) A review of methods and techniques to measure and estimate actual...

  1. An Accurate and Impartial Expert Assignment Method for Scientific Project Review

    Directory of Open Access Journals (Sweden)

    Mingliang Yue

    2017-12-01

    Full Text Available Purpose: This paper proposes an expert assignment method for scientific project review that considers both accuracy and impartiality. As impartial and accurate peer review is extremely important to ensure the quality and feasibility of scientific projects, enhanced methods for managing the process are needed. Design/methodology/approach: To ensure both accuracy and impartiality, we design four criteria, the reviewers’ fitness degree, research intensity, academic association, and potential conflict of interest, to express the characteristics of an appropriate peer review expert. We first formalize the expert assignment problem as an optimization problem based on the designed criteria, and then propose a randomized algorithm to solve the expert assignment problem of identifying reviewer adequacy. Findings: Simulation results show that the proposed method is quite accurate and impartial during expert assignment. Research limitations: Although the criteria used in this paper can properly show the characteristics of a good and appropriate peer review expert, more criteria/conditions can be included in the proposed scheme to further enhance accuracy and impartiality of the expert assignment. Practical implications: The proposed method can help project funding agencies (e.g. the National Natural Science Foundation of China find better experts for project peer review. Originality/value: To the authors’ knowledge, this is the first publication that proposes an algorithm that applies an impartial approach to the project review expert assignment process. The simulation results show the effectiveness of the proposed method.

  2. Sediment core and glacial environment reconstruction - a method review

    Science.gov (United States)

    Bakke, Jostein; Paasche, Øyvind

    2010-05-01

    Alpine glaciers are often located in remote and high-altitude regions of the world, areas that only rarely are covered by instrumental records. Reconstructions of glaciers has therefore proven useful for understanding past climate dynamics on both shorter and longer time-scales. One major drawback with glacier reconstructions based solely on moraine chronologies - by far the most common -, is that due to selective preservation of moraine ridges such records do not exclude the possibility of multiple Holocene glacier advances. This problem is true regardless whether cosmogenic isotopes or lichenometry have been used to date the moraines, or also radiocarbon dating of mega-fossils buried in till or underneath the moraines themselves. To overcome this problem Karlén (1976) initially suggested that glacial erosion and the associated production of rock-flour deposited in downstream lakes could provide a continuous record of glacial fluctuations, hence overcoming the problem of incomplete reconstructions. We want to discuss the methods used to reconstruct past glacier activity based on sediments deposited in distal glacier-fed lakes. By quantifying physical properties of glacial and extra-glacial sediments deposited in catchments, and in downstream lakes and fjords, it is possible to isolate and identify past glacier activity - size and production rate - that subsequently can be used to reconstruct changing environmental shifts and trends. Changes in average sediment evacuation from alpine glaciers are mainly governed by glacier size and the mass turnover gradient, determining the deformation rate at any given time. The amount of solid precipitation (mainly winter accumulation) versus loss due to melting during the ablation-season (mainly summer temperature) determines the mass turnover gradient in either positive or negative direction. A prevailing positive net balance will lead to higher sedimentation rates and vice versa, which in turn can be recorded in downstream

  3. Stable–streamlined and helical cavities following the impact of Leidenfrost spheres

    KAUST Repository

    Mansoor, Mohammad M.

    2017-06-23

    We report results from an experimental study on the formation of stable–streamlined and helical cavity wakes following the free-surface impact of Leidenfrost spheres. Similar to the observations of Mansoor et al. (J. Fluid Mech., vol. 743, 2014, pp. 295–326), we show that acoustic ripples form along the interface of elongated cavities entrained in the presence of wall effects as soon as the primary cavity pinch-off takes place. The crests of these ripples can act as favourable points for closure, producing multiple acoustic pinch-offs, which are found to occur in an acoustic pinch-off cascade. We show that these ripples pacify with time in the absence of physical contact between the sphere and the liquid, leading to extremely smooth cavity wake profiles. More importantly, the downward-facing jet at the apex of the cavity is continually suppressed due to a skin-friction drag effect at the colliding cavity-wall junction, which ultimately produces a stable–streamlined cavity wake. This streamlined configuration is found to experience drag coefficients an order of a magnitude lower than those acting on room-temperature spheres. A striking observation is the formation of helical cavities which occur for impact Reynolds numbers and are characterized by multiple interfacial ridges, stemming from and rotating synchronously about an evident contact line around the sphere equator. The contact line is shown to result from the degeneration of Kelvin–Helmholtz billows into turbulence which are observed forming along the liquid–vapour interface around the bottom hemisphere of the sphere. Using sphere trajectory measurements, we show that this helical cavity wake configuration has 40 %–55 % smaller force coefficients than those obtained in the formation of stable cavity wakes.

  4. Stable–streamlined and helical cavities following the impact of Leidenfrost spheres

    KAUST Repository

    Mansoor, Mohammad M.; Vakarelski, Ivan Uriev; Marston, J. O.; Truscott, T. T.; Thoroddsen, Sigurdur T

    2017-01-01

    We report results from an experimental study on the formation of stable–streamlined and helical cavity wakes following the free-surface impact of Leidenfrost spheres. Similar to the observations of Mansoor et al. (J. Fluid Mech., vol. 743, 2014, pp. 295–326), we show that acoustic ripples form along the interface of elongated cavities entrained in the presence of wall effects as soon as the primary cavity pinch-off takes place. The crests of these ripples can act as favourable points for closure, producing multiple acoustic pinch-offs, which are found to occur in an acoustic pinch-off cascade. We show that these ripples pacify with time in the absence of physical contact between the sphere and the liquid, leading to extremely smooth cavity wake profiles. More importantly, the downward-facing jet at the apex of the cavity is continually suppressed due to a skin-friction drag effect at the colliding cavity-wall junction, which ultimately produces a stable–streamlined cavity wake. This streamlined configuration is found to experience drag coefficients an order of a magnitude lower than those acting on room-temperature spheres. A striking observation is the formation of helical cavities which occur for impact Reynolds numbers and are characterized by multiple interfacial ridges, stemming from and rotating synchronously about an evident contact line around the sphere equator. The contact line is shown to result from the degeneration of Kelvin–Helmholtz billows into turbulence which are observed forming along the liquid–vapour interface around the bottom hemisphere of the sphere. Using sphere trajectory measurements, we show that this helical cavity wake configuration has 40 %–55 % smaller force coefficients than those obtained in the formation of stable cavity wakes.

  5. Review on the refractive treatment methods of aphakia

    Directory of Open Access Journals (Sweden)

    Xiao-Yan Deng

    2015-06-01

    Full Text Available The refractive treatment methods of aphakia include corrective glasses, contact lens correction and intraocular lens(IOLimplantation. It magnifies images highly and limits vision field with corrective glasses. For infant aphakia corrective glasses are more likely to be chosen because their eyes are still unable to tolerate IOL implantation in the developmental stage. With low magnification of images, contact lens includes soft contact lens and rigid contact lens. The former is rarely used because it is prone to ocular lesions due to its poor oxygen permeability. The latter is widely used due to its good oxygen permeability especially suitable for the eyes of irregular astigmatism or iris missing due to trauma. At present, the most commonly used in clinical work is IOL implantation. The eye of IOL may avoid anisometropia, aberrations and so on because of more physiological anatomy. According to the IOL implantation site, it is divided into the anterior chamber IOL implantation and the posterior chamber IOL implantation. The anterior chamber IOL implantation is divided into angle fixed IOL implantation and iris fixed IOL implantation. The posterior chamber IOL implantation is divided into secondary in-the-bag IOL implantation, the ciliary sulcus IOL implantation and transscleral suture fixed IOL implantation.

  6. Review and evaluation of metallic TRU nuclear waste consolidation methods

    International Nuclear Information System (INIS)

    Montgomery, D.R.; Nesbitt, J.F.

    1983-08-01

    The US Department of Energy established the Commercial Waste Treatment Program to develop, demonstrate, and deploy waste treatment technology. In this report, viable methods are identified that could consolidate the volume of metallic wastes generated in a fuel reprocessing facility. The purpose of this study is to identify, evaluate, and rate processes that have been or could be used to reduce the volume of contaminated/irradiated metallic waste streams and to produce an acceptable waste form in a safe and cost-effective process. A technical comparative evaluation of various consolidation processes was conducted, and these processes were rated as to the feasibility and cost of producing a viable product from a remotely operated radioactive process facility. Out of the wide variety of melting concepts and consolidation systems that might be applicable for consolidating metallic nuclear wastes, the following processes were selected for evaluation: inductoslay melting, rotating nonconsumable electrode melting, plasma arc melting, electroslag melting with two nonconsumable electrodes, vacuum coreless induction melting, and cold compaction. Each process was evaluated and rated on the criteria of complexity of process, state and type of development required, safety, process requirements, and facility requirements. It was concluded that the vacuum coreless induction melting process is the most viable process to consolidate nuclear metallic wastes. 11 references

  7. A Review of Distributed Parameter Groundwater Management Modeling Methods

    Science.gov (United States)

    Gorelick, Steven M.

    1983-04-01

    Models which solve the governing groundwater flow or solute transport equations in conjunction with optimization techniques, such as linear and quadratic programing, are powerful aquifer management tools. Groundwater management models fall in two general categories: hydraulics or policy evaluation and water allocation. Groundwater hydraulic management models enable the determination of optimal locations and pumping rates of numerous wells under a variety of restrictions placed upon local drawdown, hydraulic gradients, and water production targets. Groundwater policy evaluation and allocation models can be used to study the influence upon regional groundwater use of institutional policies such as taxes and quotas. Furthermore, fairly complex groundwater-surface water allocation problems can be handled using system decomposition and multilevel optimization. Experience from the few real world applications of groundwater optimization-management techniques is summarized. Classified separately are methods for groundwater quality management aimed at optimal waste disposal in the subsurface. This classification is composed of steady state and transient management models that determine disposal patterns in such a way that water quality is protected at supply locations. Classes of research missing from the literature are groundwater quality management models involving nonlinear constraints, models which join groundwater hydraulic and quality simulations with political-economic management considerations, and management models that include parameter uncertainty.

  8. IMPLEMENTING NDN USING SDN: A REVIEW ON METHODS AND APPLICATIONS

    Directory of Open Access Journals (Sweden)

    Shiva Rowshanrad

    2016-11-01

    Full Text Available In recent years many claims about the limitations of todays’ network architecture, its lack of flexibility and ability to response to ongoing changes and increasing users demands. In this regard, new network architectures are proposed. Software Defined Networking (SDN is one of these new architectures which centralizes the control of network by separating control plane from data plane. This separation leads to intelligence, flexibility and easier control in computer networks. One of the advantages of this framework is the ability to implement and test new protocols and architectures in actual networks without any concern of interruption.Named Data Networking (NDN is another paradigm for future network architecture. With NDN the network becomes aware of the content that is providing, rather than just transferring it among end-points. NDN attracts researchers’ attention and known as the potential future of networking and internet. Providing NDN functionalities over SDN is an important requirement to enable the innovation and optimization of network resources. In this paper first we describe about SDN and NDN, and then we introduce methods for implementing NDN using SDN. We also point out the advantages and applications of implementing NDN over SDN.

  9. Effects of the Pilates method on neck pain: a systematic review

    OpenAIRE

    Cemin, Natália Fernanda; Schmit, Emanuelle Francine Detogni; Candotti, Cláudia Tarragô

    2017-01-01

    Abstract Introduction: The Pilates method has been used for neck pain reduction. Objective: To systematically review randomized and non-randomized controlled trials that assessed the effects of Pilates on neck pain when compared to other groups (CRD42015025987). Methods: This study involved a systematic review directed by the PRISMA Statement based on the recommendations of the Cochrane Colaboration, registered in PROSPERO under the code CRD42015025987. The following databases were searche...

  10. Radon decay product in-door behaviour - parameter, measurement method, and model review

    International Nuclear Information System (INIS)

    Scofield, P.

    1988-01-01

    This report reviews parameters used to characterize indoor radon daughter behavior and concentrations. Certain parameters that affect indoor radon daughter concentrations are described and the values obtained experimentally or theoretically are summarized. Radon daughter measurement methods are reviewed, such as, PAEC, unattached daughters, particle size distributions, and plateout measurement methods. In addition, certain radon pressure driven/diffusion models and indoor radon daughter models are briefly described. (orig.)

  11. Including mixed methods research in systematic reviews: examples from qualitative syntheses in TB and malaria control.

    Science.gov (United States)

    Atkins, Salla; Launiala, Annika; Kagaha, Alexander; Smith, Helen

    2012-04-30

    Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research.

  12. Theoretical Calculations on Sediment Transport on Titan, and the Possible Production of Streamlined Forms

    Science.gov (United States)

    Burr, D. M.; Emery, J. P.; Lorenz, R. D.

    2005-01-01

    The Cassini Imaging Science System (ISS) has been returning images of Titan, along with other Saturnian satellites. Images taken through the 938 nm methane window see down to Titan's surface. One of the purposes of the Cassini mission is to investigate possible fluid cycling on Titan. Lemniscate features shown recently and radar evidence of surface flow prompted us to consider theoretically the creation by methane fluid flow of streamlined forms on Titan. This follows work by other groups in theoretical consideration of fluid motion on Titan's surface.

  13. Measuring solar reflectance - Part II: Review of practical methods

    Energy Technology Data Exchange (ETDEWEB)

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul [Heat Island Group, Environmental Energy Technologies Division, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)

    2010-09-15

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

  14. Novel diffusion tensor imaging technique reveals developmental streamline volume changes in the corticospinal tract associated with leg motor control.

    Science.gov (United States)

    Kamson, David O; Juhász, Csaba; Chugani, Harry T; Jeong, Jeong-Won

    2015-04-01

    Diffusion tensor imaging (DTI) has expanded our knowledge of corticospinal tract (CST) anatomy and development. However, previous developmental DTI studies assessed the CST as a whole, overlooking potential differences in development of its components related to control of the upper and lower extremities. The present cross-sectional study investigated age-related changes, side and gender differences in streamline volume of the leg- and hand-related segments of the CST in children. DTI data of 31 children (1-14 years; mean age: 6±4 years; 17 girls) with normal conventional MRI were analyzed. Leg- and hand-related CST streamline volumes were quantified separately, using a recently validated novel tractography approach. CST streamline volumes on both sides were compared between genders and correlated with age. Higher absolute streamline volumes were found in the left leg-related CST compared to the right (p=0.001) without a gender effect (p=0.4), whereas no differences were found in the absolute hand-related CST volumes (p>0.4). CST leg-related streamline volumes, normalized to hemispheric white matter volumes, declined with age in the right hemisphere only (R=-.51; p=0.004). Absolute leg-related CST streamline volumes showed similar, but slightly weaker correlations. Hand-related absolute or normalized CST streamline volumes showed no age-related variations on either side. These results suggest differential development of CST segments controlling hand vs. leg movements. Asymmetric volume changes in the lower limb motor pathway may be secondary to gradually strengthening left hemispheric dominance and is consistent with previous data suggesting that footedness is a better predictor of hemispheric lateralization than handedness. Copyright © 2014 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  15. Approximation and inference methods for stochastic biochemical kinetics—a tutorial review

    International Nuclear Information System (INIS)

    Schnoerr, David; Grima, Ramon; Sanguinetti, Guido

    2017-01-01

    Stochastic fluctuations of molecule numbers are ubiquitous in biological systems. Important examples include gene expression and enzymatic processes in living cells. Such systems are typically modelled as chemical reaction networks whose dynamics are governed by the chemical master equation. Despite its simple structure, no analytic solutions to the chemical master equation are known for most systems. Moreover, stochastic simulations are computationally expensive, making systematic analysis and statistical inference a challenging task. Consequently, significant effort has been spent in recent decades on the development of efficient approximation and inference methods. This article gives an introduction to basic modelling concepts as well as an overview of state of the art methods. First, we motivate and introduce deterministic and stochastic methods for modelling chemical networks, and give an overview of simulation and exact solution methods. Next, we discuss several approximation methods, including the chemical Langevin equation, the system size expansion, moment closure approximations, time-scale separation approximations and hybrid methods. We discuss their various properties and review recent advances and remaining challenges for these methods. We present a comparison of several of these methods by means of a numerical case study and highlight some of their respective advantages and disadvantages. Finally, we discuss the problem of inference from experimental data in the Bayesian framework and review recent methods developed the literature. In summary, this review gives a self-contained introduction to modelling, approximations and inference methods for stochastic chemical kinetics. (topical review)

  16. Diagonally Implicit Runge-Kutta Methods for Ordinary Differential Equations. A Review

    Science.gov (United States)

    Kennedy, Christopher A.; Carpenter, Mark H.

    2016-01-01

    A review of diagonally implicit Runge-Kutta (DIRK) methods applied to rst-order ordinary di erential equations (ODEs) is undertaken. The goal of this review is to summarize the characteristics, assess the potential, and then design several nearly optimal, general purpose, DIRK-type methods. Over 20 important aspects of DIRKtype methods are reviewed. A design study is then conducted on DIRK-type methods having from two to seven implicit stages. From this, 15 schemes are selected for general purpose application. Testing of the 15 chosen methods is done on three singular perturbation problems. Based on the review of method characteristics, these methods focus on having a stage order of two, sti accuracy, L-stability, high quality embedded and dense-output methods, small magnitudes of the algebraic stability matrix eigenvalues, small values of aii, and small or vanishing values of the internal stability function for large eigenvalues of the Jacobian. Among the 15 new methods, ESDIRK4(3)6L[2]SA is recommended as a good default method for solving sti problems at moderate error tolerances.

  17. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 408: Bomblet Target Area, Tonopah Test Range, Nevada

    International Nuclear Information System (INIS)

    NSTec Environmental Management

    2006-01-01

    This Streamlined Approach for Environmental Restoration Plan provides the details for the closure of Corrective Action Unit (CAU) 408, Bomblet Target Area. CAU 408 is located at the Tonopah Test Range and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order of 1996. One Corrective Action Site (CAS) is included in CAU 408: (lg b ullet) CAS TA-55-002-TAB2, Bomblet Target Areas Based on historical documentation, personnel interviews, process knowledge, site visits, aerial photography, multispectral data, preliminary geophysical surveys, and the results of data quality objectives process (Section 3.0), clean closure will be implemented for CAU 408. CAU 408 closure activities will consist of identification and clearance of bomblet target areas, identification and removal of depleted uranium (DU) fragments on South Antelope Lake, and collection of verification samples. Any soil containing contaminants at concentrations above the action levels will be excavated and transported to an appropriate disposal facility. Based on existing information, contaminants of potential concern at CAU 408 include explosives. In addition, at South Antelope Lake, bomblets containing DU were tested. None of these contaminants is expected to be present in the soil at concentrations above the action levels; however, this will be determined by radiological surveys and verification sample results. The corrective action investigation and closure activities have been planned to include data collection and hold points throughout the process. Hold points are designed to allow decision makers to review the existing data and decide which of the available options are most suitable. Hold points include the review of radiological, geophysical, and analytical data and field observations

  18. Review: Janice M. Morse & Linda Niehaus (2009). Mixed Method Design: Principles and Procedures

    OpenAIRE

    Öhlen, Joakim

    2010-01-01

    Mixed method design related to the use of a combination of methods, usually quantitative and qualitative, is increasingly used for the investigation of complex phenomena. This review discusses the book, "Mixed Method Design: Principles and Procedures," by Janice M. MORSE and Linda NIEHAUS. A distinctive feature of their approach is the consideration of mixed methods design out of a core and a supplemental component. In order to define these components they emphasize the overall conceptual dir...

  19. Caseload management methods for use within district nursing teams: a literature review.

    Science.gov (United States)

    Roberson, Carole

    2016-05-01

    Effective and efficient caseload management requires extensive skills to ensure that patients receive the right care by the right person at the right time. District nursing caseloads are continually increasing in size and complexity, which requires specialist district nursing knowledge and skills. This article reviews the literature related to caseload management with the aim of identifying the most effective method for district nursing teams. The findings from this review are that there are different styles and methods of caseload management. The literature review was unable to identify a single validated tool or method, but identified themes for implementing effective caseload management, specifically caseload analysis; workload measurement; work allocation; service and practice development and workforce planning. This review also identified some areas for further research.

  20. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods.

    Science.gov (United States)

    Boushey, C J; Spoden, M; Zhu, F M; Delp, E J; Kerr, D A

    2017-08-01

    For nutrition practitioners and researchers, assessing dietary intake of children and adults with a high level of accuracy continues to be a challenge. Developments in mobile technologies have created a role for images in the assessment of dietary intake. The objective of this review was to examine peer-reviewed published papers covering development, evaluation and/or validation of image-assisted or image-based dietary assessment methods from December 2013 to January 2016. Images taken with handheld devices or wearable cameras have been used to assist traditional dietary assessment methods for portion size estimations made by dietitians (image-assisted methods). Image-assisted approaches can supplement either dietary records or 24-h dietary recalls. In recent years, image-based approaches integrating application technology for mobile devices have been developed (image-based methods). Image-based approaches aim at capturing all eating occasions by images as the primary record of dietary intake, and therefore follow the methodology of food records. The present paper reviews several image-assisted and image-based methods, their benefits and challenges; followed by details on an image-based mobile food record. Mobile technology offers a wide range of feasible options for dietary assessment, which are easier to incorporate into daily routines. The presented studies illustrate that image-assisted methods can improve the accuracy of conventional dietary assessment methods by adding eating occasion detail via pictures captured by an individual (dynamic images). All of the studies reduced underreporting with the help of images compared with results with traditional assessment methods. Studies with larger sample sizes are needed to better delineate attributes with regards to age of user, degree of error and cost.

  1. Streamlining the process: A strategy for making NEPA work better and cost less

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, R.P.; Hansen, J.D. [Hansen Environmental Consultants, Englewood, CO (United States); Wolff, T.A. [Sandia National Labs., Albuquerque, NM (United States)

    1998-05-01

    When the National Environmental Policy Act (NEPA) was enacted in 1969, neither Congress nor the Federal Agencies affected anticipated that implementation of the NEPA process would result in the intolerable delays, inefficiencies, duplication of effort, commitments of excessive financial and personnel resources, and bureaucratic gridlock that have become institutionalized. The 1975 Council on Environmental Quality (CEQ) regulations, which were intended to make the NEPA process more efficient and more useful to decision makers and the public, have either been largely ignored or unintentionally subverted. Agency policy mandates, like those of former Secretary of Energy Hazel R. O`Leary, to ``make NEPA work better and cost less`` have, so far, been disappointingly ineffectual. Federal Agencies have reached the point where almost every constituent of the NEPA process must be subjected to crisis management. This paper focuses on a ten-point strategy for streamlining the NEPA process in order to achieve the Act`s objectives while easing the considerable burden on agencies, the public, and the judicial system. How the ten points are timed and implemented is critical to any successful streamlining.

  2. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    Science.gov (United States)

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  3. Review of: Methods to complete watershed analysis on Pacific Lumber lands in Northern California

    Science.gov (United States)

    L. M. Reid

    1999-01-01

    The three questions of primary concern for this review are: 1) are the WDNR modules adequately and validly modified to suit local conditions, as required by the HCP/SYP? 2) is there an adequate "distinct cumulative effects assessment" method, as required by the HCP/SYP? 3) will the cumulative effects assessment method and the modified WDNR modules be...

  4. A review of the physics methods for advanced gas-cooled reactors

    International Nuclear Information System (INIS)

    Buckler, A.N.

    1982-01-01

    A review is given of steady-state reactor physics methods and associated codes used in AGR design and operation. These range from the basic lattice codes (ARGOSY, WIMS), through homogeneous-diffusion theory fuel management codes (ODYSSEUS, MOPSY) to a fully heterogeneous code (HET). The current state of development of the methods is discussed, together with illustrative examples of their application. (author)

  5. Wellbeing Research in Developing Countries: Reviewing the Role of Qualitative Methods

    Science.gov (United States)

    Camfield, Laura; Crivello, Gina; Woodhead, Martin

    2009-01-01

    The authors review the contribution of qualitative methods to exploring concepts and experiences of wellbeing among children and adults living in developing countries. They provide examples illustrating the potential of these methods for gaining a holistic and contextual understanding of people's perceptions and experiences. Some of these come…

  6. A review of methods for the evaluation of the energy contribution of daylight in buildings

    Energy Technology Data Exchange (ETDEWEB)

    Attenborough, M; Goodwin, A

    1996-07-01

    A review has been undertaken of energy prediction methods and daylight calculation methods currently in use in the UK. This was based on a literature review and discussions with large engineering practices and academics involved in the areas of daylighting and energy simulation research. The aim of this review was to identify manual methods or computer programs that are capable of determining energy use in non-domestic buildings and of taking into account the energy savings resulting from daylighting. One potential application for these methods is in supporting anticipated energy targets for non-domestic buildings within Building Regulations and other energy labelling schemes. The review has identified a range of methods which are capable of predicting overall energy use while accounting for daylight. These vary in complexity from empirical methods such as ESICHECK and the CIBSE Energy Code through to dynamic energy simulation models such as DOE 2 and ESP. For each of the methods identified a brief assessment has been made of their technical capabilities ease of use and availability. These assessments have been based on discussions with users and program developers. Descriptions of the various methods are given. (Author)

  7. State-of-the-Art Methods for Brain Tissue Segmentation: A Review.

    Science.gov (United States)

    Dora, Lingraj; Agrawal, Sanjay; Panda, Rutuparna; Abraham, Ajith

    2017-01-01

    Brain tissue segmentation is one of the most sought after research areas in medical image processing. It provides detailed quantitative brain analysis for accurate disease diagnosis, detection, and classification of abnormalities. It plays an essential role in discriminating healthy tissues from lesion tissues. Therefore, accurate disease diagnosis and treatment planning depend merely on the performance of the segmentation method used. In this review, we have studied the recent advances in brain tissue segmentation methods and their state-of-the-art in neuroscience research. The review also highlights the major challenges faced during tissue segmentation of the brain. An effective comparison is made among state-of-the-art brain tissue segmentation methods. Moreover, a study of some of the validation measures to evaluate different segmentation methods is also discussed. The brain tissue segmentation, content in terms of methodologies, and experiments presented in this review are encouraging enough to attract researchers working in this field.

  8. Methods for increasing upper airway muscle tonus in treating obstructive sleep apnea: systematic review.

    Science.gov (United States)

    Valbuza, Juliana Spelta; de Oliveira, Márcio Moysés; Conti, Cristiane Fiquene; Prado, Lucila Bizari F; de Carvalho, Luciane Bizari Coin; do Prado, Gilmar Fernandes

    2010-12-01

    Treatment of obstructive sleep apnea (OSA) using methods for increasing upper airway muscle tonus has been controversial and poorly reported. Thus, a review of the evidence is needed to evaluate the effectiveness of these methods. The design used was a systematic review of randomized controlled trials. Data sources are from the Cochrane Library, Medline, Embase and Scielo, registries of ongoing trials, theses indexed at Biblioteca Regional de Medicina/Pan-American Health Organization of the World Health Organization and the reference lists of all the trials retrieved. This was a review of randomized or quasi-randomized double-blind trials on OSA. Two reviewers independently applied eligibility criteria. One reviewer assessed study quality and extracted data, and these processes were checked by a second reviewer. The primary outcome was a decrease in the apnea/hypopnea index (AHI) of below five episodes per hour. Other outcomes were subjective sleep quality, sleep quality measured by night polysomnography, quality of life measured subjectively and adverse events associated with the treatments. Three eligible trials were included. Two studies showed improvements through the objective and subjective analyses, and one study showed improvement of snoring, but not of AHI while the subjective analyses showed no improvement. The adverse events were reported and they were not significant. There is no accepted scientific evidence that methods aiming to increase muscle tonus of the stomatognathic system are effective in reducing AHI to below five events per hour. Well-designed randomized controlled trials are needed to assess the efficacy of such methods.

  9. Epidemiological methods for research with drug misusers: review of methods for studying prevalence and morbidity

    Directory of Open Access Journals (Sweden)

    Dunn John

    1999-01-01

    Full Text Available Epidemiological studies of drug misusers have until recently relied on two main forms of sampling: probability and convenience. The former has been used when the aim was simply to estimate the prevalence of the condition and the latter when in depth studies of the characteristics, profiles and behaviour of drug users were required, but each method has its limitations. Probability samples become impracticable when the prevalence of the condition is very low, less than 0.5% for example, or when the condition being studied is a clandestine activity such as illicit drug use. When stratified random samples are used, it may be difficult to obtain a truly representative sample, depending on the quality of the information used to develop the stratification strategy. The main limitation of studies using convenience samples is that the results cannot be generalised to the whole population of drug users due to selection bias and a lack of information concerning the sampling frame. New methods have been developed which aim to overcome some of these difficulties, for example, social network analysis, snowball sampling, capture-recapture techniques, privileged access interviewer method and contact tracing. All these methods have been applied to the study of drug misuse. The various methods are described and examples of their use given, drawn from both the Brazilian and international drug misuse literature.

  10. Demystifying the peer-review process - workshop

    Science.gov (United States)

    Scientific writing and peer-review are integral parts of the publishing process. This workshop aims to demystify the peer-review process for early career scientists and provide insightful tips for streamlining the submission and peer review process for all researchers. Providing ...

  11. Introducing PALETTE: an iterative method for conducting a literature search for a review in palliative care.

    Science.gov (United States)

    Zwakman, Marieke; Verberne, Lisa M; Kars, Marijke C; Hooft, Lotty; van Delden, Johannes J M; Spijker, René

    2018-06-02

    In the rapidly developing specialty of palliative care, literature reviews have become increasingly important to inform and improve the field. When applying widely used methods for literature reviews developed for intervention studies onto palliative care, challenges are encountered such as the heterogeneity of palliative care in practice (wide range of domains in patient characteristics, stages of illness and stakeholders), the explorative character of review questions, and the poorly defined keywords and concepts. To overcome the challenges and to provide guidance for researchers to conduct a literature search for a review in palliative care, Palliative cAre Literature rEview iTeraTive mEthod (PALLETE), a pragmatic framework, was developed. We assessed PALETTE with a detailed description. PALETTE consists of four phases; developing the review question, building the search strategy, validating the search strategy and performing the search. The framework incorporates different information retrieval techniques: contacting experts, pearl growing, citation tracking and Boolean searching in a transparent way to maximize the retrieval of literature relevant to the topic of interest. The different components and techniques are repeated until no new articles are qualified for inclusion. The phases within PALETTE are interconnected by a recurrent process of validation on 'golden bullets' (articles that undoubtedly should be part of the review), citation tracking and concept terminology reflecting the review question. To give insight in the value of PALETTE, we compared PALETTE with the recommended search method for reviews of intervention studies. By using PALETTE on two palliative care literature reviews, we were able to improve our review questions and search strategies. Moreover, in comparison with the recommended search for intervention reviews, the number of articles needed to be screened was decreased whereas more relevant articles were retrieved. Overall, PALETTE

  12. [Predictive methods versus clinical titration for the initiation of lithium therapy. A systematic review].

    Science.gov (United States)

    Geeraerts, I; Sienaert, P

    2013-01-01

    When lithium is administered, the clinician needs to know when the lithium in the patient’s blood has reached a therapeutic level. At the initiation of treatment the level is usually achieved gradually through the application of the titration method. In order to increase the efficacy of this procedure several methods for dosing lithium and for predicting lithium levels have been developed. To conduct a systematic review of the publications relating to the various methods for dosing lithium or predicting lithium levels at the initiation of therapy. We searched Medline systematically for articles published in English, French or Dutch between 1966 and April 2012 which described or studied a method for dosing lithium or for predicting the lithium level reached following a specific dosage. We screened the reference lists of relevant articles in order to locate additional papers. We found 38 lithium prediction methods, in addition to the clinical titration method. These methods can be divided into two categories: the ‘a priori’ methods and the ‘test-dose’ methods, the latter requiring the administration of a test dose of lithium. The lithium prediction methods generally achieve a therapeutic blood level faster than the clinical titration method, but none of the methods achieves convincing results. On the basis of our review, we propose that the titration method should be used as the standard method in clinical practice.

  13. A review of Green's function methods in computational fluid mechanics: Background, recent developments and future directions

    International Nuclear Information System (INIS)

    Dorning, J.

    1981-01-01

    The research and development over the past eight years on local Green's function methods for the high-accuracy, high-efficiency numerical solution of nuclear engineering problems is reviewed. The basic concepts and key ideas are presented by starting with an expository review of the original fully two-dimensional local Green's function methods developed for neutron diffusion and heat conduction, and continuing through the progressively more complicated and more efficient nodal Green's function methods for neutron diffusion, heat conduction and neutron transport to establish the background for the recent development of Green's function methods in computational fluid mechanics. Some of the impressive numerical results obtained via these classes of methods for nuclear engineering problems are briefly summarized. Finally, speculations are proffered on future directions in which the development of these types of methods in fluid mechanics and other areas might lead. (orig.) [de

  14. Trends in HFE Methods and Tools and Their Applicability to Safety Reviews

    Energy Technology Data Exchange (ETDEWEB)

    O' Hara, J.M.; Plott, C.; Milanski, J.; Ronan, A.; Scheff, S.; Laux, L.; and Bzostek, J.

    2009-09-30

    The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M&T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M&Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance to support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M&Ts. The M&Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M&Ts for each trend, and (2) whether M&Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M&T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.

  15. Systematic reviews of diagnostic tests in endocrinology: an audit of methods, reporting, and performance.

    Science.gov (United States)

    Spencer-Bonilla, Gabriela; Singh Ospina, Naykky; Rodriguez-Gutierrez, Rene; Brito, Juan P; Iñiguez-Ariza, Nicole; Tamhane, Shrikant; Erwin, Patricia J; Murad, M Hassan; Montori, Victor M

    2017-07-01

    Systematic reviews provide clinicians and policymakers estimates of diagnostic test accuracy and their usefulness in clinical practice. We identified all available systematic reviews of diagnosis in endocrinology, summarized the diagnostic accuracy of the tests included, and assessed the credibility and clinical usefulness of the methods and reporting. We searched Ovid MEDLINE, EMBASE, and Cochrane CENTRAL from inception to December 2015 for systematic reviews and meta-analyses reporting accuracy measures of diagnostic tests in endocrinology. Experienced reviewers independently screened for eligible studies and collected data. We summarized the results, methods, and reporting of the reviews. We performed subgroup analyses to categorize diagnostic tests as most useful based on their accuracy. We identified 84 systematic reviews; half of the tests included were classified as helpful when positive, one-fourth as helpful when negative. Most authors adequately reported how studies were identified and selected and how their trustworthiness (risk of bias) was judged. Only one in three reviews, however, reported an overall judgment about trustworthiness and one in five reported using adequate meta-analytic methods. One in four reported contacting authors for further information and about half included only patients with diagnostic uncertainty. Up to half of the diagnostic endocrine tests in which the likelihood ratio was calculated or provided are likely to be helpful in practice when positive as are one-quarter when negative. Most diagnostic systematic reviews in endocrine lack methodological rigor, protection against bias, and offer limited credibility. Substantial efforts, therefore, seem necessary to improve the quality of diagnostic systematic reviews in endocrinology.

  16. Trends in HFE Methods and Tools and Their Applicability to Safety Reviews

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Plott, C.; Milanski, J.; Ronan, A.; Scheff, S.; Laux, L.; Bzostek, J.

    2009-01-01

    The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M and T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M and Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance to support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M and Ts. The M and Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M and Ts for each trend, and (2) whether M and Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M and T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.

  17. Physical activity among South Asian women: a systematic, mixed-methods review

    OpenAIRE

    Babakus, Whitney S; Thompson, Janice L

    2012-01-01

    Abstract Introduction The objective of this systematic mixed-methods review is to assess what is currently known about the levels of physical activity (PA) and sedentary time (ST) and to contextualize these behaviors among South Asian women with an immigrant background. Methods A systematic search of the literature was conducted using combinations of the key words PA, ST, South Asian, and immigrant. A mixed-methods approach was used to analyze and synthesize all evidence, both quantitative an...

  18. Review of methods of detection of oil pollution in the sea

    Energy Technology Data Exchange (ETDEWEB)

    Gurgul, H; Pawlak, B

    1981-01-01

    In connection with the necessary of detection, recognition, and identification of oil spills in the sea, existing and prospective contactless methods of detecting oil on the water surface are reviewed, including such methods as optical (in IR, visible, and UV, including lasers, bands), radar with the use of fluorescence and interference phenomena; aerial and space photography and shooting. Parameters of instruments that use the optical and radar methods, including CO/sub 2/-, nitrogen and helium-cadmium lasers, are presented.

  19. Methods for the evaluation of hospital cooperation activities (Systematic review protocol

    Directory of Open Access Journals (Sweden)

    Rotter Thomas

    2012-02-01

    Full Text Available Abstract Background Hospital partnerships, mergers and cooperatives are arrangements frequently seen as a means of improving health service delivery. Many of the assumptions used in planning hospital cooperatives are not stated clearly and are often based on limited or poor scientific evidence. Methods This is a protocol for a systematic review, following the Cochrane EPOC methodology. The review aims to document, catalogue and synthesize the existing literature on the reported methods for the evaluation of hospital cooperation activities as well as methods of hospital cooperation. We will search the Database of Abstracts of Reviews of Effectiveness, the Effective Practice and Organisation of Care Register, the Cochrane Central Register of Controlled Trials and bibliographic databases including PubMed (via NLM, Web of Science, NHS EED, Business Source Premier (via EBSCO and Global Health for publications that report on methods for evaluating hospital cooperatives, strategic partnerships, mergers, alliances, networks and related activities and methods used for such partnerships. The method proposed by the Cochrane EPOC group regarding randomized study designs, controlled clinical trials, controlled before and after studies, and interrupted time series will be followed. In addition, we will also include cohort, case-control studies, and relevant non-comparative publications such as case reports. We will categorize and analyze the review findings according to the study design employed, the study quality (low versus high quality studies and the method reported in the primary studies. We will present the results of studies in tabular form. Discussion Overall, the systematic review aims to identify, assess and synthesize the evidence to underpin hospital cooperation activities as defined in this protocol. As a result, the review will provide an evidence base for partnerships, alliances or other fields of cooperation in a hospital setting. PROSPERO

  20. A comparison of statistical methods for identifying out-of-date systematic reviews.

    Directory of Open Access Journals (Sweden)

    Porjai Pattanittum

    Full Text Available BACKGROUND: Systematic reviews (SRs can provide accurate and reliable evidence, typically about the effectiveness of health interventions. Evidence is dynamic, and if SRs are out-of-date this information may not be useful; it may even be harmful. This study aimed to compare five statistical methods to identify out-of-date SRs. METHODS: A retrospective cohort of SRs registered in the Cochrane Pregnancy and Childbirth Group (CPCG, published between 2008 and 2010, were considered for inclusion. For each eligible CPCG review, data were extracted and "3-years previous" meta-analyses were assessed for the need to update, given the data from the most recent 3 years. Each of the five statistical methods was used, with random effects analyses throughout the study. RESULTS: Eighty reviews were included in this study; most were in the area of induction of labour. The numbers of reviews identified as being out-of-date using the Ottawa, recursive cumulative meta-analysis (CMA, and Barrowman methods were 34, 7, and 7 respectively. No reviews were identified as being out-of-date using the simulation-based power method, or the CMA for sufficiency and stability method. The overall agreement among the three discriminating statistical methods was slight (Kappa = 0.14; 95% CI 0.05 to 0.23. The recursive cumulative meta-analysis, Ottawa, and Barrowman methods were practical according to the study criteria. CONCLUSION: Our study shows that three practical statistical methods could be applied to examine the need to update SRs.

  1. Qualitative Evaluation Methods in Ethics Education: A Systematic Review and Analysis of Best Practices.

    Science.gov (United States)

    Watts, Logan L; Todd, E Michelle; Mulhearn, Tyler J; Medeiros, Kelsey E; Mumford, Michael D; Connelly, Shane

    2017-01-01

    Although qualitative research offers some unique advantages over quantitative research, qualitative methods are rarely employed in the evaluation of ethics education programs and are often criticized for a lack of rigor. This systematic review investigated the use of qualitative methods in studies of ethics education. Following a review of the literature in which 24 studies were identified, each study was coded based on 16 best practices characteristics in qualitative research. General thematic analysis and grounded theory were found to be the dominant approaches used. Researchers are effectively executing a number of best practices, such as using direct data sources, structured data collection instruments, non-leading questioning, and expert raters. However, other best practices were rarely present in the courses reviewed, such as collecting data using multiple sources, methods, raters, and timepoints, evaluating reliability, and employing triangulation analyses to assess convergence. Recommendations are presented for improving future qualitative research studies in ethics education.

  2. Using logic model methods in systematic review synthesis: describing complex pathways in referral management interventions.

    Science.gov (United States)

    Baxter, Susan K; Blank, Lindsay; Woods, Helen Buckley; Payne, Nick; Rimmer, Melanie; Goyder, Elizabeth

    2014-05-10

    There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. PROSPERO registration number: CRD42013004037.

  3. Setting health research priorities using the CHNRI method: VII. A review of the first 50 applications of the CHNRI method.

    Science.gov (United States)

    Rudan, Igor; Yoshida, Sachiyo; Chan, Kit Yee; Sridhar, Devi; Wazny, Kerri; Nair, Harish; Sheikh, Aziz; Tomlinson, Mark; Lawn, Joy E; Bhutta, Zulfiqar A; Bahl, Rajiv; Chopra, Mickey; Campbell, Harry; El Arifeen, Shams; Black, Robert E; Cousens, Simon

    2017-06-01

    Several recent reviews of the methods used to set research priorities have identified the CHNRI method (acronym derived from the "Child Health and Nutrition Research Initiative") as an approach that clearly became popular and widely used over the past decade. In this paper we review the first 50 examples of application of the CHNRI method, published between 2007 and 2016, and summarize the most important messages that emerged from those experiences. We conducted a literature review to identify the first 50 examples of application of the CHNRI method in chronological order. We searched Google Scholar, PubMed and so-called grey literature. Initially, between 2007 and 2011, the CHNRI method was mainly used for setting research priorities to address global child health issues, although the first cases of application outside this field (eg, mental health, disabilities and zoonoses) were also recorded. Since 2012 the CHNRI method was used more widely, expanding into the topics such as adolescent health, dementia, national health policy and education. The majority of the exercises were focused on issues that were only relevant to low- and middle-income countries, and national-level applications are on the rise. The first CHNRI-based articles adhered to the five recommended priority-setting criteria, but by 2016 more than two-thirds of all conducted exercises departed from recommendations, modifying the CHNRI method to suit each particular exercise. This was done not only by changing the number of criteria used, but also by introducing some entirely new criteria (eg, "low cost", "sustainability", "acceptability", "feasibility", "relevance" and others). The popularity of the CHNRI method in setting health research priorities can be attributed to several key conceptual advances that have addressed common concerns. The method is systematic in nature, offering an acceptable framework for handling many research questions. It is also transparent and replicable, because it

  4. Availability and performance of image/video-based vital signs monitoring methods: a systematic review protocol

    Directory of Open Access Journals (Sweden)

    Mirae Harford

    2017-10-01

    Full Text Available Abstract Background For many vital signs, monitoring methods require contact with the patient and/or are invasive in nature. There is increasing interest in developing still and video image-guided monitoring methods that are non-contact and non-invasive. We will undertake a systematic review of still and video image-based monitoring methods. Methods We will perform searches in multiple databases which include MEDLINE, Embase, CINAHL, Cochrane library, IEEE Xplore and ACM Digital Library. We will use OpenGrey and Google searches to access unpublished or commercial data. We will not use language or publication date restrictions. The primary goal is to summarise current image-based vital signs monitoring methods, limited to heart rate, respiratory rate, oxygen saturations and blood pressure. Of particular interest will be the effectiveness of image-based methods compared to reference devices. Other outcomes of interest include the quality of the method comparison studies with respect to published reporting guidelines, any limitations of non-contact non-invasive technology and application in different populations. Discussion To the best of our knowledge, this is the first systematic review of image-based non-contact methods of vital signs monitoring. Synthesis of currently available technology will facilitate future research in this highly topical area. Systematic review registration PROSPERO CRD42016029167

  5. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  6. Review of analytical methods for the quantification of iodine in complex matrices

    Energy Technology Data Exchange (ETDEWEB)

    Shelor, C. Phillip [Department of Chemistry and Biochemistry, University of Texas at Arlington, Arlington, TX 76019-0065 (United States); Dasgupta, Purnendu K., E-mail: Dasgupta@uta.edu [Department of Chemistry and Biochemistry, University of Texas at Arlington, Arlington, TX 76019-0065 (United States)

    2011-09-19

    Highlights: {yields} We focus on iodine in biological samples, notably urine and milk. {yields} Sample preparation and the Sandell-Kolthoff method are extensively discussed. - Abstract: Iodine is an essential element of human nutrition. Nearly a third of the global population has insufficient iodine intake and is at risk of developing Iodine Deficiency Disorders (IDD). Most countries have iodine supplementation and monitoring programs. Urinary iodide (UI) is the biomarker used for epidemiological studies; only a few methods are currently used routinely for analysis. These methods either require expensive instrumentation with qualified personnel (inductively coupled plasma-mass spectrometry, instrumental nuclear activation analysis) or oxidative sample digestion to remove potential interferences prior to analysis by a kinetic colorimetric method originally introduced by Sandell and Kolthoff {approx}75 years ago. The Sandell-Kolthoff (S-K) method is based on the catalytic effect of iodide on the reaction between Ce{sup 4+} and As{sup 3+}. No available technique fully fits the needs of developing countries; research into inexpensive reliable methods and instrumentation are needed. There have been multiple reviews of methods used for epidemiological studies and specific techniques. However, a general review of iodine determination on a wide-ranging set of complex matrices is not available. While this review is not comprehensive, we cover the principal developments since the original development of the S-K method.

  7. A Review of Methods for Analysis of the Expected Value of Information.

    Science.gov (United States)

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  8. Novel keyword co-occurrence network-based methods to foster systematic reviews of scientific literature.

    Science.gov (United States)

    Radhakrishnan, Srinivasan; Erbis, Serkan; Isaacs, Jacqueline A; Kamarthi, Sagar

    2017-01-01

    Systematic reviews of scientific literature are important for mapping the existing state of research and highlighting further growth channels in a field of study, but systematic reviews are inherently tedious, time consuming, and manual in nature. In recent years, keyword co-occurrence networks (KCNs) are exploited for knowledge mapping. In a KCN, each keyword is represented as a node and each co-occurrence of a pair of words is represented as a link. The number of times that a pair of words co-occurs in multiple articles constitutes the weight of the link connecting the pair. The network constructed in this manner represents cumulative knowledge of a domain and helps to uncover meaningful knowledge components and insights based on the patterns and strength of links between keywords that appear in the literature. In this work, we propose a KCN-based approach that can be implemented prior to undertaking a systematic review to guide and accelerate the review process. The novelty of this method lies in the new metrics used for statistical analysis of a KCN that differ from those typically used for KCN analysis. The approach is demonstrated through its application to nano-related Environmental, Health, and Safety (EHS) risk literature. The KCN approach identified the knowledge components, knowledge structure, and research trends that match with those discovered through a traditional systematic review of the nanoEHS field. Because KCN-based analyses can be conducted more quickly to explore a vast amount of literature, this method can provide a knowledge map and insights prior to undertaking a rigorous traditional systematic review. This two-step approach can significantly reduce the effort and time required for a traditional systematic literature review. The proposed KCN-based pre-systematic review method is universal. It can be applied to any scientific field of study to prepare a knowledge map.

  9. Streamlining interventional radiology admissions: The role of the interventional radiology clinic and physician's assistant

    International Nuclear Information System (INIS)

    White, R.I. Jr.; Rizer, D.M.; Shuman, K.; White, E.J.; Adams, P.; Doyle, K.; Kinnison, M.

    1987-01-01

    During a 5-year period (1982-1987), 376 patients were admitted to an interventional radiology service where they were managed by the senior physician and interventional radiology fellows. Sixty-eight percent of patients were admitted for angioplasty and 32% for elective embolotherapy/diagnostic angiography. A one-half-day, twice weekly interventional radiology clinic and employment of a physician's assistant who performed preadmission history and physicals and wrote orders accounted, in part, for a decrease in hospital stay length from 3.74 days (1982-1983) to 2.41 days (1986-1987). The authors conclude that use of the clinic and the physician's assistant streamlines patient flow and the admitting process and is partially responsible for a decreased length of stay for patients admitted to an interventional radiology service

  10. Streamlining and Standardizing Due Diligence to Ensure Quality of PV Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-28

    Those investing in PV power plants would like to have confidence that the plants will provide the anticipated return on investment. While due diligence is capably performed by independent engineers today, as PV systems mature, there will be benefit in standardization and streamlining of this process. The IECRE has defined technical information that is needed as a basis for each transaction step such as approving a design to begin construction, documenting readiness to operate, quantifying performance after a year of operation, and assessing the health of the plant in preparation for sale of the plant. The technical requirements have been defined by IEC Technical Committee 82 and have been designed to be both effective and efficient in completing the assessments. This workshop will describe these new tools that are now available to the community and will include a panel/audience discussion about how and when they can be most effectively used.

  11. Roof Box Shape Streamline Adaptation and the Impact towards Fuel Consumption

    Directory of Open Access Journals (Sweden)

    Abdul Latif M.F.

    2017-01-01

    Full Text Available The fuel price hike is currently a sensational national issue in Malaysia. Since the rationalization of fuel subsidies many were affected especially the middle income family. Vehicle aerodynamic were directly related to the fuel consumption, were extra frontal area result a higher drag force hence higher fuel consumption. Roof box were among the largest contributor to the extra drag, thus the roof box shape rationalization were prominent to reduce the extra drag. The idea of adopting water drop shape to the roof box design shows prominent result. The roof box has been simulated using MIRA virtual wind tunnel modelling via commercial computational fluid dynamic (CFD package. This streamline shape drastically reduce the drag force by 34% resulting to a 1.7% fuel saving compare to the conventional boxy roof box. This is an effort to reduce the carbon foot print for a sustainable green world.

  12. Easy XMM-Newton Data Analysis with the Streamlined ABC Guide!

    Science.gov (United States)

    Valencic, Lynne A.; Snowden, Steven L.; Pence, William D.

    2016-01-01

    The US XMM-Newton GOF has streamlined the time-honored XMM-Newton ABC Guide, making it easier to find and use what users may need to analyze their data. It takes into account what type of data a user might have, if they want to reduce the data on their own machine or over the internet with Web Hera, and if they prefer to use the command window or a GUI. The GOF has also included an introduction to analyzing EPIC and RGS spectra, and PN Timing mode data. The guide is provided for free to students, educators, and researchers for educational and research purposes. Try it out at: http://heasarc.gsfc.nasa.gov/docs/xmm/sl/intro.html

  13. Investigating the effects of streamline-based fiber tractography on matrix scaling in brain connective network.

    Science.gov (United States)

    Jan, Hengtai; Chao, Yi-Ping; Cho, Kuan-Hung; Kuo, Li-Wei

    2013-01-01

    Investigating the brain connective network using the modern graph theory has been widely applied in cognitive and clinical neuroscience research. In this study, we aimed to investigate the effects of streamline-based fiber tractography on the change of network properties and established a systematic framework to understand how an adequate network matrix scaling can be determined. The network properties, including degree, efficiency and betweenness centrality, show similar tendency in both left and right hemispheres. By employing the curve-fitting process with exponential law and measuring the residuals, the association between changes of network properties and threshold of track numbers is found and an adequate range of investigating the lateralization of brain network is suggested. The proposed approach can be further applied in clinical applications to improve the diagnostic sensitivity using network analysis with graph theory.

  14. A review of the evolution of human reliability analysis methods at nuclear industry

    International Nuclear Information System (INIS)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R.

    2017-01-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  15. A review of the evolution of human reliability analysis methods at nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R., E-mail: lecionoliveira@gmail.com, E-mail: luquetti@ien.gov.br, E-mail: paulov@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  16. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Can streamlined multi-criteria decision analysis be used to implement shared decision making for colorectal cancer screening?

    Science.gov (United States)

    Dolan, James G.; Boohaker, Emily; Allison, Jeroan; Imperiale, Thomas F.

    2013-01-01

    Background Current US colorectal cancer screening guidelines that call for shared decision making regarding the choice among several recommended screening options are difficult to implement. Multi-criteria decision analysis (MCDA) is an established methodology well suited for supporting shared decision making. Our study goal was to determine if a streamlined form of MCDA using rank order based judgments can accurately assess patients’ colorectal cancer screening priorities. Methods We converted priorities for four decision criteria and three sub-criteria regarding colorectal cancer screening obtained from 484 average risk patients using the Analytic Hierarchy Process (AHP) in a prior study into rank order-based priorities using rank order centroids. We compared the two sets of priorities using Spearman rank correlation and non-parametric Bland-Altman limits of agreement analysis. We assessed the differential impact of using the rank order-based versus the AHP-based priorities on the results of a full MCDA comparing three currently recommended colorectal cancer screening strategies. Generalizability of the results was assessed using Monte Carlo simulation. Results Correlations between the two sets of priorities for the seven criteria ranged from 0.55 to 0.92. The proportions of absolute differences between rank order-based and AHP-based priorities that were more than ± 0.15 ranged from 1% to 16%. Differences in the full MCDA results were minimal and the relative rankings of the three screening options were identical more than 88% of the time. The Monte Carlo simulation results were similar. Conclusion Rank order-based MCDA could be a simple, practical way to guide individual decisions and assess population decision priorities regarding colorectal cancer screening strategies. Additional research is warranted to further explore the use of these methods for promoting shared decision making. PMID:24300851

  18. Psychological distress and streamlined BreastScreen follow-up assessment versus standard assessment.

    Science.gov (United States)

    Sherman, Kerry A; Winch, Caleb J; Borecky, Natacha; Boyages, John

    2013-11-04

    To establish whether altered protocol characteristics of streamlined StepDown breast assessment clinics heightened or reduced the psychological distress of women in attendance compared with standard assessment. Willingness to attend future screening was also compared between the assessment groups. Observational, prospective study of women attending either a mammogram-only StepDown or a standard breast assessment clinic. Women completed questionnaires on the day of assessment and 1 month later. Women attending StepDown (136 women) or standard assessment clinics (148 women) at a BreastScreen centre between 10 November 2009 and 7 August 2010. Breast cancer worries; positive and negative psychological consequences of assessment (Psychological Consequences Questionnaire); breast cancer-related intrusion and avoidance (Impact of Event Scale); and willingness to attend, and uneasiness about, future screening. At 1-month follow-up, no group differences were evident between those attending standard and StepDown clinics on breast cancer worries (P= 0.44), positive (P= 0.88) and negative (P = 0.65) consequences, intrusion (P = 0.64), and avoidance (P = 0.87). Willingness to return for future mammograms was high, and did not differ between groups (P = 0.16), although higher levels of unease were associated with lessened willingness to rescreen (P = 0.04). There was no evidence that attending streamlined StepDown assessments had different outcomes in terms of distress than attending standard assessment clinics for women with a BreastScreen-detected abnormality. However, unease about attending future screening was generally associated with less willingness to do so in both groups; thus, there is a role for psycho-educational intervention to address these concerns.

  19. Management methods ash from combustion of biomass. Review of productions and associated methods. Extended abstract

    International Nuclear Information System (INIS)

    Boulday, D.; Marcovecchio, F.

    2016-02-01

    The study deals with the management of biomass ashes from industrial and collective facilities (wood log excluded) and provides a state of the art, in France and in Europe, flows, methods of recovery and post-treatment, physico-chemical characteristics and programs for new opportunities. Currently, flows of biomass ash are estimated at 110 kt-330 kt in France and 1 500 kt - 4 500 kt in Europe and should amount respectively 330 kt-1000 kt and 3100 kt-8000 kt in 2020. The physical and chemical composition of biomass ash is influenced by many factors: fuel, pretreatment, post-treatment, additives, fly and bottom ash, power installation, type of combustion equipment, extraction mode...However, these ashes have characteristics which are commonly accepted: liming / neutralizing power, fertilizer, pozzolanic behavior generally almost zero. In France and Europe, a distinction is made between fly and bottom ashes, usually less polluted. However, this separation does not always make sense according to the valuation mode, the type of equipment (including fluidized bed or grid) or mixtures of ash made in the plant (e.g. mix of bottom and coarse ash). Currently, the main outlet is ash landfill, followed by agricultural and forestry recycling. The other identified opportunities concern a few countries and marginal flows: brick-works, road engineering... The development of biomass energy, coupled with a reduction in landfill options, has given rise to many research and demonstration programs in recent years, particularly in France, with some promising solutions. Many limiting factors, which can be different according to opportunities, have been identified. More or less advanced solutions aimed at reducing the harmful effects of these factors (slaking lime, sorting, grinding...).However to date, the most robust and massive solution for ash recycling material remains undoubtedly the agricultural recycling. According to the study, it's necessary to consolidate the agricultural

  20. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    Science.gov (United States)

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  1. A review of methods for monitoring streamflow for sustainable water resource management

    Science.gov (United States)

    Dobriyal, Pariva; Badola, Ruchi; Tuboi, Chongpi; Hussain, Syed Ainul

    2017-10-01

    Monitoring of streamflow may help to determine the optimum levels of its use for sustainable water management in the face of climate change. We reviewed available methods for monitoring streamflow on the basis of six criteria viz. their applicability across different terrains and size of the streams, operational ease, time effectiveness, accuracy, environmental impact that they may cause and cost involve in it. On the basis of the strengths and weaknesses of each of the methods reviewed, we conclude that the timed volume method is apt for hilly terrain having smaller streams due to its operational ease and accuracy of results. Although comparatively expensive, the weir and flume methods are suitable for long term studies of small hill streams, since once the structure is put in place, it yields accurate results. In flat terrain, the float method is best suited for smaller streams for its operational ease and cost effectiveness, whereas, for larger streams, the particle image velocimetry may be used for its accuracy. Our review suggests that the selection of a method for monitoring streamflow may be based on volume of the stream, accuracy of the method, accessibility of the terrain and financial and physical resources available.

  2. The Assessment Methods of Laryngeal Muscle Activity in Muscle Tension Dysphonia: A Review

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Nakhostin Ansari, Noureddin; Izadi, Farzad; Talebian Moghadam, Saeed

    2013-01-01

    The purpose of this paper is to review the methods used for the assessment of muscular tension dysphonia (MTD). The MTD is a functional voice disorder associated with abnormal laryngeal muscle activity. Various assessment methods are available in the literature to evaluate the laryngeal hyperfunction. The case history, laryngoscopy, and palpation are clinical methods for the assessment of patients with MTD. Radiography and surface electromyography (EMG) are objective methods to provide physiological information about MTD. Recent studies show that surface EMG can be an effective tool for assessing muscular tension in MTD. PMID:24319372

  3. A review on brightness preserving contrast enhancement methods for digital image

    Science.gov (United States)

    Rahman, Md Arifur; Liu, Shilong; Li, Ruowei; Wu, Hongkun; Liu, San Chi; Jahan, Mahmuda Rawnak; Kwok, Ngaiming

    2018-04-01

    Image enhancement is an imperative step for many vision based applications. For image contrast enhancement, popular methods adopt the principle of spreading the captured intensities throughout the allowed dynamic range according to predefined distributions. However, these algorithms take little or no consideration into account of maintaining the mean brightness of the original scene, which is of paramount importance to carry the true scene illumination characteristics to the viewer. Though there have been significant amount of reviews on contrast enhancement methods published, updated review on overall brightness preserving image enhancement methods is still scarce. In this paper, a detailed survey is performed on those particular methods that specifically aims to maintain the overall scene illumination characteristics while enhancing the digital image.

  4. A Review of Propensity-Score Methods and Their Use in Cardiovascular Research.

    Science.gov (United States)

    Deb, Saswata; Austin, Peter C; Tu, Jack V; Ko, Dennis T; Mazer, C David; Kiss, Alex; Fremes, Stephen E

    2016-02-01

    Observational studies using propensity-score methods have been increasing in the cardiovascular literature because randomized controlled trials are not always feasible or ethical. However, propensity-score methods can be confusing, and the general audience may not fully understand the importance of this technique. The objectives of this review are to describe (1) the fundamentals of propensity score methods, (2) the techniques to assess for propensity-score model adequacy, (3) the 4 major methods for using the propensity score (matching, stratification, covariate adjustment, and inverse probability of treatment weighting [IPTW]) using examples from previously published cardiovascular studies, and (4) the strengths and weaknesses of these 4 techniques. Our review suggests that matching or IPTW using the propensity score have shown to be most effective in reducing bias of the treatment effect. Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  5. Physical methods of resveratrol induction in grapes and grape products - a review

    International Nuclear Information System (INIS)

    Triska, J.; Houska, M.

    2012-01-01

    Trans-resveratrol ((E)-3,4',5-trihydroxystilbene) is a substance that is produced by a large number of plants as a phytoalexin. Resveratrol has been credited as being potentially responsible for the ''French paradox'' - the observation that the French have a relatively low incidence of coronary heart disease, even though their diet is high in saturated fats. This review deals with the methods serving for the increase of the resveratrol content in wine products - wine and grape juices. The methods reviewed are UV irradiation of grapes and ozonisation of grapes. The discussed methods describe the ways of increasing resveratrol contents in grapes and wine using ''natural'' methods. Resveratrol is increased endogenously and therefore, it need not be declared as the added substance on the product labels

  6. Comparison of critical methods developed for fatty acid analysis: A review.

    Science.gov (United States)

    Wu, Zhuona; Zhang, Qi; Li, Ning; Pu, Yiqiong; Wang, Bing; Zhang, Tong

    2017-01-01

    Fatty acids are important nutritional substances and metabolites in living organisms. These acids are abundant in Chinese herbs, such as Brucea javanica, Notopterygium forbesii, Isatis tinctoria, Astragalus membranaceus, and Aconitum szechenyianum. This review illustrates the types of fatty acids and their significant roles in the human body. Many analytical methods are used for the qualitative and quantitative evaluation of fatty acids. Some of the methods used to analyze fatty acids in more than 30 kinds of plants, drugs, and other samples are presented in this paper. These analytical methods include gas chromatography, liquid chromatography, near-infrared spectroscopy, and NMR spectroscopy. The advantages and disadvantages of these techniques are described and compared. This review provides a valuable reference for establishing methods for fatty acid determination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Dating methods and geochronology of fractures and movements in bedrock: a review

    Energy Technology Data Exchange (ETDEWEB)

    Tullborg, E.L. [Terralogica AB, Graabo (Sweden); Larson, Sven Aake [Goeteborgs Univ. (Sweden); Morad, S. [Uppsala Univ. (Sweden)

    2001-06-01

    Constraining the absolute and relative ages of crustal movements is of fundamental importance in evaluating the potentials of a site as a repository for spent radioactive fuel. In this report a review summary of up to date absolute and relative dating methods is presented with specific attention to those methods most amenable for dating of fractures. A review of major fracture-and shear zones in the Swedish part of the Baltic Shield is also given. Since the shield has suffered a long and complicated history, geo-chronologists are faced with the problem of reactivated zones when attempting to date these. It is important to get structural control in order to make the choice of dating method since different methods may give answer to completely different questions. An integration of all geological background data is necessary in order to make the proper chose to fit the raised question.

  8. Advanced image based methods for structural integrity monitoring: Review and prospects

    Science.gov (United States)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  9. Review of laser-induced fluorescence methods for measuring rf- and microwave electric fields in discharges

    International Nuclear Information System (INIS)

    Gavrilenko, V.; Oks, E.

    1994-01-01

    Development of methods for measuring rf- or μ-wave electric fields E(t) = E 0 cosωt in discharge plasmas is of a great practical importance. First, these are fields used for producing rf- or μ-wave discharges. Second, the fields E(t) may represent electromagnetic waves penetrating into a plasma from the outside. This paper reviews methods for diagnostics of the fields E(t) in low temperature plasmas based on Laser-Induced Fluorescence (LIF). Compared to emission (passive) methods, LIF-methods have a higher sensitivity as well as higher spatial and temporal resolutions. Underlying physical effects may be highlighted by an example of LIF of hydrogen atoms in a plasma. After a presentation of the underlying physical principles, the review focuses on key experiments where these principles were implemented for measurements of rf- and μ-wave electric fields in various discharges

  10. The Impact of Dynamic Lighting in Classrooms. A Review on Methods

    DEFF Research Database (Denmark)

    Hansen, Ellen Kathrine; Nielsen, Stine Maria Louring; Georgieva, Diana Zdravkova

    2018-01-01

    In order to understand how research can support lighting designs to improve nurturing environments for learning, a literature review was carried out. The review examined lighting research methods and parameters used for evaluating the effect of dynamic lighting in classrooms. The test parameter...... and designed holistically through a mixed method approach. It is suggested that the potentials of dynamic lighting in learning environments are explored through design driven innovation and the use of mixed methods, in order to be able to put more emphasis on the students’ and teachers’ needs for dynamic...... gaining most attention in the studies is academic performance; whereas qualitative test parameters, such as behaviour and mood, are addressed in less than a third of the selected studies. The analysis of these methods leads to a conclusion that learning environments to a broader extent should be studied...

  11. Dating methods and geochronology of fractures and movements in bedrock: a review

    International Nuclear Information System (INIS)

    Tullborg, E.L.; Larson, Sven Aake; Morad, S.

    2001-06-01

    Constraining the absolute and relative ages of crustal movements is of fundamental importance in evaluating the potentials of a site as a repository for spent radioactive fuel. In this report a review summary of up to date absolute and relative dating methods is presented with specific attention to those methods most amenable for dating of fractures. A review of major fracture-and shear zones in the Swedish part of the Baltic Shield is also given. Since the shield has suffered a long and complicated history, geo-chronologists are faced with the problem of reactivated zones when attempting to date these. It is important to get structural control in order to make the choice of dating method since different methods may give answer to completely different questions. An integration of all geological background data is necessary in order to make the proper chose to fit the raised question

  12. Methods for estimating the burden of antimicrobial resistance: a systematic literature review protocol

    Directory of Open Access Journals (Sweden)

    Nichola R. Naylor

    2016-11-01

    Full Text Available Abstract Background Estimates of the burden of antimicrobial resistance (AMR are needed to ascertain AMR impact, to evaluate interventions, and to allocate resources efficiently. Recent studies have estimated health, cost, and economic burden relating to AMR, with outcomes of interest ranging from drug-bug resistance impact on mortality in a hospital setting to total economic impact of AMR on the global economy. However, recent collation of this information has been largely informal, with no formal quality assessment of the current evidence base (e.g. with predefined checklists. This review therefore aims to establish what perspectives and resulting methodologies have been used in establishing the burden of AMR, whilst also ascertaining the quality of these studies. Methods The literature review will identify relevant literature using a systematic review methodology. MEDLINE, EMBASE, Scopus and EconLit will be searched utilising a predefined search string. Grey literature will be identified by searching within a predefined list of organisational websites. Independent screening of retrievals will be performed in a two-stage process (abstracts and full texts, utilising a pre-defined inclusion and exclusion criteria. Data will be extracted into a data extraction table and descriptive examination will be performed. Study quality will be assessed using the Newcastle-Ottawa scales and the Philips checklists where appropriate. A narrative synthesis of the results will be presented. Discussion This review will provide an overview of previous health, cost and economic definitions of burden and the resultant impact of these different definitions on the burden of AMR estimated. The review will also explore the methods that have been used to calculate this burden and discuss resulting study quality. This review can therefore act as a guide to methods for future research in this area. Systematic review registration PROSPERO CRD42016037510

  13. Review of methods and indicators in sustainable urban transport studies overview from 2000 to 2016

    Directory of Open Access Journals (Sweden)

    Puji Adiatna Nadi

    2017-12-01

    Full Text Available The attention of countries either the developed or developing countries on sustainable urban transport is becoming more popular. The purpose of paper is to review the methods and the indicators used for measuring performance of sustainable urban transport. This study is based on the literature review and the case study observation and also uses the quantitative assessment. It reviews the theoretical aspects of sustainability factors at various research works and performance indicator in urban transportation. The indicators were classified into two major categories: (i assessment methods in sustainable urban transport (SUT, and (ii basic of sustainability indicators for urban transport. This study found several types of analytical techniques for measuring sustainability indicators in urban transport. It also identify five indicators as basic element to measure sustainable urban transport performance i.e. traffic congestion, traffic air pollution, traffic noise pollution, traffic accidents and land consumption for transport infrastructure.

  14. A review of the methods to measure the ion temperature in a tokamak plasma

    International Nuclear Information System (INIS)

    Zurro Hernandez, B.; Perez-Navarro Gomez, A.

    1976-01-01

    The most important methods to measure the ion temperatu--re in a Tokamak plasma are reviewed, e.g. energy analysis of the fast neutrals which leave out the plasma, Doppler broadening of the emision spectral lines and fusion neutron analysis. It is discussed their bounds so as the advantages and drawbacks of each one. Other methods of some interest in the future are outlined. (author) [es

  15. A review of cyber security risk assessment methods for SCADA systems

    OpenAIRE

    Cherdantseva, Yulia; Burnap, Peter; Blyth, Andrew; Eden, Peter; Jones, Kevin; Soulsby, Hugh; Stoddart, Kristan

    2016-01-01

    This paper reviews the state of the art in cyber security risk assessment of Supervisory Control and Data Acquisition (SCADA) systems. We select and in-detail examine twenty-four risk assessment methods developed for or applied in the context of a SCADA system. We describe the essence of the methods and then analyse them in terms of aim; application domain; the stages of risk management addressed; key risk management concepts covered; impact measurement; sources of probabilistic data; evaluat...

  16. Analytical Review of Data Visualization Methods in Application to Big Data

    OpenAIRE

    Gorodov, Evgeniy Yur’evich; Gubarev, Vasiliy Vasil’evich

    2013-01-01

    This paper describes the term Big Data in aspects of data representation and visualization. There are some specific problems in Big Data visualization, so there are definitions for these problems and a set of approaches to avoid them. Also, we make a review of existing methods for data visualization in application to Big Data and taking into account the described problems. Summarizing the result, we have provided a classification of visualization methods in application to Big Data.

  17. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Melius, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  18. Lagrange–Galerkin methods for the incompressible Navier-Stokes equations: a review

    Directory of Open Access Journals (Sweden)

    Bermejo Rodolfo

    2016-09-01

    Full Text Available We review in this paper the development of Lagrange-Galerkin (LG methods to integrate the incompressible Navier-Stokes equations (NSEs for engineering applications. These methods were introduced in the computational fluid dynamics community in the early eighties of the past century, and at that time they were considered good methods for both their theoretical stability properties and the way of dealing with the nonlinear terms of the equations; however, the numerical experience gained with the application of LG methods to different problems has identified drawbacks of them, such as the calculation of specific integrals that arise in their formulation and the calculation of the ow trajectories, which somehow have hampered the applicability of LG methods. In this paper, we focus on these issues and summarize the convergence results of LG methods; furthermore, we shall briefly introduce a new stabilized LG method suitable for high Reynolds numbers.

  19. An Empirical Review of Research Methodologies and Methods in Creativity Studies (2003-2012)

    Science.gov (United States)

    Long, Haiying

    2014-01-01

    Based on the data collected from 5 prestigious creativity journals, research methodologies and methods of 612 empirical studies on creativity, published between 2003 and 2012, were reviewed and compared to those in gifted education. Major findings included: (a) Creativity research was predominantly quantitative and psychometrics and experiment…

  20. Methods for cost management during product development: A review and comparison of different literatures

    NARCIS (Netherlands)

    Wouters, M.; Morales, S.; Grollmuss, S.; Scheer, M.

    2016-01-01

    Purpose The paper provides an overview of research published in the innovation and operations management (IOM) literature on 15 methods for cost management in new product development, and it provides a comparison to an earlier review of the management accounting (MA) literature (Wouters & Morales,

  1. Soft wheat and flour products methods review: solvent retention capacity equation correction

    Science.gov (United States)

    This article discusses the results of a significant change to calculations made within AACCI Approved methods 56-10 and 56-11, the Alkaline Water Retention Capacity (AWRC) test and the Solvent Retention Capacity (SRC) test. The AACCI Soft Wheat and Flour Products Technical Committee reviewed propos...

  2. Methods for mapping the impact of social sciences and humanities - a literature review

    DEFF Research Database (Denmark)

    Pedersen, David Budtz; Grønvad, Jonas Følsgaard; Hvidtfeldt, Rolf

    2018-01-01

    This article explores the current literature on 'research impact' in the Social Sciences and Humanities (SSH). By providing a comprehensive review of available literature, drawing on national and international experiences, we seek to examine key methods and frameworks used to assess research impact...

  3. The measurement of house prices : A review of the sale price appraisal ratio method

    NARCIS (Netherlands)

    De Haan, J.; Van der Wal, E.B.; De Vries, P.

    2009-01-01

    The sale price appraisal ratio (SPAR) method has been applied in a number of countries to construct house price indexes. This paper reviews the statistical and index number properties of the SPAR approach. Three types of SPAR indexes are distinguished: a weighted index, which aims at tracking the

  4. Review of quantum Monte Carlo methods and results for Coulombic systems

    International Nuclear Information System (INIS)

    Ceperley, D.

    1983-01-01

    The various Monte Carlo methods for calculating ground state energies are briefly reviewed. Then a summary of the charged systems that have been studied with Monte Carlo is given. These include the electron gas, small molecules, a metal slab and many-body hydrogen

  5. Review of the different methods to derive average spacing from resolved resonance parameters sets

    International Nuclear Information System (INIS)

    Fort, E.; Derrien, H.; Lafond, D.

    1979-12-01

    The average spacing of resonances is an important parameter for statistical model calculations, especially concerning non fissile nuclei. The different methods to derive this average value from resonance parameters sets have been reviewed and analyzed in order to tentatively detect their respective weaknesses and propose recommendations. Possible improvements are suggested

  6. Methods for Practising Ethics in Research and Innovation : A Literature Review, Critical Analysis and Recommendations

    NARCIS (Netherlands)

    Reijers, Wessel; Wright, David; Brey, Philip; Weber, Karsten; Rodrigues, Rowena; O’Sullivan, Declan; Gordijn, Bert

    2017-01-01

    This paper provides a systematic literature review, analysis and discussion of methods that are proposed to practise ethics in research and innovation (R&I). Ethical considerations concerning the impacts of R&I are increasingly important, due to the quickening pace of technological innovation and

  7. A review of the current literature on aetiology and measurement methods of halitosis.

    NARCIS (Netherlands)

    Broek, A.M. van den; Feenstra, L.; Baat, C. de

    2007-01-01

    OBJECTIVES: This work reviews the current knowledge of aetiology and measurement methods of halitosis. DATA: Halitosis is an unpleasant or offensive odour emanating from the breath. The condition is multifactorial and may involve both oral and non-oral conditions. SOURCES: A private, monthly with

  8. Diverse Delivery Methods and Strong Psychological Benefits: A Review of Online Formative Assessment

    Science.gov (United States)

    McLaughlin, T.; Yan, Z.

    2017-01-01

    This article is a review of literature on online formative assessment (OFA). It includes a narrative summary that synthesizes the research on the diverse delivery methods of OFA, as well as the empirical literature regarding the strong psychological benefits and limitations. Online formative assessment can be delivered using many traditional…

  9. A Review on Methods of Risk Adjustment and their Use in Integrated Healthcare Systems

    Science.gov (United States)

    Juhnke, Christin; Bethge, Susanne

    2016-01-01

    Introduction: Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The objective of this review was to obtain an overview of existing models of risk adjustment as well as on crucial weights in risk adjustment. Moreover, the predictive performance of selected methods in international healthcare systems should be analysed. Theory and methods: A comprehensive, systematic literature review on methods of risk adjustment was conducted in terms of an encompassing, interdisciplinary examination of the related disciplines. Results: In general, several distinctions can be made: in terms of risk horizons, in terms of risk factors or in terms of the combination of indicators included. Within these, another differentiation by three levels seems reasonable: methods based on mortality risks, methods based on morbidity risks as well as those based on information on (self-reported) health status. Conclusions and discussion: After the final examination of different methods of risk adjustment it was shown that the methodology used to adjust risks varies. The models differ greatly in terms of their included morbidity indicators. The findings of this review can be used in the evaluation of integrated healthcare delivery systems and can be integrated into quality- and patient-oriented reimbursement of care providers in the design of healthcare contracts. PMID:28316544

  10. Intangibles and methods for their valuation in financial terms: Literature review

    Directory of Open Access Journals (Sweden)

    Damián Pastor

    2017-02-01

    Full Text Available Purpose: The purpose of this paper is to review literature devoted to intangibles and their valuation and give examples of the methods that can be used for valuation of individual intangibles in financial terms. Design/methodology/approach: Paper presents a systematic review of articles dedicated to intangibles and their valuation. Findings: This review shows that there is a need for consensus in definitions of intangibles, intangible assets, knowledge assets and other related terms. These terms are used interchangeably in spite of their different meanings. Many methods for valuation of intangibles can be found in the literature, but widely accepted list of basic intangibles with suggested methods for their valuation in financial terms is still missing. Research limitations/implications: Not all the papers related to this topic could be covered in this paper. Presented list of important intangible components may be enhanced and examples of some other methods for their valuation may be added in the future. Practical implications: Paper calls for development of framework comprising list of the most important intangibles, proposals of methods used for their valuation and examples of their use. This framework can be helpful for organization, which are confronted with a difficult task of intangibles valuation. Originality/value: Basic definitions and differences between intangibles, intangible assets, identifiable intangible assets, knowledge assets and intellectual capital have not been mentioned in one paper yet. List of intangibles and methods for their valuation gives a direction for future work that can be fruitful for valuation of intangibles.

  11. Use of Vortex Generators to Reduce Distortion for Mach 1.6 Streamline-Traced Supersonic Inlets

    Science.gov (United States)

    Baydar, Ezgihan; Lu, Frank; Slater, John W.; Trefny, Chuck

    2016-01-01

    Reduce the total pressure distortion at the engine-fan face due to low-momentum flow caused by the interaction of an external terminal shock at the turbulent boundary layer along a streamline-traced external-compression (STEX) inlet for Mach 1.6.

  12. Proposed Model for a Streamlined, Cohesive, and Optimized K-12 STEM Curriculum with a Focus on Engineering

    Science.gov (United States)

    Locke, Edward

    2009-01-01

    This article presents a proposed model for a clear description of K-12 age-possible engineering knowledge content, in terms of the selection of analytic principles and predictive skills for various grades, based on the mastery of mathematics and science pre-requisites, as mandated by national or state performance standards; and a streamlined,…

  13. The Use of the Delphi and Other Consensus Group Methods in Medical Education Research: A Review.

    Science.gov (United States)

    Humphrey-Murto, Susan; Varpio, Lara; Wood, Timothy J; Gonsalves, Carol; Ufholz, Lee-Anne; Mascioli, Kelly; Wang, Carol; Foth, Thomas

    2017-10-01

    Consensus group methods, such as the Delphi method and nominal group technique (NGT), are used to synthesize expert opinions when evidence is lacking. Despite their extensive use, these methods are inconsistently applied. Their use in medical education research has not been well studied. The authors set out to describe the use of consensus methods in medical education research and to assess the reporting quality of these methods and results. Using scoping review methods, the authors searched the Medline, Embase, PsycInfo, PubMed, Scopus, and ERIC databases for 2009-2016. Full-text articles that focused on medical education and the keywords Delphi, RAND, NGT, or other consensus group methods were included. A standardized extraction form was used to collect article demographic data and features reflecting methodological rigor. Of the articles reviewed, 257 met the inclusion criteria. The Modified Delphi (105/257; 40.8%), Delphi (91/257; 35.4%), and NGT (23/257; 8.9%) methods were most often used. The most common study purpose was curriculum development or reform (68/257; 26.5%), assessment tool development (55/257; 21.4%), and defining competencies (43/257; 16.7%). The reporting quality varied, with 70.0% (180/257) of articles reporting a literature review, 27.2% (70/257) reporting what background information was provided to participants, 66.1% (170/257) describing the number of participants, 40.1% (103/257) reporting if private decisions were collected, 37.7% (97/257) reporting if formal feedback of group ratings was shared, and 43.2% (111/257) defining consensus a priori. Consensus methods are poorly standardized and inconsistently used in medical education research. Improved criteria for reporting are needed.

  14. Quality of life among dermatology patients: a systematic review of investigations using qualitative methods.

    Science.gov (United States)

    Singh, Sanminder; Ehsani-Chimeh, Nazanin; Kornmehl, Heather; Armstrong, April W

    2017-07-13

    Quality of life may be assessed using quantitative or qualitative methods. Quantitative methods are commonly used in research settings; however, they may fail to capture the full range of patient experiences and impact on quality of life. Qualitative methods may be used to address this limitation. In this systematic review, we aim to synthesize data from articles utilizing qualitative methods to assess quality of life in dermatology patients. We performed a systematic review search using the MEDLINE, EMBASE, and SCOPUS databases. The search was conducted using the following search criteria: ("Dermatology" [MeSH]) AND ("Quality of Life" [MeSH]), AND ("Qualitative Research" [MeSH]), searching literature spanning from January 1, 1946- October 5, 2016. The systematic review of 15 articles included 533 dermatology patients. Patients expressed frustration over the unpredictability of disease symptoms and having to compensate for the subsequent limitations by altering their daily routines. Patients also reported profound helplessness due to chronic skin disease and social isolation in an effort to hide their disease. Patients noted the patient-provider relationship as a source of support and information exchange, with the goal to form easy to use treatment plans that met both physician and patient expectations. Qualitative assessment of patient quality of life can provide new insights into the patient experience and the impact of their skin disease. Qualitative methodology may capture meaningful information that may be overlooked by quantitative methods, and it should be included in quality of life research.

  15. A comparison of primary two- and three-dimensional methods to review CT colonography

    International Nuclear Information System (INIS)

    Gelder, Rogier E. van; Florie, Jasper; Nio, C. Yung; Jager, Steven W. de; Lameris, Johan S.; Stoker, Jaap; Jensch, Sebastiaan; Vos, Frans M.; Venema, Henk W.; Bartelsman, Joep F.; Reitsma, Johannes B.; Bossuyt, Patrick M.M.

    2007-01-01

    The aim of our study was to compare primary three-dimensional (3D) and primary two-dimensional (2D) review methods for CT colonography with regard to polyp detection and perceptive errors. CT colonography studies of 77 patients were read twice by three reviewers, first with a primary 3D method and then with a primary 2D method. Mean numbers of true and false positives, patient sensitivity and specificity and perceptive errors were calculated with colonoscopy as a reference standard. A perceptive error was made if a polyp was not detected by all reviewers. Mean sensitivity for large (≥10 mm) polyps for primary 3D and 2D review was 81% (14.7/18) and 70%(12.7/18), respectively (p-values ≥0.25). Mean numbers of large false positives for primary 3D and 2D were 8.3 and 5.3, respectively. With primary 3D and 2D review 1 and 6 perceptive errors, respectively, were made in 18 large polyps (p = 0.06). For medium-sized (6-9 mm) polyps these values were for primary 3D and 2D, respectively: mean sensitivity: 67%(11.3/17) and 61%(10.3/17; p-values≥ 0.45), number of false positives: 33.3 and 15.6, and perceptive errors: 4 and 6 (p = 0.53). No significant differences were found in the detection of large and medium-sized polyps between primary 3D and 2D review. (orig.)

  16. Is Video-Based Education an Effective Method in Surgical Education? A Systematic Review.

    Science.gov (United States)

    Ahmet, Akgul; Gamze, Kus; Rustem, Mustafaoglu; Sezen, Karaborklu Argut

    2018-02-12

    Visual signs draw more attention during the learning process. Video is one of the most effective tool including a lot of visual cues. This systematic review set out to explore the influence of video in surgical education. We reviewed the current evidence for the video-based surgical education methods, discuss the advantages and disadvantages on the teaching of technical and nontechnical surgical skills. This systematic review was conducted according to the guidelines defined in the preferred reporting items for systematic reviews and meta-analyses statement. The electronic databases: the Cochrane Library, Medline (PubMED), and ProQuest were searched from their inception to the 30 January 2016. The Medical Subject Headings (MeSH) terms and keywords used were "video," "education," and "surgery." We analyzed all full-texts, randomised and nonrandomised clinical trials and observational studies including video-based education methods about any surgery. "Education" means a medical resident's or student's training and teaching process; not patients' education. We did not impose restrictions about language or publication date. A total of nine articles which met inclusion criteria were included. These trials enrolled 507 participants and the total number of participants per trial ranged from 10 to 172. Nearly all of the studies reviewed report significant knowledge gain from video-based education techniques. The findings of this systematic review provide fair to good quality studies to demonstrate significant gains in knowledge compared with traditional teaching. Additional video to simulator exercise or 3D animations has beneficial effects on training time, learning duration, acquisition of surgical skills, and trainee's satisfaction. Video-based education has potential for use in surgical education as trainees face significant barriers in their practice. This method is effective according to the recent literature. Video should be used in addition to standard techniques

  17. Determination of 237Np in environmental and nuclear samples: A review of the analytical method

    International Nuclear Information System (INIS)

    Thakur, P.; Mulholland, G.P.

    2012-01-01

    A number of analytical methods has been developed and used for the determination of neptunium in environmental and nuclear fuel samples using alpha, ICP–MS spectrometry, and other analytical techniques. This review summarizes and discusses development of the radiochemical procedures for separation of neptunium (Np), since the beginning of the nuclear industry, followed by a more detailed discussion on recent trends in the separation of neptunium. This article also highlights the progress in analytical methods and issues associated with the determination of neptunium in environmental samples. - Highlights: ► Determination of Np in environmental and nuclear samples is reviewed. ► Various analytical methods used for the determination of Np are listed. ► Progress and issues associated with the determination of Np are discussed.

  18. Review of the Strain Modulation Methods Used in Fiber Bragg Grating Sensors

    Directory of Open Access Journals (Sweden)

    Kuo Li

    2016-01-01

    Full Text Available Fiber Bragg grating (FBG is inherently sensitive to temperature and strain. By modulating FBG’s strain, various FBG sensors have been developed, such as sensors with enhanced or reduced temperature sensitivity, strain/displacement sensors, inclinometers, accelerometers, pressure meters, and magnetic field meters. This paper reviews the strain modulation methods used in these FBG sensors and categorizes them according to whether the strain of an FBG is changed evenly. Then, those even-strain-change methods are subcategorized into (1 attaching/embedding an FBG throughout to a base and (2 fixing the two ends of an FBG and (2.1 changing the distance between the two ends or (2.2 bending the FBG by applying a transverse force at the middle of the FBG. This review shows that the methods of “fixing the two ends” are prominent because of the advantages of large tunability and frequency modulation.

  19. A Review of Spectral Methods for Variable Amplitude Fatigue Prediction and New Results

    Science.gov (United States)

    Larsen, Curtis E.; Irvine, Tom

    2013-01-01

    A comprehensive review of the available methods for estimating fatigue damage from variable amplitude loading is presented. The dependence of fatigue damage accumulation on power spectral density (psd) is investigated for random processes relevant to real structures such as in offshore or aerospace applications. Beginning with the Rayleigh (or narrow band) approximation, attempts at improved approximations or corrections to the Rayleigh approximation are examined by comparison to rainflow analysis of time histories simulated from psd functions representative of simple theoretical and real world applications. Spectral methods investigated include corrections by Wirsching and Light, Ortiz and Chen, the Dirlik formula, and the Single-Moment method, among other more recent proposed methods. Good agreement is obtained between the spectral methods and the time-domain rainflow identification for most cases, with some limitations. Guidelines are given for using the several spectral methods to increase confidence in the damage estimate.

  20. Impact Strength of Natural Fibre Composites Measured by Different Test Methods: A Review

    Directory of Open Access Journals (Sweden)

    Navaranjan Namasivayam

    2017-01-01

    Full Text Available Different types of impact test methods have been used in recent years to measure the impact resistance of natural fibre composites (NFCs. After reviewing the literature, the impact resistance of flax, hemp, sisal, wood and jute fibre composites that were measured using different test methods have been compared and discussed. It has been learned that the test methods were selected for research interest, industry requirement or availability of test equipment. Each method had its own advantages and limitations. The result from a particular test could be compared but not with the result from other test methods. Most impact test methods were developed for testing ductile-brittle transition of metals. However, each NFC has a different morphology and cannot be comparable to metals in failure mode and energy absorption characteristic during an impact test. A post evaluation of morphology of an NFC sample after an impact test is important to characterise the material.

  1. Early Interventions Following the Death of a Parent: Protocol of a Mixed Methods Systematic Review.

    Science.gov (United States)

    Pereira, Mariana; Johnsen, Iren; Hauken, May Aa; Kristensen, Pål; Dyregrov, Atle

    2017-06-29

    Previous meta-analyses examined the effectiveness of interventions for bereaved children showing small to moderate effect sizes. However, no mixed methods systematic review was conducted on bereavement interventions following the loss of a parent focusing on the time since death in regard to the prevention of grief complications. The overall purpose of the review is to provide a rigorous synthesis of early intervention after parental death in childhood. Specifically, the aims are twofold: (1) to determine the rationales, contents, timeframes, and outcomes of early bereavement care interventions for children and/or their parents and (2) to assess the quality of current early intervention studies. Quantitative, qualitative, and mixed methods intervention studies that start intervention with parentally bereaved children (and/or their parents) up to 6 months postloss will be included in the review. The search strategy was based on the Population, Interventions, Comparator, Outcomes, and Study Designs (PICOS) approach, and it was devised together with a university librarian. The literature searches will be carried out in the Medical Literature Analysis and Retrieval System Online (MEDLINE), PsycINFO, Excerpta Medica Database (EMBASE), and Cumulative Index to Nursing and Allied Health Literature (CINAHL). The Mixed Methods Appraisal Tool will be used to appraise the quality of eligible studies. All data will be narratively synthetized following the Guidance on the Conduct of Narrative Synthesis in Systematic Reviews. The systematic review is ongoing and the data search has started. The review is expected to be completed by the end of 2017. Findings will be submitted to leading journals for publication. In accordance with the current diagnostic criteria for prolonged grief as well as the users' perspectives literature, this systematic review outlines a possible sensitive period for early intervention following the death of a parent. The hereby presented protocol ensures

  2. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges.

    Science.gov (United States)

    Stansfield, Claire; O'Mara-Eves, Alison; Thomas, James

    2017-09-01

    Using text mining to aid the development of database search strings for topics described by diverse terminology has potential benefits for systematic reviews; however, methods and tools for accomplishing this are poorly covered in the research methods literature. We briefly review the literature on applications of text mining for search term development for systematic reviewing. We found that the tools can be used in 5 overarching ways: improving the precision of searches; identifying search terms to improve search sensitivity; aiding the translation of search strategies across databases; searching and screening within an integrated system; and developing objectively derived search strategies. Using a case study and selected examples, we then reflect on the utility of certain technologies (term frequency-inverse document frequency and Termine, term frequency, and clustering) in improving the precision and sensitivity of searches. Challenges in using these tools are discussed. The utility of these tools is influenced by the different capabilities of the tools, the way the tools are used, and the text that is analysed. Increased awareness of how the tools perform facilitates the further development of methods for their use in systematic reviews. Copyright © 2017 John Wiley & Sons, Ltd.

  3. A Review of Data Quality Assessment Methods for Public Health Information Systems

    Directory of Open Access Journals (Sweden)

    Hong Chen

    2014-05-01

    Full Text Available High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users’ concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process.

  4. The effectiveness of scoliosis screening programs: methods for systematic review and expert panel recommendations formulation

    Science.gov (United States)

    2013-01-01

    Background Literature on scoliosis screening is vast, however because of the observational nature of available data and methodological flaws, data interpretation is often complex, leading to incomplete and sometimes, somewhat misleading conclusions. The need to propose a set of methods for critical appraisal of the literature about scoliosis screening, a comprehensive summary and rating of the available evidence appeared essential. Methods To address these gaps, the study aims were: i) To propose a framework for the assessment of published studies on scoliosis screening effectiveness; ii) To suggest specific questions to be answered on screening effectiveness instead of trying to reach a global position for or against the programs; iii) To contextualize the knowledge through expert panel consultation and meaningful recommendations. The general methodological approach proceeds through the following steps: Elaboration of the conceptual framework; Formulation of the review questions; Identification of the criteria for the review; Selection of the studies; Critical assessment of the studies; Results synthesis; Formulation and grading of recommendations in response to the questions. This plan follows at best GRADE Group (Grades of Recommendation, Assessment, Development and Evaluation) requirements for systematic reviews, assessing quality of evidence and grading the strength of recommendations. Conclusions In this article, the methods developed in support of this work are presented since they may be of some interest for similar reviews in scoliosis and orthopaedic fields. PMID:23883346

  5. Review of literature on bioassay methods for estimating radionuclides in urine

    International Nuclear Information System (INIS)

    Prasad, M.V.R.; Surya Narayana, D.S.; Jeevanram, R.K.; Sundarajan, A.R.

    1991-01-01

    Bioassay methods of certain important radionuclides encountered in the nuclear fuel cycle operations, viz., thorium, uranium, sup(239)Pu, sup(241)Am, sup(90)Sr, sup(99)Tc, sup(106)Ru, sup(137)Cs are reviewed, with special emphasis on urinalysis. Since the preconcentration is an important prerequisite for bioassay, various preconcentration methods are also discussed. Brief account of various instruments both nuclear and analytical used in the bioassay programme is included. The sensitivities of the methods cited in the literature vis-a-vis the derived recording levels indicated in ICRP recommendations are compared. Literature surveyed up to 1990 is tabulated. (author). 96 refs., 1 fig ., 3 tabs

  6. Methods to determine stratification efficiency of thermal energy storage processes–Review and theoretical comparison

    DEFF Research Database (Denmark)

    Haller, Michel; Cruickshank, Chynthia; Streicher, Wolfgang

    2009-01-01

    This paper reviews different methods that have been proposed to characterize thermal stratification in energy storages from a theoretical point of view. Specifically, this paper focuses on the methods that can be used to determine the ability of a storage to promote and maintain stratification...... during charging, storing and discharging, and represent this ability with a single numerical value in terms of a stratification efficiency for a given experiment or under given boundary conditions. Existing methods for calculating stratification efficiencies have been applied to hypothetical storage...

  7. Mixed-methods designs in mental health services research: a review.

    Science.gov (United States)

    Palinkas, Lawrence A; Horwitz, Sarah M; Chamberlain, Patricia; Hurlburt, Michael S; Landsverk, John

    2011-03-01

    Despite increased calls for use of mixed-methods designs in mental health services research, how and why such methods are being used and whether there are any consistent patterns that might indicate a consensus about how such methods can and should be used are unclear. Use of mixed methods was examined in 50 peer-reviewed journal articles found by searching PubMed Central and 60 National Institutes of Health (NIH)-funded projects found by searching the CRISP database over five years (2005-2009). Studies were coded for aims and the rationale, structure, function, and process for using mixed methods. A notable increase was observed in articles published and grants funded over the study period. However, most did not provide an explicit rationale for using mixed methods, and 74% gave priority to use of quantitative methods. Mixed methods were used to accomplish five distinct types of study aims (assess needs for services, examine existing services, develop new or adapt existing services, evaluate services in randomized controlled trials, and examine service implementation), with three categories of rationale, seven structural arrangements based on timing and weighting of methods, five functions of mixed methods, and three ways of linking quantitative and qualitative data. Each study aim was associated with a specific pattern of use of mixed methods, and four common patterns were identified. These studies offer guidance for continued progress in integrating qualitative and quantitative methods in mental health services research consistent with efforts by NIH and other funding agencies to promote their use.

  8. Methods of blinding in reports of randomized controlled trials assessing pharmacologic treatments: a systematic review.

    Directory of Open Access Journals (Sweden)

    Isabelle Boutron

    2006-10-01

    Full Text Available BACKGROUND: Blinding is a cornerstone of therapeutic evaluation because lack of blinding can bias treatment effect estimates. An inventory of the blinding methods would help trialists conduct high-quality clinical trials and readers appraise the quality of results of published trials. We aimed to systematically classify and describe methods to establish and maintain blinding of patients and health care providers and methods to obtain blinding of outcome assessors in randomized controlled trials of pharmacologic treatments. METHODS AND FINDINGS: We undertook a systematic review of all reports of randomized controlled trials assessing pharmacologic treatments with blinding published in 2004 in high impact-factor journals from Medline and the Cochrane Methodology Register. We used a standardized data collection form to extract data. The blinding methods were classified according to whether they primarily (1 established blinding of patients or health care providers, (2 maintained the blinding of patients or health care providers, and (3 obtained blinding of assessors of the main outcomes. We identified 819 articles, with 472 (58% describing the method of blinding. Methods to establish blinding of patients and/or health care providers concerned mainly treatments provided in identical form, specific methods to mask some characteristics of the treatments (e.g., added flavor or opaque coverage, or use of double dummy procedures or simulation of an injection. Methods to avoid unblinding of patients and/or health care providers involved use of active placebo, centralized assessment of side effects, patients informed only in part about the potential side effects of each treatment, centralized adapted dosage, or provision of sham results of complementary investigations. The methods reported for blinding outcome assessors mainly relied on a centralized assessment of complementary investigations, clinical examination (i.e., use of video, audiotape, or

  9. Availability and performance of image/video-based vital signs monitoring methods: a systematic review protocol.

    Science.gov (United States)

    Harford, Mirae; Catherall, Jacqueline; Gerry, Stephen; Young, Duncan; Watkinson, Peter

    2017-10-25

    For many vital signs, monitoring methods require contact with the patient and/or are invasive in nature. There is increasing interest in developing still and video image-guided monitoring methods that are non-contact and non-invasive. We will undertake a systematic review of still and video image-based monitoring methods. We will perform searches in multiple databases which include MEDLINE, Embase, CINAHL, Cochrane library, IEEE Xplore and ACM Digital Library. We will use OpenGrey and Google searches to access unpublished or commercial data. We will not use language or publication date restrictions. The primary goal is to summarise current image-based vital signs monitoring methods, limited to heart rate, respiratory rate, oxygen saturations and blood pressure. Of particular interest will be the effectiveness of image-based methods compared to reference devices. Other outcomes of interest include the quality of the method comparison studies with respect to published reporting guidelines, any limitations of non-contact non-invasive technology and application in different populations. To the best of our knowledge, this is the first systematic review of image-based non-contact methods of vital signs monitoring. Synthesis of currently available technology will facilitate future research in this highly topical area. PROSPERO CRD42016029167.

  10. Scoping review of response shift methods: current reporting practices and recommendations.

    Science.gov (United States)

    Sajobi, Tolulope T; Brahmbatt, Ronak; Lix, Lisa M; Zumbo, Bruno D; Sawatzky, Richard

    2018-05-01

    Response shift (RS) has been defined as a change in the meaning of an individual's self-evaluation of his/her health status and quality of life. Several statistical model- and design-based methods have been developed to test for RS in longitudinal data. We reviewed the uptake of these methods in patient-reported outcomes (PRO) literature. CINHAHL, EMBASE, Medline, ProQuest, PsycINFO, and Web of Science were searched to identify English-language articles about RS published until 2016. Data on year and country of publication, PRO measure adopted, RS detection method, type of RS detected, and testing of underlying model assumptions were extracted from the included articles. Of the 1032 articles identified, 101 (9.8%) articles were included in the study. While 54.5 of the articles reported on the Then-test, 30.7% of the articles reported on Oort's or Schmitt's structural equation modeling (SEM) procedure. Newer RS detection methods, such as relative importance analysis and random forest regression, have been used less frequently. Less than 25% reported on testing the assumptions underlying the adopted RS detection method(s). Despite rapid methodological advancements in RS research, this review highlights the need for further research about RS detection methods for complex longitudinal data and standardized reporting guidelines.

  11. A Review on Microdialysis Calibration Methods: the Theory and Current Related Efforts.

    Science.gov (United States)

    Kho, Chun Min; Enche Ab Rahim, Siti Kartini; Ahmad, Zainal Arifin; Abdullah, Norazharuddin Shah

    2017-07-01

    Microdialysis is a sampling technique first introduced in the late 1950s. Although this technique was originally designed to study endogenous compounds in animal brain, it is later modified to be used in other organs. Additionally, microdialysis is not only able to collect unbound concentration of compounds from tissue sites; this technique can also be used to deliver exogenous compounds to a designated area. Due to its versatility, microdialysis technique is widely employed in a number of areas, including biomedical research. However, for most in vivo studies, the concentration of substance obtained directly from the microdialysis technique does not accurately describe the concentration of the substance on-site. In order to relate the results collected from microdialysis to the actual in vivo condition, a calibration method is required. To date, various microdialysis calibration methods have been reported, with each method being capable to provide valuable insights of the technique itself and its applications. This paper aims to provide a critical review on various calibration methods used in microdialysis applications, inclusive of a detailed description of the microdialysis technique itself to start with. It is expected that this article shall review in detail, the various calibration methods employed, present examples of work related to each calibration method including clinical efforts, plus the advantages and disadvantages of each of the methods.

  12. A Review on Human Activity Recognition Using Vision-Based Method.

    Science.gov (United States)

    Zhang, Shugang; Wei, Zhiqiang; Nie, Jie; Huang, Lei; Wang, Shuang; Li, Zhen

    2017-01-01

    Human activity recognition (HAR) aims to recognize activities from a series of observations on the actions of subjects and the environmental conditions. The vision-based HAR research is the basis of many applications including video surveillance, health care, and human-computer interaction (HCI). This review highlights the advances of state-of-the-art activity recognition approaches, especially for the activity representation and classification methods. For the representation methods, we sort out a chronological research trajectory from global representations to local representations, and recent depth-based representations. For the classification methods, we conform to the categorization of template-based methods, discriminative models, and generative models and review several prevalent methods. Next, representative and available datasets are introduced. Aiming to provide an overview of those methods and a convenient way of comparing them, we classify existing literatures with a detailed taxonomy including representation and classification methods, as well as the datasets they used. Finally, we investigate the directions for future research.

  13. Thresholds for statistical and clinical significance in systematic reviews with meta-analytic methods

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...

  14. Theoretical Methods of Domain Structures in Ultrathin Ferroelectric Films: A Review

    Directory of Open Access Journals (Sweden)

    Jianyi Liu

    2014-09-01

    Full Text Available This review covers methods and recent developments of the theoretical study of domain structures in ultrathin ferroelectric films. The review begins with an introduction to some basic concepts and theories (e.g., polarization and its modern theory, ferroelectric phase transition, domain formation, and finite size effects, etc. that are relevant to the study of domain structures in ultrathin ferroelectric films. Basic techniques and recent progress of a variety of important approaches for domain structure simulation, including first-principles calculation, molecular dynamics, Monte Carlo simulation, effective Hamiltonian approach and phase field modeling, as well as multiscale simulation are then elaborated. For each approach, its important features and relative merits over other approaches for modeling domain structures in ultrathin ferroelectric films are discussed. Finally, we review recent theoretical studies on some important issues of domain structures in ultrathin ferroelectric films, with an emphasis on the effects of interfacial electrostatics, boundary conditions and external loads.

  15. Processing methods, characteristics and adsorption behavior of tire derived carbons: a review.

    Science.gov (United States)

    Saleh, Tawfik A; Gupta, Vinod Kumar

    2014-09-01

    The remarkable increase in the number of vehicles worldwide; and the lack of both technical and economical mechanisms of disposal make waste tires to be a serious source of pollution. One potential recycling process is pyrolysis followed by chemical activation process to produce porous activated carbons. Many researchers have recently proved the capability of such carbons as adsorbents to remove various types of pollutants including organic and inorganic species. This review attempts to compile relevant knowledge about the production methods of carbon from waste rubber tires. The effects of various process parameters including temperature and heating rate, on the pyrolysis stage; activation temperature and time, activation agent and activating gas are reviewed. This review highlights the use of waste-tires derived carbon to remove various types of pollutants like heavy metals, dye, pesticides and others from aqueous media. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. The effects of the Pilates method in the elderly: a systematic review

    Directory of Open Access Journals (Sweden)

    Patrícia Becker Engers

    Full Text Available ABSTRACT Several studies show the benefits of including muscle strength and aerobic physical activity in the routine of elderly people. Among the various possibilities of physical activity, the Pilates method has become a popular modality in recent years, through a system of exercises enabling to work the whole body and that corrects posture and realigns the muscles, developing the body stability needed for a healthier life. The aim of this study was to review the current evidence on the effects of the practice of the Pilates method in the elderly. A systematic literature review was conducted in the following electronic databases: Pubmed, Scielo, Lilacs/Bireme, Scopus, Pedro and Isi of Knowledge, from descriptors pilates, elderly, old adults, aging. In the selection of studies the following inclusion criteria were used: original articles in English, Portuguese and Spanish languages. All selection and evaluation processes of the articles were performed by peers and the quality was verified by the Downs and Black scale. Twenty-one studies were included. The year of publication ranged from 2003 to 2014 and the size of the sample varied from 8 to 311 elderly subjects, aged at least 60 years old. The intervention period was from 4 weeks to 12 months of Pilates exercise practice. It was concluded that despite the studies pointing to physical and motor benefits of the Pilates method in the elderly, we cannot state whether or not the method is effective, in view of the poor methodological quality of the studies included in this review.

  17. Evaluation methods for physical activity-promoting mobile technologies: an interdisciplinary scoping review

    Directory of Open Access Journals (Sweden)

    Claire McCallum

    2015-10-01

    Full Text Available There are many thousands of mobile apps, wearables and other technologies available to support and promote physical activity. However, the rapidly evolving nature of these technologies means that the methodologies traditionally used to evaluate the effectiveness of behaviour change interventions (such as the randomised controlled trial may not be appropriate to evaluate their effectiveness. A scoping review was conducted to identify the methods currently being used to evaluate physical activity-promoting mobile technologies across health and computing science disciplines. In addition to the range of methods used, the review explored their strengths and weaknesses. The results improve understandings of when and why to use existing methods from health and computing science. Opportunities for combining and hybridising methods across the two disciplines are also identified. The review will be used to inform the development and piloting of novel, ‘fit-for-purpose’ research designs that will allow rigorous evaluation of the effectiveness of rapidly-evolving physical activity-promoting mobile technologies and their ‘active ingredients’ to build an evidence base of what works, why and for whom.

  18. A Review of the Ecological Footprint Indicator—Perceptions and Methods

    Directory of Open Access Journals (Sweden)

    Thomas Wiedmann

    2010-06-01

    Full Text Available We present a comprehensive review of perceptions and methods around the Ecological Footprint (EF, based on a survey of more than 50 international EF stakeholders and a review of more than 150 original papers on EF methods and applications over the last decade. The key points identified in the survey are that the EF (a is seen as a strong communication tool, (b has a limited role within a policy context, (c is limited in scope, (d should be closer aligned to the UN System of Environmental and Economic Accounting and (e is most useful as part of a basket of indicators. Key issues from the review of methods are: (a none of the major methods identified can address all relevant issues and questions at once, (b basing bioproductivity calculations on Net Primary Production (NPP is a promising approach, (c advances in linking bioproductivity with ecosystem services and biodiversity have been made by the Dynamic EF concept and the HANPP indicator, (d environmentally extended input-output analysis (IOA provides a number of advantages for improving EF calculations and (e further variations such as the emergy-based concept or the inclusion of further pollutants are not regarded as providing a fundamental shift to the usefulness of EF for policy making. We also discuss the implications of our findings for the use of the EF as a headline indicator for sustainability decision-making.

  19. The Advanced Aluminum Nitride Synthesis Methods and Its Applications: Patent Review.

    Science.gov (United States)

    Shishkin, Roman A; Elagin, Andrey A; Mayorova, Ekaterina S; Beketov, Askold R

    2016-01-01

    High purity nanosized aluminum nitride synthesis is a current issue for both industry and science. However, there is no up-to-date review considering the major issues and the technical solutions for different methods. This review aims to investigate the advanced methods of aluminum nitride synthesis and its development tendencies. Also the aluminum nitride application patents and prospects for development of the branch have been considered. The patent search on "aluminum nitride synthesis" has been carried out. The research activity has been analyzed. Special attention has been paid to the patenting geography and the leading researchers in aluminum nitride synthesis. Aluminum nitride synthesis methods have been divided into 6 main groups, the most studied approaches are carbothermal reduction (88 patents) and direct nitridation (107 patents). The current issues for each group have been analyzed; the main trends are purification of the final product and nanopowder synthesis. The leading researchers in aluminum nitride synthesis have represented 5 countries, namely: Japan, China, Russia, South Korea and USA. The main aluminum nitride application spheres are electronics (59,1 percent of applications) and new materials manufacturing (30,9 percent). The review deals with the state of the art data in nanosized aluminum nitride synthesis, the major issues and the technical solutions for different synthesis methods. It gives a full understanding of the development tendencies and of the current leaders in the sphere.

  20. Cochrane Qualitative and Implementation Methods Group guidance series-paper 5: methods for integrating qualitative and implementation evidence within intervention effectiveness reviews.

    Science.gov (United States)

    Harden, Angela; Thomas, James; Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Flemming, Kate; Booth, Andrew; Garside, Ruth; Hannes, Karin; Noyes, Jane

    2018-05-01

    The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method evidence from process evaluations. Despite a proliferation of methods for the synthesis of qualitative research, less attention has focused on how to integrate these syntheses within intervention effectiveness reviews. In this article, we report updated guidance from the group on approaches, methods, and tools, which can be used to integrate the findings from quantitative studies evaluating intervention effectiveness with those from qualitative studies and process evaluations. We draw on conceptual analyses of mixed methods systematic review designs and the range of methods and tools that have been used in published reviews that have successfully integrated different types of evidence. We outline five key methods and tools as devices for integration which vary in terms of the levels at which integration takes place; the specialist skills and expertise required within the review team; and their appropriateness in the context of limited evidence. In situations where the requirement is the integration of qualitative and process evidence within intervention effectiveness reviews, we recommend the use of a sequential approach. Here, evidence from each tradition is synthesized separately using methods consistent with each tradition before integration takes place using a common framework. Reviews which integrate qualitative and process evaluation evidence alongside quantitative evidence on intervention effectiveness in a systematic way are rare. This guidance aims to support review teams to achieve integration and we encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.